AgenticFlow AI: ChatGPT in the Flow of Work
HomeCommunityDiscordLogin
  • Get Started
    • AgenticFlow - Build agents that handle sales, marketing, and creative tasks around the clock.
    • Introduction to Large Language Models
    • Templates
    • AgenticFlow MCP
    • API Keys
    • Workflows quickstart
    • Agents quickstart
    • Plans and Credits
    • System Quotas
    • FAQs
    • Affiliate program đź’µ
  • AGENTS
    • Introduction to agents
    • Agent Templates
    • Create an Agent
    • Customize an Agent
    • Tools and Integrations
  • Workflows
    • Introduction
    • Workflow Templates
    • Creating a Workflow
    • User Inputs - Get Started
      • Text Input
      • Long text input
      • Drop-down
      • Numeric Input
      • File to URL
      • File to Text
      • Checkbox
      • Image Input
      • Audio Input
      • Video Input
      • Multiple Media Input
      • Carousel Select Input
    • Actions - Get Started
      • LLM
        • LLM - Advanced Settings
        • Validators
        • Too Much Text
        • LLM Prompt
        • LLM Output
      • Code - JavaScript
      • Code - Python
      • Python Helper Functions
      • PDF to text
      • Extract Website Content
      • Knowledge Search
      • Audio/Video to Text
      • Insert Data into a Dataset
    • Knowledge
    • Workflow Single Run
    • Workflow Table Run
    • Export Results
    • API Run
    • Parameter Substitution Utility
  • Data
    • Introduction
    • Data Table
  • Use Cases
    • Summarization
      • GPT on My Files
      • GPT on My Website
      • Question-Answering on Data
    • Research
      • Sentiment Analysis
      • Anonymize Text
      • Audio Transcription + High-Level Analysis
  • Sales
    • Teach LLMs to Mimic Your Style
  • Marketing
    • SEO Optimize
    • Automating Creativity Transforming Workflow with AgenticFlow AI (PDF)
  • Policies
    • Security Overview
      • AI Policy
      • Reporting bugs and vulnerabilities
      • Subprocessors
      • DPA
    • Privacy Policy
    • Terms of Service
    • Cookies Policy
  • Advanced Topics
    • Advanced Topics
Powered by GitBook
On this page
  • Ways of Managing Large Contexts When Working with LLMs
  • Handling Large Content

Was this helpful?

  1. Workflows
  2. Actions - Get Started
  3. LLM

Too Much Text

How to Handle Too Much Text in a Prompt

Ways of Managing Large Contexts When Working with LLMs

LLMs have a limit on the context size they can take as input. AgenticFlow AI provides you with great features to help you manage context size and prevent run-time errors.

Handling Large Content

In an LLM action, under “Advanced options,” you will find a section for handling large content. All variables included in the prompt should be listed in this section with an Edit button in front.

Options for Handling Large Content

Small contents can be fed to LLMs directly, but for large contents such as knowledge bases, AgenticFlow AI provides you with two options:

  1. Summarize the content

  2. Select the most relevant content to the query/goal (Vector search)

Clicking on Edit will allow you to access these options.

Summarize

This option reads the provided content and produces a summary. Under advanced settings, you can provide a prompt for better and more guided summarization. For example, you can note the objective or goal of the analysis. You can also specify the large language model you wish to use for summarization.

Most Relevant Data

This option will run a semantic search through the data to extract the most relevant parts to the provided query.

Advanced Options for Most Relevant Data

  • Query: By default, the query is extracted from the original prompt. However, it is highly recommended to set a custom value for the search query. You can type in a query or use {{}} to include a variable, for example, {{question}} or {{search_query}}.

  • Query Type: You can select between vector and keyword search.

  • Columns: By default, all columns from the row of data are included in the results. However, you can specify a subset of columns to exclude unnecessary data.

  • Page Size: This parameter indicates the number of matching entries to be fetched as the search results. It is set to 100 by default.

Full

This option will return the entire content (e.g., the whole knowledge base). Note that too large content can cause run-time errors, and this option is not recommended when working with large data.

PreviousValidatorsNextLLM Prompt

Last updated 9 hours ago

Was this helpful?