AgenticFlow AI: ChatGPT in the Flow of Work
HomeCommunityDiscordLogin
  • Get Started
    • AgenticFlow - the OS for your AI Workforce
    • llms.txt
    • FAQs
    • Key Concepts
      • Workflows
      • Templates
      • Data
      • Agent
      • API Keys
      • AgenticFlow MCP
    • Workflows quickstart
    • Agents quickstart
    • Introduction to Large Language Models
    • Plans and Credits
    • System Quotas
    • Affiliate program đź’µ
  • AGENTS
    • Introduction to agents
    • Agent Templates
    • Create an Agent
    • Customize an Agent
    • FAQ
  • Workflows
    • Introduction
    • Workflow Templates
    • Creating a Workflow
    • User Inputs - Get Started
      • Text Input
      • Long text input
      • Drop-down
      • Numeric Input
      • File to URL
      • File to Text
      • Checkbox
      • Image Input
      • Audio Input
      • Video Input
      • Multiple Media Input
      • Carousel Select Input
    • Actions - Get Started
      • LLM
        • LLM - Advanced Settings
        • Validators
        • Too Much Text
        • LLM Prompt
        • LLM Output
      • Code - JavaScript
      • Code - Python
      • Python Helper Functions
      • PDF to text
      • Extract Website Content
      • Knowledge Search
      • Audio/Video to Text
      • Insert Data into a Dataset
    • Knowledge
    • Workflow Single Run
    • Workflow Table Run
    • Export Results
    • API Run
    • FAQ
    • Parameter Substitution Utility
  • Data
    • Introduction
    • Data Table
    • FAQ
  • Use Cases
    • Summarization
      • GPT on My Files
      • GPT on My Website
      • Question-Answering on Data
    • Research
      • Sentiment Analysis
      • Anonymize Text
      • Audio Transcription + High-Level Analysis
  • Sales
    • Teach LLMs to Mimic Your Style
  • Marketing
    • SEO Optimize
    • Automating Creativity Transforming Workflow with AgenticFlow AI (PDF)
  • Policies
    • Security Overview
      • AI Policy
      • Reporting bugs and vulnerabilities
      • Subprocessors
      • DPA
    • Privacy Policy
    • Terms of Service
    • Cookies Policy
Powered by GitBook
On this page
  • What is a Prompt?
  • Tips on Writing a Good Prompt (Prompt Engineering)

Was this helpful?

  1. Workflows
  2. Actions - Get Started
  3. LLM

LLM Prompt

Large Language Model Prompt

Providing information to an LLM and requesting an action

What is a Prompt?

A prompt is a piece of text and your way of communicating with LLMs, to provide information and request an action. While this communication is done in natural language form, not all communications achieve the same success level.

Even though LLMs have made significant advancements, there are still limitations to their capabilities. One critical factor for successful use of an LLM is a good prompt. This page summarizes tips on how to create effective prompts.

Tips on Writing a Good Prompt (Prompt Engineering)

1. Provide Necessary Information at the Top

If the LLM needs information to perform a task, provide the information at the beginning of the prompt. Use keywords such as “Context” to specify what the information represents.

Example:

Context: The user is asking about the weather forecast.
Task: Provide a detailed weather forecast for the next three days.

2. Keep Instructions Short and Precise

Specify what exactly needs to be done in the bare minimum number of words required.

Example:

Summarize the following text in one paragraph.

3. Note Expectations Clearly

State what you expect rather than what you don’t want.

Example:

Answer should be informative and useful.

4. Include Formatting Instructions

When necessary, specify the format or structure you want for the output.

Example:

List the items in bullet points:
- Item 1
- Item 2

5. Specify the Scope Using Identifiers

Use characters such as ", """, or similar identifiers to clearly demarcate the scope of the data provided within the prompt.

Example:

"""
The quick brown fox jumps over the lazy dog.
"""
Summarize the text above.

6. Explicitly Note Constraints and Goals

For complex prompts, categorize information into sections like Constraints and Goals. Mention goals after the constraints.

Example:

Constraints: Use no more than 200 words.
Goals: Provide a comprehensive overview of the topic.

7. Place Important Instructions Near the End for Large Prompts

For larger prompts, place goals and important instructions as close as possible to the end of the prompt.

Example:

Context: Detailed explanation of a complex topic.
...
Goals: Summarize the key points in a concise manner.

8. Stick to One Term for the Same Concept

Use consistent terminology throughout the prompt to avoid confusion.

Example:

Use "customer" throughout the prompt instead of alternating with "client" or "user".

9. Include Examples Specific to Your Data Context

Providing examples can significantly enhance the LLM's performance.

Example:

Example: "The cat sat on the mat."
Rewrite in passive voice: "The mat was sat on by the cat."

By following these tips, you can create effective prompts that guide LLMs to produce accurate and useful responses.

PreviousToo Much TextNextLLM Output

Last updated 11 months ago

Was this helpful?