LLM - Advanced Settings

Large Language Model (LLM) - Advanced Settings

Advanced Settings Available for LLMs

There is a lot more to LLMs than just the prompt and the model. Click on “Advanced options” on the LLM action to access additional settings that can enhance the performance and functionality of the LLM.

Adding Conversation History

Conversation history allows you to provide context and specify roles for a conversation with the LLM. Click on + Add Row to add lines of conversation between a “user” and “ai”.

The only acceptable roles are “ai” and “user”.

Conversation history is useful in conversational agents and helps AI understand more about the situation. Such history can create a more personalized experience. For example, “Hello, my name is Sam” is likely to get a response like “Hello Sam - How can I help you?”

System Prompt

A system prompt is composed of notes, instructions, and guidelines that guide the AI to assume a certain role, follow a specific format, or adhere to limitations. For example: "You are an expert on the solar system. Answer the following questions in a concise and informative manner."

Strip Line-Breaks

Strip line-breaks is a text preprocessing parameter. If set to yes, all new lines will be removed from the provided prompt. This might be handy when the prompt is slightly larger than the context capacity of the selected model.

Temperature

Temperature is a hyperparameter, ranging from 0 to 1, that affects the randomness (sometimes referred to as creativity) of the LLMs’ response. Higher randomness/creativity/diversity is expected with higher temperature values. However, responses might also lose the right context.

In the next sections, we will explain how to:

  • Automatically validate LLM responses

  • Handle large amounts of text/context

Conversation History

Adding Conversation History

  1. Click on + Add Row.

  2. Enter the lines of conversation, specifying the role as either “user” or “ai”.

  3. Use conversation history to maintain context across interactions, improving the personalization and relevance of the AI’s responses.

Example

User: "What is the capital of France?" AI: "The capital of France is Paris."

User: "Can you tell me more about it?" AI: "Sure, Paris is known for its art, fashion, and culture..."

System Prompt

Setting a System Prompt

  1. Enter the notes, instructions, and guidelines for the AI.

  2. The system prompt helps the AI understand its role, the expected format, and any specific limitations.

Example

"You are an expert on the solar system. Answer the following questions in a concise and informative manner."

Strip Line-Breaks

Enabling Strip Line-Breaks

  1. Set the Strip Line-Breaks parameter to Yes.

  2. This will remove all new lines from the provided prompt, making it more compact and potentially fitting within the context capacity of the selected model.

Temperature

Setting Temperature

  1. Adjust the Temperature parameter to a value between 0 and 1.

  2. Higher values increase the randomness and creativity of the responses, while lower values produce more deterministic and contextually accurate responses.

Example

  • High Temperature (e.g., 0.9): "Once upon a time in a land far, far away, there was a magical unicorn that loved to dance in the moonlight."

  • Low Temperature (e.g., 0.2): "The capital of France is Paris."

By leveraging these advanced settings, you can significantly enhance the functionality and performance of your LLM actions in AgenticFlow AI.

Last updated