Plugin Tools Configuration
🔌 What are Plugin Tools?
Plugin tools enable your AI agent to execute individual workflow nodes as tools during conversations. Unlike workflow tools that run entire multi-step workflows, plugin tools give your agent access to specific node capabilities like:
LLM nodes: Call other AI models for specialized tasks
API nodes: Make HTTP requests to external services
Data transformation nodes: Convert formats, extract structured data
Integration nodes: Execute actions in connected services (Telegram, Google Sheets, etc.)
Utility nodes: Perform calculations, string operations, and more
Key Differences from Workflow Tools:
Workflow Tools: Execute complete multi-step workflows with complex logic
Plugin Tools: Execute single workflow nodes for specific, atomic operations
Use Plugin Tools When: You need fine-grained control over individual operations
Use Workflow Tools When: You need to execute complex, multi-step processes
⚙️ Plugin Tool Configuration
Each plugin tool requires the following configuration:
Required Fields
Plugin ID
The workflow node type to use
echo, llm, openai_ask_chat_gpt
Plugin Version
Version of the node
1.0.0 (standard for all nodes)
Optional Fields
Connection
Connection ID for nodes requiring authentication
Some nodes require a connection to external services. Check the specific node's documentation in the Node Library to see if a connection is required.
Input Config
Pre-configure specific input fields
When you want to fix certain parameters and hide them from the agent
🔧 Input Configuration (Advanced)
The input_config feature allows you to pre-configure specific input fields for a plugin tool. This is useful when:
You want to fix certain parameters: Set a specific model, temperature, or other settings
Simplify the agent's decision-making: Remove fields the agent doesn't need to decide
Enforce consistency: Ensure certain values are always used
Hide complexity: Pre-configure technical details from the agent
How It Works
When you configure an input field:
The field is removed from the tool's schema presented to the agent
The agent cannot override this value
The pre-configured value is automatically merged during execution
The field becomes invisible to the AI model
Configuration Format
Each input field configuration consists of:
{
"field_name": {
"value": "the_actual_value",
"description": "Optional: why this value is set"
}
}Example 1: Pre-configure LLM Model
Configure an LLM plugin with a fixed model and temperature:
{
"plugin_id": "llm",
"plugin_version": "1.0.0",
"input_config": {
"model": {
"value": "google-gemini-2.0-flash-lite",
"description": "Fixed model for cost control"
},
"temperature": {
"value": 0.7,
"description": "Balanced creativity"
}
}
}Result: The agent can use the LLM tool but only needs to provide the prompt. The model and temperature are automatically set to your configured values.
Example 2: Pre-configure Message Template
Configure an echo plugin with a pre-configured greeting:
{
"plugin_id": "echo",
"plugin_version": "1.0.0",
"input_config": {
"data": {
"value": "Hello, World!",
"description": "Pre-configured greeting message"
}
}
}Example 3: Pre-configure ChatGPT with Fixed Prompt
Configure OpenAI ChatGPT with a fixed system behavior:
{
"plugin_id": "openai_ask_chat_gpt",
"plugin_version": "1.0.0",
"connection": "openai_connection_id",
"input_config": {
"prompt": {
"value": "Repeat the following value: {{user_input}}"
},
"model": {
"value": "google-gemini-2.0-flash-lite"
},
"temperature": {
"value": 0.7
}
}
}Note: You can use template variables like {{user_input}} in pre-configured values if the node supports templating.
📋 Common Plugin Tool Configurations
1. Add LLM Plugin Tool
Give your agent access to another AI model for specialized tasks:
Configuration:
{
"plugin_id": "llm",
"plugin_version": "1.0.0",
"input_config": {
"model": {
"value": "google-gemini-2.0-flash-lite"
}
}
}Use Cases:
Use a reasoning model for complex logic
Use a vision model for image analysis
Use a fast model for simple tasks
2. Add API Call Plugin Tool
Allow your agent to make HTTP requests:
Configuration:
{
"plugin_id": "api_call",
"plugin_version": "1.0.0"
}Use Cases:
Fetch data from external APIs
Send data to third-party services
Integrate with custom backends
3. Add String to JSON Plugin Tool
Parse JSON strings into structured data:
Configuration:
{
"plugin_id": "string_to_json",
"plugin_version": "1.0.0"
}Use Cases:
Parse API responses
Convert string data to structured format
Handle JSON in conversations
4. Add Telegram Send Message Plugin
Send Telegram messages:
Configuration:
{
"plugin_id": "telegram_send_message",
"plugin_version": "1.0.0",
"connection": "telegram_connection_id"
}Use Cases:
Send notifications to Telegram
Alert users via Telegram
Automated messaging
5. Add OpenAI ChatGPT Plugin
Use OpenAI's ChatGPT as a specialized tool:
Configuration:
{
"plugin_id": "openai_ask_chat_gpt",
"plugin_version": "1.0.0",
"connection": "openai_connection_id",
"input_config": {
"model": {
"value": "gpt-4o"
},
"temperature": {
"value": 0.7
}
}
}Use Cases:
Use GPT-4o for specific reasoning tasks
Delegate complex analysis to a specialized model
Use vision capabilities for image understanding
🔍 Available Plugins
You can configure any of the 193+ workflow nodes as a plugin tool for your agent. Each node becomes available as a tool that your agent can execute during conversations.
Browse Available Nodes
To find the right plugin for your use case:
Node Library - Complete reference of all 193+ workflow nodes organized by category
Nodes by Category - Browse nodes by functionality (AI & LLM, Utilities, Integrations, etc.)
Nodes Alphabetical - Find nodes by alphabetical order
What You'll Find in Node Documentation
Each node's documentation page includes:
Description: What the node does and its capabilities
Input Parameters: Required and optional fields you can configure
Connection Requirements: Whether the node needs a connection to external services
Output Schema: What data the node returns
Examples: Usage examples and common configurations
Finding the Plugin ID
The Plugin ID is the node's technical name shown in the node documentation. For example:
echo- Echo nodellm- LLM nodeopenai_ask_chat_gpt- OpenAI ChatGPT nodetelegram_send_message- Telegram Send Message nodeapi_call- API Call node
Tip: Use the search function in the Node Library to quickly find nodes by keyword or functionality.
⚠️ Important Considerations
Connection Requirements
Some plugins require a connection to external services. To check if a specific node requires a connection:
Check Node Documentation: Navigate to the Node Library and find the specific node
Review Requirements: The node documentation will specify if a connection is required
Configure Connection: If required:
Navigate to: Project Settings → Connections
Add Connection: For the required service (OpenAI, Telegram, etc.)
Get Connection ID: Copy the connection ID from the connection settings
Configure Plugin: Set the
connectionfield to the connection ID
Note: Each node's documentation page in the Node Library specifies its connection requirements. Always refer to the specific node documentation for accurate connection information.
Cost Considerations
Plugin tools consume credits when executed:
LLM plugins: Cost varies by model (see Model Selection)
API plugins: May incur external API costs
Media plugins: Generation/processing costs apply
Best Practice: Use input_config to set cost-effective models for budget control.
Security Best Practices
Pre-configure sensitive parameters using
input_config:API endpoints that should remain fixed
Rate limits
Safety parameters
Model selection for consistent behavior
Use connections securely:
Store API keys and credentials in connection settings
Never expose credentials in input_config values
Reference connections by ID only
Monitor plugin usage:
Review conversation logs regularly
Track which plugins are being executed
Monitor credit consumption
Check for unexpected behavior
💡 Best Practices
1. Start Simple
Begin with basic utility plugins like echo or string_to_json to understand the behavior before adding complex integrations.
2. Use Input Config Strategically
Pre-configure fields that should remain constant:
Model selection for consistent behavior
Temperature for predictable outputs
API endpoints that shouldn't change
3. Combine with System Prompt
Guide your agent on when to use specific plugins:
System Prompt Example:
"You have access to the following tools:
- Use the 'llm' plugin for complex reasoning tasks
- Use the 'api_call' plugin to fetch external data
- Use the 'string_to_json' plugin when parsing JSON responses
Always explain which tool you're using and why."4. Check Node Documentation
Before configuring a plugin:
Review the node's documentation in the Node Library
Check connection requirements
Understand required vs. optional input fields
Review examples and use cases
5. Monitor Plugin Usage
Review conversation logs to see:
Which plugins are being used
How often they're executed
Success/failure rates
Credit consumption
6. Layer Multiple Plugins
Create specialized workflows by combining plugins:
Fetch data with
api_callParse with
string_to_jsonAnalyze with
llmNotify with
telegram_send_message
🔄 Updating Plugin Configuration
You can modify plugin tool configuration at any time:
Add New Plugins
Simply add new plugin configurations to the plugins array.
Modify Existing Plugins
Update the configuration fields:
Update
input_configvaluesChange connection IDs
Modify plugin version if needed
Remove Plugins
Remove plugin configurations from the plugins array to disable them.
Note: Changes take effect immediately for new conversations. Existing conversations may need to be refreshed.
📖 Related Documentation
Workflow Tools Configuration - Configure complete workflows as tools
MCP Tools Configuration - Configure MCP protocol integrations
Node Library - Complete reference of all 193+ nodes
System Prompt Configuration - Guide your agent's tool usage
Model Selection - Choose the right AI model for your agent
🆘 Troubleshooting
Plugin Tool Not Appearing
Possible Causes:
Invalid
plugin_id- Verify the node type existsMissing connection for nodes that require it
Configuration validation errors
Solution: Check the browser console for validation errors.
Plugin Execution Fails
Possible Causes:
Missing required input fields
Invalid connection credentials
Node-specific errors (rate limits, API failures)
Solution: Review the execution logs and verify all required inputs are provided or pre-configured.
Agent Not Using Plugin
Possible Causes:
System prompt doesn't guide plugin usage
Plugin not relevant to conversation context
Agent chose alternative approach
Solution: Update system prompt with clear guidance on when to use specific plugins.
Input Config Not Working
Possible Causes:
Field name doesn't match node schema
Invalid value type for the field
Field is required but not configured
Solution: Verify field names match the node's input schema exactly.
For detailed node-specific configuration, see: Node Library Reference
Last updated
Was this helpful?