OpenAI Search

A guide to using the OpenAI Search action for web searches within the OpenAI ecosystem.

The OpenAI Search Action provides a way to conduct web searches using OpenAI's capabilities. While not a standalone search engine in the traditional sense, this tool is powerful when used as a research step before interacting with an OpenAI language model, like in the LLM Action.

It's designed to gather relevant, up-to-date information from the web that can then be passed as context to a chat completion or instruction-following model, improving the factual accuracy and timeliness of its responses.

Connection Setup

You will need an OpenAI account and your API key.

  1. Sign up for an account on the OpenAI Platform.

  2. Find your API key in your account settings under API Keys.

  3. In AgenticFlow, navigate to Settings > Connections and add a new OpenAI Connection, providing your API key.

How It Works

The OpenAI Search action takes your query and uses it to find relevant snippets of information from across the web. The key difference from other search actions is that its output is formatted specifically to be used effectively as context in a subsequent prompt to an OpenAI model.

Configuration

Input Parameters

Parameter
Type
Description

Connection

Connection

Select the OpenAI connection you created.

Query

Text

The topic or question you want to research.

Limit

Number

The maximum number of search results to return. Defaults to 5.

Output Parameters

Parameter
Type
Description

Results

Array

An array of objects, where each object contains the title, link, and a snippet of the search result.

Example: Answering a Question with Up-to-Date Info

Let's build a workflow to answer a question about a very recent event, ensuring the LLM has the latest information.

  1. Get the Question: The workflow starts with a Text Input action where you ask: "Summarize the key announcements from Apple's latest event."

  2. Configure the OpenAI Search Action:

    • Connection: Select your OpenAI connection.

    • Query: {{text_input_action.output}}

    • Limit: 3 (to get the top 3 most relevant articles/summaries).

  3. Provide Context to the LLM Action:

    • The Results output will be an array of search results.

    • Connect this to an LLM Action.

    • Set the LLM's prompt to:

      You are a helpful assistant. Using ONLY the following search results as your context, please answer the user's question.
      
      Search Results:
      {{openai_search_action.results}}
      
      User's Question:
      {{text_input_action.output}}

This RAG (Retrieval-Augmented Generation) pattern ensures the LLM isn't relying solely on its training data, which might be outdated. It's using fresh information from the web to provide a current and accurate answer.

Last updated

Was this helpful?