PML LLM

A guide to the PML LLM action, which uses Prompt Markup Language for advanced, cacheable prompting.

The PML LLM Action provides a powerful way to interact with Large Language Models using Prompt Markup Language (PML). PML is a specialized language that allows you to construct complex, modular, and cacheable prompts. This enables significant performance improvements by reusing attention states for recurring parts of your prompts.

This action is designed for advanced use cases where prompt structures are complex or contain significant reusable components. For simpler LLM interactions, consider using the standard **[LLM Action](./llm.md)**.

How It Works

The PML LLM Action leverages a "Prompt Cache" system. Instead of recomputing the entire prompt every time, it identifies and caches the attention states of reusable modules. This means that for subsequent requests, only the new or changed parts of the prompt need to be processed, dramatically reducing latency.

Configuration

Input Parameters

Parameter
Type
Description

Connection

Connection

Select the LLM connection to use (e.g., OpenAI, Claude, Perplexity).

Model

Select

Choose the specific LLM you want to use.

PML Schema

Text

An XML-based schema defining the reusable modules, parameters, and structure of your prompt.

PML Prompt

Text

The specific prompt, written in PML, that references the schema and provides values for any parameters.

Temperature

Number

Controls the randomness of the output.

Max Tokens

Number

The maximum length of the generated response.

Output Parameters

Parameter
Type
Description

Output

Text

The text generated by the LLM based on the PML prompt.

Example: A/B Testing Email Copy

This example demonstrates how to use the PML LLM Action to A/B test different email subject lines while reusing the main body of the email.

PML Schema (email_test_schema.xml):

<schema name="email_test">
  <module name="email_body">
    Hi {{contact_name}},

    I wanted to follow up on our conversation last week. I've attached the proposal we discussed.

    Best,
    Alex
  </module>

  <union name="subject_lines">
    <module name="subject_a">Following up on our chat</module>
    <module name="subject_b">Your proposal is ready</module>
  </union>
</schema>

PML Prompt:

<prompt schema="email_test">
  <email_body contact_name="Jane"/>
  <subject_lines>
    <subject_a/>
  </subject_lines>
</prompt>

By defining the email_body as a reusable module, the LLM only needs to compute the attention states for the different subject lines, making the A/B testing process much more efficient.

Last updated

Was this helpful?