Prompt Templating in LLMKit
LLMKit uses Tera, a powerful templating engine inspired by Jinja2, to enable dynamic and flexible prompt creation. With Tera, you can create prompts that adapt based on variables you provide, making it easy to customize behavior without rewriting the entire prompt each time. This guide will walk you through the essentials of prompt templating, including variable substitutions, control structures, and best practices.
What is a Template?
A template is a string that contains placeholders for variables. When you "render" the template, these placeholders are replaced with actual values you provide. In LLMKit, templates are used to create dynamic prompts where parts of the system or user message can change based on input.
For example:
- A static prompt might always say, "You are a helpful assistant."
- A dynamic prompt could say, "You are a assistant," where
mood
is a variable you define (e.g., "cheerful" or "serious").
Variable Substitution
In Tera, variables are denoted by double curly braces: {{ variable_name }}
. When the template is rendered, these placeholders are replaced with the values you provide.
Basic Example
Suppose you have a system prompt template:
You are a {{ mood }} assistant who loves to help with {{ topic }}.
If you provide the values {"mood": "friendly", "topic": "coding"}
, the rendered prompt becomes:
You are a friendly assistant who loves to help with coding.
Supported Variable Types
You can use various types of variables in your templates:
- Strings:
"friendly"
,"coding"
- Numbers:
42
,3.14
- Booleans:
true
,false
- Lists:
["apple", "banana", "cherry"]
- Dictionaries:
{"key1": "value1", "key2": "value2"}
These types allow for rich and varied prompt customization.
Control Structures: Conditionals and Loops
Tera also supports control structures like conditionals and loops, which let you add logic to your templates.
Conditionals
Use {% if %}
to include or exclude parts of the prompt based on conditions.
You are a helpful assistant.
{% if include_examples %}
Here are some examples:
- Example 1
- Example 2
{% endif %}
If include_examples
is true
, the examples are included; otherwise, they’re omitted.
Loops
Use {% for %}
to repeat parts of the prompt for each item in a list.
Here are the topics to discuss:
{% for topic in topics %}
- {{ topic }}
{% endfor %}
If topics
is ["AI", "Robotics", "Space"]
, the rendered prompt becomes:
Here are the topics to discuss:
- AI
- Robotics
- Space
How Templates Work in LLMKit
In LLMKit, templates are used in dynamic prompts (both dynamic_system
and dynamic_both
types). Here’s how they fit into each prompt type:
- Static Prompts: No templates or variables; the system message is fixed.
- Dynamic System Prompts: The system message is a template with variables. You provide values for these variables in the API request.
- Dynamic Both Prompts: Both the system and user messages are templates with variables. You provide values for all variables in the request.
Providing Variable Values
When making an API request, you include a system message with a JSON string containing the variable values. For example:
{
"model": "DYNAMIC-SYSTEM-CHAT",
"messages": [
{
"role": "system",
"content": "{\"mood\": \"playful\", \"topic\": \"dinosaurs\"}"
},
{
"role": "user",
"content": "Tell me something cool!"
}
]
}
- The system message’s
content
is a JSON string:{"mood": "playful", "topic": "dinosaurs"}
. - These values are used to render the template before sending the prompt to the LLM.
Examples
Simple Variable Substitution
Template (system message):
You are an expert in {{ field }}.
API Request:
{
"model": "DYNAMIC-SYSTEM-CHAT",
"messages": [
{"role": "system", "content": "{\"field\": \"astronomy\"}"},
{"role": "user", "content": "What is a black hole?"}
]
}
Rendered Prompt:
You are an expert in astronomy.
Using Conditionals
Template (system message):
You are a helpful assistant.
{% if detailed %}
Provide detailed explanations.
{% else %}
Keep answers brief.
{% endif %}
API Request:
{
"model": "DYNAMIC-SYSTEM-CHAT",
"messages": [
{"role": "system", "content": "{\"detailed\": true}"},
{"role": "user", "content": "Explain photosynthesis."}
]
}
Rendered Prompt:
You are a helpful assistant.
Provide detailed explanations.
Using Loops
Template (user message):
Compare these fruits:
{% for fruit in fruits %}
- {{ fruit }}
{% endfor %}
API Request:
{
"model": "DYNAMIC-BOTH-ONESHOT",
"messages": [
{"role": "system", "content": "{\"fruits\": [\"apple\", \"banana\", \"orange\"]}"}
]
}
Rendered User Message:
Compare these fruits:
- apple
- banana
- orange
Best Practices
- Escape User Inputs: If variables come from untrusted sources, ensure they’re properly escaped to avoid injection attacks. Tera automatically escapes HTML, but be cautious with other formats.
- Keep Templates Simple: Complex logic can make prompts hard to debug. Start simple and add complexity only when necessary.
- Test Thoroughly: Use LLMKit’s evaluation tools to test how different variable values affect your prompt’s performance.
Why Use Templates?
Templates make your prompts reusable and adaptable. Instead of writing a new prompt for every scenario, you can create one template and adjust it with variables. This saves time, reduces errors, and lets you experiment with different inputs easily.
With Tera’s familiar syntax (similar to Jinja2), you’ll find it intuitive to create dynamic, powerful prompts in LLMKit.