Slots
Prompt slots let you customize how information is passed to your AI assistant. Some are automatically included, others need to be defined manually to give you full control. You can specify them in the prompt editor.
{{question}}
Will be filled with the input provided by the user via the search bar.
Prompt Template
Example prompt template with the {{question}}
slot:
Answer the following question:
{{question}}
Filled prompt
This is the complete prompt after slot resolution, sent to the language model if the
user types What are the symptoms of the flu?
:
Answer the following question:
What are the symptoms of the flu?
If data was uploaded in the data section, the {{question}}
slot will be automatically added to the prompt. A warning will be displayed in the prompt editor if it is not displayed.
If you want to insert it into a particular position of the prompt, you can do so by typing /question and the warning will disappear.
{{input_data}}
Will be filled with data extracted from the specified data section or from data uploaded via the clip.
Prompt Template
Example prompt template with the {{input_data}}
slot:
Consider the following content:
{{input_data}}
Filled prompt
If there is one document titled Flu
that is chunched into 2 parts:
Chunk 1: The flu, or influenza, is a contagious respiratory illness caused by influenza viruses. Symptoms include fever, cough, sore throat, body aches, and fatigue.
Chunk 2: It can lead to serious complications, especially in young children, elderly individuals, and those with weakened immune systems. Vaccination is the best way to prevent the flu.
This is the complete prompt after slot resolution, sent to the language model is:
Consider the following content:
# Documents
[1]:
TITLE: Flu
CONTENT: The flu, or influenza, is a contagious respiratory illness caused by influenza viruses. Symptoms include fever, cough, sore throat, body aches, and fatigue.
[2]:
TITLE: Flu
CONTENT: It can lead to serious complications, especially in young children, elderly individuals, and those with weakened immune systems. Vaccination is the best way to prevent the flu.
If the uploaded data fits in the context window of the LLM the whole content will be sent to the LLM in one prompt. If the content is too large, it will be chunked into smaller parts and only the first n chunks will be sent to the LLM. The number of chunks is configured in the retriever settings.
If data was uploaded in the data section, the {{input_data}}
slot will be automatically added to the prompt. A warning will be displayed in the prompt editor if it is not displayed.
If you want to insert it into a particular position of the prompt, you can do so by typing /input_data and the warning will disappear.
The only exception is when you are using the {{loop_through_documents}}
slot.
{{chat_history}}
Will be filled with the conversation history between the user and the assistant.
Prompt Template
Example prompt template with the {{input_data}}
slot:
Consider the following conversation when replying:
{{chat_history}}
Filled prompt
This is the complete prompt after slot resolution, sent to the language model if the conversation history is:User: What is Python?
Assistant: Python is a high-level, interpreted programming language known for its simplicity and readability.
User: How is it different from Java?
Assistant: Python is dynamically typed and generally easier for beginners.
Consider the following conversation when replying:
User: What is Python?
Assistant: Python is a high-level, interpreted programming language known for its simplicity and readability.
User: How is it different from Java?
Assistant: Python is dynamically typed and generally easier for beginners.
If data was uploaded in the data section, the {{chat_history}}
slot will be automatically added to the prompt. A warning will be displayed in the prompt editor if it is not displayed.
If you want to insert it into a particular position of the prompt, you can do so by typing /chat_history and the warning will disappear.
{{bot_name}}
/ {{bot_description}}
Will be filled with the name of the assistant, which is set at the very beginning of the task creation or with the description of the assistant.
Prompt Template
Example prompt template with the {{bot_name}}
and {{bot_description}}
slot:
Hi, I’m {{bot_name}}, your personal assistant. My goal is to {{bot_description}}. How can I help you today?
Filled prompt
This is the complete prompt after slot resolution, sent to the language model if the bot name is set toQAnswer
and the bot description is set to a friendly and concise assistant specialized in tech support
:
Hi, I’m QAnswer, your personal assistant. My goal is to be a friendly and concise assistant specialized in tech support. How can I help you today?
{{bot_answer_length}}
Will be filled with the length specified in the LLM Settings.
Controls verbosity of the assistant’s answer (e.g. Short
, Brief
, Long
).
Prompt Template
Example prompt template with the {{bot_answer_length}}
slot:
Give a {{bot_answer_length}} explanation of how the internet works.
Filled prompt
This is the complete prompt after slot resolution, sent to the language model if the answer length is set toBrief
:
Give a Brief explanation of how the internet works.
{{language}}
Will be filled with the language the assistant that is specified at creation time.
Prompt Template
Example prompt template with the {{language}}
slot:
Answer the following question in {{language}}:
Filled prompt
This is the complete prompt after slot resolution, sent to the language model if the language is set toFrench
:
Answer the following question in French:
{{date}}
/ {{time}}
Will be filled with the current date and time when the prompt is sent to the AI. This is useful for tasks that require temporal context.
Prompt Template
Example prompt template with the {{date}}
and {{time}}
slots:
Today is {{date}} and the current time is {{time}}.
Filled prompt
This is the complete prompt after slot resolution, sent to the language model if the current date is28 July 2025
and the time is 10:42
:
Today is 28 July 2025 and the current time is 10:42.
{{user_name}}
, {{user_username}}
, {{user_email}}
These slots are automatically filled with the user's name, username, and email address from their profile. They are useful for personalizing responses or when the AI needs to reference the user directly.
Prompt Template
Example prompt template with the {{user_name}}
, {{user_username}}
, and {{user_email}}
slots:
Compose a short email to request access to the account identified by the username {{user_username}}.
Sign the message with the user's full name and email: {{user_name}}, {{user_email}}
Filled prompt
This is the complete prompt after slot resolution, sent to the language model if the user's name isJhon Doe
, username is jhon_doe
, and email is jhon_doe@example.com
:
Compose a short email to request access to the account identified by the username jhon_doe.
Sign the message with the user's full name and email: Jhon Doe, jhon_doe@example.com.
{{loop_through_documents}}
Can be used to iterate over a list of documents, allowing the AI to process each document individually with the same prompt logic.
Prompt Template
Example prompt template with the {{loop_through_documents}}
slot:
Document: {{loop_through_documents}}
Answer the following: {{question}}
Filled prompts
These are the complete prompt after slot resolution, sent to the language model if the documents areCV1.pdf
, CV2.pdf
, and CV3.pdf
and the question is Does this candidate have JavaScript experience?
:
Prompt 1:
[1]:
Title: CV1.pdf
Content: `CV1.pdf's content`
Answer the following: Does this candidate have JavaScript experience?
Prompt 2:
[2]:
Title: CV2.pdf
Content: `CV2.pdf's content`
Answer the following: Does this candidate have JavaScript experience?
Prompt 3:
[3]:
Title: CV3.pdf
Content: `CV3.pdf's content`
Answer the following: Does this candidate have JavaScript experience?
If the uploaded data fits in the context window of the LLM the whole content will be sent to the LLM in each prompt. If the content is too large, it will be chunked into smaller parts and only the first n chunks will be sent to the LLM for each prompt. The number of chunks is configured in the retriever settings.
{{loop_through_values}}
Filled with a list of values to iterate over. Each value in the list is passed individually to the prompt, generating multiple prompts — one for each item.
Prompt Template
Example prompt template with the {{loop_through_values}}
slot:
What is the capital of {{loop_through_values = ["France", "Germany", "Italy"]}}?
Filled prompts
These are the complete prompts after slot resolution, sent to the language model if the values areFrance
, Germany
, and Italy
:
Prompt 1:
What is the capital of France?
Prompt 2:
What is the capital of Germany?
Prompt 3:
What is the capital of Italy?
{{synonyms}}
Filled with a list of synonyms or related terms that can be used to enhance the AI's understanding of the question. This is particularly useful for improving retrieval and semantic matching.
You define synonym groups in the retriever settings. Each group is injected into the prompt using the {{synonyms}}
slot.
Prompt Template
Example prompt template with the {{synonyms}}
slot:
Main question: {{question}}
Consider these related terms: {{synonyms}}
Filled prompt
This is the complete prompt after slot resolution, sent to the language model if the question isWhat are symptoms of the flu?
and the synonyms are ["flu", "signs", "phenotypes"]
:
Main question: What are symptoms of the flu?
Consider these related terms: flu, signs, phenotypes
{{training_examples}}
Filled with a list of training examples that help the AI assistant learn from past interactions. These examples are used to guide the assistant's behavior through a list of example question-answer pairs. The training examples are collected via thumbs up/down feedback on AI responses. If the user provides positive feedback on an AI response, it is automatically added as a new training example. The slot will be filled with the 5 most similar training examples based on the question asked by the user.
Prompt Template
Use the following examples to guide your answer:
{{training_examples}}
answer the question:
{{question}}
Filled prompt
This is the complete prompt after slot resolution, sent to the language model if the training examples includeQuestion: What is machine learning?
Answer: Machine learning is a field of AI that enables systems to learn from data and improve over time without being explicitly programmed.
and the question is What is deep learning?
:
Use the following examples to guide your answer:
Question: What is machine learning?
Answer: Machine learning is a field of AI that enables systems to learn from data and improve over time without being explicitly programmed.
answer the question:
What is deep learning?
To be used mainly for structured data, like question-answer pairs detailing the generation of an SPARQL query.