Slots

Prompt slots control how information is passed to the AI assistant. Some are included automatically; others must be defined manually. Specify them in the prompt editor.


{{question}}

Filled with the user's input from the search bar.

Prompt Template

Example prompt template with the {{question}} slot:

Answer the following question:
{{question}}

Filled prompt

Complete prompt after slot resolution, sent to the model when the user types What are the symptoms of the flu?:

Answer the following question:
What are the symptoms of the flu?
Warning
If data was uploaded in the data section, the {{question}} slot will be automatically added to the prompt. A warning will be displayed in the prompt editor if it is not displayed. If you want to insert it into a particular position of the prompt, you can do so by typing /question and the warning will disappear.

{{input_data}}

Filled with data from the specified data section or from data uploaded via the clip.

Prompt Template

Consider the following content:

{{input_data}}

Filled prompt

If there is one document titled Flu that is chunked into 2 parts:

Consider the following content:

# Documents
[1]:
TITLE: Flu
CONTENT: The flu, or influenza, is a contagious respiratory illness caused by influenza viruses. Symptoms include fever, cough, sore throat, body aches, and fatigue.
[2]:
TITLE: Flu
CONTENT: It can lead to serious complications, especially in young children, elderly individuals, and those with weakened immune systems. Vaccination is the best way to prevent the flu.
Info
If the uploaded data fits in the context window of the LLM the whole content will be sent to the LLM in one prompt. If the content is too large, it will be chunked into smaller parts and only the first n chunks will be sent to the LLM. The number of chunks is configured in the retriever settings.
Warning
If data was uploaded in the data section, the {{input_data}} slot will be automatically added to the prompt. A warning will be displayed in the prompt editor if it is not displayed. If you want to insert it into a particular position of the prompt, you can do so by typing /input_data and the warning will disappear. The only exception is when you are using the {{loop_through_documents}} slot.

{{chat_history}}

Filled with the conversation history between the user and the assistant.

Prompt Template

Consider the following conversation when replying:

{{chat_history}}

Filled prompt

Consider the following conversation when replying:

User: What is Python?
Assistant: Python is a high-level, interpreted programming language known for its simplicity and readability.
User: How is it different from Java?
Assistant: Python is dynamically typed and generally easier for beginners.
Warning
If data was uploaded in the data section, the {{chat_history}} slot will be automatically added to the prompt. A warning will be displayed in the prompt editor if it is not displayed. If you want to insert it into a particular position of the prompt, you can do so by typing /chat_history and the warning will disappear.

{{bot_name}} / {{bot_description}}

Filled with the assistant name and description, set at creation time.

Prompt Template

Hi, I'm {{bot_name}}, your personal assistant. My goal is to {{bot_description}}. How can I help you today?

Filled prompt

Hi, I'm QAnswer, your personal assistant. My goal is to be a friendly and concise assistant specialized in tech support. How can I help you today?

{{bot_answer_length}}

Filled with the length value from LLM Settings. Controls verbosity (e.g. Short, Brief, Long).

Prompt Template

Give a {{bot_answer_length}} explanation of how the internet works.

Filled prompt

Give a Brief explanation of how the internet works.

{{language}}

Filled with the assistant language specified at creation time.

Prompt Template

Answer the following question in {{language}}:

Filled prompt

Answer the following question in French:

{{date}} / {{time}}

Filled with the current date and time at the moment the prompt is sent. Useful for tasks requiring temporal context.

Prompt Template

Today is {{date}} and the current time is {{time}}.

Filled prompt

Today is 28 July 2025 and the current time is 10:42.

{{user_name}}, {{user_username}}, {{user_email}}

Automatically filled with the user's name, username, and email from their profile. Use these to personalize responses or reference the user directly.

Prompt Template

Compose a short email to request access to the account identified by the username {{user_username}}.
Sign the message with the user's full name and email: {{user_name}}, {{user_email}}

Filled prompt

Compose a short email to request access to the account identified by the username jhon_doe.
Sign the message with the user's full name and email: Jhon Doe, jhon_doe@example.com.

{{loop_through_documents}}

Iterates over a list of documents, processing each one individually with the same prompt logic.

Prompt Template

Document: {{loop_through_documents}}
Answer the following: {{question}}

Filled prompts

These are the complete prompts after slot resolution, sent to the language model if the documents are CV1.pdf, CV2.pdf, and the question is Does this candidate have JavaScript experience?:

Prompt 1:
[1]:
  Title: CV1.pdf
  Content: CV1.pdf's content
Answer the following: Does this candidate have JavaScript experience?

Prompt 2:
[2]:
  Title: CV2.pdf
  Content: CV2.pdf's content
Answer the following: Does this candidate have JavaScript experience?

Prompt 3:
[3]:
  Title: CV3.pdf
  Content: CV3.pdf's content
Answer the following: Does this candidate have JavaScript experience?
Info
If the uploaded data fits in the context window of the LLM the whole content will be sent to the LLM in each prompt. If the content is too large, it will be chunked into smaller parts and only the first n chunks will be sent to the LLM for each prompt. The number of chunks is configured in the retriever settings.

{{loop_through_values}}

Filled with a list of values to iterate over. Each value in the list is passed individually to the prompt, generating multiple prompts — one for each item.

Prompt Template

What is the capital of {{loop_through_values = ["France", "Germany", "Italy"]}}?

Filled prompts

Prompt 1:
What is the capital of France?

Prompt 2:
What is the capital of Germany?

Prompt 3:
What is the capital of Italy?

{{synonyms}}

Filled with a list of synonyms or related terms that can be used to enhance the AI's understanding of the question. This is particularly useful for improving retrieval and semantic matching. You define synonym groups in the retriever settings. Each group is injected into the prompt using the {{synonyms}} slot.

Prompt Template

Main question: {{question}}
Consider these related terms: {{synonyms}}

Filled prompt

Main question: What are symptoms of the flu?
Consider these related terms: symptoms, signs, phenotypes

{{feedback_examples}}

Filled with a list of feedback examples that help the AI assistant learn from past interactions. These examples are used to guide the assistant's behavior through a list of example question-answer pairs. The feedback examples are collected via thumbs up/down feedback on AI responses. If positive feedback is given, the response can be added as a feedback example by clicking the "Train" button in the interface. The slot will be filled with the 5 most similar feedback examples based on the question asked by the user.

Prompt Template

Use the following examples to guide your answer:
{{feedback_examples}}
answer the question:
{{question}}

Filled prompt

Use the following examples to guide your answer:
Question: What is machine learning?
Answer: Machine learning is a field of AI that enables systems to learn from data and improve over time without being explicitly programmed.
answer the question:
What is deep learning?
Info
To be used mainly for structured data, like question-answer pairs detailing the generation of a SPARQL query.