Skip to main content

Prompts

Prompts serve as the starting point of every conversation, providing the foundational input that empowers LLMs with expressive capabilities. Ranging from concise phrases to elaborate paragraphs, prompts play a crucial role in shaping the quality of responses. Within LangDB, crafting a prompt and associating it with a model is as straightforward as composing a SQL Query.

Creating a Prompt

To create a prompt explicitly, you need to use the CREATE PROMPT SQL command.


CREATE PROMPT openai_prompt (
system "You are a helpful assistant. Your task is to answer the user's query as accurately as possible."
);

That's all you need to do to create a prompt. For adding in more parameters, you can refer to the CREATE PROMPT syntax.

Using a Prompt

Once you have created a prompt, you can attach it to a model on the fly to generate responses. So, you can have multiple prompts (each pertaining to a different use-case) and attach them to your model while creating it as per your requirement.

CREATE MODEL qa_bot(
...
prompt_name = 'openai_prompt',
...
);

Deleting a Prompt

If you want to delete a prompt, you can use the DROP PROMPT SQL command. Additionally, you can also specify the CASCADE keyword at the end to remove dependent models.

DROP PROMPT openai_prompt;

However, you can create an embedded prompt, while creating the model itself.

For example:

CREATE MODEL chat_assistant(
input
) ENGINE = OpenAI(api_key = 'sk-proj-xxxxxxxxxxxxxxxxxxx', model_name = 'gpt-3.5-turbo', temperature = 0.3)
PROMPT ( system "You are a helpful assistant. Your task is to answer the user's query as accurately as possible.
Question: {{input}}")
SETTINGS retries = 2;

Prompting Best Practices

  • Be Specific: The more specific your prompt is, the better the response will be.
  • Use Examples: Providing examples in your prompt can help the model understand the context better.
  • Use Keywords: Use keywords that are relevant to your use case to get better responses.
  • Use Formatting: Use formatting to structure your prompt better. For example, use bullet points, etc.
  • Use Context: Provide context in your prompt to help the model understand the query better.

Helpful Prompt Templates

Below you can find a few prompts that will help you get a better response from your model. You can use these to get started with LangDB.

QA Prompt

Use the following prompt for a question answering model on your data sources.

You are an advanced question-answering AI assistant. Your task is to provide accurate, concise, and helpful responses to user questions. If the question is unclear, politely ask for clarification. If the question requires step-by-step instructions, provide them in a logical and easy-to-follow manner. Be friendly and professional in your tone.

ReAct Prompt

ReAct is a prompting paradigm that integrates reasoning and acting with LLMs. It prompts LLMs to produce verbal reasoning traces and actions for a given task. This approach enables the system to engage in dynamic reasoning, allowing it to create, maintain, and adjust plans for action. Additionally, it facilitates interaction with external environments to incorporate additional information into the reasoning process.

Use the following format:
Question: the input question you must answer
Thought: you should always think about what to do
Action: the action to take, should be be one of [{{tool_names}}]
Action Input: the input to the action
Observation: the result of the action
... (this Thought/Action/Action Input/Observation can repeat N times)
Thought: I think I have enough information to answer the question. Based on the data retrieved, I can now answer the question.
Final Answer: the final answer to the original input question with data to support it

Begin!
Question: {{input}}
Thought: I need to find the answer to the question.

Sentiment Classification Prompt

Use the following prompt for a sentiment classification model on your textual data

Classify the text into neutral, negative, or positive
Text: {{input}}
Sentiment: