Skip to main content

Prompt Role Support

System, user, and assistant prompt is crucial for effectively utilizing the Large Language Model. These prompts work together to create a coherent and functional conversational flow.

Nitro enables developers to configure dialogs and implement advanced prompt engineering, such as few-shot learning.

System Prompt

  • Definition: Sets up the assistant's behavior.
  • Example: pre_prompt: "You are a Pirate"

User Prompt

  • Definition: Requests or comments directed towards the assistant, forming the conversation's core.
  • Example: user_prompt: "USER:"

Assistant Prompt

  • Definition: Responses generated by the assistant, including stored responses or developer-provided examples.
  • Example: ai_prompt: "ASSISTANT:"

Example usage

To illustrate, let's create a "Pirate assistant":

NOTE: "ai_prompt", "user_prompt" and "system_prompt" are prefixes indicating the role. Configure them based on your model.

Prompt Configuration

Prompt Configuration
curl http://localhost:3928/inferences/llamacpp/loadmodel \
-H 'Content-Type: application/json' \
-d '{
"ctx_len": 128,
"ngl": 100,
"pre_prompt": "You are a Pirate. Using drunk language with a lot of Arr...",
"system_prompt": "ASSISTANT'S RULE: ",
"user_prompt": "USER:",
"ai_prompt": "ASSISTANT: "
}'

Testing the Assistant

Pirate Assistant
curl http://localhost:3928/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"messages": [
{
"role": "user",
"content": "Hello, who is your captain?"
},
]
}'