Anaconda Desktop is currently available through a limited early access program. Anaconda AI Navigator provides many of the same capabilities and is available to all users.
System prompts vs. user prompts
System prompts and user prompts serve different purposes in shaping model behavior. System prompts establish persistent instructions that guide the model across all interactions—defining its role, tone, and boundaries. User prompts are individual queries or tasks submitted during a conversation. Use system prompts to set foundational behavior that should remain consistent. Use user prompts for specific, one-time requests. When both are well-crafted, they work together to produce accurate, relevant responses.The CARE framework
When crafting a system prompt, use the CARE framework to ensure you’ve covered the essential components:Component
Description
Clear instruction
Define what the model should do
Additional context
Provide background information the model needs
Response format
Specify how output should be structured
Examples or constraints
Include samples of desired output or set limitations
System prompt best practices
An effective system prompt shapes how the model interacts with users, ensuring that responses are consistent and aligned with the intended behavior. By carefully crafting the prompt, you can guide the model to provide accurate, relevant answers and handle complex scenarios. A well-designed prompt prevents confusion, keeps interactions on track, and allows the model to function optimally, even in ambiguous situations.- Define the Role
- Clearly specify what the model is expected to do in its interactions. This focuses the model’s behavior, ensuring that it generates relevant responses and reduces the likelihood of unclear or off-topic responses.
- Set the Tone and Style
- Choose a tone that matches the context—technical, conversational, or professional—so the model’s replies align with audience expectations.
- Clarify Output Format
- Specify how the model should present its responses. Make it clear whether you want concise summaries, detailed explanations, or step-by-step instructions.
- Set Constraints and Boundaries
- Constraints and boundaries both limit model behavior, but they serve different purposes:
- Boundaries define topics or content the model should avoid entirely. Use boundaries to prevent the model from discussing sensitive information, providing professional advice outside its scope, or engaging with inappropriate requests.
- Constraints define requirements for how the model responds. Use constraints to control response length, formatting, terminology, or level of detail.
- Provide Examples
- When you need consistent formatting or a specific style, include an example of the desired output in your system prompt. Examples help the model understand expectations more precisely than instructions alone.
Without an example:With an example: - Use Delimiters
- Delimiters help separate different parts of your prompt, making it easier for the model to distinguish instructions from reference content or examples. Common delimiters include
###,---, and""".
Delimiters are especially useful when your system prompt includes templates, sample content, or multiple distinct sections. - Account for Ambiguous Situations
- Provide clear fallback instructions for handling unclear or incomplete input. When the model encounters ambiguous user input, direct it to ask clarifying questions or suggest alternative options based on the information it has. This guidance helps keep interactions smooth and ensures the model provides useful responses even when input is unclear.
Iterative refinement
Effective system prompts rarely emerge fully formed. Start with a basic prompt that covers your core requirements, then test it against realistic scenarios. Refine based on the results. When your prompt produces unexpected results, use the following guidance to identify and address the problem:Problem
Likely cause
Solution
Responses are too vague
Insufficient detail in prompt
Add specific requirements and context
Responses are too long
No length constraints
Specify word count or format limits
Responses are too technical
Audience not defined
Indicate the user’s expertise level
Model fabricates information
No grounding in provided facts
Supply verified information and instruct the model to use only what’s provided
Formatting is inconsistent
No format specification
Define output structure or provide an example
Model goes off-topic
Role not clearly defined
Strengthen role definition and add boundaries