Creating the Optimal ChatGPT Prompt: Best Practices and Examples
Introduction
The quality of ChatGPT's responses depends largely on how well you formulate your prompt - that is, your input or question1. A precisely formulated prompt can help you get more accurate, relevant, and useful results. In this article, we'll show you how to create an optimal ChatGPT prompt. We'll present proven methods (Best Practices) recommended by official sources like OpenAI and Microsoft, and provide practical examples for various use cases. Our goal is to give you a guide for effectively "communicating" with ChatGPT - following the motto: You program the AI with words1.
Best Practices for Effective ChatGPT Prompts
Formulate clearly, specifically and precisely
Make your request as unambiguous as possible to leave little room for interpretation. Both OpenAI and Microsoft emphasize that concrete instructions lead to more relevant and focused responses23. Avoid vague phrasing or overly general questions. Instead of e.g.
Include context and references
Provide the model with the necessary context it needs for a well-founded response. If your question relates to a specific text, data, or previous chat history, make this information available. OpenAI recommends clearly separating such additional content from the actual command, for example by using quotation marks or a clear keyword2. A prompt for summarization might look like this: 1
Specify desired format and output
Describe in what format or style the response should come. Should the output be e.g. a list, a paragraph, a JSON object or a poem? State this explicitly. OpenAI recommends making the desired output format clear - if necessary by including an example in the prompt2. For example, you could write:
Assign a role or persona in the ChatGPT prompt
It can be useful to have the model assume a specific role to steer the tone and perspective of the response4. You could start with a phrase like:
Use examples (Few-Shot Prompts)
If a simple prompt (without examples) doesn't deliver the desired result, examples can help. The idea behind this: Show the model what you expect by demonstrating one or more example inputs with the corresponding desired outputs in the prompt. This approach is called Few-Shot Prompting. For example, for a complicated extraction task, you could first provide one or two example sentences plus the correctly extracted information, and then follow with a third sentence where the model should generate the output. OpenAI shows an example where keywords are extracted from texts: First two example texts with matching keyword lists are given, then Text 3 and Keywords 3: - the model should then complete this list2. Through such examples in the prompt, ChatGPT better understands what is desired. However, examples should be limited to the most important cases so the prompt doesn't become unnecessarily long.
Break down complex tasks
If your request is too extensive or multi-part, consider breaking it down into smaller sub-steps4. You can first send one aspect of the task to ChatGPT, then check the response and build on it with follow-up questions. For example, you could first ask for a list of ideas and then ask for details about each idea, rather than demanding everything in a single prompt. OpenAI recommends dividing complex tasks into simpler, sequential steps4. This way, the individual request remains manageable and the AI can focus more specifically on each step. You also retain more control over the entire process and can make adjustments if needed.
Enable step-by-step thinking
For tricky questions, it can help to explicitly instruct the model to use a
Formulate positively rather than negatively
Phrase instructions as positive action requests rather than prohibitions. OpenAI shows that a prompt consisting only of prohibitions
Iteratively refine and test the prompt
View prompting as an iterative process. Rarely will you achieve the perfect output on the first try. It's normal and recommended to gradually improve the prompt: rephrase your request, add details or remove unnecessary parts, and observe how the responses change. OpenAI points out that a change to the prompt may immediately show better results for some examples, but might perform worse on a broader test basis4. Therefore, it makes sense to try a prompt with several different inputs to ensure the results are consistently good. Also use the possibility in an ongoing ChatGPT conversation to ask follow-up questions or adjust the instruction without having to start from scratch - ChatGPT "remembers" the history. According to experts, you can thus gradually steer the response in the desired direction, for example by saying after an initial response
Anatomy of a good prompt: Structure following the "o1 Prompt"
A particularly structured approach is described by developer Swyx with the so-called "o1 Prompt"6. Here, a good prompt is divided into four logical sections - a concept that can help you formulate complex tasks for ChatGPT clearly and precisely:
- Goal: What exactly do you want? Describe precisely what you expect from the model - e.g. "List the 3 best hikes near San Francisco".
- Return Format: In what form should the response appear? (e.g. as a list with specific fields, as a table, JSON, short paragraphs etc.). The more detailed, the better.
- Warnings: Are there things the model should pay attention to? (e.g. "Ensure the locations actually exist and the time specifications are correct").
- Context Dump: Background info and personal preferences that help the model deliver more suitable results - such as previous experiences, limitations or special wishes.
This structure is particularly suitable for more complex tasks or when you expect very specific results. It can also be clearly organized in several paragraphs or separated by delimiters (e.g. "--") in the prompt. Tip: Even if you only use 2 or 3 of the elements, the quality of the response often increases.
Practical Application Examples
Summarizing a text
Suppose you want ChatGPT to summarize a longer article or paragraph. An effective prompt might look like this:
""
2. Additionally, the number of desired points (3-5) is precisely specified, which is more precise than e.g. "summarize briefly"2. ChatGPT can thus better extract the content and list the most important information concisely.
Creative content (story)
If you want ChatGPT to write a story or creative text, it's worth specifying the style and theme exactly. Example:
Code generation and help
If you use ChatGPT for programming (e.g. to generate code or explain errors), you should tailor the prompt to the developer perspective. A good approach is to suggest that the model present code directly in the response. Example:
import
in a prompt can encourage the model to output the response directly as code2. In our example, ChatGPT might respond with a Python code block (including def
definition and possibly an import math
if needed) and then provide an explanation in prose. This way you get both the requested code and its explanation.
Q&A with expert knowledge
Imagine you need an expert answer, for example in the medical field. Instead of simply asking
Summary
In summary: An optimal ChatGPT prompt is unambiguous, detailed, and contains all relevant information so the model knows exactly what is required. By providing context, format specifications and possibly examples, you can steer the AI responses in the desired direction. The best practices presented here - validated by OpenAI, Microsoft and other experts - serve as a guide. Ultimately, practice in prompt writing improves results: Don't hesitate to try different formulations and learn from the model's reactions. Over time, you'll develop a feel for which prompt techniques are most effective to get precise and helpful responses from ChatGPT.
Sources
- Effective Prompts for AI: The Essentials - MIT Sloan Teaching & Learning Technologies
- Best practices for prompt engineering with the OpenAI API | OpenAI Help Center
- Azure OpenAI Service - Azure OpenAI | Microsoft Learn
- The Official ChatGPT-Prompt Engineering Guide from OpenAI Is Here
- The art of the prompt: How to get the best out of generative AI - Source
- latent.space: o1 Skill Issue – How to Write a Prompt