Ollama system prompt


I have recently started to use Ollama and I was unimpressed by some models as they did not follow instructions, especially in their output format. I knew about model system prompt but I thought it was fixed in the model.

Then I found out you could change the system prompt at run time with the /set system command and immediately, most models responded as expected. That was so much better!

To set the system prompt with an API call, add the parameter “system” as described in the documentation.

curl http://localhost:11434/api/generate -d '{
  "model": "llama2",
  "prompt": "Why is the sky blue?",
  "system": "This is your new system prompt"
This entry was posted in artificial intelligence, Computer, Generative AI, Large Language Models, Linux. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.