Ollama
I have recently started to use Ollama and I was unimpressed by some models as they did not follow instructions, especially in their output format. I knew about model system prompt but I thought it was fixed in the model.
Then I found out you could change the system prompt at run time with the /set system
command and immediately, most models responded as expected. That was so much better!
To set the system prompt with an API call, add the parameter “system” as described in the documentation.
curl http://localhost:11434/api/generate -d '{
"model": "llama2",
"prompt": "Why is the sky blue?",
"system": "This is your new system prompt"
}'