From df56f1ee5eb5f464b581402cb00c4fe7dd5de774 Mon Sep 17 00:00:00 2001 From: Jeffrey Morgan Date: Mon, 19 Feb 2024 22:16:42 -0500 Subject: [PATCH] Update faq.md --- docs/faq.md | 22 ++++++++++++++++++++++ 1 file changed, 22 insertions(+) diff --git a/docs/faq.md b/docs/faq.md index 8f345776..b1883ce2 100644 --- a/docs/faq.md +++ b/docs/faq.md @@ -14,6 +14,28 @@ curl -fsSL https://ollama.com/install.sh | sh Review the [Troubleshooting](./troubleshooting.md) docs for more about using logs. +## How can I specify the context window size? + +By default, Ollama uses a context window size of 2048 tokens. + +To change this when using `ollama run`, use `/set parameter`: + +``` +/set parameter num_ctx 4096 +``` + +When using the API, specify the `num_ctx` parameter: + +``` +curl http://localhost:11434/api/generate -d '{ + "model": "llama2", + "prompt": "Why is the sky blue?", + "options": { + "num_ctx": 4096 + } +}' +``` + ## How do I configure Ollama server? Ollama server can be configured with environment variables.