From f5e3939220e9cd3d7a636708bc9df031ebfd4854 Mon Sep 17 00:00:00 2001 From: Jeffrey Morgan Date: Thu, 25 Jul 2024 23:10:18 -0400 Subject: [PATCH] Update api.md (#5968) --- docs/api.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/api.md b/docs/api.md index 0ab70383..2d4fe28f 100644 --- a/docs/api.md +++ b/docs/api.md @@ -418,6 +418,7 @@ Generate the next message in a chat with a provided model. This is a streaming e - `model`: (required) the [model name](#model-names) - `messages`: the messages of the chat, this can be used to keep a chat memory +- `tools`: tools for the model to use if supported. Requires `stream` to be set to `false` The `message` object has the following fields: @@ -432,7 +433,6 @@ Advanced parameters (optional): - `options`: additional model parameters listed in the documentation for the [Modelfile](./modelfile.md#valid-parameters-and-values) such as `temperature` - `stream`: if `false` the response will be returned as a single response object, rather than a stream of objects - `keep_alive`: controls how long the model will stay loaded into memory following the request (default: `5m`) -- `tools`: external tools the model can use. Not all models support this feature. ### Examples @@ -1286,4 +1286,4 @@ curl http://localhost:11434/api/embeddings -d '{ 0.8785552978515625, -0.34576427936553955, 0.5742510557174683, -0.04222835972905159, -0.137906014919281 ] } -``` \ No newline at end of file +```