ollama/README.md
2023-06-27 12:48:50 -04:00

1.1 KiB

Ollama

  • Run models easily
  • Download, manage and import models

Install

pip install ollama

Example quickstart

import ollama
ollama.generate("./llama-7b-ggml.bin", "hi")

Reference

ollama.load

Load a model for generation

ollama.load("model name")

ollama.generate("message")

Generate a completion

ollama.generate(model, "hi")

ollama.models

List available local models

models = ollama.models()

ollama.serve

Serve the ollama http server

Cooing Soon

ollama.pull

Download a model

ollama.pull("huggingface.co/thebloke/llama-7b-ggml")

ollama.import

Import a model from a file

ollama.import("./path/to/model")

Search for compatible models that Ollama can run

ollama.search("llama-7b")

Future CLI

In the future, there will be an easy CLI for testing out models

ollama run huggingface.co/thebloke/llama-7b-ggml
> Downloading [================>          ] 66.67% (2/3) 30.2MB/s