ollama/README.md

86 lines
1.1 KiB
Markdown
Raw Normal View History

2023-06-27 12:08:52 -04:00
# Ollama
2023-06-22 12:45:31 -04:00
2023-06-27 12:44:12 -04:00
- Run models easily
2023-06-27 12:08:52 -04:00
- Download, manage and import models
2023-06-25 13:08:03 -04:00
2023-06-27 12:08:52 -04:00
## Install
2023-06-22 12:45:31 -04:00
```
2023-06-27 12:08:52 -04:00
pip install ollama
2023-06-22 12:45:31 -04:00
```
2023-06-27 12:08:52 -04:00
## Example quickstart
2023-06-25 13:08:03 -04:00
2023-06-27 12:08:52 -04:00
```python
import ollama
model_name = "huggingface.co/thebloke/llama-7b-ggml"
model = ollama.pull(model_name)
ollama.load(model)
ollama.generate(model_name, "hi")
2023-06-25 13:08:03 -04:00
```
2023-06-27 12:08:52 -04:00
## Reference
2023-06-25 13:08:03 -04:00
2023-06-27 12:08:52 -04:00
### `ollama.load`
2023-06-27 12:44:12 -04:00
Load a model for generation
2023-06-27 12:08:52 -04:00
```python
ollama.load("model name")
2023-06-25 13:08:03 -04:00
```
2023-06-27 12:08:52 -04:00
### `ollama.generate("message")`
2023-06-25 13:08:03 -04:00
2023-06-27 12:08:52 -04:00
Generate a completion
2023-06-25 13:08:03 -04:00
2023-06-27 12:08:52 -04:00
```python
ollama.generate(model, "hi")
2023-06-25 13:08:03 -04:00
```
2023-06-27 12:08:52 -04:00
### `ollama.models`
2023-06-25 13:08:03 -04:00
2023-06-27 12:44:12 -04:00
List available local models
2023-06-27 12:08:52 -04:00
```
models = ollama.models()
2023-06-25 13:08:03 -04:00
```
2023-06-27 12:08:52 -04:00
### `ollama.serve`
2023-06-25 13:10:15 -04:00
2023-06-27 12:08:52 -04:00
Serve the ollama http server
2023-06-25 13:08:03 -04:00
2023-06-27 12:08:52 -04:00
## Cooing Soon
2023-06-25 13:08:03 -04:00
2023-06-27 12:08:52 -04:00
### `ollama.pull`
2023-06-27 12:44:12 -04:00
Download a model
2023-06-27 12:08:52 -04:00
```python
ollama.pull("huggingface.co/thebloke/llama-7b-ggml")
2023-06-25 13:08:03 -04:00
```
2023-06-27 12:08:52 -04:00
### `ollama.import`
2023-06-27 12:44:12 -04:00
Import a model from a file
2023-06-27 12:08:52 -04:00
```python
ollama.import("./path/to/model")
```
2023-06-25 13:08:03 -04:00
2023-06-27 12:08:52 -04:00
### `ollama.search`
2023-06-25 14:29:26 -04:00
2023-06-27 12:08:52 -04:00
Search for compatible models that Ollama can run
2023-06-25 14:29:26 -04:00
2023-06-27 12:08:52 -04:00
```python
ollama.search("llama-7b")
```
2023-06-25 13:08:03 -04:00
2023-06-27 12:08:52 -04:00
## Future CLI
2023-06-25 14:29:26 -04:00
2023-06-27 12:44:12 -04:00
In the future, there will be an easy CLI for testing out models
2023-06-27 12:08:52 -04:00
```
ollama run huggingface.co/thebloke/llama-7b-ggml
2023-06-27 12:44:12 -04:00
> Downloading [================> ] 66.67% (2/3) 30.2MB/s
2023-06-27 12:08:52 -04:00
```