Ollama API Usage Examples

Unlock the potential of the Ollama API with our detailed usage examples. Learn how to integrate and optimize your applications effectively.

Prerequisites

Install Ollama

curl -fsSL https://ollama.com/install.sh | sh

Check Ollama

ollama --version
check ollama version

Pull llama3.2 model

ollama pull llama3.2
ollama pull llama3.2

Run llama3.2 model

ollama run llama3.2
ollama run llama3.2

How to use Ollama API

Access Ollama API from local using curl

Access from local using curl

curl http://localhost:11434/api/generate -d '{ "model": "llama3.2", "prompt": "How are you today?"}'
Ollama api curl response

The format of the default response is not very friendly, let's add additional parameters to generate a single json object data, and the response is the return content.

curl http://localhost:11434/api/generate -d '{ "model": "llama3.2", "prompt": "How are you today?", "stream": false}'
Ollama api curl response json format

Use Ollama Python library

Ollama Python library link: https://github.com/ollama/ollama-python

The Ollama Python library provides the easiest way to integrate Python 3.8+ projects with Ollama. Let's install a Python virtual environment first, install python3-pip.

apt install python3-pip python3.12-venv
python3 -m venv myvenv
./myvenv/bin/python3 -m pip install ollama

There are many specific ways to use Python to investigate the Ollama API. Here are two relatively simple ones. For more information, please refer to the link above.

1. Create a Python file

Using Ollama api in python file

2. Use Python Shell

Using Ollama api in python shell