ollama

package
v0.0.0-...-7871f83 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 23, 2025 License: Apache-2.0 Imports: 11 Imported by: 0

Documentation

Overview

Package ollama provides sentinel errors for the Ollama model package.

Package ollama provides an adapter for integrating Ollama models with AgentMesh. It enables local model execution without API keys or cloud dependencies.

Index

Constants

This section is empty.

Variables

View Source
var (
	// ErrYieldFalse is returned when a yield function returns false.
	ErrYieldFalse = errors.New("model/ollama: yield returned false")

	// ErrInvalidToolMessage is returned when a tool message is invalid.
	ErrInvalidToolMessage = errors.New("model/ollama: invalid tool message")
)

Functions

func WithModel

func WithModel(modelName string) func(o *Options)

WithModel sets the Ollama model to use (e.g., "llama3.2", "mistral", "codellama").

func WithNumPredict

func WithNumPredict(numPredict int) func(o *Options)

WithNumPredict sets the maximum number of tokens to predict. -1 means no limit (default).

func WithSeed

func WithSeed(seed int) func(o *Options)

WithSeed sets the random seed for deterministic output. -1 means random seed (default).

func WithTemperature

func WithTemperature(temperature float64) func(o *Options)

WithTemperature controls randomness in the output (0.0 to 2.0). Lower values make output more deterministic.

func WithTopK

func WithTopK(topK int) func(o *Options)

WithTopK limits sampling to the K most likely next tokens.

func WithTopP

func WithTopP(topP float64) func(o *Options)

WithTopP uses nucleus sampling: only tokens with cumulative probability <= P are considered.

Types

type Client

type Client interface {
	Chat(ctx context.Context, req *api.ChatRequest, fn api.ChatResponseFunc) error
	Generate(ctx context.Context, req *api.GenerateRequest, fn api.GenerateResponseFunc) error
}

Client defines the interface for interacting with the Ollama API.

type ClientWrapper

type ClientWrapper struct {
	// contains filtered or unexported fields
}

ClientWrapper wraps an Ollama client to implement the Client interface.

func NewClientWrapper

func NewClientWrapper(client *api.Client) (*ClientWrapper, error)

NewClientWrapper creates a new ClientWrapper. Returns an error if the client parameter is nil.

func (*ClientWrapper) Chat

Chat implements the Chat method of the Client interface.

func (*ClientWrapper) Generate

Generate implements the Generate method of the Client interface.

type Model

type Model struct {
	// contains filtered or unexported fields
}

Model wraps the Ollama API client for chat completion.

func NewModel

func NewModel(optFns ...func(o *Options)) *Model

NewModel creates a new Ollama model using the default client from environment.

func NewModelFromClient

func NewModelFromClient(client *api.Client, optFns ...func(o *Options)) (*Model, error)

NewModelFromClient creates a model from an existing Ollama client. Returns an error if the client is nil.

func NewModelFromClientWrapper

func NewModelFromClientWrapper(wrapper *ClientWrapper, optFns ...func(o *Options)) (*Model, error)

NewModelFromClientWrapper creates a model from a wrapped client. Returns an error if the wrapper is nil.

func (*Model) Capabilities

func (m *Model) Capabilities() model.Capabilities

Capabilities returns the features and limitations of Ollama models.

func (*Model) Generate

func (m *Model) Generate(ctx context.Context, req model.Request) iter.Seq2[model.Response, error]

Generate sends messages to the Ollama model and yields responses. Supports both streaming and non-streaming modes.

func (*Model) Name

func (m *Model) Name() string

Name returns the configured Ollama model identifier.

type Options

type Options struct {
	// contains filtered or unexported fields
}

Options configures Ollama model behavior.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL