llm

package
v0.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 24, 2026 License: Apache-2.0 Imports: 18 Imported by: 0

Documentation

Overview

Package llm provides LLM provider interfaces and implementations.

Package llm provides LLM provider interfaces and implementations.

Tracing wrapper for LLM providers.

Index

Constants

View Source
const (
	GroqBaseURL       = "https://api.groq.com/openai/v1"
	MistralBaseURL    = "https://api.mistral.ai/v1"
	XAIBaseURL        = "https://api.x.ai/v1"
	OpenRouterBaseURL = "https://openrouter.ai/api/v1"
	OllamaLocalURL    = "http://localhost:11434/v1"
	LMStudioLocalURL  = "http://localhost:1234/v1"
)

Provider-specific base URLs

Variables

This section is empty.

Functions

func InferProviderFromModel

func InferProviderFromModel(model string) string

InferProviderFromModel returns the provider name based on model name patterns. This allows users to just specify a model name without explicitly setting the provider.

func ThinkingLevelToAnthropicBudget

func ThinkingLevelToAnthropicBudget(level ThinkingLevel, configBudget int64) int64

ThinkingLevelToAnthropicBudget converts a thinking level to Anthropic budget tokens. Returns 0 for off, or a reasonable default for each level.

func ThinkingLevelToBool

func ThinkingLevelToBool(level ThinkingLevel) bool

ThinkingLevelToBool converts a thinking level to a boolean for providers that only support on/off.

Types

type AnthropicConfig

type AnthropicConfig struct {
	APIKey       string
	IsOAuthToken bool   // True if APIKey is an OAuth access token (uses Bearer auth)
	BaseURL      string // Optional custom endpoint
	Model        string
	MaxTokens    int
	Thinking     ThinkingConfig
	Retry        RetryConfig
}

AnthropicConfig holds configuration for the Anthropic provider.

type AnthropicProvider

type AnthropicProvider struct {
	// contains filtered or unexported fields
}

AnthropicProvider implements the Provider interface using the official Anthropic SDK.

func NewAnthropicProvider

func NewAnthropicProvider(cfg AnthropicConfig) (*AnthropicProvider, error)

NewAnthropicProvider creates a new Anthropic provider using the official SDK.

func (*AnthropicProvider) Chat

Chat implements the Provider interface.

type ChatRequest

type ChatRequest struct {
	Messages  []Message `json:"messages"`
	Tools     []ToolDef `json:"tools,omitempty"`
	MaxTokens int       `json:"max_tokens,omitempty"`
}

ChatRequest represents a chat request to the LLM.

type ChatResponse

type ChatResponse struct {
	Content      string             `json:"content"`
	Thinking     string             `json:"thinking,omitempty"`
	ToolCalls    []ToolCallResponse `json:"tool_calls,omitempty"`
	StopReason   string             `json:"stop_reason"`
	InputTokens  int                `json:"input_tokens"`
	OutputTokens int                `json:"output_tokens"`
	Model        string             `json:"model"`
	TTFTMs       int64              `json:"ttft_ms,omitempty"` // Time to first token in milliseconds
}

ChatResponse represents a chat response from the LLM.

type GoogleConfig

type GoogleConfig struct {
	APIKey    string
	Model     string
	MaxTokens int
	Thinking  ThinkingConfig
	Retry     RetryConfig
}

GoogleConfig holds configuration for the Google provider.

type GoogleProvider

type GoogleProvider struct {
	// contains filtered or unexported fields
}

GoogleProvider implements the Provider interface using the official Google Gemini SDK.

func NewGoogleProvider

func NewGoogleProvider(cfg GoogleConfig) (*GoogleProvider, error)

NewGoogleProvider creates a new Google Gemini provider using the official SDK.

func (*GoogleProvider) Chat

Chat implements the Provider interface.

func (*GoogleProvider) Close

func (p *GoogleProvider) Close() error

Close closes the underlying client.

type Message

type Message struct {
	Role       string             `json:"role"` // user, assistant, tool, system
	Content    string             `json:"content"`
	ToolCalls  []ToolCallResponse `json:"tool_calls,omitempty"`
	ToolCallID string             `json:"tool_call_id,omitempty"` // For tool result messages
}

Message represents an LLM message.

type MockProvider

type MockProvider struct {

	// ChatFunc can be overridden for custom behavior
	ChatFunc func(ctx context.Context, req ChatRequest) (*ChatResponse, error)
	// contains filtered or unexported fields
}

MockProvider is a mock LLM provider for testing.

func NewMockProvider

func NewMockProvider() *MockProvider

NewMockProvider creates a new mock provider.

func (*MockProvider) CallCount

func (p *MockProvider) CallCount() int

CallCount returns the number of Chat calls made.

func (*MockProvider) Chat

func (p *MockProvider) Chat(ctx context.Context, req ChatRequest) (*ChatResponse, error)

Chat implements the Provider interface.

func (*MockProvider) LastRequest

func (p *MockProvider) LastRequest() *ChatRequest

LastRequest returns the last request.

func (*MockProvider) Reset

func (p *MockProvider) Reset()

Reset resets the call count.

func (*MockProvider) SetError

func (p *MockProvider) SetError(err error)

SetError sets an error to return.

func (*MockProvider) SetResponse

func (p *MockProvider) SetResponse(content string)

SetResponse sets the response content.

func (*MockProvider) SetStopReason

func (p *MockProvider) SetStopReason(reason string)

SetStopReason sets the stop reason.

func (*MockProvider) SetTokenCounts

func (p *MockProvider) SetTokenCounts(input, output int)

SetTokenCounts sets the token counts.

func (*MockProvider) SetToolCall

func (p *MockProvider) SetToolCall(name string, args map[string]interface{})

SetToolCall sets a single tool call response.

func (*MockProvider) SetToolCalls

func (p *MockProvider) SetToolCalls(calls []ToolCallResponse)

SetToolCalls sets multiple tool call responses.

type OllamaCloudConfig

type OllamaCloudConfig struct {
	APIKey    string
	BaseURL   string // defaults to https://ollama.com
	Model     string
	MaxTokens int
	Thinking  ThinkingConfig
	Retry     RetryConfig
}

OllamaCloudConfig holds configuration for the Ollama Cloud provider.

type OllamaCloudProvider

type OllamaCloudProvider struct {
	// contains filtered or unexported fields
}

OllamaCloudProvider implements the Provider interface for Ollama's cloud API. This uses Ollama's native /api/chat endpoint, not the OpenAI-compatible endpoint.

func NewOllamaCloudProvider

func NewOllamaCloudProvider(cfg OllamaCloudConfig) (*OllamaCloudProvider, error)

NewOllamaCloudProvider creates a new Ollama Cloud provider.

func (*OllamaCloudProvider) Chat

Chat implements the Provider interface.

type OpenAICompatConfig

type OpenAICompatConfig struct {
	APIKey       string
	BaseURL      string
	Model        string
	MaxTokens    int
	ProviderName string // For logging/identification
	Thinking     ThinkingConfig
	Retry        RetryConfig
}

OpenAICompatConfig holds configuration for OpenAI-compatible providers.

type OpenAICompatProvider

type OpenAICompatProvider struct {
	// contains filtered or unexported fields
}

OpenAICompatProvider implements the Provider interface for OpenAI-compatible APIs. This includes Groq, Mistral, LiteLLM, OpenRouter, local Ollama, LMStudio, etc.

func NewGroqProvider

func NewGroqProvider(cfg OpenAICompatConfig) (*OpenAICompatProvider, error)

NewGroqProvider creates a Groq provider (uses OpenAI-compatible API).

func NewLMStudioProvider

func NewLMStudioProvider(cfg OpenAICompatConfig) (*OpenAICompatProvider, error)

NewLMStudioProvider creates an LMStudio local provider (uses OpenAI-compatible API).

func NewMistralProvider

func NewMistralProvider(cfg OpenAICompatConfig) (*OpenAICompatProvider, error)

NewMistralProvider creates a Mistral provider (uses OpenAI-compatible API).

func NewOllamaLocalProvider

func NewOllamaLocalProvider(cfg OpenAICompatConfig) (*OpenAICompatProvider, error)

NewOllamaLocalProvider creates an Ollama local provider (uses OpenAI-compatible API).

func NewOpenAICompatProvider

func NewOpenAICompatProvider(cfg OpenAICompatConfig) (*OpenAICompatProvider, error)

NewOpenAICompatProvider creates a new OpenAI-compatible provider.

func NewOpenRouterProvider

func NewOpenRouterProvider(cfg OpenAICompatConfig) (*OpenAICompatProvider, error)

NewOpenRouterProvider creates an OpenRouter provider (uses OpenAI-compatible API).

func NewXAIProvider

func NewXAIProvider(cfg OpenAICompatConfig) (*OpenAICompatProvider, error)

NewXAIProvider creates an xAI (Grok) provider (uses OpenAI-compatible API).

func (*OpenAICompatProvider) Chat

Chat implements the Provider interface.

type OpenAIConfig

type OpenAIConfig struct {
	APIKey    string
	BaseURL   string // Optional custom endpoint
	Model     string
	MaxTokens int
	Thinking  ThinkingConfig
	Retry     RetryConfig
}

OpenAIConfig holds configuration for the OpenAI provider.

type OpenAIProvider

type OpenAIProvider struct {
	// contains filtered or unexported fields
}

OpenAIProvider implements the Provider interface using the official OpenAI SDK.

func NewOpenAIProvider

func NewOpenAIProvider(cfg OpenAIConfig) (*OpenAIProvider, error)

NewOpenAIProvider creates a new OpenAI provider using the official SDK.

func (*OpenAIProvider) Chat

Chat implements the Provider interface.

type Provider

type Provider interface {
	// Chat sends a chat request and returns the response.
	Chat(ctx context.Context, req ChatRequest) (*ChatResponse, error)
}

Provider is the interface for LLM providers.

func NewProvider

func NewProvider(cfg ProviderConfig) (Provider, error)

NewProvider creates a provider based on the configuration. If Provider is empty, it will be inferred from the Model name.

func WithTracing

func WithTracing(p Provider, providerName string) Provider

WithTracing wraps a provider with tracing instrumentation.

type ProviderConfig

type ProviderConfig struct {
	Provider     string         `json:"provider"` // anthropic, openai, google, groq, mistral, openai-compat
	Model        string         `json:"model"`
	APIKey       string         `json:"api_key"`
	IsOAuthToken bool           `json:"is_oauth_token"` // True if APIKey is an OAuth access token (Anthropic only)
	MaxTokens    int            `json:"max_tokens"`
	BaseURL      string         `json:"base_url"` // Custom API endpoint (for OpenRouter, LiteLLM, Ollama, LMStudio)
	Thinking     ThinkingConfig `json:"thinking"` // Thinking/reasoning configuration
	RetryConfig  RetryConfig    `json:"retry"`    // Retry configuration
}

ProviderConfig holds configuration for the Provider adapter.

func (*ProviderConfig) ApplyDefaults

func (c *ProviderConfig) ApplyDefaults()

ApplyDefaults applies default values.

func (*ProviderConfig) Validate

func (c *ProviderConfig) Validate() error

Validate validates the configuration.

type ProviderFactory

type ProviderFactory interface {
	// GetProvider returns a provider for the given profile name.
	// Empty profile name returns the default provider.
	GetProvider(profile string) (Provider, error)
}

ProviderFactory creates providers based on configuration.

type RetryConfig

type RetryConfig struct {
	MaxRetries  int           `json:"max_retries"`  // Max retry attempts (default 5)
	MaxBackoff  time.Duration `json:"max_backoff"`  // Max backoff duration (default 60s)
	InitBackoff time.Duration `json:"init_backoff"` // Initial backoff (default 1s)
}

RetryConfig holds retry settings for LLM calls.

type SingleProviderFactory

type SingleProviderFactory struct {
	// contains filtered or unexported fields
}

SingleProviderFactory wraps a single provider (for backward compatibility).

func NewSingleProviderFactory

func NewSingleProviderFactory(p Provider) *SingleProviderFactory

NewSingleProviderFactory creates a factory that always returns the same provider.

func (*SingleProviderFactory) GetProvider

func (f *SingleProviderFactory) GetProvider(profile string) (Provider, error)

GetProvider returns the single provider regardless of profile.

type Summarizer

type Summarizer struct {
	// contains filtered or unexported fields
}

Summarizer uses an LLM to summarize content and answer questions

func NewSummarizer

func NewSummarizer(provider Provider) *Summarizer

NewSummarizer creates a new summarizer with the given LLM provider

func (*Summarizer) Summarize

func (s *Summarizer) Summarize(ctx context.Context, content, question string) (string, error)

Summarize extracts information from content based on a question

type ThinkingConfig

type ThinkingConfig struct {
	// Level: "auto", "off", "low", "medium", "high"
	// Auto uses heuristic classifier to determine level per-request.
	Level ThinkingLevel

	// BudgetTokens for Anthropic extended thinking (optional, 0 = provider default)
	BudgetTokens int64
}

ThinkingConfig holds thinking configuration.

type ThinkingLevel

type ThinkingLevel string

ThinkingLevel represents the thinking/reasoning effort level.

const (
	ThinkingOff    ThinkingLevel = "off"
	ThinkingLow    ThinkingLevel = "low"
	ThinkingMedium ThinkingLevel = "medium"
	ThinkingHigh   ThinkingLevel = "high"
	ThinkingAuto   ThinkingLevel = "auto"
)

func InferThinkingLevel

func InferThinkingLevel(messages []Message, tools []ToolDef) ThinkingLevel

InferThinkingLevel analyzes the request and returns an appropriate thinking level. This is a zero-cost heuristic classifier - no LLM calls, just pattern matching.

func ResolveThinkingLevel

func ResolveThinkingLevel(config ThinkingConfig, messages []Message, tools []ToolDef) ThinkingLevel

ResolveThinkingLevel resolves the thinking level for a request. If config is Auto, it uses the heuristic classifier.

type ToolCallResponse

type ToolCallResponse struct {
	ID   string                 `json:"id"`
	Name string                 `json:"name"`
	Args map[string]interface{} `json:"args"`
}

ToolCallResponse represents a tool call from the LLM.

type ToolDef

type ToolDef struct {
	Name        string                 `json:"name"`
	Description string                 `json:"description"`
	Parameters  map[string]interface{} `json:"parameters"`
}

ToolDef represents a tool definition for the LLM.

type TracingProvider

type TracingProvider struct {
	// contains filtered or unexported fields
}

TracingProvider wraps a Provider with OpenTelemetry tracing.

func (*TracingProvider) Chat

Chat implements Provider with tracing.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL