agent.AI()
AI/LLM integration with structured outputs, streaming, and OpenRouter support
agent.AI()
Universal LLM interface for Go with OpenAI and OpenRouter support, structured outputs via Go structs, and streaming responses
The AI() method provides a simple, type-safe interface to Large Language Models with automatic structured output conversion from Go structs to JSON schemas.
Return Type: Returns *ai.Response which provides .Text() for simple text or .Into(&struct{}) for structured output parsing.
Basic Usage
import (
"context"
"github.com/Agent-Field/agentfield/sdk/go/agent"
"github.com/Agent-Field/agentfield/sdk/go/ai"
)
// Simple text response
response, err := app.AI(ctx, "What is the capital of France?")
if err != nil {
log.Fatal(err)
}
fmt.Println(response.Text()) // "The capital of France is Paris."
fmt.Printf("Tokens used: %d\n", response.Usage.TotalTokens)Function Signature
func (a *Agent) AI(
ctx context.Context,
prompt string,
opts ...ai.Option,
) (*ai.Response, error)Parameters
Context
Standard Go context.Context for cancellation and timeouts:
// With timeout
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
defer cancel()
response, err := app.AI(ctx, "Long task...")Prompt
The user message to send to the LLM:
response, err := app.AI(ctx, "Explain quantum computing")Options
Functional options for customizing the AI call:
Prop
Type
Common Patterns
System Prompt
Define AI behavior with a system prompt:
response, err := app.AI(ctx,
"What's 2+2?",
ai.WithSystem("You are a math tutor who explains step-by-step"))Structured Output with Go Structs
The killer feature - type-safe AI responses using Go structs:
// Define your response schema
type WeatherResponse struct {
Location string `json:"location" description:"City name"`
Temperature float64 `json:"temperature" description:"Temperature in Celsius"`
Conditions string `json:"conditions" description:"Weather conditions"`
}
// Call AI with schema
response, err := app.AI(ctx,
"What's the weather in Paris?",
ai.WithSystem("You are a weather assistant"),
ai.WithSchema(WeatherResponse{}))
// Parse into struct
var weather WeatherResponse
if err := response.Into(&weather); err != nil {
log.Fatal(err)
}
fmt.Printf("%s: %.1f°C, %s\n",
weather.Location,
weather.Temperature,
weather.Conditions)Automatic JSON Schema Conversion: The SDK automatically converts Go structs to JSON schemas compatible with OpenAI's structured output format. The description tag helps the LLM understand what each field means.
Model Selection
Override the default model for specific tasks:
// Use cheaper model for simple tasks
response, err := app.AI(ctx,
"Summarize in one sentence",
ai.WithModel("openai/gpt-4o-mini"))
// Use better model for complex reasoning
response, err := app.AI(ctx,
"Analyze this complex dataset",
ai.WithModel("openai/gpt-4o"))
// Use OpenRouter for multi-provider access
response, err := app.AI(ctx,
"Write a story",
ai.WithModel("anthropic/claude-3.5-sonnet"))Temperature Control
Adjust creativity vs determinism:
// Deterministic output (for data extraction, analysis)
response, err := app.AI(ctx,
"Extract the date from: 'Meeting on Jan 15th'",
ai.WithTemperature(0.0))
// Creative output (for stories, brainstorming)
response, err := app.AI(ctx,
"Write a creative story about AI",
ai.WithTemperature(1.2))Complex Structured Outputs
Nested structs with arrays:
type SentimentAnalysis struct {
Sentiment string `json:"sentiment" description:"overall sentiment: positive, negative, or neutral"`
Confidence float64 `json:"confidence" description:"confidence score from 0 to 1"`
Keywords []string `json:"keywords" description:"key phrases that influenced the sentiment"`
Reasoning string `json:"reasoning" description:"brief explanation of the analysis"`
}
response, err := app.AI(ctx,
"I absolutely love this product! Best purchase ever.",
ai.WithSchema(SentimentAnalysis{}))
var analysis SentimentAnalysis
response.Into(&analysis)
fmt.Printf("Sentiment: %s (%.2f confidence)\n", analysis.Sentiment, analysis.Confidence)
fmt.Printf("Keywords: %v\n", analysis.Keywords)
fmt.Printf("Reasoning: %s\n", analysis.Reasoning)Using AI in Reasoners
Perfect pattern for AI-powered reasoners:
type DocumentSummary struct {
Summary string `json:"summary"`
KeyPoints []string `json:"key_points"`
ActionItems []string `json:"action_items"`
RequiresAction bool `json:"requires_action"`
}
app.RegisterReasoner("summarize_document", func(ctx context.Context, input map[string]any) (any, error) {
document := input["document"].(string)
response, err := app.AI(ctx,
fmt.Sprintf("Summarize this document:\n\n%s", document),
ai.WithSystem("You are a professional document analyst"),
ai.WithSchema(DocumentSummary{}),
ai.WithTemperature(0.3))
if err != nil {
return nil, err
}
var summary DocumentSummary
if err := response.Into(&summary); err != nil {
return nil, err
}
return map[string]any{
"summary": summary,
"tokens_used": response.Usage.TotalTokens,
"model": response.Model,
}, nil
})Response Object
The ai.Response type provides access to the complete LLM response:
Properties
Prop
Type
Methods
Prop
Type
Usage Information
response, _ := app.AI(ctx, "Hello")
fmt.Printf("Prompt tokens: %d\n", response.Usage.PromptTokens)
fmt.Printf("Completion tokens: %d\n", response.Usage.CompletionTokens)
fmt.Printf("Total tokens: %d\n", response.Usage.TotalTokens)
fmt.Printf("Model used: %s\n", response.Model)Streaming Responses
For long responses, use AIStream() instead:
chunks, errs := app.AIStream(ctx,
"Write a long essay about AI",
ai.WithMaxTokens(2000))
for chunk := range chunks {
if len(chunk.Choices) > 0 && chunk.Choices[0].Delta.Content != "" {
fmt.Print(chunk.Choices[0].Delta.Content)
}
}
if err := <-errs; err != nil {
log.Printf("Stream error: %v", err)
}Streaming is ideal for real-time display and reduces perceived latency for long responses. The SDK uses Server-Sent Events (SSE) under the hood.
OpenRouter Configuration
Access 100+ models from multiple providers:
aiConfig := &ai.Config{
APIKey: os.Getenv("OPENROUTER_API_KEY"),
BaseURL: "https://openrouter.ai/api/v1",
Model: "meta-llama/llama-4-maverick", // OpenRouter format: provider/model
SiteURL: "https://myapp.com", // For OpenRouter rankings
SiteName: "My AI App",
}
app, err := agent.New(agent.Config{
NodeID: "my-agent",
AIConfig: aiConfig,
// ... other config
})Available Models via OpenRouter
// OpenAI models
ai.WithModel("openai/gpt-4o")
ai.WithModel("openai/gpt-4o-mini")
// Anthropic models
ai.WithModel("anthropic/claude-3.5-sonnet")
ai.WithModel("anthropic/claude-3-haiku")
// Meta models
ai.WithModel("meta-llama/llama-4-maverick")
ai.WithModel("meta-llama/llama-3.1-405b")
// Google models
ai.WithModel("google/gemini-pro-1.5")
// And 100+ more...Error Handling
Always handle errors from AI calls:
response, err := app.AI(ctx, prompt, ai.WithSchema(MyStruct{}))
if err != nil {
// Handle API errors (rate limits, invalid requests, etc.)
log.Printf("AI call failed: %v", err)
return nil, err
}
var result MyStruct
if err := response.Into(&result); err != nil {
// Handle JSON parsing errors
log.Printf("Failed to parse response: %v", err)
return nil, err
}Best Practices
Use Struct Tags
Always include description tags for better LLM understanding:
// ✅ Good - clear descriptions help LLM
type Analysis struct {
Category string `json:"category" description:"primary category: tech, business, or personal"`
Urgency int `json:"urgency" description:"urgency level from 1 (low) to 5 (high)"`
ActionItem string `json:"action_item" description:"specific next action to take"`
}
// ❌ Bad - no guidance for LLM
type Analysis struct {
Category string `json:"category"`
Urgency int `json:"urgency"`
ActionItem string `json:"action_item"`
}Handle Optional Fields
Use omitempty for optional fields:
type Response struct {
Required string `json:"required" description:"always present"`
Optional *string `json:"optional,omitempty" description:"may be nil"`
}Choose Right Temperature
Match temperature to task:
// Data extraction/analysis - deterministic
ai.WithTemperature(0.0)
// General tasks - balanced
ai.WithTemperature(0.7)
// Creative writing - varied
ai.WithTemperature(1.2)Cost Optimization
Use cheaper models when possible:
// Simple classification - use mini
response, err := app.AI(ctx, "Is this spam?",
ai.WithModel("openai/gpt-4o-mini"),
ai.WithMaxTokens(10))
// Complex reasoning - use full model
response, err := app.AI(ctx, "Analyze this complex scenario",
ai.WithModel("openai/gpt-4o"),
ai.WithMaxTokens(1000))