Go SDK Overview
Build production-grade AI agents in Go with automatic workflow tracking and structured outputs.
Go SDK Overview
Build production-grade AI agents in Go with automatic workflow tracking and structured outputs
The Agentfield Go SDK provides idiomatic Go interfaces for building distributed agent systems with automatic parent-child workflow tracking, AI-powered reasoners, and seamless Agentfield control plane integration.
Installation
# Navigate to your Go project
cd your-project
# Add the Agentfield Go SDK
go get github.com/Agent-Field/agentfield/sdk/goThe Go SDK requires Go 1.21 or later. Make sure your go.mod specifies the correct version.
Quick Start
Create your first AI-powered agent in under 5 minutes:
package main
import (
"context"
"log"
"os"
"github.com/Agent-Field/agentfield/sdk/go/agent"
"github.com/Agent-Field/agentfield/sdk/go/ai"
)
func main() {
ctx := context.Background()
// Configure AI (OpenRouter or OpenAI)
aiConfig := &ai.Config{
APIKey: os.Getenv("OPENROUTER_API_KEY"),
BaseURL: "https://openrouter.ai/api/v1",
Model: "meta-llama/llama-4-maverick",
Temperature: 0.7,
}
// Create agent
app, err := agent.New(agent.Config{
NodeID: "my-agent",
Version: "1.0.0",
AgentFieldURL: "http://localhost:8080",
AIConfig: aiConfig,
})
if err != nil {
log.Fatal(err)
}
// Register a reasoner
app.RegisterReasoner("greet", func(ctx context.Context, input map[string]any) (any, error) {
name := input["name"].(string)
// Call AI with structured output
response, err := app.AI(ctx,
fmt.Sprintf("Generate a personalized greeting for %s", name),
ai.WithTemperature(0.9))
if err != nil {
return nil, err
}
return map[string]any{
"message": response.Text(),
"tokens": response.Usage.TotalTokens,
}, nil
})
// Initialize and run
if err := app.Initialize(ctx); err != nil {
log.Fatal(err)
}
if err := app.Run(ctx); err != nil {
log.Fatal(err)
}
}Key Features
📟 CLI Mode: Dual-Mode Binaries
Go agents compile to single binaries that operate in two modes:
- Server Mode: Connect to AgentField control plane for verifiable, auditable execution with auto-discovery
- CLI Mode: Run standalone as fast, native CLI tools with zero dependencies
Same binary, both modes. Enable with WithCLI() option. Perfect for building production-ready CLI tools that can optionally integrate with the control plane for distributed workflows and cryptographic audit trails.
🚀 Zero-Config Workflow Tracking
Parent-child relationships are automatically created when reasoners call other reasoners. No manual DAG management required.
// Parent reasoner
app.RegisterReasoner("orchestrate", func(ctx context.Context, input map[string]any) (any, error) {
// Calling child automatically creates parent-child link
result1, _ := app.Call(ctx, "my-agent.analyze", input)
result2, _ := app.Call(ctx, "my-agent.summarize", input)
// Agentfield control plane tracks: orchestrate -> [analyze, summarize]
return combine(result1, result2), nil
})Automatic Dependency Injection: The SDK automatically sets X-Run-ID, X-Execution-ID, and X-Parent-Execution-ID headers. The Agentfield control plane builds workflow DAGs without any manual tracking.
🤖 AI-Powered Reasoners
Built-in AI client with OpenAI and OpenRouter support, plus structured outputs via Go structs.
type Analysis struct {
Sentiment string `json:"sentiment" description:"positive, negative, or neutral"`
Confidence float64 `json:"confidence" description:"confidence score 0-1"`
Keywords []string `json:"keywords" description:"key phrases"`
}
app.RegisterReasoner("analyze", func(ctx context.Context, input map[string]any) (any, error) {
response, err := app.AI(ctx, input["text"].(string),
ai.WithSchema(Analysis{})) // Type-safe structured output!
var analysis Analysis
response.Into(&analysis)
return analysis, nil
})🔄 Streaming Responses
Real-time AI responses with Go channels.
chunks, errs := app.AIStream(ctx, "Write a story")
for chunk := range chunks {
if len(chunk.Choices) > 0 {
fmt.Print(chunk.Choices[0].Delta.Content)
}
}⚡ Serverless Support
Deploy to Lambda, Cloud Functions, or any serverless platform.
// HTTP entrypoint (Cloud Run / Functions)
http.ListenAndServe(":8080", app.Handler())
// Raw event entrypoint (Lambda)
func LambdaHandler(ctx context.Context, event map[string]any) (map[string]any, error) {
normalize := func(e map[string]any) map[string]any {
return map[string]any{
"path": stringFrom(e, "rawPath", "path"),
"target": stringFrom(e, "target", "reasoner", "skill"),
"input": e["input"],
}
}
result, status, err := app.HandleServerlessEvent(ctx, event, normalize)
if err != nil {
return map[string]any{"statusCode": 500, "body": map[string]any{"error": err.Error()}}, nil
}
return map[string]any{"statusCode": status, "body": result}, nil
}Environment Variables
The SDK automatically reads from environment variables for convenient configuration:
AI Provider Configuration
Standard OpenAI configuration:
export OPENAI_API_KEY="sk-proj-..."
export AI_MODEL="gpt-4o" # or gpt-4o-mini, gpt-4-turbo, etc.Available models:
gpt-4o- Latest GPT-4 Optimized (recommended)gpt-4o-mini- Faster, cheaper GPT-4gpt-4-turbo- Previous generationgpt-3.5-turbo- Legacy model
OpenRouter for multi-provider access:
export OPENROUTER_API_KEY="sk-or-v1-..."
export AI_MODEL="anthropic/claude-3-5-sonnet-20241022"Popular models on OpenRouter:
anthropic/claude-3-5-sonnet-20241022- Claude 3.5 Sonnetanthropic/claude-3-5-haiku-20241022- Claude 3.5 Haiku (faster)meta-llama/llama-4-maverick- Llama 4 Maverickgoogle/gemini-pro-1.5- Gemini 1.5 Proopenai/gpt-4o- GPT-4 via OpenRouterperplexity/llama-3.1-sonar-large-128k-online- Perplexity with web search
See OpenRouter Models for the complete list.
For OpenAI-compatible APIs (Ollama, LocalAI, etc.):
export OPENAI_API_KEY="your-api-key"
export AI_BASE_URL="http://localhost:11434/v1" # Ollama
export AI_MODEL="llama3.2"Examples:
- Ollama:
http://localhost:11434/v1 - LocalAI:
http://localhost:8080/v1 - Azure OpenAI:
https://<resource>.openai.azure.com/
Control Plane Configuration
# Control plane URL (default: http://localhost:8080)
export AGENTFIELD_URL="http://localhost:8080"
# Optional authentication token
export BRAIN_TOKEN="your-token"Complete Configuration Example
Local development:
# AI provider
export OPENAI_API_KEY="sk-proj-..."
export AI_MODEL="gpt-4o"
# Control plane
export AGENTFIELD_URL="http://localhost:8080"Production with OpenRouter:
# AI provider (OpenRouter for multi-model access)
export OPENROUTER_API_KEY="sk-or-v1-..."
export AI_MODEL="anthropic/claude-3-5-sonnet-20241022"
# Control plane
export AGENTFIELD_URL="https://agentfield.company.com"OpenRouter Benefits
Why OpenRouter? OpenRouter provides:
- 100+ models from OpenAI, Anthropic, Meta, Google, Cohere, etc.
- Single API - no need to integrate multiple providers
- Unified billing - one invoice for all models
- Automatic routing - fallback to other models if primary fails
- Cost optimization - use cheaper models for simple tasks
The Go SDK auto-detects which service you're using based on OPENROUTER_API_KEY or OPENAI_API_KEY.
For complete environment variable reference, see:
- Environment Variables Reference - All Go SDK variables
- Agent Package Documentation - Go-specific configuration
SDK Components
The Go SDK is organized into focused packages:
| Package | Purpose | Key Types |
|---|---|---|
agent | Agent lifecycle, reasoner registration | Agent, Config, Reasoner |
ai | LLM integration, structured outputs | Client, Config, Response |
client | Agentfield control plane communication | Client (internal) |
types | Shared types and interfaces | ExecutionContext |
Next Steps
- CLI Mode Overview - Build dual-mode binaries (CLI + server)
- Agent Configuration - Learn how to configure agents for different deployment modes
- Registering Reasoners - Register reasoners with automatic workflow tracking
- Calling Reasoners - Execute reasoners with parent-child relationship tracking
- AI Integration - Use AI with structured outputs and streaming
Comparison with Python SDK
Both SDKs provide the same core functionality with language-specific idioms:
| Feature | Python SDK | Go SDK |
|---|---|---|
| AI Calls | await agent.ai("prompt") | agent.AI(ctx, "prompt") |
| Structured Output | schema=Model kwarg | ai.WithSchema(Model{}) option |
| Reasoner Registration | @agent.reasoner decorator | agent.RegisterReasoner() method |
| Calling Reasoners | await agent.call() | agent.Call(ctx, ...) |
| Streaming | stream=True kwarg | agent.AIStream() method |
| Configuration | Keyword arguments | Functional options pattern |
System Requirements
- Go: 1.21 or later
- Agentfield Control Plane: v1.0.0 or later
- Network: HTTP access to Agentfield control plane
- Optional: OpenAI or OpenRouter API key for AI features