This project primarily provides two things:
-
Golang SDK for building LLM-powered applications.
- Make LLM calls to multiple providers through a unified interface.
- Build agents with custom tools, MCP servers, conversation history, and durable execution.
-
AI Gateway for managing access and rate limiting LLM calls, and building and deploying agents without writing code with built-in observability for LLM calls and Agent Loop.
Add the SDK to your project
go get -u github.com/curaious/uno
Initialize the SDK
client, err := sdk.New(&sdk.ClientOptions{
LLMConfigs: sdk.NewInMemoryConfigStore([]*gateway.ProviderConfig{
{
ProviderName: llm.ProviderNameOpenAI,
ApiKeys: []*gateway.APIKeyConfig{
{
Name: "Key 1",
APIKey: os.Getenv("OPENAI_API_KEY"),
},
},
},
}),
})Step 1: Create a model instance
// OpenAI
model := client.NewLLM(sdk.LLMOptions{
Provider: llm.ProviderNameOpenAI,
Model: "gpt-4.1-mini",
})
// Anthropic
model := client.NewLLM(sdk.LLMOptions{
Provider: llm.ProviderNameAnthropic,
Model: "claude-haiku-4-5",
})Step 2: Make the LLM Call
// Completions
resp, err := model.NewResponses(
context.Background(),
&responses.Request{
Instructions: utils.Ptr("You are helpful assistant. You greet user with a light-joke"),
Input: responses.InputUnion{
OfString: utils.Ptr("Hello!"),
},
},
)
// Embeddings
resp, err := model.NewEmbedding(context.Background(), &embeddings.Request{
Input: embeddings.InputUnion{
OfString: utils.Ptr("The food was delicious and the waiter..."),
},
})
// Text-to-spech
resp, err := model.NewSpeech(context.Background(), &speech.Request{
Input: "Hello, this is a test of the text-to-speech system.",
Model: "tts-1",
Voice: "alloy",
})Refer to documentation for advanced usage:
- Text Generation
- Tool Calling
- Reasoning
- Structured Output
- Image Generation
- Web Search Tool
- Code Execution Tool
agent := client.NewAgent(&sdk.AgentOptions{
Name: "Hello world agent",
Instruction: client.Prompt("You are helpful assistant. You are interacting with the user named {{name}}"),
LLM: model,
Parameters: responses.Parameters{
Temperature: utils.Ptr(0.2),
},
})
out, err := agent.Execute(context.Background(), &agents.AgentInput{
Messages: []responses.InputMessageUnion{
responses.UserMessage("Hello!"),
},
})Refer to documentation for more advanced usage:
- System Prompt
- Function Tools
- MCP Tools
- Agent as a Tool
- Human in the loop
- Conversation History
- History Compaction or Summarization
- Durable Execution via Restate
- Durable Execution via Temporal
- Serving Agent through HTTP
Prerequisite: Docker and Docker compose installed and running.
Install AI Gateway and run locally
npx -y @curaious/uno
then visit http://localhost:3000
Refer to documentation for advanced usage:
- LLM Gateway
- No-code Agent Builder
Apache 2.0