GenX - Universal LLM Interface

GenX is a universal abstraction layer for Large Language Models (LLMs).

Design Goals

  1. Provider Agnostic: Single API for OpenAI, Gemini, and other providers
  2. Streaming First: Native support for streaming responses
  3. Tool Orchestration: Rich function calling and tool management
  4. Agent Framework: Build autonomous AI agents (Go only)

Architecture

graph TB
    subgraph app["Application Layer"]
        subgraph agent["Agent Framework (Go)"]
            react[ReActAgent]
            match[MatchAgent]
            sub[SubAgents]
        end
        subgraph tools["Tool System"]
            func[FuncTool]
            gen[GeneratorTool]
            http[HTTPTool]
            comp[CompositeTool]
        end
    end
    
    subgraph core["Core Abstraction"]
        ctx[ModelContext<br/>Builder]
        generator[Generator<br/>Trait]
        stream[Stream<br/>Chunks]
    end
    
    subgraph providers["Provider Adapters"]
        openai[OpenAI]
        gemini[Gemini]
        other[Other]
    end
    
    app --> core
    core --> providers

Core Concepts

ModelContext

Contains all inputs for LLM generation:

  • Prompts: System instructions (named prompts)
  • Messages: Conversation history
  • Tools: Available function definitions
  • Params: Model parameters (temperature, max_tokens, etc.)
  • CoTs: Chain-of-thought examples

Generator

Interface for LLM providers:

  • GenerateStream(): Streaming text generation
  • Invoke(): Structured function call

Stream

Streaming response handler:

  • Next(): Get next message chunk
  • Close(): Close stream
  • CloseWithError(): Close with error

Message Types

TypeDescription
userUser input
assistantModel response
systemSystem prompt (in messages)
toolTool call/result

Content Types

TypeDescription
TextPlain text
BlobBinary data (images, audio)
ToolCallFunction call request
ToolResultFunction call response

Agent Framework (Go only)

Agent Types

AgentDescription
ReActAgentReasoning + Acting pattern
MatchAgentIntent-based routing

Tool Types

ToolDescription
FuncToolGo function wrapper
GeneratorToolLLM-based generation
HTTPToolHTTP requests
CompositeToolTool pipeline
TextProcessorToolText manipulation

Event System

Agents emit events for fine-grained control:

  • EventChunk: Output chunk
  • EventEOF: Round ended
  • EventClosed: Agent completed
  • EventToolStart: Tool execution started
  • EventToolDone: Tool completed
  • EventToolError: Tool failed
  • EventInterrupted: Interrupted

Configuration (agentcfg)

YAML/JSON configuration for agents and tools:

type: react
name: assistant
prompt: |
  You are a helpful assistant.
generator:
  model: gpt-4
tools:
  - $ref: tool:search
  - $ref: tool:calculator

Supports $ref for reusable components.

Provider Support

ProviderGoRust
OpenAI
Gemini
Compatible APIs

Examples Directory

  • examples/go/genx/ - Go examples
  • examples/rust/genx/ - Rust examples