CUGA LogoCUGA AGENT
Customization

Model Configuration

Configure different LLM providers for CUGA

CUGA supports multiple LLM providers with flexible configuration options. You can configure models through TOML files or override specific settings using environment variables.

Supported Platforms

  • OpenAI - GPT models via OpenAI API (also supports LiteLLM via base URL override)
  • IBM WatsonX - IBM's enterprise LLM platform
  • Azure OpenAI - Microsoft's Azure OpenAI service
  • Groq - Groq's LLM platform
  • OpenRouter - LLM API gateway provider

Option 1: OpenAI 🌐

Setup Instructions:

  1. Create an account at platform.openai.com
  2. Generate an API key from your API keys page
  3. Add to your .env file:
    # For OpenAI:
    # AGENT_SETTING_CONFIG="settings.openai.toml" #Could be replaced by the settings in ./configurations/models
    # OPENAI_API_KEY="XXXXXX"
    # OPENAI_API_VERSION="2024-08-01-preview"

Default Values:

  • Model: gpt-4o
  • API Version: OpenAI's default API Version
  • Base URL: OpenAI's default endpoint

Environment Variable Override: You can override the model name using:

MODEL_NAME="gpt-4-turbo"

Option 2: IBM WatsonX 🔵

Setup Instructions:

  1. Access IBM WatsonX
  2. Create a project and get your credentials:
    • Project ID
    • API Key
    • Region/URL
  3. Add to your .env file:
    # For WatsonX
    # AGENT_SETTING_CONFIG="settings.watsonx.toml" #Could be replaced by the settings in ./configurations/models
    # WATSONX_PROJECT_ID="XXXXXX"
    # WATSONX_URL="https://us-south.ml.cloud.ibm.com"
    # WATSONX_APIKEY="XXXXXX"

Default Values:

  • Model: meta-llama/llama-4-maverick-17b-128e-instruct-fp8

Environment Variable Override: You can override the model name using:

MODEL_NAME="meta-llama/llama-3.1-8b-instruct"

Option 3: Azure OpenAI

Setup Instructions:

  1. Add to your .env file:
    # For Azure Openai
    # AGENT_SETTING_CONFIG="settings.azure.toml" #Could be replaced by the settings in ./configurations/models
    # AZURE_OPENAI_API_KEY="XXXXXX"
    # AZURE_OPENAI_ENDPOINT="XXXXXX"

Environment Variable Override: You can override the model name using:

MODEL_NAME="gpt-4o"

Option 4: Groq

Setup Instructions:

  1. Add to your .env file:
    # For Groq
    # GENT_SETTING_CONFIG="settings.groq.toml"
    # GROQ_API_KEY="XXXX"

Environment Variable Override: You can override the model name using:

MODEL_NAME="openai/gpt-oss-120b"
  1. Install dependencies
    uv sync --group groq

Option 5: OpenRouter

Setup Instructions:

  1. Create an account at openrouter.ai
  2. Generate an API key from your account settings
  3. Add to your .env file:
   # For OpenRouter
   # AGENT_SETTING_CONFIG="settings.openrouter.toml"
   # OPENROUTER_API_KEY="XXXX"
   # OPENROUTER_BASE_URL="https://openrouter.ai/api/v1"

Default Values:

  • Base URL: https://openrouter.ai/api/v1
  • Model: Configurable via MODEL_NAME environment variable

Environment Variable Override: You can override the model name using:

MODEL_NAME="openai/gpt-4o"

Note: OpenRouter provides access to multiple model providers. Refer to OpenRouter's model documentation for available models and their pricing.

LiteLLM Support

CUGA supports LiteLLM through the OpenAI configuration by overriding the base URL:

  1. Add to your .env file:
    # For LiteLLM:
    # AGENT_SETTING_CONFIG="settings.openai.toml" #Could be replaced by the settings in ./configurations/models
    # OPENAI_API_KEY="XXXXXX"
    # OPENAI_BASE_URL="XXXXXX"

Environment Variable Override: You can override the model name using:

MODEL_NAME="gpt-4-turbo"

Configuration Files

CUGA uses TOML configuration files located in src/cuga/configurations/models/:

  • settings.openai.toml - OpenAI configuration (also supports LiteLLM via base URL override)
  • settings.watsonx.toml - WatsonX configuration
  • settings.azure.toml - Azure OpenAI configuration
  • settings.groq.toml - Groq configuration
  • settings.openrouter.toml - OpenRouter configuration

Each file contains agent-specific model settings that can be overridden by environment variables.

Settings File Content Structure

Each settings file contains configuration for different CUGA agents. You can configure each agent independently with different models and parameters:

OpenAI Configuration Example

[agent.task_decomposition.model]
platform = "openai"
temperature = 0.1
max_tokens = 1000

[agent.planner.model]
platform = "openai"
temperature = 0.1
max_tokens = 5000

[agent.chat.model]
platform = "openai"
temperature = 0.1
max_tokens = 5000

[agent.shortlister.model]
platform = "openai"
temperature = 0.1
max_tokens = 7000

WatsonX Configuration Example

[agent.shortlister.model]
platform = "watsonx"
model_name = "meta-llama/llama-4-maverick-17b-128e-instruct-fp8"
temperature = 0.1
max_tokens = 7000

[agent.planner.model]
platform = "watsonx"
model_name = "meta-llama/llama-4-maverick-17b-128e-instruct-fp8"
temperature = 0.1
max_tokens = 5000

[agent.chat.model]
platform = "watsonx"
model_name = "meta-llama/llama-4-maverick-17b-128e-instruct-fp8"
temperature = 0.1
max_tokens = 5000

OpenRouter Configuration Example

[agent.task_decomposition.model]
platform = "openrouter"
temperature = 0.1
max_tokens = 1000

[agent.planner.model]
platform = "openrouter"
temperature = 0.1
max_tokens = 5000

[agent.chat.model]
platform = "openrouter"
temperature = 0.1
max_tokens = 5000

[agent.shortlister.model]
platform = "openrouter"
temperature = 0.1
max_tokens = 7000

🔄 Switching Between Providers

Method 1: Environment Variable

# Switch to OpenAI
export AGENT_SETTING_CONFIG="settings.openai.toml"

# Switch to WatsonX
export AGENT_SETTING_CONFIG="settings.watsonx.toml"

# Switch to Azure
export AGENT_SETTING_CONFIG="settings.azure.toml"

# Switch to Groq
export AGENT_SETTING_CONFIG="settings.groq.toml"

# Switch to OpenRouter
export AGENT_SETTING_CONFIG="settings.openrouter.toml"

Method 2: Edit .env File

# Edit .env file
AGENT_SETTING_CONFIG="settings.openai.toml"

📚 Next Steps

After configuring your model:

  1. Environment Setup: Configure other system settings
  2. Customization Overview: Fine-tune CUGA behavior
  3. Advanced Usage: Explore advanced features

Model configured? Move to Environment Setup!