Codex and Windsurf use TOML configuration format and only support system-level (global) configuration. The CLI automatically detects which variant you have installed.
Windsurf is a variant of Codex with enhanced features. The configuration is identical for both.
Configuration File
Location : ~/.codex/config.toml
model_provider = "megallm"
model = "gpt-5"
[ model_providers . megallm ]
name = "OpenAI using Chat Completions"
base_url = "https://ai.megallm.io/v1"
env_key = "MEGALLM_API_KEY"
query_params = {}
[ tools ]
web_search = true
file_browser = true
Codex/Windsurf only supports system-level configuration. Project-level configuration is not available.
Environment Variable
The configuration references an environment variable for the API key:
export MEGALLM_API_KEY = "sk-mega-your-api-key-here"
This is added to your shell configuration file:
~/.bashrc (bash)
~/.zshrc (zsh)
~/.config/fish/config.fish (fish)
PowerShell profile (Windows)
Verify Environment Variable
echo $MEGALLM_API_KEY
# Output: sk-mega-your-api-key-here
Manual Setup
If you prefer not to use the CLI:
# 1. Create directory
mkdir -p ~/.codex
# 2. Create config file
cat > ~/.codex/config.toml << 'EOF'
model_provider = "megallm"
model = "gpt-5"
[model_providers.megallm]
name = "OpenAI using Chat Completions"
base_url = "https://ai.megallm.io/v1"
env_key = "MEGALLM_API_KEY"
query_params = {}
[tools]
web_search = true
file_browser = true
EOF
# 3. Add environment variable to shell config
echo 'export MEGALLM_API_KEY="your-api-key"' >> ~/.bashrc
# 4. Reload shell
source ~/.bashrc
Configuration Options
Model Selection
Change the default model in the configuration:
model = "claude-opus-4-1-20250805" # or any supported model
Available models:
gpt-5 - Latest GPT model
gpt-4 - GPT-4
gpt-4o - GPT-4 Optimized
claude-opus-4-1-20250805 - Claude Opus
claude-sonnet-4 - Claude Sonnet
gemini-2.5-pro - Gemini Pro
See Models Catalog for full list
Enable or disable integrated tools:
[ tools ]
web_search = true # Enable web search capability
file_browser = true # Enable file browser
terminal = true # Enable terminal access
code_execution = true # Enable code execution
Advanced Configuration
model_provider = "megallm"
model = "gpt-5"
temperature = 0.7
max_tokens = 4096
top_p = 0.9
[ model_providers . megallm ]
name = "OpenAI using Chat Completions"
base_url = "https://ai.megallm.io/v1"
env_key = "MEGALLM_API_KEY"
query_params = {}
[ tools ]
web_search = true
file_browser = true
terminal = true
[ ui ]
theme = "dark"
font_size = 14
show_line_numbers = true
Multiple API Keys
If you need different API keys for different purposes:
Using Environment Variables
# Development key
export MEGALLM_API_KEY = "sk-mega-dev-key"
# Production key
export MEGALLM_API_KEY_PROD = "sk-mega-prod-key"
Switching Configurations
# Create backup of current config
cp ~/.codex/config.toml ~/.codex/config.toml.backup
# Development config
cat > ~/.codex/config.toml.dev << 'EOF'
model_provider = "megallm"
model = "gpt-4o-mini"
[model_providers.megallm]
base_url = "https://ai.megallm.io/v1"
env_key = "MEGALLM_API_KEY_DEV"
EOF
# Production config
cat > ~/.codex/config.toml.prod << 'EOF'
model_provider = "megallm"
model = "gpt-5"
[model_providers.megallm]
base_url = "https://ai.megallm.io/v1"
env_key = "MEGALLM_API_KEY_PROD"
EOF
# Switch to dev
cp ~/.codex/config.toml.dev ~/.codex/config.toml
# Switch to prod
cp ~/.codex/config.toml.prod ~/.codex/config.toml
Windsurf-Specific Features
Windsurf includes additional configuration options:
model_provider = "megallm"
model = "gpt-5"
[ model_providers . megallm ]
name = "OpenAI using Chat Completions"
base_url = "https://ai.megallm.io/v1"
env_key = "MEGALLM_API_KEY"
query_params = {}
[ windsurf ]
cascade_mode = true # Enable Cascade AI feature
multi_file_edit = true # Allow editing multiple files
context_awareness = "enhanced" # enhanced, standard, minimal
[ tools ]
web_search = true
file_browser = true
terminal = true
supercomplete = true # Windsurf autocomplete feature
Verification
Check Configuration File
# View configuration
cat ~/.codex/config.toml
# Validate TOML syntax (if toml-cli installed)
toml-check ~/.codex/config.toml
# Check file permissions
ls -la ~/.codex/config.toml
Test API Connection
# Test API with your credentials
curl -H "Authorization: Bearer $MEGALLM_API_KEY " \
-H "Content-Type: application/json" \
https://ai.megallm.io/v1/models
# Should return list of available models
Test Codex/Windsurf
# Run Codex/Windsurf
codex # or 'windsurf'
# Check version
codex --version # or 'windsurf --version'
Troubleshooting
Configuration file not found
Check if directory exists: Create if missing: mkdir -p ~/.codex
# Then create config.toml
Verify file path: # Should be exactly:
~ /.codex/config.toml
# Not:
~ /.config/codex/config.toml # <Icon icon="xmark" /> Wrong location
Check environment variable is set: If empty: # Add to shell config
echo 'export MEGALLM_API_KEY="your-key"' >> ~/.bashrc
source ~/.bashrc
Verify key format:
Must start with sk-mega-
At least 20 characters
No extra spaces or quotes
Test the key: curl -H "Authorization: Bearer $MEGALLM_API_KEY " \
https://ai.megallm.io/v1/models
Wrong model provider being used
Check config file: cat ~/.codex/config.toml | grep model_provider
# Should show: model_provider = "megallm"
Verify base_url: cat ~/.codex/config.toml | grep base_url
# Should show: base_url = "https://ai.megallm.io/v1"
Ensure no typos: model_provider = "megallm" # <Icon icon="check" /> Correct
# model_provider = "megalm" # <Icon icon="xmark" /> Wrong (typo)
# model_provider = "openai" # <Icon icon="xmark" /> Wrong (different provider)
Validate syntax: # If you have toml-cli installed
toml-check ~/.codex/config.toml
# Or use Python
python3 -c "import tomli; tomli.load(open('~/.codex/config.toml', 'rb'))"
Common TOML mistakes: # <Icon icon="xmark" /> Wrong - missing quotes
model_provider = m egallm
# <Icon icon="check" /> Correct
model_provider = "megallm"
# <Icon icon="xmark" /> Wrong - incorrect section syntax
model_providers.megallm
base_url = "..."
# <Icon icon="check" /> Correct
[ model_providers . megallm ]
base_url = "..."
Codex/Windsurf not detecting configuration
Restart Codex/Windsurf: # Close all instances
pkill codex # or 'pkill windsurf'
# Start fresh
codex # or 'windsurf'
Check for multiple config files: find ~ -name "config.toml" -path "*/.codex/*"
# Should only show one file
Verify permissions: chmod 644 ~/.codex/config.toml
Why System-Level Only?
Codex and Windsurf don’t support project-level configuration because:
Single Instance - Codex/Windsurf runs as a single instance across all projects
Global Settings - Tool preferences apply system-wide
Simplified Management - One configuration to manage
Workaround for Project-Specific Keys:
Use environment variables in your project:
# In project directory
cat > .env << 'EOF'
MEGALLM_API_KEY=project-specific-key
EOF
# Load before running Codex
source .env && codex
Or create shell aliases:
# In ~/.bashrc or ~/.zshrc
alias codex-project-a = 'MEGALLM_API_KEY="key-for-project-a" codex'
alias codex-project-b = 'MEGALLM_API_KEY="key-for-project-b" codex'
Best Practices
Backup Configuration Keep backup of config.toml before making changes
Use Environment Variables Store API keys in environment variables, not in config file
Version Control You can commit config.toml if env_key is used (no hardcoded keys)
Regular Updates Keep Codex/Windsurf updated for latest features
Comparison: Codex vs Windsurf
Feature Codex Windsurf Base Configuration MegaLLM Support Config Location ~/.codex/~/.codex/Cascade AI Supercomplete Multi-file Edit Basic Enhanced
Both Codex and Windsurf use the same configuration file location and format.
Next Steps