Skip to content

Configuration

Idep reads its config from (in order):

  1. ~/.config/idep/config.toml (XDG Base Dir spec, primary)
  2. ~/.idep/config.toml (fallback)

A starter config is included in the repo root as config.example.toml.

[ai]
backend = "ollama" # ollama | anthropic | huggingface | openai
model = "codellama:13b"
endpoint = "http://localhost:11434" # optional — ollama and openai-compat only
[ai.auth]
api_key = "..." # optional — anthropic, huggingface, openai only
KeyTypeDefaultDescription
ai.backendstringBackend type: ollama, anthropic, huggingface, openai
ai.modelstringModel identifier (e.g., codellama:13b, claude-3-5-sonnet-20241022)
ai.endpointstringhttp://localhost:11434 (ollama only)API endpoint URL. Optional for ollama and openai-compat.
ai.auth.api_keystringAPI key for anthropic, huggingface, openai. Optional if IDEP_API_KEY env var is set.

Fallback for ai.auth.api_key. If set, overrides the config file value.

Terminal window
export IDEP_API_KEY="sk-ant-..."

Idep follows the XDG Base Directory specification:

  • Config home: $XDG_CONFIG_HOME/idep/config.toml (defaults to ~/.config/idep/config.toml)
  • Data home: $XDG_DATA_HOME/idep/ (defaults to ~/.local/share/idep/)
  • Cache home: $XDG_CACHE_HOME/idep/ (defaults to ~/.cache/idep/)

Override by setting XDG_CONFIG_HOME, XDG_DATA_HOME, or XDG_CACHE_HOME environment variables.

BackendAPI Key RequiredCloud DependencyBest For
OllamaNoNoneLocal, private, offline
AnthropicYesModerateQuality completions, Claude models
HuggingFaceYesModerateOpen models, flexible hosting
OpenAI-compatYesModerate–HighFlexibility, Groq speed, LM Studio
[ai]
backend = "ollama"
model = "codellama:13b"
endpoint = "http://localhost:11434"
[ai]
backend = "anthropic"
model = "claude-3-5-sonnet-20241022"
[ai.auth]
api_key = "sk-ant-..."
[ai]
backend = "huggingface"
model = "bigcode/starcoder2-15b"
[ai.auth]
api_key = "hf_..."
[ai]
backend = "openai"
model = "mixtral-8x7b-32768"
endpoint = "https://api.groq.com/openai/v1"
[ai.auth]
api_key = "gsk_..."
[ai]
backend = "openai"
model = "gpt-4o-mini"
[ai.auth]
api_key = "sk-..."

(Omit endpoint to use OpenAI’s default: https://api.openai.com/v1)