Configuration
Config File Location
Section titled “Config File Location”Idep reads its config from (in order):
~/.config/idep/config.toml(XDG Base Dir spec, primary)~/.idep/config.toml(fallback)
A starter config is included in the repo root as config.example.toml.
Configuration Structure
Section titled “Configuration Structure”[ai]backend = "ollama" # ollama | anthropic | huggingface | openaimodel = "codellama:13b"endpoint = "http://localhost:11434" # optional — ollama and openai-compat only
[ai.auth]api_key = "..." # optional — anthropic, huggingface, openai onlyConfig Reference
Section titled “Config Reference”| Key | Type | Default | Description |
|---|---|---|---|
ai.backend | string | — | Backend type: ollama, anthropic, huggingface, openai |
ai.model | string | — | Model identifier (e.g., codellama:13b, claude-3-5-sonnet-20241022) |
ai.endpoint | string | http://localhost:11434 (ollama only) | API endpoint URL. Optional for ollama and openai-compat. |
ai.auth.api_key | string | — | API key for anthropic, huggingface, openai. Optional if IDEP_API_KEY env var is set. |
Environment Variables
Section titled “Environment Variables”IDEP_API_KEY
Section titled “IDEP_API_KEY”Fallback for ai.auth.api_key. If set, overrides the config file value.
export IDEP_API_KEY="sk-ant-..."XDG Base Directory Spec
Section titled “XDG Base Directory Spec”Idep follows the XDG Base Directory specification:
- Config home:
$XDG_CONFIG_HOME/idep/config.toml(defaults to~/.config/idep/config.toml) - Data home:
$XDG_DATA_HOME/idep/(defaults to~/.local/share/idep/) - Cache home:
$XDG_CACHE_HOME/idep/(defaults to~/.cache/idep/)
Override by setting XDG_CONFIG_HOME, XDG_DATA_HOME, or XDG_CACHE_HOME environment variables.
Backend Comparison
Section titled “Backend Comparison”| Backend | API Key Required | Cloud Dependency | Best For |
|---|---|---|---|
| Ollama | No | None | Local, private, offline |
| Anthropic | Yes | Moderate | Quality completions, Claude models |
| HuggingFace | Yes | Moderate | Open models, flexible hosting |
| OpenAI-compat | Yes | Moderate–High | Flexibility, Groq speed, LM Studio |
Example Configs
Section titled “Example Configs”Ollama (Local)
Section titled “Ollama (Local)”[ai]backend = "ollama"model = "codellama:13b"endpoint = "http://localhost:11434"Anthropic
Section titled “Anthropic”[ai]backend = "anthropic"model = "claude-3-5-sonnet-20241022"
[ai.auth]api_key = "sk-ant-..."HuggingFace
Section titled “HuggingFace”[ai]backend = "huggingface"model = "bigcode/starcoder2-15b"
[ai.auth]api_key = "hf_..."OpenAI-compatible (Groq)
Section titled “OpenAI-compatible (Groq)”[ai]backend = "openai"model = "mixtral-8x7b-32768"endpoint = "https://api.groq.com/openai/v1"
[ai.auth]api_key = "gsk_..."OpenAI Direct
Section titled “OpenAI Direct”[ai]backend = "openai"model = "gpt-4o-mini"
[ai.auth]api_key = "sk-..."(Omit endpoint to use OpenAI’s default: https://api.openai.com/v1)