Getting Started
Prerequisites
Section titled “Prerequisites”- Rust stable — install via rustup
- Cargo — included with Rust
- Git
- Optional: Ollama for local AI inference (recommended for first run)
Idep pins its Rust version via rust-toolchain.toml in the repo root. rustup will auto-install the correct version on first cargo build.
Clone and Build
Section titled “Clone and Build”git clone https://github.com/idep-editor/idepcd idepcargo build --releaseA successful build means all crates compile and the binary is ready at target/release/idep.
Configure Your Backend
Section titled “Configure Your Backend”Create the config directory and file:
mkdir -p ~/.config/idepcp config.example.toml ~/.config/idep/config.tomlEdit ~/.config/idep/config.toml to select your AI backend. See Configuration for all options.
Ollama (Local, No API Key)
Section titled “Ollama (Local, No API Key)”Best for: privacy, offline use, no ongoing cost.
[ai]backend = "ollama"model = "codellama:13b"endpoint = "http://localhost:11434"Install Ollama from ollama.com, then pull a model:
ollama pull codellama:13bRecommended models: codellama:13b, deepseek-coder:6.7b, starcoder2:15b.
Anthropic
Section titled “Anthropic”Best for: highest quality completions, Claude models.
[ai]backend = "anthropic"model = "claude-3-5-sonnet-20241022"
[ai.auth]api_key = "sk-ant-..."Or set the environment variable:
export IDEP_API_KEY="sk-ant-..."HuggingFace
Section titled “HuggingFace”Best for: open models, flexible hosting.
[ai]backend = "huggingface"model = "bigcode/starcoder2-15b"
[ai.auth]api_key = "hf_..."OpenAI-compatible
Section titled “OpenAI-compatible”Best for: Groq, Together AI, LM Studio, or any OpenAI-compatible endpoint.
[ai]backend = "openai"model = "gpt-4o-mini"endpoint = "https://api.groq.com/openai/v1"
[ai.auth]api_key = "gsk_..."The endpoint field accepts any OpenAI-compatible base URL. Omit endpoint to use OpenAI directly (https://api.openai.com/v1).
Verify It Works
Section titled “Verify It Works”cargo build --release./target/release/idep --versionExpected output:
idep 0.0.2If you configured Ollama, start the server in another terminal:
ollama serveThen run a completion test (exact command depends on UI availability in v0.0.2 — check TODO.md for current status).