AI Settings
Configure how Cadeo's AI features work in Settings > AI.

Local AI (Ollama) — Desktop Only
Run AI models on your machine for complete privacy.
Available Models
| Model | RAM Required | Notes |
|---|---|---|
| DeepSeek R1 7B | 16 GB minimum | Recommended for most users |
| DeepSeek R1 14B | 16 GB minimum | Better quality |
| DeepSeek R1 32B | 32 GB minimum | Best quality |
Cadeo auto-detects your Mac's RAM and:
- Disables models that require more RAM than available
- Shows a Recommended badge on the best-fit model
Managing Models
- Download — with progress bar showing percentage
- Delete — remove installed models to free disk space
- Status — shows install size in GB
Ollama Status
Shows whether the AI engine is running or offline, with version number and a Refresh button.
Cloud AI
Uses Anthropic's cloud AI for processing.
- Free tier: 10 requests per day — usage bar shows remaining requests
- BYOK (Bring Your Own Key): Unlimited requests with your own API key
- Usage bar turns orange at 90%+ consumption
Provider Preference
Choose your preferred AI provider:
- Local first — use Ollama, fall back to cloud if unavailable
- Cloud with local fallback — prefer cloud, use local if offline
- Cloud only — always use cloud AI
Data Privacy
- Save AI conversations locally only — when enabled, AI chat history stays on your device and doesn't sync to the cloud