Skip to main content

AI Settings

Configure how Cadeo's AI features work in Settings > AI.

AI settings

Local AI (Ollama) — Desktop Only

Run AI models on your machine for complete privacy.

Available Models

ModelRAM RequiredNotes
DeepSeek R1 7B16 GB minimumRecommended for most users
DeepSeek R1 14B16 GB minimumBetter quality
DeepSeek R1 32B32 GB minimumBest quality

Cadeo auto-detects your Mac's RAM and:

  • Disables models that require more RAM than available
  • Shows a Recommended badge on the best-fit model

Managing Models

  • Download — with progress bar showing percentage
  • Delete — remove installed models to free disk space
  • Status — shows install size in GB

Ollama Status

Shows whether the AI engine is running or offline, with version number and a Refresh button.

Cloud AI

Uses Anthropic's cloud AI for processing.

  • Free tier: 10 requests per day — usage bar shows remaining requests
  • BYOK (Bring Your Own Key): Unlimited requests with your own API key
  • Usage bar turns orange at 90%+ consumption

Provider Preference

Choose your preferred AI provider:

  • Local first — use Ollama, fall back to cloud if unavailable
  • Cloud with local fallback — prefer cloud, use local if offline
  • Cloud only — always use cloud AI

Data Privacy

  • Save AI conversations locally only — when enabled, AI chat history stays on your device and doesn't sync to the cloud