Getting Started
choose your adventure
Quilltap talks to AI models, but which models and where they run is entirely up to you. Pick the path that suits your situation — you can always add the other later.
Run Models Locally
Install Ollama and run open-source models on your own hardware. Free forever, completely private, works offline — no API key, no account, no bill.
Ollama SetupUse a Cloud Provider
Connect to OpenAI, Anthropic, Google Gemini, OpenRouter, Grok, and more. Nothing to install — just an API key and you're off to the races.
Cloud Provider SetupNot sure? Start with Ollama if you want zero cost and full privacy. Go with a cloud provider if you want access to the most powerful models without worrying about hardware. Many users end up using both — a local model for everyday chats and background tasks, and a cloud model for the heavy lifting.