Lowkey is a local-first AI chat app that runs entirely on your machine. No cloud. No tracking. No bullshit.
Runs fully offline using local models like Qwen and Gemma.
Runs models directly on your hardware
No internet required after setup
Tokens stream instantly, zero lag
Qwen, Gemma, LLaMA-GGUF
Just install and use
Windows & Linux
Lowkey is an offline, local-first AI chat application that runs entirely on your computer using local language models.
Yes. Lowkey is completely free and does not require subscriptions or accounts.
Only for the initial model download. After setup, Lowkey works fully offline.
No. All conversations stay on your machine. No data is uploaded, logged, or tracked.
No analytics, no tracking, no telemetry. Your usage stays private.
Chats are stored locally on your device inside the app’s data directory.
Lowkey supports GGUF models such as Qwen, Gemma, and other llama.cpp compatible models.
Yes. You can place your own GGUF models in the models directory.
Yes. Responses are streamed token-by-token for a fast, responsive experience.
No. Lowkey works on CPU, but GPU acceleration is supported if available.
Honestly? Your potato PC is also cool.
Yes. Lowkey is open source and available on GitHub.
Man do it, I'd love some help! Check out the GitHub repo for contribution guidelines.