Originally published byDev.to
Most AI assistants today depend heavily on the cloud. Your data leaves your device, goes to external servers, and you depend on third-party APIs.
I wanted to explore a different approach.
So I built CrustAI — a self-hosted AI assistant that runs entirely on your machine, powered by local LLMs through Ollama, and integrated in real time with Telegram, WhatsApp, Discord and Slack.
Key ideas behind the project:
- Total privacy (no data leaves your machine)
- Real-time messaging integrations
- Long-term memory between conversations
- Offline speech-to-text and text-to-speech
- Extensible Node.js architecture with REST API
This is not theoretical. You can clone the repo and run it today.
GitHub repository:
https://github.com/DaveSimoes/CrustAI
I’d love to hear thoughts from the community.
🇺🇸
More news from United StatesUnited States
NORTH AMERICA
Related News
What Does "Building in Public" Actually Mean in 2026?
20h ago
The Agentic Headless Backend: What Vibe Coders Still Need After the UI Is Done
20h ago
Why I’m Still Learning to Code Even With AI
22h ago
Students Boo Commencement Speaker After She Calls AI the 'Next Industrial Revolution'
5h ago

Testing for ‘Bad Cholesterol’ Doesn’t Tell the Whole Story
5h ago