I built a desktop AI agent that my 60+ year old mom can use, .exe/.dmg, Ollama support, 300MB idle RAM
A designer-built desktop AI agent installs in seconds, works fully local with Ollama, and uses just 300MB idle RAM.
A designer, frustrated by the technical hurdles of running local AI agents, has launched Skales—a native desktop application that installs like any normal software. Built by yaboyskales as an Electron app, Skales eliminates the need for Docker, terminal commands, or complex setup. It downloads as a .exe for Windows or .dmg for Mac, allowing non-technical users like his 60+ year-old mother and 6-year-old son to install and run it instantly. The app supports fully local inference via Ollama or connects to cloud providers like OpenAI, Claude, and Gemini, keeping all data stored locally in ~/.skales-data.
Skales functions as an autonomous desktop agent with ReAct reasoning, bi-temporal memory, and browser automation via Playwright. It features a persistent desktop buddy interface, multi-agent group chats where different models can debate, and native integrations with Gmail, Telegram, WhatsApp, and Google Calendar. Despite its capabilities, the app maintains a lightweight footprint, idling at around 300MB of RAM. It's source-available under the BSL-1.1 license, free for personal use, with the code published to prevent commercial reselling by large companies. The project represents a significant step toward democratizing powerful AI agent technology for everyday users.
- Installs like normal software (.exe/.dmg) with no Docker or terminal commands required, built for non-technical users.
- Supports fully local AI via Ollama or cloud APIs (OpenAI, Claude, Gemini), idling at ~300MB RAM with local data storage.
- Features include a desktop buddy, ReAct autopilot, multi-agent chats, and native app integrations (Gmail, Discord, WhatsApp).
Why It Matters
It dramatically lowers the barrier to using powerful AI agents, making the technology accessible to non-technical individuals and small teams.