I built "Gloss" -- A local-first, privacy-focused NotebookLM alternative in Rust. Features hybrid search, local model support, and explicit RAG control.
Developer builds open-source research workspace with hybrid search, local AI models, and fully transparent RAG.
A developer has launched Gloss, an open-source, local-first research workspace built in Rust as a direct alternative to Google's NotebookLM. The project addresses key concerns with commercial offerings: eliminating the black-box architecture, data privacy risks, and forced reliance on proprietary APIs. Instead of a thin wrapper, Gloss is a complete, transparent RAG (retrieval-augmented generation) environment where users can audit the entire retrieval path. Its core is built on a custom semantic-memory crate implementing a hybrid search system that combines HNSW for dense vector similarity and TF-IDF/BM25 for exact keyword matching.
Gloss emphasizes user control and transparency. It supports local inference with models like Mistral, Llama 3, and Qwen via a local server, while also offering optional API integrations. The system strictly adheres to defined context constraints, allowing users to see exactly which sources are cited and why. The interface features a clean, three-panel split (Sources, Chat, Studio) designed for inspecting evidence alongside AI generation. The project is available on GitHub for community testing and feedback.
- Built in Rust for speed, safety, and low memory footprint, contrasting with cloud-dependent tools.
- Features a custom hybrid search backend combining HNSW vector search with TF-IDF/BM25 keyword matching.
- Provides fully transparent RAG with auditable retrieval paths and support for local models like Llama 3 and Mistral.
Why It Matters
It offers professionals a verifiable, private research tool that keeps sensitive data local and audit trails clear.