AnchorNote: Exploring Speech-Driven Spatial Externalization for Co-Located Collaboration in Augmented Reality
A new AR prototype captures spoken ideas as digital sticky notes, anchored in physical space via live transcription.
Researchers Diya Hundiwala and Andrés Monroy-Hernández have introduced AnchorNote, an experimental Augmented Reality (AR) system designed to transform co-located collaboration. The system allows multiple users in the same physical space to externalize their ideas by simply speaking. Using live speech transcription and LLM (Large Language Model) summarization, AnchorNote automatically creates digital sticky notes that are anchored to specific locations in the shared AR environment. This approach aims to replicate the low-friction, spatial benefits of physical sticky notes—rapid capture, rearrangement, and group attention coordination—within a digital, persistent workspace.
In a two-phase iterative study involving 20 participants, the team evaluated AnchorNote during brainstorming and thematic grouping tasks. The findings revealed a trade-off: while the system successfully reduced the manual effort of writing and capturing ideas, it also introduced new coordination costs. The act of speaking to create a permanent, spatial object changed how participants formulated their thoughts, timed their contributions, and organized the collective workspace. The research uses AnchorNote as an exploratory probe to understand how speech-driven, spatial externalization in AR fundamentally restructures collaborative cognition and team coordination.
The paper, published on arXiv, contributes valuable design implications for the next generation of co-located AR collaboration tools. It highlights that simply digitizing familiar workflows like sticky notes is not enough; designers must account for the subtle ways speech interfaces and spatial persistence alter group dynamics. The work sits at the intersection of Human-Computer Interaction (HCI) and AI, demonstrating how LLMs can act as real-time collaborators in shaping shared understanding.
- Uses live transcription & LLM summarization to convert speech into anchored digital notes.
- Study with 20 participants found it reduced writing effort but added new coordination overhead.
- Serves as a research probe into how AR and AI reshape collaborative idea formulation.
Why It Matters
It reveals the hidden trade-offs in designing AI-augmented collaborative workspaces, crucial for future enterprise AR tools.