Break the Window: Exploring Spatial Decomposition of Webpages in XR
A new research prototype shatters the single-window paradigm, turning webpages into spatial UI chunks for mixed reality.
A team from Microsoft Research and the University of Washington has published a CHI 2026 paper introducing 'Break-the-Window' (BTW), a prototype that fundamentally rethinks how we interact with the web in Extended Reality (XR). Instead of replicating the desktop metaphor of a single floating browser window, BTW automatically decomposes live, fully functional webpages into distinct, movable UI panels—like navigation bars, content sections, or media players—that can be freely arranged in 3D space. This spatial decomposition allows users to attach panels to physical surfaces, cluster related elements, or spread information across their field of view, moving beyond the constraints of a flat, bounded rectangle.
The technical prototype supports both direct hand-touch and controller ray-based interactions for manipulating these spatial chunks. A formative study with XR practitioners and a qualitative study with 15 participants revealed key benefits, including support for distributed attention across multiple information sources and the ability to create spatial layouts that convey meaning (like placing a map panel on a physical wall). However, the research also surfaces significant challenges for future development, including increased coordination effort for users, precision issues with mid-air interaction, and the current lack of shared conventions for spatial UI design. This work is a foundational step toward a post-window era for the web in spatial computing, inviting the broader community to explore how information architecture and interaction must evolve.
- The 'Break-the-Window' prototype automatically splits live webpages into interactive, movable panels for placement in 3D space.
- A study with 15 participants found the approach enables distributed attention and spatial meaning-making but increases coordination effort.
- The research, accepted to CHI 2026, identifies key challenges like interaction precision and the need for new spatial UI conventions.
Why It Matters
This foundational research could define the next paradigm for how we access and organize information in AR/VR workspaces.