Probing More-Than-Human Representation in Crisis Resilience Planning: An HCI Researcher Perspective
New research uses AI agents and VR to explore representing animals and ecosystems in disaster resilience planning.
A research team of eight Human-Computer Interaction (HCI) specialists, led by Tram Thi Minh Tran, has published a paper probing how emerging technologies like AI and VR can represent non-human perspectives in crisis planning. Accepted to the prestigious CHI 2026 conference, the work addresses a critical gap: disaster resilience planning remains overwhelmingly human-centered, excluding the species and ecological systems also devastated by floods, fires, and other crises. The researchers conducted a workshop where participants used two novel design probes—a voice-based conversational AI agent and an immersive embodied VR prototype—not to test usability, but to deliberately debate the profound ethical and design choices involved in giving 'voice' to nature.
The findings reveal that using AI to translate ecological data or animal behavior into a human-understandable 'voice' is far from a neutral technical task. Instead, it's a fraught design challenge that introduces core tensions between legitimacy (who decides the representation?), authority (does a synthesized voice carry weight?), and authenticity (can AI ever truly speak for a river or forest?). This positions crisis planning as a critical real-world testbed for examining AI-mediated representation. The research provides empirical insight for developers and planners, suggesting that future tools must navigate these representational politics carefully, moving beyond technical feasibility to consider the ethical implications of speaking for the more-than-human world.
- Workshop used a voice-based AI conversational agent and an immersive VR prototype as design probes to explore non-human representation.
- Key finding: Using AI to give 'voice' to nature creates tensions between legitimacy, authority, and authenticity—it's not a simple translation.
- Paper accepted to CHI 2026, positioning crisis resilience planning as a critical site for testing AI- and immersion-mediated representation.
Why It Matters
Forces planners and AI developers to confront the ethical weight of using technology to represent ecosystems in life-or-death decisions.