Adversarial Coevolutionary Illumination with Generational Adversarial MAP-Elites
New QD method discovers diverse strategies across games and robotics...
Researchers from multiple institutions have introduced GAME (Generational Adversarial MAP-Elites), a novel coevolutionary Quality-Diversity (QD) algorithm that evolves both competing sides in adversarial problems. Unlike conventional QD methods that fix one side, GAME alternates evolution between opponents each generation. It integrates a vision embedding model (VEM) to automatically generate behavior descriptors from video, eliminating the need for domain-specific engineering. The team validated GAME across three distinct adversarial domains: a multi-agent battle game, a soft-robot wrestling environment, and a deck building game, showing it consistently outperforms one-sided QD baselines.
GAME's experiments revealed several evolutionary phenomena, including arms race-like dynamics where both sides continuously adapt, enhanced novelty through generational extinction of old strategies, and the preservation of neutral mutations as critical stepping stones toward high performance. While GAME successfully illuminates all three adversarial problems, its capacity for truly open-ended discovery remains constrained by the search spaces used. Published in IEEE Transactions on Evolutionary Computation, the work demonstrates broad applicability and highlights opportunities for future research into open-ended adversarial coevolution. Code and videos are available online.
- GAME alternates evolution between competing sides each generation, unlike fixed-side QD methods
- Vision embedding model (VEM) eliminates need for domain-specific behavior descriptors by operating on video
- Validated across 3 domains: multi-agent battle, soft-robot wrestling, and deck building games
Why It Matters
GAME enables more open-ended AI evolution in adversarial settings, potentially advancing game AI, robotics, and competitive simulations.