Research & Papers

Functional Decomposition and Shapley Interactions for Interpreting Survival Models

New method decomposes time-dependent feature interactions, solving a core limitation of standard additive explanations.

Deep Dive

A research team led by Sophie Hanna Langbein introduces Survival Functional Decomposition (SurvFD) and SurvSHAP-IQ. These tools provide a principled framework for interpreting machine learning models that predict time-to-event outcomes, like patient survival. By decomposing higher-order, time-dependent feature interactions, they address the inherent non-additivity of hazard functions. This allows data scientists and medical researchers to understand precisely when and why standard explanation methods fail for critical predictive tasks.

Why It Matters

Enables trustworthy interpretation of AI used in high-stakes fields like healthcare and risk analysis.