Image & Video

CT-Guided Spatially-varying Regularization for Voxel-Wise Deformable Whole-Body PET Registration

Anatomy-adaptive regularization improves tumor tracking in 296-patient study

Deep Dive

A team of researchers from multiple institutions, led by Xiangcen Wu, has introduced a novel approach to whole-body Positron Emission Tomography (PET) registration that leverages CT data to apply spatially-varying regularization. Traditional deep learning-based deformable registration uses a single global regularization weight for the dense displacement field (DDF), which struggles with anatomical heterogeneity—rigid structures like bones require strong regularization to prevent unrealistic deformations, while soft tissues need more flexibility. The new method constructs a voxel-wise regularization map from the paired CT volume acquired during PET/CT scans, adapting constraints based on tissue type.

Evaluated on a clinical dataset of 296 patients with 18F-PSMA and 18F-FDG tracers, the method achieved statistically significant improvements over weakly-supervised baselines in both whole-body registration and organ-wise alignment. This advancement is critical for multi-parametric tumor characterization and tracking metastatic disease progression, offering more precise alignment across different tracers and time points. The work is available on arXiv and has been submitted to the Image and Video Processing subject area.

Key Points
  • CT-guided voxel-wise regularization adapts constraints: stronger for bones, weaker for soft tissues
  • Evaluated on 296 patients with 18F-PSMA and 18F-FDG tracers
  • Significant improvements in whole-body and organ-wise alignment over baselines

Why It Matters

Enables more accurate PET registration for better tumor tracking and disease progression assessment.