Image & Video

Exploring Remote Photoplethysmography for Neonatal Pain Detection from Facial Videos

Non-contact pulse signals from video could replace subjective pain scales for newborns...

Deep Dive

A team of researchers from IIIT Allahabad and other institutions has introduced a novel non-contact method for neonatal pain detection using remote photoplethysmography (rPPG) from facial videos. Published on arXiv, the study addresses a critical gap in neonatal care: unaddressed pain in newborns can lead to delayed development and slower weight gain, yet traditional pain assessment relies on subjective behavioral scales or contact-based sensors that are unsuitable for long-term monitoring and pose infection risks, especially in NICU settings during pandemics like COVID-19.

The proposed method extracts pulse signals from facial video frames by selecting regions-of-interest (ROIs) least affected by skin deformations, using a quality parameter to filter out noisy temporal signals. Signal-to-noise ratio is then employed as a fitness parameter to identify the most reliable rPPG clip. Experimental results on the iCOPEvid dataset demonstrate that rPPG signals provide useful discriminative information for neonatal pain detection, with the blue color channel significantly outperforming red and green channels. Furthermore, fusing rPPG features with audio features yields superior performance compared to using either modality alone, suggesting a promising multimodal approach for automated, contactless pain monitoring in vulnerable infants.

Key Points
  • Uses remote photoplethysmography (rPPG) from facial videos to detect neonatal pain non-invasively
  • Blue color channel rPPG signals outperform red and green channels for pain detection accuracy
  • Combining rPPG with audio features improves multimodal pain detection over individual modalities

Why It Matters

Enables continuous, contactless pain monitoring for vulnerable neonates, reducing infection risks and improving developmental outcomes.