BreathAI: Transfer Learning-Based Thermal Imaging for Automated Breathing Pattern Recognition
Researchers' new AI uses thermal cameras to monitor breathing, outperforming sound-based methods.
A research team led by Hamza Kheddar, Yassine Himeur, and Abbes Amira has published a paper on arXiv introducing BreathAI, a novel system for automated breathing pattern recognition. Unlike traditional sound-based methods, their approach leverages thermal imaging and a sophisticated deep learning architecture called the Adaptive Transfer Learning and Thresholding-based Deep Learning Model (ATL-TDLM). The core innovation lies in its use of hierarchical deep feature extraction combined with adaptive multi-thresholding (AMT) to precisely segment thermal data, enabling highly accurate tracking of subtle temperature changes associated with breathing.
The model further enhances its performance through knowledge distillation-based fine-tuning (KD-FT) to optimize the transfer of learned features and contrastive representation learning (CRL) to better separate the inhalation and exhalation phases. This technical combination has yielded a state-of-the-art accuracy of 98.8%, significantly outperforming existing models. The system is designed to be computationally efficient, making real-time monitoring feasible. Its primary application is in the non-contact, silent detection of respiratory disorders such as sleep apnea and asthma, offering a more comfortable and continuous monitoring solution compared to intrusive sensors or microphones.
- Uses thermal imaging instead of sound, enabling silent, non-contact breathing monitoring.
- Achieves 98.8% accuracy by combining adaptive multi-thresholding and contrastive learning.
- Designed for efficient real-time use in detecting sleep apnea and asthma.
Why It Matters
Enables continuous, discreet health monitoring without wearable sensors, potentially improving early diagnosis of respiratory conditions.