Subjective and Objective Quality-of-Experience Evaluation Study for Live Video Streaming
New dataset of 1,155 distorted live videos aims to solve a major blind spot for platforms like Twitch and TikTok.
A team of researchers has published a landmark study addressing a critical gap in live-streaming technology: how to accurately measure viewer satisfaction, or Quality of Experience (QoE). While metrics exist for on-demand video (VoD), live streaming presents unique challenges like variable frame rates and real-time compression. To solve this, the team created 'TaoLive QoE,' the first large-scale dataset specifically for live video. It includes 42 source clips from real broadcasts and 1,155 distorted versions that simulate common streaming problems, providing a crucial benchmark for AI training.
Using this dataset, the researchers conducted human studies to establish ground-truth QoE scores and then benchmarked existing evaluation models. They found current models perform poorly on live content. In response, they developed 'Tao-QoE,' a new end-to-end AI model. Unlike traditional methods that rely on network statistics (QoS), Tao-QoE analyzes multi-scale semantic features and optical flow-based motion to directly predict a retrospective quality score, offering a more human-aligned assessment of what makes a live stream watchable.
- Introduces 'TaoLive QoE,' the first dedicated dataset with 1,155 distorted live-streaming videos for AI training and benchmarking.
- Highlights the failure of existing Video-on-Demand (VoD) QoE models to accurately assess live content with its specific distortions.
- Proposes a new AI model, 'Tao-QoE,' that uses semantic and motion features instead of network stats, achieving better correlation with human scores.
Why It Matters
Enables platforms like Twitch and TikTok to algorithmically optimize video quality in real-time, directly improving viewer retention and engagement.