Modeling Stage-wise Evolution of User Interests for News Recommendation
A novel framework uses stage-wise temporal graphs and dual LSTM/attention branches to capture both long-term habits and short-term trends.
A research team led by Zhiyong Cheng has introduced a novel AI framework designed to solve a core challenge in personalized news recommendation: the dynamic, time-sensitive nature of user interests. Traditional models that rely on a single, static interaction graph struggle to capture the rapid shifts driven by emerging events and trending topics. This new approach proposes a unified architecture that learns user preferences from dual temporal perspectives. A global preference modeling component captures long-term collaborative signals and stable reading habits from the overall interaction graph. Simultaneously, a local preference modeling component addresses short-term dynamics by partitioning a user's historical interactions into distinct, stage-wise temporal subgraphs.
Within this local module, the framework employs a two-branch neural network to analyze evolving interests. One branch uses a Long Short-Term Memory (LSTM) network to model the progressive, sequential evolution of a user's most recent engagements. The other branch leverages a self-attention mechanism to identify and weigh long-range temporal dependencies across different behavioral stages. This combination allows the model to understand both immediate context shifts and broader interest patterns over time. The paper, accepted at the ACM Web Conference 2026 (WWW '26), reports that extensive experiments on two large-scale real-world datasets demonstrate the model's superiority. It consistently outperforms strong existing baselines, proving particularly effective at delivering recommendations that are both fresher and more relevant across diverse user behaviors and temporal settings.
- Proposes a unified framework with global & local modeling to capture both long-term preferences and short-term, context-dependent interest changes.
- Uses stage-wise temporal subgraphs and a dual-branch network (LSTM for progression, self-attention for dependencies) to model interest evolution.
- Outperforms existing baselines in experiments on large-scale datasets, delivering significantly fresher and more relevant news recommendations.
Why It Matters
This research could lead to news feeds and content platforms that adapt in real-time to user interest shifts, dramatically improving engagement.