MoodMap: Heartbeat-Adaptive Music Recommendation System
MoodMap is a UX research-driven music recommendation system that selects music in real time from the listener's physiological state. Instead of relying on listening history, playlists, or manual mood selection, the system uses heart rate and heart rate variability as the primary interaction signals.
The project transforms cardiovascular data into affective music selection by mapping HR and HRV onto a valence–energy model. As a result, the listener's body becomes the interface: the system continuously interprets physiological state and selects music from an affectively structured corpus.
Problem
Most music recommendation systems are driven by historical behavior: what the listener previously played, what similar users liked, or what is currently popular. However, these systems do not respond to what the listener's body is doing in the present moment.
This creates a UX problem in contexts such as post-exercise recovery, stress, fatigue, or sleep preparation. The user may need music that matches or regulates their physiological state, yet the interface still requires manual searching, skipping, or mood selection.
UX Research Foundation
This project began as a UX research investigation into physiologically responsive music interaction. The research identified four core pain points in existing recommendation systems, wearable health applications, and affective music datasets.
- Temporal disconnection: recommendations are not connected to real-time physiological state.
- Cognitive overhead: mood-based filters require the user to self-assess and manually choose.
- Mapping gap: there is no clear bridge between affective music tags and cardiovascular signals.
- Data and Catalogue limitation: existing affective music datasets and music recommendation catalogues are not structured for real-time playback.
These findings directly shaped the product direction. Instead of asking users to choose a mood, the system lets the body report the current state through HR and HRV. Therefore, MoodMap is not only a music player, but the outcome of a UX research process focused on reducing cognitive effort and making physiological state actionable.
Research Approach
1. User Pain Point Analysis
Defined the interaction problem around real-time physiological state, cognitive effort, mood selection, and the lack of body-aware recommendation logic.
2. Data Set Extension
Extended the MTG-Jamendo corpus by adding valence and energy coordinates to mood categories, making the catalogue queryable by affective state.
3. Physiological Mapping
Designed a mapping model that translates HR and HRV deviation into target affective coordinates for real-time track selection.
Core Framework
MoodMap is organized around three connected layers:
- Physiological input layer: heart rate and HRV are captured as the primary interaction signals.
- Affective corpus layer: 59 mood categories are mapped to valence–energy coordinates.
- Recommendation layer: the system selects tracks by matching cardiovascular state to affective position.
System Design
The system uses the MTG-Jamendo corpus as its music foundation. In its original form, the dataset contains categorical mood and genre tags but no quantitative valence or energy values. Therefore, the corpus could not directly support physiological music selection.
To solve this, I designed a valence–energy annotation layer that assigns each mood category a coordinate in Russell's circumplex model of affect. This transforms the dataset from a static collection of mood tags into a dynamically queryable affective catalogue.
The recommendation pipeline then maps heart rate deviation to the energy axis and HRV deviation to the valence axis. The system queries the corpus using weighted distance and selects the track whose affective position best matches the listener's current or target physiological state.
Interaction Design Translation
The central interaction design principle is that the body becomes the interface. Rather than asking users to choose a mood, the system reads cardiovascular state and performs the selection automatically. This reduces interaction load at moments when the user may already be tired, stressed, overstimulated, or trying to self-regulate.
- No manual mood selection: HR and HRV replace explicit mood filtering.
- Session-based calibration: the system responds to deviations from the user's own baseline.
- Mirror mode: music matches the listener's current physiological state.
- Guide mode: music gently moves the listener toward a target state such as relaxation.
UX Design Principles
The research produced four design principles for physiologically responsive interfaces:
- The body is the interface: cardiovascular state performs the navigation.
- Interpretability is required: every selection can be traced to HR, HRV, and a lookup-table entry.
- Calibration personalizes without profiling: the system adapts to session-local baseline values.
- Determinism supports trust: fixed mappings make the system reproducible and auditable.
Impact
MoodMap reframes music recommendation as a real-time physiological interaction problem. Instead of asking what the user wants to hear, the system asks what the user's body is reporting now and how music can respond to that state.
The project contributes a transparent and reproducible UX framework for health-adjacent adaptive systems. It combines data set design, affective annotation, physiological mapping, and interaction design into a single working prototype.
Prototype & Documentation
Interactive Prototype
Explore the heartbeat-adaptive music recommendation prototype, where HR and HRV guide real-time music selection through the MoodMap interface.
LaunchFull UX Research Documentation
Read the full research: user pain points, MTG-Jamendo corpus extension, valence–energy annotation, and HR/HRV mapping logic.
Launch