Music Cognition · Affective Computing · Multimodal Research

Real-Time Music & Human Response System

Music is one of the most powerful drivers of human emotion — yet we are only beginning to understand how and why it affects us the way it does. Measuring that response in real time, with scientific precision, opens a new frontier for how music is experienced, recommended, generated, and applied in health.

Domain

Multimodal Human Computer Interaction (HMI)

Signals

Audio, EEG, HR, HRV, Face

Methods

UX research, sync, prototyping

Output

Time-aligned biofeedback & sound structure dataset

Problem Definition & Research Approach

🎯 User Need

Whether streaming, generating, or experiencing music therapeutically — the systems never truly know how listeners feel. Affective state remains invisible, and personalization stays surface-level.

🔍 Root Cause

Research across music and health sciences has grown rapidly, but largely in isolation from one another. Structural and human-response data are rarely studied together in real time — creating a data scarcity that limits what AI systems in this space can learn and do. See Full UX Research for Root Cause Analysis.

1. Structuring the Music

Musical events are broken down into labeled, time-stamped units — making it possible to link specific moments in a piece to how a listener responds.

2. Capturing Human Response

As music plays, the system records brain activity, heart rate, facial expressions, and behavioral reactions — all at the same time.

3. Live Monitoring Dashboard

All data streams are displayed in a single real-time interface, giving researchers a clear view of how listeners respond moment by moment.

4. Synchronized Dataset

Every session produces a structured dataset linking each physiological response directly to the musical moment that triggered it.

System Design

The system brings together every signal — audio, brain activity, heart rate, facial expression, and behavioral response — into a single, time-aligned interface. Researchers can see exactly what a listener was experiencing, and when. It was built directly out of the research process — translating an identified gap in multimodal human-response analysis into a functional, integrated research tool.

Interaction Design

The interface makes invisible processes visible. As music plays, every signal — harmonic events, facial expressions, brain activity, and heart rate — appears in real time, giving researchers a moment-by-moment picture of the listener's experience.

  • Musical events: each chord and structural moment is labeled and tracked live.
  • Behavioral signals: facial expressions and task responses capture attention and engagement.
  • Physiological signals: EEG, heart rate, and HRV reflect the body's continuous response.
  • Temporal alignment: all streams are synchronized so every response can be traced back to its musical cause.

The result is a research tool that turns a complex, multi-layered human experience into something measurable, comparable, and actionable.

Impact

For the first time, researchers can observe how a listener's brain, body, and behavior respond to specific musical moments — all in one place, in real time. This creates a foundation for music systems that genuinely understand their users. The dataset generated by this system can directly inform how music recommendation algorithms are trained, how generative systems adapt to emotional state, and how music-based interventions are designed and evaluated in health contexts.

Demo & Documentation