Demonstration of a multimodal real-time software framework integrating musical structure analysis with behavioral and electrophysiological data.
I am a music theory master's student whose work investigates how musical structure shapes perception in real time. My academic path began in journalism and early music cognition, leading to international conference presentations and a peer-reviewed publication. Over time, my interests expanded toward computational modeling, culminating in the development of a multimodal research framework that synchronizes harmonic analysis with EEG, audio, and behavioral data streams. This system grew out of both my earlier research in generative approaches to musical structure and my industry experience as a UX/Sound Designer, where I created cognitively optimized auditory alerts. Currently at UMass Amherst, I continue to integrate music theory, signal processing, and human-centered design to examine how harmonic motion, attention, and emotional response interact within dynamic listening environments.
Vancouver, BC, Canada - JW Marriott Parq Vancouver Hotel
Poster presentation: "Dynamic Temporal Alignment of EEG and Music: A Novel Framework for Real-Time Music
Cognition Research."
Scheduled for March 8, 2026
Bologna, Italy - Palazzo dei Congressi
Workshop (90 mins): "A Multimodal Real-Time Tool for Music Therapy: Investigating Musical Memory and Emotional Engagement in Cognitive Decline."