Biosensing Music

Role

UX & UI Designer


Tools

Figma


Duration

Figma


Team

Project Manager


A UX research and product design prototype at the frontier of biometric music personalization

BioSensing Music

Understanding Music Taste Through Bio-Sensing Behavior

A speculative UX research and product design project exploring how physiological responses to music can reveal deeper, unconscious preferences.

Overview

Music profoundly affects the human body and mind — yet most streaming platforms rely solely on behavioral data like skips or replays to infer preferences. What if we could understand someone's music taste by measuring how their body feels during listening, rather than what they say they like?

This project explores the intersection of biometric sensing and emotional design: using heart rate, brainwaves, stress markers, and pupil dilation to map a listener’s true emotional response to music.

Research Synthesis

While self-initiated, this project imagined future collaboration with:

  • Wearable Tech Providers (e.g. Muse, Apple Watch)
  • Streaming Platforms (e.g. Spotify, Apple Music)
  • Emotionally-aware Users (e.g. those with ADHD, anxiety, or neurodivergence)

Frameworks from behavioral science, biometric UX, and affective computing were used to validate the design direction.

Challenge

How might we reveal a user’s authentic music preferences by decoding their physiological reactions — beyond likes, skips, or playlists?

Research

Cardiac & HRV Responses to Music

  • Roque et al. (2013) found both relaxing (baroque) and intense (metal) music decreased HRV — suggesting moderate physiological stress driven by sound volume.
  • Trappe & Voit (2016) showed that classical music (e.g. Mozart) lowered heart rate and blood pressure significantly, while pop music (ABBA) had no effect.
  • Balchin et al. (2021) found that jazz, piano, and lo-fi genres increased HRV, improving cardiovascular recovery during focused listening.

EEG & Brainwave Correlates

  • Rogenmoser et al. (2016) linked emotional music to changes in alpha and theta activity — with high arousal suppressing alpha, and positive valence boosting frontal theta.
  • Ramkhalawansingh et al. (2018) summarized that genre, liking, and emotional tone produce distinct EEG patterns: e.g. Mozart increases frontal alpha, while disliked music does not.

Pupil Dilation as Emotional Arousal

  • Gingras et al. (2015) showed that pupil size increased with emotionally arousing music. Personal involvement with music amplified this effect — but liking a song could reduce the arousal response, showing emotional regulation.

Stress & Autonomic Measures

  • Linnemann et al. (2015): relaxing self-chosen music reduced salivary cortisol and alpha-amylase.
  • Thoma et al. (2013): relaxing music before a stressor improved post-stress recovery (faster ANS normalization).
  • Adiasto et al. (2022): meta-analysis found music’s stress-reducing effects varied widely depending on genre, tempo, and user intent.

Individual Differences

  • Personal traits (e.g. high absorption, trait anxiety) intensified physiological responses.
  • Musical genre preference didn’t always predict response — pointing to subconscious or context-based factors.

Design Process

Music Timeline + Biometric Overlay

A chronological interface where users can see what music they listened to and how their body responded — in real time. This moves beyond play counts or likes, letting users reflect on how music actually made them feel physiologically.

For each song played, the timeline visualizes biometric changes using colored overlays:

Pupil Dilation(Only in Computer/Laptop)
  • Indicates emotional arousal.
  • Dilated pupils = heightened emotional intensity
  • Constricted pupils = calm or disengagement
Stress Markers
  • High stress = reduced heart rate variability (HRV), elevated cortisol
  • Low stress = increased HRV, calmer baseline
Brain Wave Patterns (EEG)
  • Alpha suppression = arousal or mental effort
  • Frontal theta = emotional engagement or focus
  • High beta = anxiety or high alert
Heart Rate (BPM)
  • Increased BPM = excitement, anxiety, or movement
  • Decreased BPM = relaxation or disengagement

Bio-Matched Discovery

Instead of using genres, this recommendation engine finds music that resonates with a user’s biological response. For example:

  • A track that reduces stress via increased HRV
  • A focus-inducing song based on increased frontal theta
  • A motivational track that boosts alpha-arousal safely

Outcome

  • Created a high-fidelity prototype in Figma, visualizing music-body interaction
  • Designed data architecture linking physiological signals to emotional states
  • Proposed new recommendation logic based on resonance, not genre
  • Developed real-time visualization tools for emotion-aware playback

Takeaways

Designing for the Invisible
This project pushed me to work with signals users don’t see or verbalize — like brainwaves or pupil dilation — and translate them into intuitive UX.

From Data to Meaning
Interpreting raw signals (like alpha suppression or salivary cortisol) into emotional insight required metaphor, context, and deep empathy.

Tech that Feels
Bio-sensing interfaces have the potential to deepen self-awareness. I now think more about software as an extension of emotional intelligence.

Final Thought

Music is more than preference — it’s embodied emotion. This project imagines a future where our tools don't just follow our clicks, but understand how we feel.

Thanks for stopping by!

Connect with me to
create something enduring!

Design is born of order, shaped by vision,
and destined to endure.