Sonifying heart signals to understand team dynamics in COVID care.
Figure 1: Distances in the pitch class helix of the spiral array, used for mapping data to sounds.
Figure 2: TFC data for the patient versus a medical team member.
Figure 3: Project workflow.
Figure 4: Visualization of TFC data.
Video 1 and 2: Visualization and sonic representation of the heartbeats of the patient and the medial team members. Color-coded squares visualize and audibly represent the degree of synchronization between the patient (top) and medical team members.
Keywords
Music Tech, User Research, Data Science.
Technologies
Python, R, & MATLAB (data preparation and analysis), SuperCollider (sound synthesis), Processing (visualization).
Background
The COVID-19 pandemic underscored the potential of data sonification, a technique that translates numerical data into sound. While many such applications focused on epidemiological data, the human experiences during the pandemic were largely overlooked. This project aimed to address that gap by sonifying the heartbeats of a COVID-19 patient and a medical team, offering a unique auditory representation of their shared experience.
Aim
The aim of this work was to create a musical representation of heart signals to reflect how a medical team comes together during a COVID-19 treatment. More specifically, the goal was to explore Time-Frequency Coherence (TFC) and heartbeat rhythms within this group.
Approach
To quantify the relationship between the heart signals of the patient and the medical team, we analyzed Heart Rate Variability (HRV) data. This data was processed using a custom MATLAB script to calculate Time-Frequency Coherence (TFC), which is a measure of spectral coherence.
I then developed a rule-based system that mapped TFC values to distances between pitches in the spiral array, allowing for the creation of sonic representations of patient-team interactions. In this system, coherent signals correspond to consonant sounds, while incoherent signals are represented by dissonant sounds.
The effectiveness of this sonic representation was evaluated through a listening experiment with 41 participants, who were exposed to two conditions: coherent and incoherent signals. Results were analyzed using Poisson regression analysis and chi-squared tests. Additionally, interactive visualizations were created to further illustrate the data.
Findings
Results demonstrated the effectiveness of the proposed system in distinguishing between low and high coherence between heart signals, achieving an overall accuracy of 69%. This study underscores the potential of sonifying physiological data to convey complex relationships and to enhance understanding of cardiovascular health.