Fly Me to the Moon: A Pilot Experiment on Music Neuroscience
Background and Aim
Cooperation in music performance is a foundational subject in the study of music cognition. When musicians perform together, they assume different roles—leader, follower, and observer—creating a dynamic interplay of roles. But what exactly occurs in their brains during these interactions? This experiment investigates the neural mechanisms underlying cooperative music performance, focusing on how brain activity reflects these roles.
The Pilot Experiment
The accompanying video showcases one of our pilot experiments conducted at the Tsinghua Laboratory of Brain and Intelligence. Musicians were equipped with functional near-infrared spectroscopy (fNIRS) devices, which measure blood oxygen levels in various brain regions. In this experiment, researchers, including myself (as the clarinet player), participated as performers .
Key Visuals in the Video
Bottom Left Corner: A live video of the performance involving three performers and six listeners, all equipped with HuiChuang fNIRS devices.
Top Left Corner: A real-time three-dimensional audio spectrum (analyzed using Insight2 plug-ins in Pro Tools).
Right Side: Brain activity (HbO images) of the three performers, providing a glimpse of oxygenation patterns during specific moments of the performance.
Initial Observations
1. 0:42 - 0:50: During the clarinet's melody in Section A2, noticeable activation appears in the frontal area, associated with emotional processing.
2. 1:12 - 1:30: While the bass player performs the walking bass in Section B1, significant left frontal activation occurs, persisting throughout the section.
3. 1:45 - 2:00: In Section B2, as the pianist takes over the melody, activation increases in the frontal and parietal regions, linked to motor functions.
These observations, while intriguing, are preliminary and require further analysis due to potential inaccuracies and the inherent time delay of fNIRS measurements. Rigorous analysis and replication are essential for drawing robust conclusions.
Beyond Science: The Future of Interactive Music
This pilot experiment extends beyond traditional neuroscience, exploring the potential of integrating brain activity into interactive music performances. Such endeavors could revolutionize audio-visual experiences, enabling real-time interaction between performers' neural states and audience perception. Notable precedents include concerts featuring real-time brain activity projections, such as those by pianist Kong Xiangdong.
Looking ahead, I aim to deepen the understanding of the relationship between music and brain activity. This research aspires to broaden the horizons of interactive and audio-visual performances, creating new opportunities for artistic and scientific exploration.


Appendix: Data Processing Details
Data processed using NirSpark V1.7.5.
Preprocessing:
Auto motion correction, filter (0.01Hz-1Hz), hemo (DPF = 6).
Colorbar:
±0.5 mmol/L*mm. ( It’s a relative small parameter in order to get a significant visualization in the image)