AffectMachine

The AffectMachine is a controllable music generation system which allows users to customize the output music to their desired emotional state. It was originally designed to be incorporated into biofeedback systems (such as brain-computer-interfaces) to help users become aware of, and ultimately mediate, their own dynamic affective states.

Using the AffectMachine together with the Empatica E4 for physiological sensing .

Using the AffectMachine together with the Empatica E4 for physiological sensing .

Across several experiments and demos, we tested the system with various input physiological signals including EEG, electrodermal activity (EDA), and heart rate.

Showcasing the AffectMachine together with the Muse 2 at the ArtScience Museum in Singapore, as part of the MENTAL: Colours of Wellbeing exhibition.

Showcasing the AffectMachine together with the Muse 2 at the ArtScience Museum in Singapore, as part of the MENTAL: Colours of Wellbeing exhibition.

Role

I worked on developing the AffectMachine together with prof Kat Agres, Adyasha Dash, and a group of composers at the Yong Siew Toh Conservatory of Music. I took point on developing the codebase for the AffectMachine, and helped to translate the composers’ musical ideas to code. My gratitude to Nishka Khendry, who assisted with the development as part of her undergraduate final year project, as well as prof Kat for bringing me on board.

 

Publications

AffectMachine-Classical: a novel system for generating affective classical music
Frontiers in Psychology, 2023
Kat Agres, Adyasha Dash, Phoebe Chua