AffectMachine
The AffectMachine is a controllable music generation system which allows users to customize the output music to their desired emotional state. It was originally designed to be incorporated into biofeedback systems (such as brain-computer-interfaces) to help users become aware of, and ultimately mediate, their own dynamic affective states.
Across several experiments and demos, we tested the system with various input physiological signals including EEG, electrodermal activity (EDA), and heart rate.
Role
I worked on developing the AffectMachine together with prof Kat Agres, Adyasha Dash, and a group of composers at the Yong Siew Toh Conservatory of Music. I took point on developing the codebase for the AffectMachine, and helped to translate the composers’ musical ideas to code. My gratitude to Nishka Khendry, who assisted with the development as part of her undergraduate final year project, as well as prof Kat for bringing me on board.
Publications
AffectMachine-Classical: a novel system for generating affective classical music
Frontiers in Psychology, 2023
Kat Agres, Adyasha Dash, Phoebe Chua