Abletweet: Harnessing Social Media APIs for Encoding, Co-Creating and Performing Improvised Generative Electronic Music

James Curtis

web

Details

Abstract

Abletweet is a novel Max For Live device which uses the Twitter API and Node.JS to facilitate improvised generative electronic music performances inside Ableton Live. The generation of MIDI sequence data from tweet text content is shaped parametrically by the performer using musically appropriate functions such as scale and rhythmic quantisation. Keywords derived from Sentiment Analysis using the AFINN-165 wordlist and Emoji Sentiment Ranking are used to accent the generated sequences using minor and major triad chords in accordance with their word ranking. Designed with consideration to bi-directionality, existing MIDI clips created by the performer can also be encoded as tweets and sent over twitter to other users and imported into Ableton in real time - harnessing social media as communication platform for creativity, co-creation and performance. Abletweet seeks to both encourage artists to harness social media APIs as a performable bidirectional networking protocol and consider openly accessible data as an available medium with which to create musical works.

Bio

James Curtis is a graduate researcher and sessional lecturer at RMIT University, Melbourne. Having completed a double degree in Fine Art (Sound and Spatial Practice) and Design (Industrial - First Class Honours) James received the Vice Chancellor's Award for Academic Excellence for his research in sound and interaction design in 2017. James is currently undertaking PhD in the RMIT School of Design on a full-time scholarship award, his research focus is developing Artificial Intelligence-mediated creative design tools for musicians.