An Algorave with FoxDot

Ryan Kirkbride

University of Leeds, Leeds, United Kingdom [email protected]

Description

FoxDot is a new system developed as an extension to the Python programming language that interfaces with SuperCollider to create . While FoxDot’s development is still in its infancy, the purpose of this proposal to perform at the International Conference of Live Interfaces (ICLI) is to showcase its ability to create music easily, quickly, and in a human-readable format. This performance will not only demonstrate the advantages of integrating an existing programming language into a Live Coded music performance by importing existing libraries, whose application will be rerouted into musical patterns, but also exemplify the ease of “blank slate” performances with this new system.

Figure 1. Live image processing with FoxDot

Live coding can be used to create music from a wide range of genres but is most commonly associated with live performances of dance or music at , known as “Algoraves”. Performances usually consist of one or more laptop performers using a programming language to create music and projecting their screen so audience members can gain an insight into the performer’s creative thinking by watching the code be written in real-time. “Algorave” events are designed to get people dancing and the nature of Live Coding allows the performer to react and engage with the audience and create a fun and exciting atmosphere. For this reason my proposal is to perform using FoxDot in a semi-improvised “Algorave” style in a venue for a duration of 25 to 30 minutes. The music will be generated by combining synthesised sounds and the manipulation of samples through the use of objects that iterate over musical patterns in an algorithmic fashion.

FoxDot is a new interface for musical expression and, as a Live Coding language, is inherently a form of notation for Human-Computer Interaction (HCI). By projecting the screen the performance also becomes one of an audiovisual nature. One of the advantages of using Python as the foundation language for FoxDot is the ease at which external code can be imported into a performance from Python’s existing library or a user’s own module. This is demonstrated in one of my pieces called “Webs”1 that uses a Python module for downloading web-pages and converts the HTML into music. While the type of music generated in this instance is not appropriate for a nightclub setting, I am currently writing a plug-in module using OpenCV to compliment the “Algorave” style I intend to perform (see figure 1). The plug-in connects to, and displays the image captured from, a web-cam and generates Open Sound Control (OSC) messages to send to SuperCollider based on my gestures. I will then perform live-coded image processing to alter the display and consequently change the sonic output, combining multiple types of HCI into one performance.

Biography

Ryan Kirkbride graduated from the University of Leeds in 2014 with a first class degree in Computer Science before completing his MA in in the summer of 2015. He started working on FoxDot as part of his masters module in Composition and has since continued development. One of Ryan’s research interests lies in algorithmic composition and his Masters dissertation, “The Infinite Remix Machine”2, was part of the research workshop at the Electronic Visualisation and the Arts (EVA) 2015 conference in London. He is currently in the first year of his PhD studying the use of non-verbal communication in ensemble performances using motion capture technology and spends his free time working on FoxDot and researching Live Coding.

1 https://www.youtube.com/watch?v=EnaKvs-GlYo 2 http://ewic.bcs.org/content/ConWebDoc/54873