arts Article User-Influenced/Machine-Controlled Playback: The variPlay Music App Format for Interactive Recorded Music Justin Paterson 1,* , Rob Toulson 2 and Russ Hepworth-Sawyer 3 1 London College of Music, University of West London, St Mary’s Rd, Ealing, London W5 5RF, UK 2 School of Media Art and Design, University of Westminster, Watford Rd, Northwick Park, Middlesex HA1 3TP, UK 3 School of Performance & Media Production, York St John University, Lord Mayors Walk, York YO31 7EX, UK * Correspondence:
[email protected] Received: 19 July 2019; Accepted: 29 August 2019; Published: 3 September 2019 Abstract: This paper concerns itself with an autoethnography of the five-year ‘variPlay’ project. This project drew from three consecutive rounds of research funding to develop an app format that could host both user interactivity to change the sound of recorded music in real-time, and a machine-driven mode that could autonomously remix, playing back a different version of a song upon every listen, or changing part way on user demand. The final funded phase involved commercialization, with the release of three apps using artists from the roster of project partner, Warner Music Group. The concept and operation of the app is discussed, alongside reflection on salient matters such as product development, music production, mastering, and issues encountered through the commercialization itself. The final apps received several thousand downloads around the world, in territories such as France, USA, and Mexico. Opportunities for future development are also presented. Keywords: interactive music; dynamic music; app; remix; automatic remix; commercialization; HCI; music production 1.