An Algorave with Foxdot

Total Page:16

File Type:pdf, Size:1020Kb

An Algorave with Foxdot An Algorave with FoxDot Ryan Kirkbride University of Leeds, Leeds, United Kingdom [email protected] Description FoxDot is a new Live Coding system developed as an extension to the Python programming language that interfaces with SuperCollider to create electronic music. While FoxDot’s development is still in its infancy, the purpose of this proposal to perform at the International Conference of Live Interfaces (ICLI) is to showcase its ability to create music easily, quickly, and in a human-readable format. This performance will not only demonstrate the advantages of integrating an existing programming language into a Live Coded music performance by importing existing libraries, whose application will be rerouted into musical patterns, but also exemplify the ease of “blank slate” performances with this new system. Figure 1. Live image processing with FoxDot Live coding can be used to create music from a wide range of genres but is most commonly associated with live performances of dance or drum and bass music at nightclubs, known as “Algoraves”. Performances usually consist of one or more laptop performers using a programming language to create music and projecting their screen so audience members can gain an insight into the performer’s creative thinking by watching the code be written in real-time. “Algorave” events are designed to get people dancing and the nature of Live Coding allows the performer to react and engage with the audience and create a fun and exciting atmosphere. For this reason my proposal is to perform using FoxDot in a semi-improvised “Algorave” style in a nightclub venue for a duration of 25 to 30 minutes. The music will be generated by combining synthesised sounds and the manipulation of samples through the use of objects that iterate over musical patterns in an algorithmic fashion. FoxDot is a new interface for musical expression and, as a Live Coding language, is inherently a form of notation for Human-Computer Interaction (HCI). By projecting the screen the performance also becomes one of an audiovisual nature. One of the advantages of using Python as the foundation language for FoxDot is the ease at which external code can be imported into a performance from Python’s existing library or a user’s own module. This is demonstrated in one of my pieces called “Webs”1 that uses a Python module for downloading web-pages and converts the HTML into music. While the type of music generated in this instance is not appropriate for a nightclub setting, I am currently writing a plug-in module using OpenCV to compliment the “Algorave” style I intend to perform (see figure 1). The plug-in connects to, and displays the image captured from, a web-cam and generates Open Sound Control (OSC) messages to send to SuperCollider based on my gestures. I will then perform live-coded image processing to alter the display and consequently change the sonic output, combining multiple types of HCI into one performance. Biography Ryan Kirkbride graduated from the University of Leeds in 2014 with a first class degree in Computer Science before completing his MA in Computer Music in the summer of 2015. He started working on FoxDot as part of his masters module in Composition and has since continued development. One of Ryan’s research interests lies in algorithmic composition and his Masters dissertation, “The Infinite Remix Machine”2, was part of the research workshop at the Electronic Visualisation and the Arts (EVA) 2015 conference in London. He is currently in the first year of his PhD studying the use of non-verbal communication in ensemble performances using motion capture technology and spends his free time working on FoxDot and researching Live Coding. 1 https://www.youtube.com/watch?v=EnaKvs-GlYo 2 http://ewic.bcs.org/content/ConWebDoc/54873 .
Recommended publications
  • Konzerte, Klanginstallationen, Performances, Künstlergespräche, Filme, Workshops Concerts, Sound Installations, Performances, Artist Talks, Films, Workshops
    Biennale für Elektroakustische Musik und Klangkunst Biennial for Electroacoustic Music and Sound Art 28.9. – 1.10.2017 Konzerte, Klanginstallationen, Performances, Künstlergespräche, Filme, Workshops Concerts, Sound Installations, Performances, Artist Talks, Films, Workshops 1 KONTAKTE’17 28.9.–1.10.2017 Biennale für Elektroakustische Musik und Klangkunst Biennial for Electroacoustic Music and Sound Art Konzerte, Klanginstallationen, Performances, Künstlergespräche, Filme, Workshops Concerts, Sound Installations, Performances, Artist Talks, Films, Workshops KONTAKTE '17 INHALT 28. September bis 1. Oktober 2017 Akademie der Künste, Berlin Programmübersicht 9 Ein Festival des Studios für Elektroakustische Musik der Akademie der Künste A festival presented by the Studio for Electro­ acoustic Music of the Akademie der Künste Konzerte 10 Im Zusammenarbeit mit In collaboration with Installationen 48 Deutsche Gesellschaft für Elektroakustische Musik Berliner Künstlerprogramm des DAAD Forum 58 Universität der Künste Berlin Hochschule für Musik Hanns Eisler Berlin Technische Universität Berlin Ausstellung 62 Klangzeitort Helmholtz ­Zentrum Berlin Workshop 64 Ensemble ascolta Musik der Jahrhunderte, Stuttgart Institut für Elektronische Musik und Akustik der Kunstuniversität Graz Laboratorio Nacional de Música Electroacústica Biografien 66 de Cuba singuhr – projekte Partner 88 Heroines of Sound Lebenshilfe Berlin Deutschlandfunk Kultur Lageplan 92 France Culture Karten, Information 94 Studio für Elektroakustische Musik der Akademie der Künste Hanseatenweg 10, 10557 Berlin Fon: +49 (0) 30 ­ 20057­2236 www.adk.de/sem E­Mail: [email protected] KONTAKTE ’17 www.adk.de/kontakte17 #kontakte17 KONTAKTE’17 Die zwei Jahre, die seit der ersten Ausgabe von KONTAKTE im Jahr 2015 vergangen sind, waren für das Studio für Elektroakustische Musik eine ereignisreiche Zeit. Mitte 2015 erhielt das Studio eine großzügige Sachspende ausgesonderter Studiotechnik der Deut­ schen Telekom, die nach entsprechenden Planungs­ und Wartungsarbeiten seit 2016 neue Produktionsmöglichkeiten eröffnet.
    [Show full text]
  • An AI Collaborator for Gamified Live Coding Music Performances
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Falmouth University Research Repository (FURR) Autopia: An AI Collaborator for Gamified Live Coding Music Performances Norah Lorway,1 Matthew Jarvis,1 Arthur Wilson,1 Edward J. Powley,2 and John Speakman2 Abstract. Live coding is \the activity of writing (parts of) regards to this dynamic between the performer and audience, a program while it runs" [20]. One significant application of where the level of risk involved in the performance is made ex- live coding is in algorithmic music, where the performer mod- plicit. However, in the context of the system proposed in this ifies the code generating the music in a live context. Utopia3 paper, we are more concerned with the effect that this has on is a software tool for collaborative live coding performances, the performer themselves. Any performer at an algorave must allowing several performers (each with their own laptop pro- be prepared to share their code publicly, which inherently en- ducing its own sound) to communicate and share code during courages a mindset of collaboration and communal learning a performance. We propose an AI bot, Autopia, which can with live coders. participate in such performances, communicating with human performers through Utopia. This form of human-AI collabo- 1.2 Collaborative live coding ration allows us to explore the implications of computational creativity from the perspective of live coding. Collaborative live coding takes its roots from laptop or- chestra/ensemble such as the Princeton Laptop Orchestra (PLOrk), an ensemble of computer based instruments formed 1 Background at Princeton University [19].
    [Show full text]
  • Spaces to Fail In: Negotiating Gender, Community and Technology in Algorave
    Spaces to Fail in: Negotiating Gender, Community and Technology in Algorave Feature Article Joanne Armitage University of Leeds (UK) Abstract Algorave presents itself as a community that is open and accessible to all, yet historically, there has been a lack of diversity on both the stage and dance floor. Through women- only workshops, mentoring and other efforts at widening participation, the number of women performing at algorave events has increased. Grounded in existing research in feminist technology studies, computing education and gender and electronic music, this article unpacks how techno, social and cultural structures have gendered algorave. These ideas will be elucidated through a series of interviews with women participating in the algorave community, to centrally argue that gender significantly impacts an individual’s ability to engage and interact within the algorave community. I will also consider how live coding, as an embodied techno-social form, is represented at events and hypothesise as to how it could grow further as an inclusive and feminist practice. Keywords: gender; algorave; embodiment; performance; electronic music Joanne Armitage lectures in Digital Media at the School of Media and Communication, University of Leeds. Her work covers areas such as physical computing, digital methods and critical computing. Currently, her research focuses on coding practices, gender and embodiment. In 2017 she was awarded the British Science Association’s Daphne Oram award for digital innovation. She is a current recipient of Sound and Music’s Composer-Curator fund. Outside of academia she regularly leads community workshops in physical computing, live coding and experimental music making. Joanne is an internationally recognised live coder and contributes to projects including laptop ensemble, OFFAL and algo-pop duo ALGOBABEZ.
    [Show full text]
  • International Computer Music Conference (ICMC/SMC)
    Conference Program 40th International Computer Music Conference joint with the 11th Sound and Music Computing conference Music Technology Meets Philosophy: From digital echos to virtual ethos ICMC | SMC |2014 14-20 September 2014, Athens, Greece ICMC|SMC|2014 14-20 September 2014, Athens, Greece Programme of the ICMC | SMC | 2014 Conference 40th International Computer Music Conference joint with the 11th Sound and Music Computing conference Editor: Kostas Moschos PuBlished By: x The National anD KapoDistrian University of Athens Music Department anD Department of Informatics & Telecommunications Panepistimioupolis, Ilissia, GR-15784, Athens, Greece x The Institute for Research on Music & Acoustics http://www.iema.gr/ ADrianou 105, GR-10558, Athens, Greece IEMA ISBN: 978-960-7313-25-6 UOA ISBN: 978-960-466-133-6 Ξ^ĞƉƚĞŵďĞƌϮϬϭϰʹ All copyrights reserved 2 ICMC|SMC|2014 14-20 September 2014, Athens, Greece Contents Contents ..................................................................................................................................................... 3 Sponsors ..................................................................................................................................................... 4 Preface ....................................................................................................................................................... 5 Summer School .......................................................................................................................................
    [Show full text]
  • Combining Live Coding and Vjing for Live Visual Performance
    CJing: Combining Live Coding and VJing for Live Visual Performance by Jack Voldemars Purvis A thesis submitted to the Victoria University of Wellington in fulfilment of the requirements for the degree of Master of Science in Computer Graphics. Victoria University of Wellington 2019 Abstract Live coding focuses on improvising content by coding in textual inter- faces, but this reliance on low level text editing impairs usability by not al- lowing for high level manipulation of content. VJing focuses on remixing existing content with graphical user interfaces and hardware controllers, but this focus on high level manipulation does not allow for fine-grained control where content can be improvised from scratch or manipulated at a low level. This thesis proposes the code jockey practice (CJing), a new hybrid practice that combines aspects of live coding and VJing practice. In CJing, a performer known as a code jockey (CJ) interacts with code, graphical user interfaces and hardware controllers to create or manipulate real-time visuals. CJing harnesses the strengths of live coding and VJing to enable flexible performances where content can be controlled at both low and high levels. Live coding provides fine-grained control where con- tent can be improvised from scratch or manipulated at a low level while VJing provides high level manipulation where content can be organised, remixed and interacted with. To illustrate CJing, this thesis contributes Visor, a new environment for live visual performance that embodies the practice. Visor's design is based on key ideas of CJing and a study of live coders and VJs in practice.
    [Show full text]
  • Improv: Live Coding for Robot Motion Design
    Improv: Live Coding for Robot Motion Design Alexandra Q. Nilles Chase Gladish Department of Computer Science Department of Computer Science University of Illinois at Urbana-Champaign University of Illinois at Urbana-Champaign [email protected] [email protected] Mattox Beckman Amy LaViers Department of Computer Science Mechanical Science and Engineering University of Illinois at Urbana-Champaign Department [email protected] University of Illinois at Urbana-Champaign [email protected] ABSTRACT Often, people such as educators, artists, and researchers wish to quickly generate robot motion. However, current toolchains for programming robots can be difficult to learn, especially for people without technical training. This paper presents the Improv system, a programming language for high-level description of robot motion with immediate visualization of the resulting motion on a physical or simulated robot. Improv includes a "live coding" wrapper for ROS (“Robot Operating System", an open-source robot software framework which is widely used in academia and industry, and integrated with many commercially available robots). Commands in Improv are compiled to ROS messages. The language is inspired by choreographic techniques, and allows the user to compose and transform movements in space and time. In this paper, we present our work on Improv so far, as well as the design decisions made throughout its creation. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page.
    [Show full text]
  • AIMC2021 Paper
    Autopia: An AI collaborator for live networked computer music performance Norah Lorway1, Edward J. Powley2 and Arthur Wilson3 1 Academy of Music & Theatre Arts, Falmouth University, UK 2 Games Academy, Falmouth University, UK 3 School of Communication, Royal College of Art, UK [email protected], [email protected], [email protected] Abstract. This paper describes an AI system, Autopia, designed to participate in collaborative live coding music performances using the Utopia1 software tool for SuperCollider. This form of human-AI collaboration allows us to explore the implications of mixed-initiative computational creativity from the perspective of live coding. As well as collaboration with human performers, one of our motivations with Autopia is to explore audience collaboration through a gamified mode of interaction, namely voting through a web-based interface accessed by the audience on their smartphones. The results of this are often emergent, chaotic, and surprising to performers and audience alike. Keywords: 1 Introduction 1.1 Live Coding Live coding is the activity of manipulating, interacting and writing parts of a program whilst it runs (Ward et al., 2004). Whilst live coding can be used in a variety of contexts, it is most commonly used to create improvised computer music and visual art. The diversity of musical and artistic output achievable with live coding techniques has seen practitioners perform in many different settings, including jazz bars, festivals and algoraves --- an event in which performers use algorithms to create both music and visuals that can be performed in the context of a rave. What began as a niche practice has evolved into an international community of artists, programmers, and researchers.
    [Show full text]
  • Manifesto for a Musebot Ensemble: a Platform for Live Interactive Performance Between Multiple Autonomous Musical Agents
    Manifesto for a Musebot Ensemble: A platform for live interactive performance between multiple autonomous musical agents 1 2 3 Oliver Bown , Benjamin Carey , Arne Eigenfeldt 1Design Lab, University of Sydney, 2Creativity and Cognition Studios, University of Technology Sydney, NSW, Australia 3School of Contemporary Arts, Simon Fraser University, Vancouver, BC, Canada. [email protected], [email protected], [email protected] Abstract and the opportunities it will bring, discuss the challenges In this paper we draw on previous research in musical meta- and questions faced, and present the design and specifica- creation (MuMe) to propose that novel creative forms are needed tion of our multi-agent system, along with a set of tools to propel innovation in autonomous creative musical agents. We and example agents that we have created. propose the “musebot”, and the “musebot ensemble”, as one such novel form that we argue will provide new opportunities for artis- Objectives and ‘Genres’ of Musical Metacreation tic practitioners working in the MuMe field to better collaborate, MuMe straddles and sometimes integrates scientific evaluate work, and make meaningful contributions both creative- and artistic objectives. Some MuMe tasks have identifiable ly and technically. We give details of our specification and de- measures of success — either because they are fully objec- signs for the musebot ensemble project. tive [6], or can be clearly measured by expert users [7]. Others have clear usability goals in contexts where the aim Keywords is to support creativity [8]. Yet others face problems of Generative music, autonomous agents, performance, computer evaluation because of their creatively open-ended nature music, musical metacreation, live algorithms.
    [Show full text]
  • Fun and Software
    Fun and Software Fun and Software Exploring Pleasure, Paradox and Pain in Computing Edited by Olga Goriunova Bloomsbury Academic An imprint of Bloomsbury Publishing Inc 1385 Broadway 50 Bedford Square New York London NY 10018 WC1B 3DP USA UK www.bloomsbury.com Bloomsbury is a registered trade mark of Bloomsbury Publishing Plc First published 2014 © Olga Goriunova and contributors, 2014 All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage or retrieval system, without prior permission in writing from the publishers. No responsibility for loss caused to any individual or organization acting on or refraining from action as a result of the material in this publication can be accepted by Bloomsbury or the author. Library of Congress Cataloging-in-Publication Data A catalog record for this book is available from the Library of Congress ISBN: HB: 978-1-6235-6094-2 ePub: 978-1-6235-6756-9 ePDF: 978-1-6235-6887-0 Typeset by Fakenham Prepress Solutions, Fakenham, Norfolk NR21 8NN Printed and bound in the United States of America Contents Acknowledgements vi Introduction Olga Goriunova 1 1 Technology, Logistics and Logic: Rethinking the Problem of Fun in Software Andrew Goffey 21 2 Bend Sinister: Monstrosity and Normative Effect in Computational Practice Simon Yuill 41 3 Always One Bit More, Computing and the Experience of Ambiguity Matthew Fuller 91 4 Do Algorithms Have Fun? On Completion, Indeterminacy
    [Show full text]
  • Dancecult 10(1) Programming Technique
    Algorithmic Electronic Dance Music Guest Editors Shelly Knotts and Nick Collins Volume 10 Number 1 2018 Dancecult Journal of Electronic In memory of Ed Montano, Dancecult Reviews Editor, Production Dance Music Culture Editor, Production Assistant, Operations Director 2009-2018. Issue 10(1) 2018 ISSN 1947-5403 Dancecult: Journal of Electronic Dance Music Culture ©2018 Dancecult is a peer-reviewed, open-access e-journal for the study of electronic Published yearly at <http://dj.dancecult.net> dance music culture (EDMC). Launched in 2009, as a platform for interdisciplinary scholarship on the shifting terrain of EDMCs worldwide, Dancecult houses research exploring the sites, technologies, Executive Editor sounds and cultures of electronic music in historical and contemporary Graham St John perspectives. Playing host to studies of emergent forms of electronic (University of Fribourg, CH) music production, performance, distribution, and reception, as a portal for cutting-edge research on the relation between bodies, From the floor Editors Alice O’Grady technologies, and cyberspace, as a medium through which the cultural (University of Leeds, UK) politics of dance is critically investigated, and as a venue for innovative Dave Payling multimedia projects, Dancecult is the leading venue for research on (Staffordshire University, UK) EDMC. Reviews Editor Toby Young Cover: The Yorkshire Programming Ensemble (TYPE) live at Open Data Institute in 2017. Image shows laptop band members (University of Oxford, UK) Lucy Cheesman, Ryan Kirkbride, and Laurie
    [Show full text]
  • Reflections on the Creative and Technological Development of the Audiovisual Duo—The Rebel Scum
    Rogue Two Reflections on the Creative and Technological Development of the Audiovisual Duo—The Rebel Scum Feature Article Ryan Ross Smith and Shawn Lawson Monash University (Australia) / Rensselaer Polytechnic Institute (US) Abstract This paper examines the development of the audiovisual duo Obi-Wan Codenobi and The Wookie (authors Shawn Lawson and Ryan Ross Smith respectively). The authors trace a now four-year trajectory of technological and artistic development, while highlighting the impact that a more recent physical displacement has had on the creative and collaborative aspects of the project. We seek to reflect upon the creative and technological journey through our collaboration, including Lawson’s development of The Force, an OpenGL shader-based live-coding environment for generative visuals, while illuminating our experiences with, and takeaways from, live coding in practice and performance, EDM in general and algorave culture specifically. Keywords: live coding; collaboration; EDM; audiovisual; Star Wars Ryan Ross Smith is a composer, performer and educator based in Melbourne, Australia. Smith has performed and had his music performed in North America, Iceland, Denmark, Australia and the UK, and has presented his work and research at conferences including NIME, ISEA, ICLI, ICLC, SMF and TENOR. Smith is also known for his work with Animated Notation, and his Ph.D. research website is archived at animatednotation.com. He is a Lecturer in composition and creative music technology at Monash University in Melbourne, Australia. Email: ryanrosssmith [@] gmail [.] com . Web: <http://www.ryanrosssmith.com/> Shawn Lawson is a visual media artist creating the computational sublime. As Obi-Wan Codenobi, he live-codes, real-time computer graphics with his software: The Force & The Dark Side.
    [Show full text]
  • In Kepler's Gardens
    A global journey mapping the ‘new’ world In Kepler’s Gardens AUTONOMY Helsinki Jerusalem Cairo Bangkok Esch Tallinn Nicosia Grenoble Melbourne Linz UNCERTAINTY Paris London ECOLOGY Oslo Moscow Auckland Vilnius Amsterdam Tokyo Johannesburg Munich Bucharest Belgrade San Sebastian Riga DEMOCRACY Dublin Daejeon Utrecht Vienna Los Angeles Ljubljana Buenos Aires Barcelona Montréal Chicago TECHNOLOGY Seoul Prague Hong Kong Athens Bergen Berlin Leiden Milan New York Brussels HUMANITY Taipei Jakarta ARS ELECTRONICA 2020 Festival for Art, Technology & Society In Kepler’s Gardens A global journey mapping the ‘new’ world Edited by Gerfried Stocker / Christine Schöpf / Hannes Leopoldseder Ars Electronica 2020 Festival for Art, Technology, and Society September 9 — 13, 2020 Ars Electronica, Linz Editors: Hannes Leopoldseder, Christine Schöpf, Gerfried Stocker Editing: Veronika Liebl, Anna Grubauer, Maria Koller, Alexander Wöran Translations: German — English: Douglas Deitemyer, Daniel Benedek Copyediting: Laura Freeburn, Mónica Belevan Graphic design and production: Main layout: Cornelia Prokop, Lunart Werbeagentur Cover: Gerhard Kirchschläger Minion Typeface: IBM Plex Sans PEFC Certified This product is from Printed by: Gutenberg-Werbering Gesellschaft m.b.H., Linz sustainably managed forests and controlled sources Paper: MagnoEN Bulk 1,1 Vol., 115 g/m², 300 g/m² www.pefc.org PEFC Certified © 2020 Ars Electronica PEFC Certified PEFC Certified This product is This product is This product is PEFC Certified © 2020 for the reproduced works by the artists,
    [Show full text]