Beacon: Exploring Physicality in Digital Performance
Total Page:16
File Type:pdf, Size:1020Kb
Beacon: Exploring Physicality in Digital Performance Anna Weisling Anna Xambó Abstract Georgia Institute of Technology Queen Mary University of Live performances which involve digital technology often North Ave NW, Atlanta, GA London strive toward clear correspondences between distinct me- 30332, USA Mile End Road, London E1 4NS, dia modes, particularly those works which combine audio [email protected] UK and video. Often, the process of creating and executing [email protected] such performances involves mapping schemes which are encased within the digital system, producing content which is tightly synchronized but with relationships which can feel rigid and unexpressive. Within this paper we present a col- laborative process between visualist and musician, which builds toward a method for promoting co-creativity in multi- media performance and prioritizes the performer’s physical presence and interaction with digital content. Through the development of two autonomous systems, a novel phys- ical interface and an interactive music system, we sum- marize our creative process of co-exploration of system capabilities, and extended periods of experimentation and exploration. From this experience, we offer an early-stage framework for approaching engaging digital audiovisual re- lationships in live performance settings. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation Author Keywords on the first page. Copyrights for third-party components of this work must be honored. Digital performance; co-creativity; tangible user interfaces. For all other uses, contact the owner/author(s). Copyright held by the owner/author(s). TEI ’18, March 18–21, 2018, Stockholm, Sweden CCS Concepts ACM 978-1-4503-5568-1/18/03. •Hardware ! Tactile and hand-based interfaces; •Applied https://doi.org/10.1145/3173225.3173312 computing ! Sound and music computing; Dear College of Design Diversity Council Members, We, Anna Weisling, Takumi Ogata, and Anna Xambó have been accepted to present our work at the prestigious international conference New Interfaces for Musical Expression (NIME), to be held in Copenhagen May 15th to 18th (http://www.nime2017.org/). In this letter we provide information about the conference, ourselves, how our visit will have an impact on the recruitment of diverse students, and how we will share the lessons learned. The Conference: New Interfaces for Musical Expression (NIME) This internationally renowned conference is a yearly showcase of scientific research on cutting-edge music technology and musical interface design. The submissions are peer reviewed (acceptance rate around 30%), and the publications are compiled in the Proceedings, which are freely accessible under a Creative Commons license. In the Google Scholar Music & Musicology section, it is listed as the 7th most relevant publication in the world. Bios Takumi Ogata is a musician and an instrument design researcher. After obtaining his BS in sound engineering at University of Michigan, he began attending Georgia Tech as an MS student in the music tech program. As an instrument designer, his main interest is to incorporate robotics into his design. His work was presented at National Student Electronic Music Event (NSEME) conference, and he has won 2nd place in 2017 Guthman Musical Instrument Competition. Anna Weisling explores the relationship between sound and image and the performance possibilities shared by both. She has a Master's degree in Sonic Arts from Queen's University Belfast and is currently pursuing a PhD in Digital Media at Georgia Tech. Anna Xambó is currently a Postdoctoral Fellow at the Center for Music Technology and Digital Media Program (Georgia Tech) as well as a composer, performer, and producer of experimental electronic music. Xambó holds a MSc degree specializing in HCI and music technology from Universitat Pompeu Fabra (Spain), and completed her PhD on collaborative music computing at the Open University (UK). Summary of the research presented IntroductionBeacon by Anna Weising and Anna Xambó (Performance) TheBeacon physical is an and audiovisual the digital shareperformance a performance for electronic space musician withand increasing live visualist frequency.. The audiovisual We attach sensorsparadigm to dancerspresented in the (e.g.,work [1 ]),is wedecidedly mount cameras musical, in thethough rafters each (e.g., agent [8]). We acts with mapautonomy, our media ontoand ourthe bodies, resulting our screens, dynamics and our speak-embody a ers,reconsideration we utilize computer of visual technology music, and to extend soundtracks. technique, The visualist buildand novel musician relationships are linked with through media and more technology, than their and dynamics; producethe visual richer system performances itself is amplified, [4]. But the contributing physical-digital to the shape contractand dynamics is too often of the considered sonic material. a system with which to send information from one media or discipline to another, Figure 1: Distaff, a visual instrument. rather than a framework within which we can collaborate in new ways. Robotically Augmented Electric Guitar by Takumi Ogata (Demo) One of the byproducts of this developmentRobotically is that Augmented the com- Electricof Guitar the human is a novel performers. robotic Several guitar guidingthat establishe principalss have puter, as it extends our capabilities,shared at the same control time between runs humanbeen performers employed and in order mechanical to explore actuators. the possible Unlike perfor- the risk of removing expressivity fromother our handsmechatronic [6]. Tech- guitar instrumentsmance dynamics that perform in this work, pre- includingprogrammed exploring music physi- nology is commonly used in ways whichautomatically, allow any spec-this guitar allowscal distributed and digital systemcontrol capabilities, between the identifying human and and experi- tator to experience a piece as a polished,robotic safe components. production, Furthermore,menting the rhythmic with audiovisual patterns relationships, can be algorithmically developing composi- with little regard for what expressiveor or stochastically collaborative capa- generated by thetional hammer, form, and which building supports toward real an-time audiovisual interactive performance bilities might have been sacrificed inimprovisation. the interest of ease framework. In the following sections, we will discuss these or security. Often culminating in one-to-one mapping, this techniques and the resulting consequences for composi- synchronous input-output relationship is digital mickey- tion, instrument design, and performance. mousing, a way for a secondary source of content to be driven by the primary element of performance. The visual Beacon content becomes a music video to accompany musicians, Beacon is an audiovisual performance for electronic musi- or music becomes the soundtrack for a dancer. This ap- cian and live visualist. The audiovisual paradigm presented proach abandons the rich potential for experimental perfor- in the work is decidedly musical, though each agent acts mances before it even begins. Though this condition can with autonomy, and the resulting dynamics embody a re- be observed in nearly all facets of performance, audiovisual consideration of both visual music and soundtracks. The pieces are a prime example of both the potential for engag- visualist and musician are linked through more than their ing new work as well as the pitfalls of poor or misguided dynamics; the visual system itself is amplified, contributing design. to the shape and dynamics of the sonic material. In Beacon, aural and visual content is considered as ex- Beacon has been performed twice to date, in very different pressive material which evolves together over time, shaped spaces. The first performance was tailored to an 8-channel by both the digital system’s properties as well as the agency sound system at the 2017 Root Signals Festival (Georgia Southern University, Statesboro, Georgia, United States), and the second performance was held at the NIME 2017 conference (Stengade, Copenhagen, Denmark). All of the visual content for the piece is produced live on a specialized visual instrument, the Distaff [14], which re- quires the performer to focus on gesture, feedback, and physicality within the collaborative space (see Figure1). With core mechanics pulled from a Technics turntable, Figure 2: Fingertip pressing on the the visualist must control the instrument’s rate of rotation Distaff. Photo: NIME 2017 / Stengade. through power and pressure, allowing the object to effec- tively “push back” against the fingertips (see Figures2 and 3). With physiological phenomena such as reverse-rotation (the “wagon-wheel effect”) and persistence of vision consid- ered, the output of the instrument ranges from fluid textural movements to stuttered, strobing lights; both ends of the Figure 4: An example of code of MIRLC, the audio engine. spectrum geared toward immersing the viewer in an unfil- Figure 3: Operating the Distaff. tered physical experience of the senses. Photo: NIME 2017 / Stengade. The audio engine is a prototype built on the audio program- in exploring mutual interaction. This refers to bidirectional ming language SuperCollider [9]. An example of code is control, and therefore the use of feedback is welcome.