Artists’ Article Motion Tracking of a Fish as a Novel Way to Control Electronic Music Performance

S h a l t i e l Eloul, Gil Zi s s u ,

Y e h i e l H . A m o a n d N o r i J a c o b y

The authors have mapped the three-dimensional motion of a fish T of Aristotle, who, in the 4th century BCE, tried to relate a C onto various electronic music performance gestures, including loops, fish’s fins to its straight motion, somewhat anticipating Isaac R melodies, arpeggio and DJ-like interventions. They combine an element

S Newton’s identification of the fins as the physical basis of of visualization, using an LED screen installed on the back of an

ABT A aquarium, to create a link between the fish’s motion and the sonified the generation of thrust for locomotion in water in the 17th

music. This visual addition provides extra information about the fish’s century [2]. role in the music, enabling the perception of versatile and developing This unique locomotion can be both patterned and ran- auditory structures during the performance that extend beyond the dom, and therefore our translation of the fish’s movement sonification of the momentary motion of objects. into musical syntax observed the following rules: On the one hand, the fish’s repetitive paths could be captured as , Popular electronic music performances are situations in while on the other hand, occasional random movements which highly structured music defines the mood and social could trigger surprising musical responses and effects. An interactions among people within the same space. This style electronic club music genre that has analogous properties of music typically combines accessible musical phrases with is minimal techno [5], characterized by slowly developing variations that enhance the experience for the human ear [1]. that exploit the use of repetition while making gen- Many clubs and electronic venues also create multisensory erous use of synthesized sounds and prerecorded realistic environments in an attempt to enhance club-goers’ experi- sounds. This genre is an inspiration for our work on translat- ences. Introducing an interactive musical aquarium is one ing the motion of a fish into various electronic music styles such enhancement, creating subtle similarities between the as a new way to express the interaction between visual and movement of fish in an aquarium and the electronic music auditory senses in club and electronic music scenes. style. Music computing artists have been increasingly interested From auditory to visual domains, fish exhibit a wide vari- in identifying new methods of expressive engagement with ety of swimming patterns and behaviors in response to their interactive music systems [6–8], i.e. systems that express ges- environments. Their movements are visually attractive to the tures to generate and control musical parameters and sig- human eye and have also proven to be valuable in cognitive nals for virtual instruments [9–13]. Another emerging area studies [2] and psychological therapy [3]. The hydrodynamic of study is the sonification of various natural phenomena shape and body movements of fish (which are related to their by analyzing data or tracking objects. This includes water size and type) merge into the aquatic locomotion that has sonification [14], music based on climate [15], music maps captivated human onlookers for thousands of years [2,4]. of the routes of stars [16] and sounds based on molecules The earliest recorded reference to fish motion is perhaps that and biomolecules [17, 18]. The aim of our work is to create an expressive interactive performance with a stimulating

Shaltiel Eloul (artist-student), St Cross College, Oxford, St Giles, OX1-3LZ U.K. phenomenon, in this case the sonification of fish motion, by Email: . using motion tracking. Gil Zissu (artist), University of the Arts, London, WC1V 7EY U.K. Tracking visual motion (e.g. dancing, natural phenomena Email: . or animals’ motion [19]) in real time and sonification of mo- Yehiel H. Amo (artist), Beer-Sheva, Israel. Email: . tion are now relatively manageable tasks owing to the wide Nori Jacoby (advisor and researcher), Department of Music, Bar Ilan University, Ramat Gan, Israel, 52900, and Edmond and Lily Safra Center for the Brain Sciences, availability of high-speed cameras and tracking devices such Hebrew University of Jerusalem, Jerusalem. Email: . as the Kinect [20, 21]. Yet the translation of these tasks into See for supplemental files associated intriguing onstage musical performances remains challeng- with this issue. ing, both technologically and artistically [22]. This difficulty

©2016 ISAST doi:10.1162/LEON_a_01072 LEONARDO, Vol. 49, No. 3, pp. 203–210, 2016 203

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/LEON_a_01072 by guest on 23 September 2021 Fig. 1. Sketch of equipment used during performances and exhibitions. All the equipment is portable, and setup requires approximately one hour. (© Shaltiel Eloul)

is likely a result of the signifi cant gap between the demon- study of sonifi cation and the implementation of sound from stration of an idea in a research framework (“in the lab”) and visual events. However, their auditory results did not seem the creation of a compelling live performance. When per- to attempt to stimulate the listener’s attunement to various forming, the artist must consider many parameters, such as music phrases or to develop music structures over time. the quality of the music, visual interpretation, the audience’s Moreover, their works were focused mostly on solving tech- ability to understand the artistic ideas and, above all, the nological issues in sonifi cation, such as motion tracking with artist’s ability to maintain the audience’s interest throughout multiple objects, rather than directly applying it to the cre- the performance. Th erefore, the artist’s challenge is to turn ation of music or to the combination of their technology with a new technological art idea into an enjoyable and pleasant time-based performance. experience for the audience in a live show, exhibition or club In 2011, Nikolaidis and Weinberg [26] created a general performance. Here, we present the idea of creating and play- model for dynamic sonifi cation of visual movements into ing music generated by the motion of a fi sh, which we have “low-level” elements such as pitch and panning as well as developed into an onstage performance. “high-level” elements such as melodic attraction. Still, music that relied solely on this model would be limited to tracked PErForMaNCE baCkGrouNd objects over a specifi c time. To avoid such specifi c momen- Th e fi rst major work to address the challenge of translating tary movements, we used a background screen that can in- fi sh movement into music was produced at the San Francisco teract with the movement of the tracked object, rendering Tape Music Center by Ramon Sender in 1962 [23]. In his the melody and musical ideas more complex and diverse. Tropical Fish Opera, four instrumentalists used a tank to For example, “painting” the movement of the fi sh in real create music scores for an improvisational performance. time can provide a visual record of the fi sh’s routes or pre- Th e players improvised the music by drawing staff lines vious positions, as shown below in the section “Audio and on the fi sh’s tank and trying to play the positions of the Visual Translation Patches.” Th en, aft er this information has fi sh as it swam up and down in the tank. been captured on the screen, we can use it to implement Sonifi cation using optical tracking was proposed by large-scale musical ideas (such as create rhythm, arpeggios Pendse et al. in 2008 [24] and more recently by Baldan et or loops) without losing the relation between the music and al. in 2012 [25]; these works describe the idea of multiple the fi sh’s motion. Above all, we must also take into account fi sh sonifi cations created according to the animals’ size and aspects of visual and music perception, aesthetic value and location in an aquarium. Th e studies demonstrate a restricted logistical organization, so as to provide a performance that method of sonifi cation, which is important for the general appeals to both musicians and nonmusicians.

204 Eloul et al., Motion Tracking of a Fish as a Novel Way to Control Electronic Music Performance

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/LEON_a_01072 by guest on 23 September 2021 Inst allation Sketches of the aquarium equipment and photo- graphs from live installations are shown in Figs 1 and 2, respectively. We positioned the fish in an aquarium (Fig. 1a) onstage, as depicted in the upper photo in Fig. 2. To detect the three- dimensional motion of the fish, we placed two FireWire cameras at the top (Fig. 1b) and the right side of the aquarium (Fig. 1c). Three parameters —position, velocity and acceleration—are trans- lated to control musical gestures and triggers. To compute the fish’s motion and all music ele- ments, we used two laptop computers (Fig. 1d), a synthesizer (Fig. 1e) and one portable audio card (Fig. 1f). We further combined a visual interpretation with an LED screen (Fig. 1g) positioned on the back of the aquarium. This visual element is con- trolled by the fish’s motion and by music triggers using a third laptop. We found that this added visual interpretation is crucial to the audience’s understanding of the interaction between the fish and the music, as we explain below.

Appi l cative Methodology The live and time-based art we present addresses a central concern in the engagement of technology with the interactive design of musical and visual experience. The experience of the performance relies on the ability of the musicians to integrate independent fish motion into familiar structures Fig. 2. Photographs from separate installations, 2013. The upper photo was taken during an installation in an auditorium. The lower photo was taken at an installation on the lighting of electronic music. These involve loops, DJ-like and machines floor above a theater stage. (© Shaltiel Eloul. Photos: Gil Zissu.) effects such as panning and filtering, and creat- ing live melodic layers over an existing electronic track. To connect the audience to the performance, we de- listener to enjoy the performance. Furthermore, in order for cided to avoid demanding cognitive ideas, which might easily the listener to distinguish between the background music distract listeners from the integration of the visuals and the and the fish’s role, the musical style needs to be accessible music. On the other hand, we felt that the musical concept and full of constant, easily recognizable patterns. The back- should be continuously developed and varied to maintain the ground music cannot involve solo lead parts, as those could audience’s curiosity and the performance structure. There- confuse the listener. fore, we attempted to develop the performance slowly, mov- Given the logistical and aesthetic aspects of the perfor- ing from the most transparent interactions between the fish mance, we built a portable aquarium with a wheeled stand, and the music to more complex interactions. As we designed allowing for rapid construction and portability between in- the performance, we examined a number of aspects in order stallations (lower photo in Fig. 2). The fish’s natural behavior to understand how to maximize the interaction among the (e.g. swimming speed) and our ability to maintain neces- listener, the fish’s motion and the music. sary water conditions in the aquarium (e.g. temperature, We swiftly realized that playing music based on more than oxygen bubbling and filtering) are other essential aspects one fish makes the connection between the sound and the of the performance. Additionally, for ethical and practical motion of the fish difficult to capture and barely understand- considerations, we had to ensure that the fish’s life is pleasant able; focused on deciphering one fish’s part in the creative before, during and after the show by allowing it to move in process, the listener becomes distracted from the whole ex- a natural way. Therefore, we designed the aquarium with a perience. We also realized that visual elements behind the separator on the side of the tank that provides a hiding place aquarium play an important role in allowing the listener to (Fig. 1h), which is common in many fish tanks. We posi- understand the fish’s role. Our choices of visual effects are tioned all the aquarium equipment, including the bubbler, relatively easy to understand and constitute a “bridge” be- filter and heater, within the hiding place, allowing the fish tween the fish’s motion and the music, making the experience to live in comfortable conditions. It is important to mention of the performance comprehensible and thus allowing the that our fish, a dominant red zebra cichlid, did not show any

Eloul et al., Motion Tracking of a Fish as a Novel Way to Control Electronic Music Performance 205

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/LEON_a_01072 by guest on 23 September 2021 sign of distress before, during or aft er the performance. It ate normally, moved calmly and explored the tank, showing some curiosity by tracking the visualization on the screen at the back of the aquarium. Th e fi sh could also move into its hiding place at any time.

TEChNoloGY aNd PErForMaNCE METhodoloGY We developed a modular program using a Max/MSP/Jitter [27] environment, containing three patches that allow the use of three standard laptops, each equipped with a 2.4 GHz processor. In this way, we smoothed the motion of visual eff ects and avoided signifi cant digital signal latency (DSP).

Th e fi rst patch controls image processing for tracking the fi sh Fig. 3. The “looper” preset: The fi sh creates a musical rhythm by painting and translating its motion into a position, velocity (speed) a matrix of 10 sounds (vertical axis) along a bar that consists of eighth, and acceleration. Th e second patch handles music translation sixteenth and thirty-second notes (horizontal axis). (© Shaltiel Eloul. and DSP, and the third controls the visualization on the LED Photo: Gil Zissu.) screen at the back of the aquarium. We implemented this tracking process using the computer In the same manner, the acceleration magnitude (a) de- vision for Jitter library [28]. Two low-latency FireWire cam- scribes large changes in the fi sh’s motion, as follows: eras are used to capture the fi sh at a resolution of 320:240 pixels, yielding up to 25 frames per section (fps) (Fig. 1b,c). We implement a paradigm for tracking the fi sh in the perfor- mance as follows: During the fi rst step, we use normal image Aft er the tracking paradigm is computed on the fi rst lap- handling to optimize contrast and brightness and thus make top, the data is sent wirelessly via TCP/IP protocol to audio the colors more distinctive and compressed. In the second translation and to the visual translation patches on the two step, we apply an unusual background subtraction, as normal other laptops. continuous background subtraction [29] is not applicable to this system, given that the fi sh may move slowly or remain audio aNd viSual TraNSlaTioN PaTChES still, leading to an unsatisfactory detection of position. To overcome this issue, we apply a static background subtraction Th e fi sh’s position, speed and acceleration parameters are that is renewed on the empty half of the aquarium whenever translated to shape music gestures and triggers via four pre- the fi sh is swimming in the other half. Th is tactic helps to sets—creating loops, music dialogue, DJ gestures and melody make the subtraction far enough from the fi sh’s position or composition—that control and infl uence the music in diff er- that of its shadow. Next, blob tracking (which identifi es po- ing ways to achieve a variety of experiences for the listener. sition and size) is introduced from the cv.jit toolbox, which uses a binary image followed by a Kalman fi lter to track the Creating Loops fi sh’s true position [29]. In the last step, we simply use a low- Th is preset translates the fi sh’s movement by painting cells pass fi lter of the tracked position while maintaining a prac- with respect to its x–z pathways in the aquarium. Th ese are tical latency—around 50 milliseconds (ms). Th e fi lter takes visualized on the screen in the background (Fig. 3). Each a series of tracked matrices and cuts off any large distance cell that is painted by the fi sh triggers a sample in a matrix fl uctuations by applying the following formula: as a standard loop machine. Th e rows of the matrix describe the sampled sound, which we prepare in advance, and the 1 1 (New position) = (last position) × 0.95 + (new detected columns are the metrical position within the bar ( /32, /16 1 position) × 0.05 or /8 of a bar). We use samples from electronic drum kits, glitches and other noisy sounds. Delay and reverb eff ects are Th is low-pass fi lter is necessary for a stable, smooth esti- added when the fi sh arrives at corners. Th e loop results in an mation of the fi sh’s speed. Th e fi sh’s coordinates indicate its abstract structure when using 32 beats per bar or a repetitive position at any frame, and its fi rst-order discrete derivative rhythm and a danceable loop when using 8 beats per bar magnitude (scalar speed, which we denote as V) is then eval- (Fig. 3). uated using the following formula: In addition, we can predetermine the beats per min- ute (bpm) of the loop by mapping the speed of the fi sh

to an experimental value between V (speed, 0–vmax) → 120–240bpm. Where f is the frame rate of the input signal, ∆x, ∆y and ∆z are the diff erences between two subsequent frames in the Music Dialogue tracked position of x (horizontal axis), y (depth axis) and z In this preset, the aquarium is “split” into two sides: left (vertical axis and the height of the aquarium), respectively. and right (Fig. 4). Each side is programmed for a diff erent

206 Eloul et al., Motion Tracking of a Fish as a Novel Way to Control Electronic Music Performance

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/LEON_a_01072 by guest on 23 September 2021 mu ­sical style and visual atmosphere. In this preset, the fish controls the type of music and visual effects by swimming in each section of the aquarium. The visualization of the LED screen uses OpenGL in Jitter visual programming. This preset allows clar- ity via sharp shifts in the music according to large changes in the fish’s position, helping the listener understand the components of the system: the fish’s motion, the visualization of the screen and how the two merge into music gestures. Fig. 4. Music dialogue preset: The fish determines the music’s atmosphere. When the DJ Gestures fish swims to the black side, the music becomes “noisy,” and two vertical white lines “flicker,” synchronized with the rhythm. If the fish enters the pink side, the music is Thex , y and z coordinates of the fish control musi- “chill” (with a relaxing ambience). (© Shaltiel Eloul. Photo: Gil Zissu.) cal modulation and effects: cutoff filtering, pitch- bend, volume and left-to-right panning. The fish alone controls the background music, because its position and speed are mapped to those param- eters. Table 1 shows how these effects are mapped according to the motion of the fish in the aquarium. Given our goal of making gestures that are easy for a listener to understand but do not distort the background music, gesture mapping is determined by trial and error. We combine the background LED screen with visual effects designed in Max/ Jitter to enhance the connection between the mo- tion of the fish and the musical gestures. One ex- ample of a simple enhancement is shown in Fig. 5, in which a tracking ball behind the fish changes shape, size and speed vibration bits according to the fish’s velocity and amplitude peaks in the music audio (made using OpenGL in Jitter visual programming).

Melody Composition Baldan et al. [25] mapped sound parameters such as pitch, panning and timbre to a large number of fish according to various physical characteristics and positions, resulting in a very rich multiplicity of random sounds. In our work, the fish is in charge of the melody, in a role similar to the player of a synthesizer. The position, speed and acceleration of the fish send MIDI messages to an external syn- thesizer (a Novation UltraNova). The MIDI mes- Fig. 5. A visual enhancement of the tracked fish on the LED screen. A simple geometrical shape follows the fish (upper photo), vibrates and changes size and colors (lower photo) to sages trigger notes at certain accelerations. Many enhance the connection between the fish’s movements and music events. (© Shaltiel Eloul. sound presets can be used with various attack- Photo: Gil Zissu.) decay-sustain-release (ADSR) envelopes, such as lead-synth, pads or strings. MIDI messages that are controlled by the fish’s position and velocity Tabl e 1. Mapping effects on the master music channel (background music). are sent to the two wheels of the synthesizer: the pitchbend wheel and the modulation effect wheel, Gesture Parameter Scale as illustrated in Fig. 6. The preset is designed for the performance and Pitchbend x (0–320 pixels) –12 tone to +12 tone is well suited to function as solo parts, above the Cutoff filter y (0–240 pixels) 500 Hz–20 KHz background music. The fish is restricted to playing specific types of scales (diatonic, pentatonic, etc.), Panning x (0–240 pixels) 100% left → 100% right

and there are specific rules for each music phrase Volume V (speed, 0–vmax) –3 dB → 0 dB (listed in Table 2). We determined each sound

Eloul et al., Motion Tracking of a Fish as a Novel Way to Control Electronic Music Performance 207

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/LEON_a_01072 by guest on 23 September 2021 ­preset using these rules. In this way, the listener can relate the motion of the fish to familiar popular music patterns that reflect the fish’s characteristic motion. As an enhancement and development of this preset that we combine in performance, we use the visualiza- tion of the LED backscreen to play arpeggios without losing the relation between the melody and the fish’s motion. When the fish accelerates or makes a turn, it creates notes that are presented as 3D balls connected by wires (Fig. 7). The notes then make an arpeggio melody, whose is determined by the speed of the fish. The visualization, created with OpenGL and the open- source Java library TRAER.PHYSICS [30] to make a Fig. 6. The synthesizer allows the fish to “play” melodies of electronic particle system, applies forces and handles positions in sounds by sending MIDI messages. The synthesizer is a Novation UltraNova. real time. In addition, when the fish bumps into one of (© Shaltiel Eloul. Photo: Gil Zissu.) the balls, the ball sticks to the fish, and the fish virtually “grabs” and drags the ball to change and develop the arpeggio melody.

Prerfo mance and Music Creation All the above presets are combined into two distinct performance modes. In the first mode, the fish creates the music ab initio, by layers (e.g. creating a loop, adding melody and then making gestures on the music that was created). Onstage, we begin the performance with one music element—the “creating loop” preset, which gives a clear visual picture of how the loop is created by the fish. After finding an interesting loop containing the base beat, we add further elements and layers. For example, we may add arpeggios or let the fish play melodies with predefined sounds. When the music becomes rich, we let the fish become the DJ by controlling effects such as Fig. 7. An arpeggio visualization on the LED screen at the back of the aquarium. Every ball is mapped to a pitch in the arpeggio. The fish can create the ball or virtu- filtering, panning, volume and pitchbend. Then we can ally “grab” it to create an interesting development of the melody. The active pitch/ repeat the set. As audience members grow familiar with ball, at each moment in the arpeggio, is illuminated in white light. The arpeggio’s the role of the fish, they typically are able to achieve a tempo is also affected by the speed of the fish. (© Shaltiel Eloul. Photo: Gil Zissu.) greater engagement with the musical result. In the sec- ond mode, we set the music in the background or play it alongside performers onstage, and the fish affects the Tabl e 2. Mapping preset parameters for playing a melody. background music by controlling music parameters, changing the music atmosphere or adding a melody. Preset We combined supplementary media, including Rules Type Mapping sample video and sample sounds, which accompany the present article [31], to illustrate the main ideas of Scale Octave, random, x, y or z diatonic, pentatonic, the performance and supply additional information re- chromatic garding experimental performances that examine levels of interaction. Another demo provides an audio file of Range ±1 octave → ±4 octave — unedited music created by the fish during the installa- Pitchbend MIDI command (0–127) x, y or z tion. Last, a third audio file presents the result of the fish creating an arpeggio and controlling the parameters of Note on MIDI trigger Threshold acceleration: background samples [32].

V = 0 → 20% from vmax Cocu n l sion and Future Work Note off Duration in milliseconds 0–4,000 milliseconds We have explained a music technology project that tracks the motion of a fish to play and control electronic music in live performance. We also propose a new un- derstanding of the utilization of this music technology

208 Eloul et al., Motion Tracking of a Fish as a Novel Way to Control Electronic Music Performance

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/LEON_a_01072 by guest on 23 September 2021 idea, taking it out of the “lab” and onto the stage. The com- fish and the music, as well as the experience as a whole. In plexity of the interaction between the fish and the music is addition, additional music presets can be examined to elevate reduced to a set of gestures/presets, extending earlier ver- the performance into higher levels of melodies and interac- sions suggested by previous work [24–26] in the music per- tion. For example, we examined a new preset that permits a formance aspect. In the same manner, we have found that fish to play an arpeggio with the combination of more than trying to use more than one fish can easily garble the clarity one fish that is painted on the back LED screen, such as by of the interaction we seek. Our solution to the problem in- making each fish responsible for one note in the arpeggio, volves the introduction of a LED screen at the back of the with the notes being changed with the position of the vari- aquarium, allowing the listener to understand the fish’s role ous fish. This technique could permit melodies to develop in the music while also inviting new ideas for creating music. into more interesting and complex structures without cloud- To allow more complex gestures and presets using more ing the clarity of the interaction between the music and the than one fish, interaction experiments should first be con- fish’s motion, thus providing audiences high levels of music ducted with musicians and nonmusicians in order to exam- experience and interactivity that are directly influenced by ine their perception of the level of interaction between the the fish behavior as an individual or in a group.

Acknowledgments 12 J. Drew and M. Harrison, “Eye Music: The Graphic Art of New Mu- sical Notation,” exh. cat., Serpentine Gallery, London, 26 July–31 This project was supported and funded by the music project syndrome. August 1986. nu. A special thanks to Eran Shlomi, Almog Kalifa and Dima Davi- dovich, who participated in the installation and design of the portable 13 T. Machover and J. Chung, “Hyperinstruments: Musically Intelligent aquarium and are an integral part of the project. We acknowledge ­ and Interactive Performance and Creativity Systems,” in Proceedings Carmel Raz for her editing, proofreading and professional review of of the 1989 International Computer Music Conference (Columbus, this article. Finally, we thank Liv Fiertag for her proofreading and lan- OH, 1989) pp. 186–190. guage advice. 14 B.L. Sturm, “Pulse of an Ocean: Sonification of Ocean Buoy Data,” Leonardo 38, No. 2, 143–149 (2005). References and Notes 15 M. Quinn, “The Climate Symphony and Other Sonifications of Ice 1 A. Bennett, Popular Music and Youth Culture: Music, Identity and Core, Radar, DNA, Seismic and Solar Wind Data,” in Proceedings of Place (New York: Palgrave Macmillan, 2000). the 2001 International Conference on Auditory Display (Espoo, Fin- land, 29 July–1 August 2001) pp. 56–61. 2 K. Stephens, B. Pham and A. Wardhani, “Modelling Fish Behaviour,” in Graphite 2003: Proceedings of the First International Conference 16 R. Winton et al., “A Sonification of Kepler Space Telescope Star Data,” on Computer Graphics and Interactive Techniques in Australasia and in Proceedings of the 18th International Conference on Auditory Dis- South East Asia (Melbourne, 2003) pp. 71–78. Aristotle’s observations play, Atlanta, GA, 2012, pp. 18–21. mentioned above appear in his Progression of Animals. 17 T. Delatour, “Molecular Music: The Acoustic Conversion of Molecu- 3 S. Barker, K. Rasmussen and Al M. Best, “Effect of Aquariums on lar Vibrational Spectra,” Computer Music Journal 24, No. 3, 48–68 Electroconvulsive Therapy Patients,” Anthrozoös 16, No. 3, 229–240 (2000). (2003). 18 J. Dunn and M.A. Clark, “Life Music: The Sonification of Proteins,” 4 R. Blake, Fish Locomotion (Cambridge, U.K.: Cambridge Univ. Press, Leonardo 32, No. 1, 25–32 (1999). 1983). 19 J.A. Paradiso and F. Sparacino, “Optical Tracking for Music and 5 P. Sherburne, “Digital Discipline: Minimalism in House and Techno,” Dance Performance,” in Fourth Conference on Optical 3D Measure- in Christopher Cox and Daniel Warner, eds., Audio Culture: Read- ment Techniques, ETH, Zurich (1997). ings in Modern Music (New York: Continuum, 2006) pp. 321–322. 20 G. Odowichuk et al., “Sensor Fusion: Towards a Fully Expressive 3D 6 A. Camurri, “Interactive Dance/Music Systems,” in Proceedings of Music Control Interface,” in Communications, Computers and Signal the 1995 International Computer Music Conference (ICMC-95) (Banff, Processing (PacRim), 2011 IEEE Pacific Rim Conference, Victoria, BC, Canada, and San Francisco: ICMA, 1995) pp. 245–252. 23–26 August 2011, pp. 836–841. 7 G. Castellano et al., “Expressive Control of Music and Visual Media 21 D. Chattopadhyay, “Multimodal Tagging of Human Motion Using by Full-Body Movement,” in Proceedings of the 2007 Conference on Skeletal Tracking with Kinect,” M.Sc. thesis, Department of Com- New Interfaces for Musical Expression (NIME07) (New York, 2007). puter Science, State Univ. of New York at Stony Brook (2011). 8 E. Edmonds, A. Martin and S. Pauletto, “Audio-Visual Interfaces in 22 M.M. Wanderley and M. Battier, “Meaning in Music Gesture: Trends Digital Art,” in Proceedings of the 2004 ACM SIGCHI International in Gestural Control of Music,” in Trends in Gestural Control of Music, Conference on Advances in Computer Entertainment Technology (Sin- eds. Wanderley and Battier (Paris: Centre Pompidou, 2000). gapore, 2004). 23 D. Bernstein, The San Francisco Tape Music Center: 1960s Counter- 9 K. Peacock, “Instruments to Perform Color-Music: Two Centuries of culture and the Avant-Garde (Berkeley, CA: University of California Technological Experimentation,” Leonardo 21, No. 4, 397–406 (1988). Press, 2008). 10 S.R. Wagler, “Sonovision: A Visual Display of Sound,” Leonardo 3, No. 4, 443–445 (1970). 24 A. Pendse, M. Pate and B.N. Walker, “The Accessible Aquarium: Iden- tifying and Evaluating Salient Creature Features for Sonification,” in 11 A. Abbado, “Perceptual Correspondences of Abstract Animation Proceedings of the 10th International ACM SIGACCESS Conference and Synthetic Sound,” M.S. thesis, Media Laboratory, Massachusetts on Computers and Accessibility, Halifax, NS, 13–15 October 2008, Institute of Technology (1988). pp. 297–298.

Eloul et al., Motion Tracking of a Fish as a Novel Way to Control Electronic Music Performance 209

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/LEON_a_01072 by guest on 23 September 2021 25 S. Baldan, L.A. Ludovico and D.A. Mauro, “Musica sull’acqua: A 29 M. Basavaiah, Optical Flow Based Moving Object Detection and Motion Tracking Based Sonification of an Aquarium in Real Time,” Tracking System: Development of Optical Flow Based Moving Object in Proceedings of the 9th Sound and Music Computing Conference, Detection and Tracking System on an Embedded DSP Processor (Cov- Copenhagen, Denmark, 2012, pp. 69–74. entry, UK: LAP Lambert Academic Publishing, 2012).

26 R. Nikolaidis and G. Weinberg, “Generative Musical Tension Model- 30 For more information on TRAER.PHYSICS, contact . ing and Its Application to Dynamic Sonification,” Computer Music 31 We have uploaded a video to Vimeo to accompany this article: http:// Journal 36 (2011): 55–64. vimeo.com/72027888. (The video is also provided as a supplemen- tary file.) 27 More information is available at . 32 The sound examples are provided as supplementary files to this 28 cv.jit is an external collection of Max/MSP/Jitter tools of computer ­article. vision for Jitter. Manuscript received 3 April 2014.

announcing

Leonardo Art Science Evening Rendezvous (LASER)

Since 2008, the Leonardo Art Science Evening Rendezvous (LASER) series of lectures and presentations on art, science and technology has provided spaces for progressive thought leaders to come together to form community and explore the intersections of disciplinary thinking. Owing to its success and popularity, LASER has expanded beyond its birthplace in the San Francisco Bay Area, first to the U.S. East Coast, then across the Atlantic to London—the home of the first European LASER—and today continues to expand to new locations nationally and internationally. We thank all of those who have spoken at, participated in or attended LASER events throughout the years. We owe a special thank you to Piero Scaruffi, LASER founder and chair, for his inspiration and continued dedication, and to the growing list of LASER hosts around the world. To follow LASER events, see .

210 Eloul et al., Motion Tracking of a Fish as a Novel Way to Control Electronic Music Performance

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/LEON_a_01072 by guest on 23 September 2021 announc E MENT

Leonardo Book Series

Editor in Chief: Sean Cubitt Editorial Advisory Board: Annick Bureaud, Laura U. Marks, Anna Munster, Michael Punt, Sundar Sarukkai, Eugene Thacker Editorial Consultant: Joel Slayton

The arts, sciences and technology are experiencing a period of profound change. Explosive challenges to the institutions and practices of engineering, art-making and scientific research raise urgent questions of ethics, craft and care for the planet and its inhabitants. Unforeseen forms of beauty and understanding are possible, but so too are unexpected risks and threats. The Leonardo Book Series, published by The MIT Press, aims to consider these opportunities, changes and challenges in books that are both timely and of enduring value.

Proposals that address these challenges in terms of theory, research and practice, education, historical scholarship, discipline summaries and experimental texts will be considered. Single-authored books are particularly encouraged.

Full book proposal submission guidelines: .

Inquiries and proposals should be submitted to both:

Leonardo Book Series Doug Sery c/o Leonardo MIT Press Books 1440 Broadway, Suite 422 One Rogers Street Oakland, CA 94612 Cambridge, MA 02142 U.S.A. U.S.A.

Email: .

RcentTe itles: Laura U. Marks: Hanan al-Cinema: Affections for the Moving Image Joasia Krysa and Jussi Parikka, eds.: Writing and Unwriting (Media) Art History: Erkki Kurenniemi in 2048 Seb Franklin: Control: Digitality as Cultural Logic Gloria Sutton: The Experience Machine: Stan VanDerBeek’s Movie-Drome and Expanded Cinema

To order Leonardo Books, visit .

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/LEON_a_01072 by guest on 23 September 2021