Quick viewing(Text Mode)

Experiments with Virtual Reality Instruments

Experiments with Virtual Reality Instruments

Proceedings of the 2005 International Conference on New Interfaces for Musical Expression (NIME05), Vancouver, BC, Canada

Experiments with Virtual Reality Instruments Teemu Mäki-Patola Juha Laitinen, Aki Kanerva Tapio Takala MSc, researcher Research assistant Professor Laboratory of Telecommunications Laboratory of Telecommunications Laboratory of Telecommunications Software and Multimedia, Helsinki Software and Multimedia, Helsinki Software and Multimedia, Helsinki University of Technology University of Technology University of Technology +358 9 451 5849 +358 50 {353 1243, 544 8673} +358 9 451 3222 [email protected] {jmlaitin,aki.kanerva}@tml.hut.fi [email protected]

ABSTRACT has only lately become a topic of active study [3], [4]. On the In this paper, we introduce and analyze four gesture-controlled other hand, there is a lot of literature on how to design musical musical instruments. We briefly discuss the test platform designed interfaces [2], [5], [9], [10], [18]. However, more research on the to allow for rapid experimentation of new interfaces and control effects of individual interface properties is needed. For instance, mappings. We describe our design experiences and discuss the virtual reality interfaces are bound to differ from classical effects of system features such as latency, resolution and lack of instruments, as the medium and its properties are fundamentally tactile feedback. The instruments use virtual reality hardware and different. VR technology introduces some latency, cannot easily computer vision for user input, and three-dimensional stereo simulate tactile feedback, and is limited in both spatial and vision as well as simple desktop displays for providing visual temporal resolution. For instrument design in VR, it is important feedback. The instrument sounds are synthesized in real-time to know the effects of these properties. Only then can we find out using physical sound modeling. the strengths and weaknesses of the approach, and give suggestions on the kinds of interfaces it is well-suited for. Keywords Our analyses of the presented instruments have two emphases. design, virtual instrument, gesture, widgets, First, we state our design experiences and research on the above physical sound modeling, control mapping. concepts. Second, we use Sergi Jorda’s theoretical framework [6] to evaluate the potential and expressiveness of the instruments. 1. INTRODUCTION The presented instruments utilize two interaction approaches: gestural control, and interaction with virtual objects called Physical sound modeling is an active research area. Real-time widgets. In this context, gestural control means control by body implementations of these models make it possible to alter any motion without directly interacting with any physical or virtual parameter of the model while playing, offering more freedom for objects. Virtual widgets are computer graphic objects the users lively performances. This creates a need for controllers whose can interact with. For example, a Virtual Xylophone instrument input flexibility matches the control complexity of the sound includes mallet and plate widgets. models. Virtual reality (VR) input technology, such as data gloves and location/orientation trackers with gesture analysis, is one way of offering several natural degrees of freedom. We have created 2. HARDWARE several musical instruments that use this approach. See our project Most of the instruments were created in a Cave-like virtual room, web site for additional information and videos [20]. called EVE, in our laboratory [22]. Visualization of the virtual environment is back projected onto three walls and a floor of a An article by Paradiso [13] and a book edited by Wanderley and three-by-three meters large cube-shaped room. The users perceive Battier [17] offer a good introduction to existing electronic the visualization three-dimensionally through active stereo shutter interfaces and gestural controllers. Many of them have been glasses. Virtual objects can be created to be perceived at any created during the last few decades, even a few commercial ones distance around the user. The rendering is done by an SGI [23], [26]. However, there has been little research on virtual InfiniteReality Onyx2 running an IRIX 6.5 operating system. reality interfaces for sound control [1], [8], [10]. The interfaces presented in this paper are perhaps more “instrument-like” than User input comes from data gloves (5DT) and a magnetic motion most of the other virtual reality interfaces, which have been more tracker (Ascension Technologies MotionStar). MIDI devices can of the type of interactive sound environments or interactive filters. also be used. The motion tracker has six sensors. It samples the three-dimensional location and orientation of each sensor at a rate There are few quantitative studies that compare sound control of 100Hz. The spatial resolution is about 1cm/2 degrees. Both interfaces [16], [19]. Also, the importance of parameter mappings gloves measure the user’s finger flexure and return one integer number for each finger defining how much the finger is bent. The Permission to make digital or hard copies of all or part of this work for gloves are not able to simulate tactile feedback. personal or classroom use is granted without fee provided that copies are EVE’s sound system consists of 15 surrounding the not made or distributed for profit or commercial advantage and that cube-shaped room, behind the screen walls. Vector Based copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, Amplitude Panning [14] is used to make the sounds originate requires prior specific permission and/or a fee. from any desired direction. Conference’04, Month 1–2, 2004, City, State, Country. Copyright 2004 ACM 1-58113-000-0/00/0004…$5.00.

11 Proceedings of the 2005 International Conference on New Interfaces for Musical Expression (NIME05), Vancouver, BC, Canada

Two of the instruments were also implemented in a desktop For ease of use, we made a graphical editor for creating and environment. Their user interface is based on a web camera and editing the control mappings. Its visual node tree gives the user a computer vision technology, and they run on a typical Linux PC. clear view of each mapping and of the parameters in it.

3. INSTRUMENT INTERFACE SYSTEM 4. ANALYSIS FRAMEWORK The main goal of the instrument interface system was to make In Jorda’s analysis framework [6], efficiency and learning curve prototyping quick and easy. Any input from any interface can be are the main concepts. The framework is aimed at analyzing easily mapped to any parameter(s) of the controlled sound professional instruments, and does not directly take into account model(s). Next, we describe the main components of the system. user satisfaction or the appeal of the experience. Our instruments - Input devices. The system can use a variety of input devices, were mostly show cases inspecting possibilities, designed to offer such as data gloves, magnetic trackers, computer vision and MIDI inspiring experiences. Despite this, we use the framework in our controllers. Input is acquired from input devices at a high and analysis, as it offers one useful perspective. We also share our constant rate, allowing for precise rhythmic control of sound. design experiences in the analysis part. - Input processing. The user interface components can access all 4.1 Efficiency collected input data, and process it in any way to extract desirable The efficiency of a musical instrument measures the relation of features, such as gestures, and produce any number of new output musical output complexity to control input complexity. This value values/events. For instance, a xylophone plate component may is scaled with performer freedom, which represents how much send its distance from a mallet, collision speed with the mallet, control the performer has over the output. Although a CD player location of the collision in 3D space, and so on. produces great music, its performer freedom is low, as the user - Control parameter mapping module. The parameters from the can do little to influence the music. The equation parameters are interface components control the parameters of the sound models not easily quantifiable, and we will only attempt to estimate them through a mapping module. The mappings can be created using a in our analysis. visual mapping editor application. outputComp lexity * performerFreedom (1) [6] - Control language. We developed an expressive control language instEfficiency = ControlInputComplexi ty that contains MIDI as a subset. As physical sound models can be computationally heavy, controlling was made possible over a network. 4.2 Learning curve Learning curve represents the behavior of efficiency as a function - Visual feedback. The performer is surrounded by a 270 degree of practice time. Simple instruments, such as a Tibetan Singing virtual reality view, able to produce any kind of stereoscopic Bowl, will likely not allow one to find new levels of expression computer graphics. after years of playing in the way a piano or a will. A good - Collision detection and other object interaction. We now use a balance between challenge, boredom and frustration is sought [6]. simple collision detection library [21] but we are in the process of Classical already established instruments are accepted to have a integrating a more physically realistic one [25] to the system. long learning curve. People are also accustomed to notice - Sound models or MIDI . A plug-in based software subtleties in their playing. However, new instruments are likely to called Mustajuuri [24] is used as a DSP platform. Each sound require a shorter learning curve in order to become accepted. synthesis model is converted into a Mustajuuri plug-in. As MIDI is a subset of our control language, MIDI can be 5. INSTRUMENTS controlled as well. However, in the instruments presented here, We will now describe four instruments created with the system we did not use commercial MIDI devices, but concentrated on VR presented earlier. Each description is followed by design interfaces and physical sound models instead. experiences and a short analysis of the instrument in question.

3.1 Mapping the Control Parameters 5.1 Virtual Xylophone Control parameter mapping means routing the user interface The Virtual Xylophone interface consists of a user-definable parameters into the control parameters of the sound model(s). Our number of virtual xylophone plates and two virtual mallets. In his configurable mapping module allows any amount of input hands, the performer holds two magnetic sensors, from which the parameters to influence any amount of output parameters. VR mallet visualizations extend. The mallets inherit the location and orientation of the sensors and cast shadows on the plates. Collisions are detected between the virtual mallets and the xylophone plates, controlling the corresponding DSP objects accordingly. Hit velocity is mapped to the amplitude of an impulse sent to the sound model. The hit location on each plate affects the decay time of the sound. The instrument allows a polyphony of several sound models. Figure 1. Screen shot of a mapping from the mapping editor. On the left side of the performer, the interface displays a large The mappings are created as node trees (the Nx+y node in the piano keyboard. The performer can grab keys with their left data figure performs a scaling and offsets the input ). glove. This creates a new xylophone plate of the chosen note,

12 Proceedings of the 2005 International Conference on New Interfaces for Musical Expression (NIME05), Vancouver, BC, Canada attached to the user's hand. The note name, such as C5, is also placing the plates optimally for each musical piece. This increases displayed above the plate. The performer can then move the plate the efficiency considerably. Yet it also interferes with learning, as to any place and orientation in 3D space. Grabbing and moving each interface is different and usable mostly for a particular piece. existing plates is also possible. The arrangement can be saved and We do not yet know if some layout would be a good general loaded. Figure 2 shows a performer playing a complex layout. solution. Possibly some core of the layout could stay the same. The possibility of moving plates and creating new ones allows the The learning curve of the instrument is gentle in the beginning. user interface to be customized even for the needs of individual The concepts of playing and modifying the interface are easy to pieces of music. Chords can be constructed by piling plates on top grasp. The instrument is likely to stay interesting for a long time. of each other. Short passages can be made by stacking plates Unlike with the traditional xylophone, this does not come so further apart, played with one swift motion that goes through much from acquiring kinaesthetic mastery, but from discovering them all. Thus, difficult passages can be made easier to play. the possibilities of the interface, and from better understanding of how individual interfaces should be designed.

5.2 Gestural FM Synthesizer The Gestural FM Synthesizer is an evolution of the somewhat famous instrument [26]. The original Theremin, invented in 1919, is considered to be the first gesture controlled instrument. It is played by moving one's hands in the air near two antennae. The right hand controls the pitch of the instrument’s simple sine wave oscillator. Moving the hand closer to the antenna increases pitch, and moving it away decreases it. Amplitude is controlled by moving the left hand relative to a loop-shaped antenna located on the left side of the instrument. Figure 2. A performer playing the Virtual Xylophone. By Playing the original Theremin is difficult, and requires perfect moving and creating new plates, the performer has created a pitch hearing, because the only feedback is aural. Finding notes custom interface for playing a particular musical piece. on the instrument requires recognizing them through hearing. The basic sine wave sound also limits expressiveness. 5.1.1 Analyses We created the Gestural FM Synthesizer in the spirit of the Because both the mallets and the plates exist only virtually, the original Theremin. For a more interesting sound, we replaced the mallets can pass through the plates without resistance or recoil. sine wave with a preconfigured FM synthesizer. The sound was While at first this seemed to bother some users, it elicited a new set to simulate brass with infinite sustain for a continuous sound. playing style, where plates could be struck from both sides by The instrument is played with data gloves. The pitch of the sound moving hands on smooth, elliptical paths. This is markedly is controlled by moving the right hand up and down, and the different from traditional Xylophone playing, and allows for amplitude by opening and closing the fingers of the right hand - playing with continuous motion. as if the performer was letting the sound out from his hand. When Due to the tracker’s limited spatial resolution, the motion paths in closed, the left hand’s relative position alters the timbre by slight playing the Virtual Xylophone are larger than with a normal changes in modulation indexes. When the left hand is open, the xylophone. There is a latency of approximately 60ms between modulation indexes remain unchanged. striking and hearing the sound. Yet, without tactile feedback, the The instrument also offers a visualization of a musical scale as a sound and visualization, both delayed, are the only cues of the vertical piano keyboard. A thin line is projected from the exact collision time. Thus, the is not so noticeable. performer’s hand to the current pitch on the keyboard. The pitch The Virtual Xylophone is effectively a different instrument than is continuous; the keyboard is only a visual aid. the traditional xylophone. Plates can fill the entire space around the user, and are relatively large. Moving between them is not as 5.2.1 Analyses fast as with a traditional xylophone. The sound is more flexible, As a result of the visual feedback, users found it much easier to and the configurable user interface offers possibilities to make find particular notes on the Gestural FM Synthesizer, compared to playing easier, and for making effects, such as patterns and the original Theremin. We suspect that the visual feedback is of chords. The instrument is intuitive to anyone familiar with the considerable help in early rehearsal of the instrument. Its traditional xylophone. importance is likely to lessen as the performer learns the Based on initial feedback from early user tests, the configurable instrument better and comes to rely on their inner presentation. interface considerably increased the appeal of the instrument. With continuous sound instruments, system latency is less Making chords and short sequences was seen to be very noticeable than with percussion instruments. We conducted a user rewarding. The people testing it were excited and inspired. test of the just noticeable latency on a normal Theremin [12]. The The efficiency of the Virtual Xylophone can be altered by first latency that the subjects statistically noticed was 30ms. creating different layouts of plates. The performer has real-time However, they were not certain of their answers until latencies control over three parameters: amplitude, note and decay time. All reached the length of 60 to 70ms. We also tested how latency are controlled by varying the style of striking. The performer can affects playing accuracy, and time to reach desired notes on a also create chords and sequences and improve the interface by Theremin and on a Virtual Reality Theremin [11]. The study

13 Proceedings of the 2005 International Conference on New Interfaces for Musical Expression (NIME05), Vancouver, BC, Canada suggested that latencies up to 60ms do not impair playing these The membrane allows the user to experiment with a highly instruments. Mostly, this is because the instruments do not offer flexible sound model. Its sound is realistic, and the material tactile feedback. As a function of latency, time to reach notes options range from leather to wood, to metal and beyond. The increased roughly five times as much as the introduced latency. In sound propagation can also be visualized as an animated wave light of these tests, virtual reality interfaces seem feasible for grid on the virtual plate (see Figure 3). This visualizes how the controlling continuous sound, despite the latency. With better impulse propagates on the surface of the membrane. hardware the latency can also be considerably reduced. As mentioned, a traditional Theremin is a difficult instrument. 5.3.1 Analyses The few virtuoso performers have years of intensive rehearsal Because of the flexible and realistic sound model the instrument behind them. The learning curve of the Gestural FM Synthesizer attracts ones interest. Test users remained fascinated with the is likely to be steeper in the beginning as the visual feedback instrument for a long time. Also, hitting a leather membrane a supports learning. Seeing where the absolute notes are makes it hundred square feet in size produces a magnificent, thunder-like possible to play the instrument without perfect pitch hearing. sound. The user really feels that he is making something happen. is easy to produce on the traditional Theremin by waving The ability to alter physical parameters while the plate is vibrating the fingers of the pitch hand. However, because of the low opens up interesting possibilities that are not readily available in resolution of our tracker, the required motion is slightly larger on the real world. The material can be continuously changed while our instrument. As a result, it is more tiring to produce vibrato. waves are travelling on it. For instance, tension can be low as the Thus, we added the option of automatic halftone vibrato, switched user hits the membrane, resulting in a sound of so low a on when the performer moves the right hand away from their that it is not audible. Increasing the tension after the hit makes the body. After a threshold, the distance alters the vibrato speed waves faster and creates powerful, rumbling sounds. Thus, the between 4Hz and 8Hz. This option can be switched off at the attack can be omitted, and only the decay used for special effects. performer's preference. The pitch scale can be modified, but Another version of the membrane, based on a web camera and because of the limited resolution, should not be too small. displayed at a science exhibition, uses one slider for interpolating The performer controls four parameters with the hand motions. between predefined material sets. The interpolation is also Pitch and amplitude are controlled more than modulation indexes, visualized by morphing the drum plate material on screen. which alter the timbre of the sound. However, all of them can be Another slider is used for modifying decay length (material controlled in real-time. Total mastery of controlling all parameters friction), and a third one for modifying the size of the plate. Each fluently and meaningfully is likely to require a lot of practice. slider can be “scratched” while playing. Because of the extra sound parameters, the instrument efficiency In the VR room version, the performer has simultaneous real-time is likely to be higher than that of a traditional Theremin. control over four parameters, impulse amplitude, two-dimensional hit location, and one of four sound model parameters in the form 5.3 Virtual Membrane of sliders. Additionally, the other three parameters can be The Virtual Membrane is an interface built around a sound model controlled by focusing on a different slider of a rectangular membrane [15], supported at the edges. Its It is not clear what a performer could learn to do when practicing physical properties, such as dimensions, tension, damping and a the instrument for a long time. In addition for using it as a drum few other material properties, can be modified in real-time. It is with a large sound scale it offers possibilities for interesting and visualized as a simple, textured rectangular plate matching the uncommon sound effects. Currently it could be used as an physical dimension parameters of the sound model. atmospheric sound generation tool for movies, as an example. Tactile feedback could be added to the interface by using a MIDI drum set for inputting amplitude and hit location. The MIDI drum could then be expanded with gestural control of the material parameters. Tactile feedback would allow for faster and more accurate playing. Material parameters could be controlled with feet, similar to a kettle drum, or with body location. This would free both hands for drumming. With a MIDI interface the efficiency of the instrument would be that of a normal drum expanded with larger control complexity and performer freedom Figure 3. Playing the Virtual Membrane with a waveform in the form of additional control of the sound. visualization and playing the exhibition version. The user interacts with the membrane with two mallets, one in 5.4 Virtual Air Guitar each hand. The hit location is mapped to the excitation point of Playing an air guitar is to imitate rock guitar gestures along music the sound model. Hitting different locations produces a different without the actual physical instrument. It is more showmanship timbre. Hit velocity is mapped to the amplitude of the impulse than musical performance and does not require real musical skills. given to the excitation location. Plate dimensions, tension and the Nevertheless, the musical component is always present. The goal speed of the sound in the material are controlled by virtual slider of our Virtual Air Guitar (VAG) project was to create a more widgets. These are mapped directly to the parameters of the sound interactive version of the experience, something that the users model. The sliders can be moved by touching them with a virtual could control instead of just acting along with. The Virtual Air mallet held in the performer’s left hand. It is possible to quickly Guitar [7] is actually playable. It has an sound and “scratch” the sliders even when the sound is playing after a hit. is controlled with guitar playing gestures performed in the air.

14 Proceedings of the 2005 International Conference on New Interfaces for Musical Expression (NIME05), Vancouver, BC, Canada

We have made two versions of the Virtual Air Guitar. One is play modes include the four chords of the intro to Smoke on the implemented in the virtual room, and the other one on a generic Water, and a more free solo mode on a pentatonic minor scale. desktop Linux PC with a web camera interface. Both versions use The performer controls pitch and volume. In most of the modes the distance between the user's hands to determine pitch, and this control is limited to a predefined scale. Thus, the actual plucking is done by moving the right hand in a strumming motion. efficiency of the instrument is not near that of a normal guitar. The pitch scale can be either fret-based or continuous. Slides are However, the VAG is an entertainment device rather than a made by moving the left hand along the imaginary guitar neck, professional instrument. And as such it functions well. We have and vibrato is produced by shaking the left hand. witnessed users of all ages play the instrument. They have been The instrument uses an extended Karplus-Strong sound model, curious and enthusiastic about it and have almost always walked which is tuned to match a Stratocaster guitar. The sound goes away with a smile on their face. through a simulated tube and effects chain to produce As the VAG was designed to be playable for a beginner its the distorted electric guitar sound. learning curve is likely to flatten already after a short practice. In order to achieve a realistic result that sounds like real guitar However, as it supports the playing, even a beginner can have lots playing, we implemented a guitar control language, and created a of fun with it and experience what it is like to play a rock guitar. software component that is controlled with this language. The component controls the six strings of the sound model, and keeps 6. DISCUSSION track of the state of each string and the performer. The controller Virtual reality is a different medium compared to our physical understands elementary playing techniques such as hammer-ons, world. Replicating interfaces of traditional instruments in virtual pull-offs, vibrato, fret-based slides and mute. The component then reality may not bring about useful results unless we are extending modifies the parameters of the sound model accordingly. existing instruments with additional control. Familiarity with a The Virtual Air Guitar interface offers different play modes such real-world counterpart helps to grasp the concept and supports as free play, pentatonic scale play, strumming and rock solo playing in the beginning. However, it may not be easy to find new modes. The desktop version became a major attraction in ways of playing, which is needed as the instruments are different. Finland’s largest science exhibition. For more information on the Users of traditional instruments may not even be interested in Virtual Air Guitar, see our comprehensive article about it [7]. using virtual replicas, as the original ones work fine. Instead, we should discover the kinds of interfaces that are best suited for the VR medium. For instance, the presented Gestural FM Synthesizer extends the expressiveness of the Theremin, and is not hampered by latency or lack of tactile feedback. For making instruments in a new medium the properties of the medium and their musical effects should be well understood in order to work around the limitations and utilize the strengths. The efficiency of our instruments is reduced by low spatial and temporal resolution, as well as by latency. Low resolution requires larger motion paths. Together with latency, it makes fast and accurate playing difficult. However, these properties could be Figure 4. Left: rocking on with the VR version of the Virtual improved with existing technology. Lifting the limitations would Air Guitar. Right: computer vision view of the desktop ver- also allow for faster instruments with vast possibilities. sion. A web camera interface tracks the yellow gloves and uses One of the most promising features of virtual reality interfaces is gesture recognition to detect plucks, vibrato, slide and mute. the potential for visualization. Unlike in physical instruments, where visual feedback can only directly correspond to the physics 5.4.1 Analyses of sound production, such as vibrating strings, virtual instruments From the selection of instruments we have created, the VAG has allow for intelligent visual feedback. A physical instrument's proven to be the most popular. It is thrilling for people to produce appearance is static, but a virtual instrument can modify its visual guitar music out of thin air. Also, the electric guitar is a “cool” features in real-time. For example, the Gestural FM Synthesizer performance instrument. Who hasn't dreamed of being a rock star, includes visualization that helps users reach correct pitches. even for a short while? In addition to providing visual cues, visualization could be used to The VAG is made so that even an unskilled performer can map teach the user how to play an instrument. For example, the FM his motion intensity into relatively good sounding rock guitar Synthesizer could include a scrolling musical score that marks the playing. A built in scale quantization makes sure only notes or correct position of the hand in the air. chords that fit well together are produced. In addition to teaching music, features of the instrument sound Naturally, the quantization gets in the way of playing actual can also be visualized. For example, a virtual widget could songs. However, without quantization, finding specific notes is change color and texture according to the properties of the sound difficult, because there is only air between the hands. The right it is linked to while playing. Visualizing sound has been studied hand’s motion also causes the distance to fluctuate. Thus, we already, but there is little research available on visualizing the cannot use too accurate a scale, even though the location of the interpreted concepts that humans make of sounds. For example, right hand is lowpass-filtered. On the exhibition version, the two the term "brightness" is subjective. The meanings of these terms

15 Proceedings of the 2005 International Conference on New Interfaces for Musical Expression (NIME05), Vancouver, BC, Canada and how to visualize these concepts could offer interesting [8] Lanier, J. Virtual Reality and Music. knowledge for computer instrument design. http://www.advanced.org/jaron/vr.html (visited 20.1.2005) In addition to adding to the usability of an instrument, [9] Machover, T. Instruments, Interactivity, and Inevitability. visualization can also affect the entire performance. Playing a Proceedings of the NIME International Conference, 2002. musical instrument is not an isolated act, but most often placed in [10] Mulder, A. (1998) Design of Virtual Three-Dimensional the context of a or other performance. The performer is Instruments for Sound Control. PhD Thesis, Simon Fraser not only producing music, but creating a multimedia show for the University. audience. By visualizing sounds in an aesthetically appealing way, artists are given more possibilities for engrossing the [11] Mäki-Patola, T. and Hämäläinen, P. Effect of Latency on audience. At the time of writing, music visualization based on Playing Accuracy of Two Continuous Sound Instruments audio analysis is gaining increased attention, especially in the Without Tactile Feedback. Proc. Int. Conference on Digital music style of electronica. These are most often abstract patterns Audio Effects (DAFx'04), Naples, Italy, 2004. that change color and shape to the rhythm and spectral content of [12] Mäki-Patola, T. and Hämäläinen, P. Latency Tolerance for music. By linking visualization to the instruments that produce Gesture Controlled Continuous Sound Instrument Without the sound, it can be made much more detailed, reacting to changes Tactile Feedback. Proc. International in individual sounds. Basically, this generalizes the concept of Conference (ICMC), Miami, USA, November 1-5, 2004. mapping to cover the routing of control parameters also to the [13] Paradiso, J. Interfaces: New Ways to Play. visualization as well as to the sound model. IEEE Spectrum, 34(12), 18-30, 1997. Later expanded as an online article, 1998. (visited 20.1.2005): 7. CONCLUSIONS http://web.media.mit.edu/~joep/SpectrumWeb/SpectrumX.ht In this paper, we have presented a virtual reality software system ml designed for making musical instruments. Four instruments created with the system were presented. The instruments were [14] Pulkki, V. Spatial sound generation and perception by analysed against a framework that considers the learning curve amplitude panning techniques. PhD thesis, Helsinki and efficiency of an instrument. Each instrument’s analysis was University of Technology, Espoo, Finland, 2001. also accompanied with our design experiences. The effects of [15] Trautmann, L., Petrausch, S., Rabenstein, R. Physical several interface properties were discussed in relation to design of Modeling of Drums by Transfer Function Methods. Proc. musical instruments using novel input and output hardware. Int. Conf. on Acoustics, Speech & Signal Processing (ICASSP), Salt Lake City, Utah, May 2001. 8. ACKNOWLEDGMENTS [16] Vertegaal, R. Eaglestone, B. Comparison of Input Devices in This research was supported by Pythagoras Graduate School an ISEE Direct Timbre Manipulation Task. Interacting with funded by the Finnish Ministry of Education and the Academy of Computers 8, 1, pp.113-30, 1996. Finland and by an EU IST program (IST-2001-33059) [20]. [17] Wanderley, M., Battier, M. Eds. Trends in Gestural Control of Music. Ircam - Centre Pompidou - 2000. 9. REFERENCES [18] Wanderley, M. Performer-Instrument Interaction: [1] Choi, I. A Manifold Interface for Kinesthetic Notation in Applications to Gestural Control of Music. PhD Thesis. High-Dimensional Systems. in Trends in Gestural Control of Paris, France: Univ. Pierre et Marie Curie - Paris VI, 2001. Music. Battier and Wanderley, eds., IRCAM, Centre George Pompidou, Paris. 2000. [19] Wanderley, M., Orio, N. Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI. Computer [2] Cook. P. Principles for Designing Computer Music Music Journal, 26:3, pp. 62-76, Fall 2002. Controllers. NIME Workshop – CHI, 2001. [20] ALMA project instruments page (visited 13.4.2005): [3] Hunt, A., Wanderley, M., Paradis, M. The Importance of http://www.tml.hut.fi/~tmakipat/alma/almawebisivu/HUTT Parameter Mapping in Electronic Instrument Design. In MLIndex.html Proceedings of the Conference on New Interfaces for Musical Expression (NIME), 2002. [21] ColDet Library website. (Visited 20.1.2005) http://photoneffect.com/coldet/ [4] Hunt, A., Wanderley, M., Kirk, R. Towards a Model for Instrumental Mapping in Expert Musical Interaction. Proc. [22] EVE home page. http://eve.hut.fi/ (Visited 30.1.2005) of the International Computer Music Conference, 2000. [23] I-Cube website. (Visited 18.1.2005) [5] Hunt, A. Radical User Interfaces for Real-time Musical http://infusionsystems.com/catalog/index.php Control. PhD Thesis, University of York UK. [24] Mustajuuri DSP software webpage (Visited 19.1.2005) [6] Jordà, S. Digital Instruments and Players: Part I – Efficiency http://www.tml.hut.fi/~tilmonen/mustajuuri/ and Apprenticeship. Proceedings of the Conference on New [25] Open Dynamics Engine (Visited 20.1.2005) http://ode.org/ Interfaces for Musical Expression (NIME04), Hamamatsu, Japan, 2004. [26] Theremin info pages. (Visited 19.1.2005) http://www.theremin.info/ [7] Karjalainen, M., Mäki-Patola, T., Kanerva, A., Huovilainen, A. and Jänis, P. Virtual Air Guitar. Proc. AES 117th [27] The Yamaha Miburi System (visited 20.1.2005) Convention, San Francisco, CA, October 28-31, 2004. http://www.spectrum.ieee.org/select/1297/miburi.html

16