Applications of Feedback Control to Musical Instrument Design
Total Page:16
File Type:pdf, Size:1020Kb
APPLICATIONS OF FEEDBACK CONTROL TO MUSICAL INSTRUMENT DESIGN A DISSERTATION SUBMITTED TO THE DEPARTMENT OF ELECTRICAL ENGINEERING AND THE COMMITTEE ON GRADUATE STUDIES OF STANFORD UNIVERSITY IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY Edgar Joseph Berdahl December 2009 © 2010 by Edgar Joseph Berdahl. All Rights Reserved. Re-distributed by Stanford University under license with the author. This work is licensed under a Creative Commons Attribution- Noncommercial-No Derivative Works 3.0 United States License. http://creativecommons.org/licenses/by-nc-nd/3.0/us/ This dissertation is online at: http://purl.stanford.edu/pb603fs3989 ii I certify that I have read this dissertation and that, in my opinion, it is fully adequate in scope and quality as a dissertation for the degree of Doctor of Philosophy. Julius Smith, III, Primary Adviser I certify that I have read this dissertation and that, in my opinion, it is fully adequate in scope and quality as a dissertation for the degree of Doctor of Philosophy. Gunter Niemeyer I certify that I have read this dissertation and that, in my opinion, it is fully adequate in scope and quality as a dissertation for the degree of Doctor of Philosophy. J Salisbury, Jr Approved for the Stanford University Committee on Graduate Studies. Patricia J. Gumport, Vice Provost Graduate Education This signature page was generated electronically upon submission of this dissertation in electronic format. An original signed hard copy of the signature page is on file in University Archives. iii iv Preface In 1963, Max Mathews wrote that the \numbers which a modern digital computer generates can be directly converted to sound waves. The process is completely gen- eral, and any perceivable sound can be so produced" [106]. Since roughly this time, composers have been able to synthesize an amazing variety of sounds using diverse techniques [134]. However, compared to the development of traditional music over several centuries, the development of computer music has been so quick and so eager that the quality of the physical interaction between the performer and the instrument has deteriorated, both to the detriment of the performance and its reception by the audience. By now laptops have become computationally powerful yet relatively inexpensive and portable. It is no surprise that many computer music performers experiment with the sound synthesis and sensing capabilities offered to them by laptops. In- deed, this approach is the easiest and most cost effective approach. Let us consider a typical live performance of modern computer music. A performer might cause a loudspeaker to emit a musical sound wave by alternatively pressing the keys, shaking the accelerometer, and/or sliding his or her finger across the trackpad of a laptop. Yet the relationship between these small movements and the sound may not be imme- diately obvious. As a consequence, the members of the audience or other performers on stage may not build an accurate conceptual model describing the musical inter- action. What do they perceive about the quality of the performance? How can they judge the degree of virtuosity of the performer? How quickly is the performer able to improve his or her own playing level? How can we make it easier for people to build a conceptual model of a novel musical interaction? v I argue that we need to search for satisfying answers to these big questions now that we have harnessed the power of computation and sensing. Any arbitrary pro- grammed interaction involving sensors, computers, and loudspeakers will not neces- sarily be immediately comprehensible because there is an overabundance of design freedom. Rather than being haunted by this overabundance, known by some as \pro- grammability is a curse" [53], we should develop designs that promote the performers' and audience's forming of accurate conceptual models that describe the musical in- teraction. At present, I cannot provide general answers to all of the questions posed above. However, I would like to point out that Physical Interaction Design presents us with a design method that addresses these questions adequately. Physical Interaction Design is predicated on the fact that humans are physical beings, and we are accustomed to interacting with objects in the physical environment [165][111]. We frequently build and refine internal conceptual models for these interactions [169], and we can construct higher order models out of more basic models that we have already de- veloped [139]. We can use these models to transfer skills between virtual and real environments [86]. When we design musical instruments that incorporate meaningful physical interactions, whether these interactions are implemented in the real world or in a virtual world, these instruments promote the formation of rich conceptual models. Some computer music performance paradigms are beginning to reflect Physical Interaction Design and in particular Newton's third law, which states that \for ev- ery action there is an equal and opposite reaction." We can re-interpret Newton's third law rather loosely and abstractly as follows: when a performer manipulates a sensor to cause a sound to be synthesized, the resulting sound wave should emanate from an actuator near the sensor. For example, laptop orchestras are already widely adopting the concept that each musician's sound source should be collocated with the musician's physical actions [159]. This approach requires more cables and am- plifiers; however, it makes it easier for performers and audience members alike to understand where sounds are coming from and why they are being synthesized. As a consequence, this approach helps the performers and the audience form accurate vi conceptual models. I did not plan for the instrument designs outlined in this thesis to conform to the notions of Physical Interaction Design. I merely set out with the idea of enhancing the musician's interaction with the instrument by providing improved force feedback. In other words, my goal was to study how to make electronic musical instruments more tangible, so that the performance of electronic music would seem less artificial, \dry," \synthetic," and \mechanical" [29]. Starting out from an engineer's point of view, I studied classes of controllers for generating force feedback. Positive real controllers were guaranteed to result in a stable control system when connected to any dissipative acoustic musical instrument. To my great surprise, I discovered that any positive real controller is equivalent to a passive mechanical system. In other words, a large set of controllers that are guaranteed to be stable when applied to any dissipative acoustic musical instrument, correspond to physical analog models!! From this point onward, I repeatedly found that physical analog systems provided solutions to the practical control problems I studied. As a consequence, I have become an unexpected fervent proponent of Physical Interaction Design. I would like to push the idea of applying Physical Interaction Design to Newton's third law a bit further. Let's consider a more strict interpretation of Newton's third law. We assume now that a sensor and an actuator are perfectly collocated in the mathematical sense described in Section 2.3.2. Now when the performer pushes on the sensor, the collocated actuator pushes back on the musician. In other words, while the performer is in contact with the musical sounding object, he or she feels the natural, structural vibrations of the object that he or she is inducing. The physical interaction is collocated with the result|the sound wave emanates from the position of the collocated sensor and actuator. I argue that sound and touch should be collocated. I discuss two paradigms for designing these types of musical sounding objects. In chapters 1-4, I explain feedback controlled acoustic musical instruments, which enable a musician to interact with a traditional, acoustic musical instrument, whose acoustical properties are modified by collocated feedback. The performer interacts directly with the physical medium responsible for sound production. vii Haptic musical instruments are presented in chapters 5-7. A large class of collo- cated haptic musical instruments enable the performer to interact with virtual musical instruments. These instruments promote natural physical interactions and motivate the guidelines I provide in Chapter 8. viii Dedication This is dedicated to my parents Paul Berdahl and Jane Berdahl and my brother Carl Berdahl for their love and support. ix Acknowledgements I would like to extend my most heart-felt thanks to my first two thesis readers Julius O. Smith III (Music/EE) and Gun¨ ter Niemeyer (ME/EE) for their ongoing encour- agement and support. I came to Stanford University specifically to work with Julius, and he never ceased to continue inspiring me with his seemingly never-ending knowl- edge about digital signal processing and sound synthesis. Julius gave me the freedom to study unconventional yet fundamental approaches to synthesizing musical sound. He supplied me with financial support from CCRMA and the RealSimple project, and when I needed physical laboratory space, he generously provided me with access to one of the desks in his own office. Without Gun¨ ter's guidance, this thesis would not have been possible either. First, he aided me by opening up my eyes to the practical knowledge necessary to make feedback control systems work well in practice. In addition, he taught me a great deal about how to organize and develop research in a rigorous framework. I would further like to thank Kenneth Salisbury (CS/Surgery), my third reader, especially for his emphatic support in studying new applications of haptics to synthe- sizing sound. Finally, I thank Scott Klemmer, my committee chair, for encouraging me to think about how my thesis work relates to the principles of human-computer interaction. Many other researchers have influenced me during my dissertation research. Most notably, Bill Verplank and Chris Chafe inspired me to consider how to make it easier to play physical models in real performance contexts.