<<

MAT 276IA: Introduction to Algorithmic Composition and Sound Synthesis! Instructor: Dr. Charlie Roberts - [email protected] Assisted by: Ryan McGee - [email protected] Credits: 4 Time: Wednesday / Friday 10:00AM - 11:50AM Room: Elings Hall, 2003 (MAT Conference Room) Office Hours: TBD // Course Synopsis! This course provides an introduction to techniques of electroacoustic production through the lenses of sound synthesis and the . We will begin with basic acoustics and digital audio theory, and advance to sound synthesis techniques and for exploring their use. The course will explore these topics using the browser-based, environment Gibber (http:// gibber.mat.ucsb.edu). The language used in Gibber is JavaScript and a basic overview of JavaScript will be provided. No programming experience is required (although some small amount of previous programming is preferred, please contact the instructor if you have questions about this), and Gibber is written with beginning programmers in mind. Despite this, it offers a number of advanced features not available in most contemporary music programming languages, including sample-accurate timing and intra-block audio graph modification, audio-rate modulation of timing with concurrent clocks, and powerful abstractions for sequencing and defining musical mappings. It possesses a versatile graphics library and many interactive affordances. JavaScript is becoming an increasingly important language in the electroacoustic landscape. It is used in a variety of DAWs (such as Logic Pro and Reaper) and is an important part of the /MSP ecosystem. Learning the basics of JavaScript also means that you can create interactive compositions for the browser, the best vehicle for widespread dissemination of audiovisual works. Students will leave the course with a high-level understanding of various synthesis techniques: additive, subtractive, granular, FM, and physical modeling, as well as knowledge of digital audio effects. They will explore the use of algorithms to create compositions using these synthesis techniques. Finally, they will publish their work to the web as compositions that are synthesized in realtime using browser-based audio technologies. Course discussions will be accompanied by both in-class and assigned listenings, // Instructor Bios! Charlie Roberts is a Postdoctoral Fellow in the AlloSphere Research Group at UCSB, where his research explores human-computer interaction in virtual reality and creative coding environments. He is the primary author of Gibber, a creative coding environment for the browser, and has given performances in the US, Europe and Asia using it to improvise audiovisual art. His upcoming scholarly work includes an article on Gibber in the Journal, an invited chapter in the Oxford Handbook of Algorithmic Music, and an article on using Gibber to teach in the Journal of Music, Technology and Education. Ryan McGee is an engineer, composer, and new media artist experienced in the development of custom and commercial software for spatial audio, sonification, sound design, interactivity, and mobile applications. He holds a BSEE from the University of Texas at Dallas and a MS in Multimedia Engineering from UCSB. Ryan is currently completing his PhD dissertation on Spatial Modulation Synthesis, a novel technique unifying sound spatialization and timbre. // Grades! 30% - Four mini-assignments exploring specific concepts presented in class 25% - Midterm composition / documentation 30% - Final composition / documentation 15% - Participation // Attendance! Don’t miss class! If you have to miss, please let Ryan and I know beforehand. Missing multiple classes will results in a lowering of the participation portion of your grade. // Schedule! Week 1. Introduction to Gibber and JavaScript! In this first week, we will focus on learning the basics of JavaScript while having fun with Gibber. The musical emphasis will be on creating rhythmic patterns using percussion instruments. By the end of the week, students will understand the basics of JavaScript (object notation, variables, and function calls) and the possibilities of Gibber and web-based audio composition. Week 2. Acoustics, the Sampling Theorem, Sine Waves and Additive Synthesis We start by discussing the physics of audio production and the phenomenology of audio reception. We then move to principles for converting digital audio representations into physical waveforms, and conclude the week with an overview of sine waves, additive synthesis, and enveloping. Week 3. Filters, Subtractive Synthesis, and Modulation! The frequency content of canonical waveforms (triangle, saw, square, PWM etc.) is discussed, followed by a discussion of audio filters. We progress from our previous week’s discussion of envelopes to include the use of low-frequency oscillators and envelope followers for modulation. Week 4. Twentieth-Century Algorithms in Action: Serialism, Twelve-Tone Technique and Algorithmic Processing! What are possibilities for the algorithmic manipulation of musical sets? We discuss serial techniques and look at time-based approaches for manipulating musical information. Along the way we learn more about creating and manipulating functions in JavaScript, and the powerful pattern manipulations available in Gibber. Week 5. Frequency Modulation and Human-Computer Interaction ! We learn how to make your favorite sounds from 80s pop-music using the synthesis technique of frequency modulation. We also look at more contemporary uses of frequency modulation, and discuss the incorporation of interactive elements into algorithmic music. Week 6. Realtime Composition and Performance! Live Coding is a (typically) improvisatory performance practice where audiovisual works are programmed in front of audience members. What are the temporal strategies live coders use in their performances? How can these strategies be employed in “traditional” algorithmic composition? Week 7. Algorithms part 2 and Digital Audio FX! We discuss canonical digital audio effects and explore algorithms for manipulating them. We also discuss a variety of algorithms typically used in media arts practice, such as the Boids flocking algorithm, Conway’s Game of Life, and L-Systems, covering their historical use in composition and their manipulation in Gibber. Week 8. Sampling, Re-sampling, and Granular Synthesis! How can live audio input be incorporated into compositional strategies? What is the potential of sampling algorithmic output and processing it? What algorithms effectively process sampled algorithmic output? Granular synthesis using audio samples is also explored. Week 9. An Introduction to Visual Music and Audiovisual Mapping! Gibber contains a variety of graphical affordances and abstractions for easily creating mappings between audio and visual objects. How can we design algorithms that effectively manipulate both audio and visual elements concurrently? What types of connections between audio and visual signals are most easily perceived by audiences? Week 10. Other Music Programming Languages + Final Project Workshop! What does all of this look like in other music programming languages? We briefly compare and contrast audio synthesis and algorithm use in Gibber with other environments such as SuperCollider, ChucK, Tidal, and Overtone. We’ll also leave time to get assistance from Ryan, Charlie and your fellow classmates in preparing your final projects for presentation.