Digital Music Input Rendering for Graphical Presentations in Soundstroll(Maxmsp)
Total Page:16
File Type:pdf, Size:1020Kb
Digital Music Input Rendering for Graphical Presentations in SoundStroll(MaxMSP) Justin Kerobo School of Computer Science and Music Earlham College Richmond, Indiana 47374 Email: [email protected] Abstract—A graphical presentation is produced at a display and manipulation of virtual worlds comprised of three dimen- of a host computer such that a scene description is rendered and sional objects in a 3D space with applications in many fields, updated by a received digital music input. The digital music input from microscopic imaging to galactic modeling and notably is matched to trigger events of the scene description and actions computer graphics for films and gaming environments. There of each matched trigger event are executed in accordance with have been a few attempts to associate the direct performance action processes of the scene description, thereby updating the of music with computer video and graphics to create new art scene description with respect to objects depicted in the scene on Which the actions are executed. The updated scene description forms. One program, Bliss Paint for the Macintosh, used MIDI is then rendered. The system provides a means for connecting a input to change colors on an evolving rendering of a fractal graphics API to a musical instrument digital interface (e.g., MIDI) image. Another program, ArKaos, uses MIDI commands to data stream, possibly from Ableton, Reason, to MaxMSP and play video clips in a DJ-like process other program, MaxMSP, producing a graphical presentation in SoundStroll in MaxMSP. uses MIDI commands in a flexible environment to drive video clips, audio clips, and drive external events. Keywords—MIDI, LATEX, OpenGL, Graphics, MaxMSP, Music Technology, Ableton, FFT, IFFT, Computer Music. There are many computer programs that control sound in various ways in response to a MIDI command stream. The “3DMIDI” program appears to be un-supported and it is not I. INTRODUCTION clear if the software works or ever worked. The available documentation describes set of separate programs that each A variety of computer software programs are available performs a prescribed set of transformations to an embedded for defining and manipulating objects in a virtual three- set of objects in response to MIDI. dimensional (3D) world. For example, “3DSMax” from Au- todesk, Inc. and SolidWorks are available. They provide an as- Each different performance is loaded and executed sep- sortment of tools in a convenient graphical user interface (GUI) arately, and has its own unique tool set to make specific for manipulation and editing of 3D virtual objects. Programs adjustments to the objects in that scene. There is an API for computer display screensavers also permit manipulation shown that invites others to develop their own performances, of moving images. Also having great popularity are computer each with their own unique sets of objects and tools which software programs for manipulation of video clips, multimedia cannot be edited at that point. Unfortunately, there is no clips, and the like. Such programs include Max, Aperture, convenient user interface available for interacting computer and ArKaos. Another popular medium that supports creativity graphics with musical instrument digital data. Conventional with computers are the various computer software applications methods generally require cumbersome specification of input that involve the musical instrument digital interface (MIDI) sources, scene description parameters and data objects, and standard. linking of input sources and scene description objects. As a result, a relatively high level of computer skills is necessary The MIDI standard permits connection of musical instru- for creating graphical presentations in conjunction with music ments with digital output to related digital sound processing input. It would be improving creative output if users could devices, including computers with sound cards and sound create scenes with objects and change both the objects and the editing applications, soundboards, broadcast equipment, and nature of the interaction between the video graphics and MIDI the like. Music has become commonly performed with instru- music data. ments that send digital MIDI data since the introduction of MIDI in approximately 1985. MIDI provides a flexible set of Because of these difficulties and increased complexity, instructions that are sent via a serial data link from a controller there is need for a graphical user interface that supports inte- to a receiver that processes those commands in a variety of gration with digital musical instruments, vocal recognition, and ways that pertain to the output functions of the receiving it is possible to do this through and with a Fourier Transform device. The data and instructions involve most commonly that (FT). It is possible and likely to be able to create this interface of sounds and music, but can also involve instructions for as a three-dimensional audio sequencer and spatializer in machine control and lighting control devices. MaxMSP, also using a speech processing application and using the Fast Fourier Transform (FFT), the Inverse Fast Fourier A separate branch of technology is the development of Transform (IFFT), and the Discrete Fourier Transform (DFT) computer video graphics, the digital electronic representation analyses to filter in keywords to find objects and create a scene that you can traverse. In order to control a SysEx function, a manufacturer-specific ID code is sent. Equipment which isn’t set up to recognize II. HISTORY OF MIDI that particular code will ignore the rest of the message, while devices that do recognize it will continue to listen. MIDI (Musical Instrument Digital Interface) protocol has become the dominant method of connecting pieces of elec- ”The MIDI protocol allows for control over more than just tronic musical equipment. And when you consider the previous when a note should be played.” standard you have to say that MIDI arrived at just the right SysEx messages are usually used for tasks such as loading time. custom patches and are typically recorded into a sequencer The control voltage (CV) and gate trigger system used using a ’SysEx Dump’ feature on the equipment. on early analogue synths was severely limited in its scope MIDI information was originally sent over a screened and flexibility. Analogue synths tended to have very few twisted pair cable (two signal wires plus an earthed shield features that could be controlled remotely, relying as they to protect them from interference) terminated with 5-pin DIN did on physical knobs and sliders, patch cables and manual plugs. However, this format has been superseded to some programming. extent by USB connections, as we’ll discuss later. No waves or varying voltages are transmitted since MIDI data is sent Furthermore, there was no universal standard for the way digitally, meaning that the signal pins either carry a voltage or CV control should work, complicating the process when in- none at all, corresponding to the binary logical values 1 and terfacing between products from different manufacturers. The 0. majority of vintage CV-controlled synths can now be adapted with a CV-to-MIDI converter, so you can use MIDI to control These binary digits (bits) are combined into 8-bit messages. them. The protocol supports data rates of up to 31,250 bits per second. Each MIDI connection sends information in one Dave Smith, founder of Californian synth legend Sequen- direction only, meaning two cables are needed if a device is tial Circuits and now head of Dave Smith Instruments, antici- used both to send and receive data (unless you’re working over pated the demand for a more powerful universal protocol and USB that is). developed the first version of the MIDI standard, which was released in 1983. With the increasing complexity of synths, and In addition to the expected IN and OUT connections, most as the music industry shifted towards digital technology and MIDI devices also have a THRU port. This simply repeats the computer-based studios, the MIDI setup took off and became signal received at the IN port so it can be sent on to other the standard for connecting equipment. devices further down the chain. Devices may be connected in series and, for the largest MIDI setups, an interface with A. How It Works multiple output ports may be used to control more than 16 separate chained devices. Absolutely no sound is sent via MIDI, just digital signals known as event messages, which instruct pieces of equipment. B. Becoming a Standard The most basic example of this can be illustrated by consider- The key feature of MIDI when it was launched was its ing a controller keyboard and a sound module. When you push efficiency: it allowed a relatively significant amount of infor- a key on the keyboard, the controller sends an event message mation to be transmitted using only a small amount of data. which corresponds to that pitch and tells the sound module Given the limitations of early ’80s digital data transmission to start playing the note. When you let go of the key, the methods, this was essential to ensure that the reproduction of controller sends a message to stop playing the note. musical timing was sufficiently accurate. Of course, the MIDI protocol allows for control over more Manufacturers quickly adopted MIDI and its popularity than just when a note should be played. Essentially, a message was cemented by the arrival of MIDI-compatible computer is sent each time some variable changes, whether it be note- hardware (most notably the built-in MIDI ports of the Atari ST, on/off (including, of course, exactly which note it is), velocity which was released in 1985). As weaknesses or potential extra (determined by how hard you hit the key), after-touch (how features were identified, the MIDI Manufacturers Association hard the key is held down), pitch-bend, pan, modulation, updated the standard regularly following its first publication.