Methods of System Synchronization and Interconnection for Live Performance By John Huntington

A Thesis Submitted to the Faculty of the Yale School of Drama Department of Technical Design and Production In Partial Fulfillment of the Requirements for the Degree of Master of Fine Arts in Drama From Yale University May, 1990 © John Huntington 1990 Acknowledgements Page iii

ACKNOWLEDGEMENTS There are many people who assisted me during the writing of this thesis, and I want to thank them here, in no particular order: Glenn Birket, Founder of Kinetix - Helped define the direction of this thesis and got me access to systems at Disney. Keith Pollock, Software Engineer at Kinetix - Supplied information about Disney's Indy show. Steve Bartlett, Projectionist for Pink Floyd's Momentary Lapse of Reason Tour- Supplied information about the Pink Floyd SMPTE system and was one of my readers. Associates and Ferren - Where I used to work, and where I was introduced to SMPTE Time Code, PLC's, and many of the other concepts discussed in this thesis. Paul Lehrman, Composer/Author/Teacher - Read over the MIDI section and wrote an excellent MTC article in Keyboard. Charlie Richmond, Richmond Sound Design - Supplied information about his systems on the Indiana Jones Stunt Spectacular, the Siegfried and Roy show, and his work on a MIDI standard. Mark Koenig, at Lone Wolf (MediaLink) - Checked over and supplied information for the MediaLink Section. Joe Fitzpatrick, at ShowNet - Supplied information about and read over the ShowNet section, and hooked me up with: Scott Cunningham, of LASER Design - Supplied me information about and access to his ShowNet-controlled systems on the Mötley Crüe Tour. Fraser Bresnahan, Visual Production Supervisor for Laurie Anderson - Supplied me information about and access to Laurie Anderson's Strange Angels tour. Jonathon Deans, Sound Designer for Siegfried and Roy - Provided me with the information covered in the S&R section. I want to thank the faculty of the Yale School of Drama for their input and guidance, and especially thank my faculty advisor: Alan Hendrickson, Associate Professor (Adjunct) of Technical Design and Production. Acknowledgements Page iv

I also want to thank my readers: Steve Bartlett, Product Manager, JBL Professional Richard A. Gray, President, R.A. Gray, Inc. Steve Terry, Executive Vice President, Production Arts Lighting Finally, I want to thank my parents for giving me the freedom and the educational opportunities to pursue my own goals, whether they understood them or not. Contents Page v

TABLE OF CONTENTS

LIST OF FIGURES...... vii LIST OF TABLES...... viii PREFACE...... ix Scope...... ix Audience...... ix Possible Obsolescence...... x INTRODUCTION...... 1 SYNCHRONIZATION CONCEPTS...... 4 Master and Slave...... 5 Event- or Time-Based...... 5 Relative or Absolute...... 6 STANDARD INTERCONNECTION METHODS...... 9 Contact Closures and PLC's...... 10 Programmable Logic Controllers...... 10 Disney/MGM Indiana Jones Epic Stunt Spectacular...... 13 System Description...... 13 Operation...... 15 Other Control Inputs...... 15 Sound System...... 16 Reasons for Using Contact Closures...... 17 SMPTE Time Code...... 18 Description...... 19 User Bits...... 19 LTC...... 20 VITC...... 22 Types of Time Code...... 24 Drop Frame Time Code...... 24 European Time Code...... 25 Standard Uses of Time Code...... 26 Pink Floyd and SMPTE Time Code...... 29 System Description...... 30 Associates and Ferren Automated Film Projection System 31 Time Code Scheme...... 33 Did the Use of Time Code Limit the Performance?.... 34 Expansions of the System...... 34 Problems and Backup...... 35 Previous Sync Attempts...... 36 MIDI...... 37 Description...... 38 Contents Page vi

Differences Between SMPTE TC and MIDI...... 38 MIDI Hardware...... 39 MIDI Messages...... 40 Channel Messages...... 40 System Messages...... 42 MIDI Sync...... 43 MIDI Time Code...... 44 MTC Messages...... 45 SMPTE User Bits and Set Up Messages...... 46 Siegfried and Roy and MIDI Control...... 48 System Description...... 48 Operation and Backup...... 50 Master Control...... 50 Advantages of the System...... 51 Possible MIDI Expansions for Theatrical Control...... 52 LOCAL AREA NETWORKS...... 56 Network Topologies...... 58 Star Topology...... 58 Bus Topology...... 59 Ring Topology...... 59 Problems with PC Networks for Entertainment Applications ...... 61 ShowNet...... 63 Show Control Computer...... 64 CUE Software Package...... 64 Interfaces...... 65 Mötley Crüe and ShowNet...... 66 System Description...... 66 Manual Operation...... 67 Other Applications...... 68 Strand's SMX Protocol...... 69 MediaLink...... 70 MediaLink LAN...... 70 Datagrams...... 71 Agents...... 71 Operating Modes...... 72 Routing...... 72 MIDITap...... 72

CONCLUSION...... 74 The Future...... 75 System Description...... 76 Console Design Philosophy...... 77 Audio System...... 78 Contents Page vii

Drawbacks of the System...... 78 Conclusion...... 79 Appendix A ANSI/SMPTE 12M-1986...... 80 Appendix B SMPTE Recommended Practice 136-1986...... 90 Appendix C MIDI 1.0 Specification...... 97 Appendix D MIDI Time Code Specification...... 110 Bibliography...... 122 Contact Addresses...... 123 Contents Page viii

LIST OF FIGURES Figure 1 Simplified Indiana Jones Stunt Spectacular Block Diagram...... 14 Figure 2 Bi-Phase Representation of LTC Frame 16:47:31:23.. 20 Figure 3 Bi-Phase Modulation...... 21 Figure 4 Simple SMPTE TC Synchronization Block Diagram. . . 26 Figure 5 Pink Floyd Momentary Lapse of Reason Block Diagram 30 Figure 6 Simple MIDI system Block Diagram...... 39 Figure 7 MIDI Interface Schematic...... 39 Figure 8 Simplified Siegfried and Roy System Block Diagram. 49 Figure 9 Star Topology...... 58 Figure 10 Bus Topology...... 59 Figure 11 Ring Topology...... 60 Figure 12 Token Frame Structure...... 60 Figure 13 Mötley Crüe Block Diagram...... 66 Figure 14 Simplified "Ideal" System Block Diagram..... 76 Contents Page ix

LIST OF TABLES Table I Usage of LTC Bits...... 20 Table II MIDI Channel Messages...... 41 Table III MIDI System Messages...... 43 Table IV MIDI Time Code Messages...... 46 Table V Possible MIDI SysEx Scheme for Theatrical Control . 53 Preface Page x

PREFACE

As a child, I often spent my time drawing complex machines. When my mother asked how they would work, I would reply, "You just press a button!" I am still, these many years later, fascinated by systems which cause a series of events to happen automatically with the press of a button. This thesis is a continuation and extension of interests I have had in entertainment control systems for quite some time. I have really enjoyed writing this, and I have learned a great deal in the process.

Scope

Defining the scope of this thesis has not been an easy task, and it has taken me about a year to really whittle it down to a manageable size. I have limited discussion of synchronization methods and systems to those that have applications for productions in which live performers perform for a live audience. This thesis is as an introduction, not a comprehensive guide, to some of the concepts and methods of control system synchronization and interconnection for the live performance industry.

Audience

In order to restrict the amount of explanation of terminology and concepts in the thesis, I have made some assumptions about my audience. If I had to provide the background information necessary to enable a layman to understand phrases such as Preface Page xi

"asynchronous opto-isolated serial interface" or "16 channels of full bandwidth digital audio", this thesis would probably be 1000 pages long. Besides, those areas have been well covered in a variety of reference works. So, I'm making the following assumptions about my audience: 1) They are familiar with the technical side of the entertainment industry, and specifically have some knowledge of technical theatre. 2) They are computer literate. 3) They are familiar with basic electronics/electricity terminology and concepts. 4) They have some experience with electronic/electrical control systems.

Possible Obsolescence

Much of the technology discussed in this thesis, while certainly not new to better funded industries, is new to the entertainment business. As I write this in 1990, I am sure that some of the information included here will be outdated or obsolete in just a few years. I am making every effort to ensure that the information is as current as possible at the time of this writing. INTRODUCTION Introduction Page 2

"Back in the old days, as a stage manager, I had a lot to do during the show. On big shows on Broadway, we had 40 or 50 people working backstage, all running around like crazy. I had to use flashing lights to signal cues, talk to people on headsets, and follow the script on paper! You know, in some ways, I almost miss those days, I didn't get quite as bored during the show then. Now we have these voice and music pattern recognition systems, and they interface on the network to the lighting, sound and automation systems. There's only a few technical people backstage, mostly in the wardrobe and props departments, because they haven't quite got the robotics cost effective yet. The only reason I'm even here is in case something goes wrong, and to press this 'Authorize' button on the dangerous cues, once the actor is in the right position. Other than that, I watch a lot of HDTV and play 3-D video games!"

Even today, this hypothetical quote from a stage manager of the future is not so far-fetched. Control systems are being synchronized and interconnected increasingly throughout the entertainment industry, and this is a trend I think will continue. Amusement parks like Walt Disney World and Universal Studios Tours have for years synchronized production control systems (lighting, sound, etc.) for their attractions. The musical groups Roger Waters, Pink Floyd, and Mötley Crüe all linked visual and sound systems on their last tours. Siegfried Introduction Page 3 and Roy in Las Vegas have interconnected their visual, sound- routing and playback systems on their new show in Las Vegas.

These systems were synchronized for a variety of reasons, including increased control precision, repeatability, and reduced labor costs. In the Pink Floyd example discussed later, the synchronization of a film image, audio tracks and musicians created a highly cohesive presentation which would probably not have been anywhere near as effective if done manually.

By adding computers to the traditional entertainment control loop, incredibly complex sequences that are difficult or impossible to implement manually can be repeated flawlessly night after night, provided that the systems get accurate cue trigger data. Humans will probably never be totally removed from the loop as long as there are still humans on stage, but the number of operators necessary can be greatly reduced. SYNCHRONIZATION CONCEPTS Synchronization Concepts Page 5

The operational concepts of synchronized systems are actually very simple. The complexity exists in the interconnection method. Synchronization concepts will be discussed in this section, and the methods used to implement these concepts will be described in subsequent sections.

Master and Slave

Perhaps the most basic synchronization concept is the "Master/Slave" relationship. In any situation where multiple elements are to be synchronized, one element must control another. The controlling element is the Master, and the element being controlled is the Slave. In traditional theatrical production, the master control source is the Stage Manager, and the various system operators (i.e. lights, sound, etc) are slaved or "synched" to this master source.

Event- or Time-Based

Synchronized systems fall into one of three categories: event- based, time-based or a combination of the two. A theatrical production is generally event-based; the Stage Manager calls cues based on the occurrence of an event, which could be the speaking of a certain line of dialogue or an actor's action. Event-based systems give performers freedom to vary the timing in their performances, to improvise, or make mistakes. Synchronization Concepts Page 6

The second major type of synchronized system is the time-based system. In these systems, all of the production elements, including the human performers, are synchronized by one means or another to a master clock. A time-based system is less forgiving to human performers, but has a major advantage -- once a time- based system is assembled and programmed, it can be run automatically. The performers are then responsible for locking themselves in sync with the control systems.

Other systems are a combination of the two types. Systems which are primarily event-based can have built in time-based sequences. Conversely, a time-based system can stop and wait for an event trigger before continuing. Musicals and operas are examples of hybrid time and event-based systems. The musical sections are time-based, with the conductor acting as a master clock. The clock stops for the non-musical sections, and starts again after some event, such as a particular line of dialogue, has transpired.

Relative or Absolute

Another important concept in synchronized systems is the difference between "absolute" and "relative". Absolute is defined as, "pertaining to measurements or units of measurement derived from fundamental relationships of space, mass, and time." Conversely, relative means "dependent upon or interconnected with Synchronization Concepts Page 7 something else for intelligibility or significance; not absolute."1

Rotary shaft encoders come in both absolute and relative varieties, and a comparison of the two types makes a good illustrative example of the differences between the two concepts. Both types encode the movement of the encoder's shaft through its rotation, and accomplish this through the use of a masked glass disk which interrupts an LED emitter-detector pair. However, the two types use different kinds of disks to generate absolute or relative positional information. A typical absolute encoder might send out a discrete 8 bit digital word which describes where in its rotation the encoder's shaft is. Each position of the shaft (to the resolution of the encoder), therefore, has a distinct digital value. If the power to the encoder goes off or the system suffers data corruption, the digital value representing the position is retained and can be easily reread. A disadvantage of these types of encoders is that, to get a useful signal with no repeated positional data, the entire movement of the measured system must be mechanically geared down to one revolution of the encoder's disk.

A relative encoder, on the other hand, simply generates a pulse train when its shaft is rotated. This is also a usable signal

1 William Morris, The American Heritage Dictionary of the English Language, American Heritage Publishing Co., Inc., New , 1975. Synchronization Concepts Page 8

for measuring angular position of the shaft, but only relative to a starting position. If the encoder's shaft is stationary, we get no signal. If we rotate the shaft 10 degrees clockwise, 200 pulses are generated. So we now know that, in relation to the starting position, the shaft is 200 pulses clockwise. These types of encoders need external counters in order to maintain an absolute positional value.

Data integrity is extremely important in any relative system, because if data is lost the system will be flawed. If the power supplying a relative system goes off, the system will not know where it is at power up without first re-initializing and finding its starting point. Since most relative systems have logic capabilities (more than likely they will be computer-controlled), this initialization sequence is simply programmed in. STANDARD INTERCONNECTION METHODS Standard Interconnection Methods Page 10

The interface or interconnection between synchronized systems is what allows the synchronization concepts detailed above to be implemented. The interconnection method is the platform upon which a synchronized system is built.

Contact Closures and PLC's

Probably the oldest and simplest method of control system interconnection is the use of contact closures. Contact closures can be generated by relays (mechanical or solid state), or transistors. Some older systems which used contact closures were based on hardware-implemented relay logic, but modern versions of these systems generally use a software version of relay logic running on a Programmable Logic Controller (PLC). Readers of this thesis may not be familiar with PLC's, so it's worth briefly describing them here.

Programmable Logic Controllers

A PLC is a low cost, heavy duty, industrial computer, with pre- wired optically isolated inputs, and with relay, transistor or solid state relay outputs. Originally designed to substitute for relay logic control systems, such as those found in control panels for elevators and other large machinery, PLC's have of late become remarkably sophisticated. They have been used for some time in factory automation, and are being used increasingly throughout the entertainment industry. Standard Interconnection Methods Page 11

In a typical application, the inputs of a PLC are connected to control or limit switches, and the outputs are connected to motors, indicators, solenoid valves, contactors (which can drive heavy loads) or other devices that need to be controlled by the PLC. The PLC reads the condition of the inputs, and the internal program evaluates and executes the output states based on the status of the inputs.

This internal program is written by the user for each application, generally in a language called Ladder Logic, which is a sort of sequentially executed relay logic schematic. Graphically, the program looks somewhat like a ladder, hence the name. Programming is done using various types of "elements" which are analogous to relay contacts and coils (from its roots in relay logic). The program can be entered into the PLC either through the use of a dedicated programming panel, or through an interface with a larger computer such as an IBM PC.

PLC's are incredibly reliable and almost impervious to electrical noise. A small $600 Mitsubishi PLC was used to control the local functions of the automated 35mm motion picture projector described later in the Pink Floyd section. The ignition pulse from the high power Xenon projection lamp used in these projectors stops most other computers dead, but didn't affect the PLC in any way. The PLC in that projector operated without any failures for the entire abusive duration of several world tours. Standard Interconnection Methods Page 12

PLC's are also amazingly sophisticated, considering their price and size. Off-the-shelf fiber optic communication links, digital math capabilities, high speed counters, analog I/O's and motion control interfaces are available even for the smaller units. Standard Interconnection Methods Page 13

Disney/MGM Indiana Jones Epic Stunt Spectacular

As the actor playing Indiana Jones tip-toes across the stage, six foot tall pungi sticks blast up out of the floor. Still reeling from the narrow escape, Indy climbs to the top of the rock ledge upstage, and grabs the treasure for which he has been searching. Suddenly, there is a loud rumble, and a giant boulder rolls straight towards him across the rocks. He tries furiously to outrun it, but it is just moving too fast! It rolls right over him. Did it really crush this poor actor? Is he alright?

Of course he's alright, this is Disney World! We have been witnessing the Indiana Jones Epic Stunt Spectacular at Disney/MGM Studios in Orlando, Florida. This show is a series of stunts, reenacted (supposedly) as they would have been done for the filming of one of the Indiana Jones motion pictures. The crew moves onto the next shot, and we see a fight scene which culminates in a machine gun battle and a flipping, exploding truck stunt. At the dramatic conclusion of the show, a nasty Nazi is chewed up in the prop of a plane, leaving only his clothes, and Indy and his woman narrowly escape a huge fuel explosion.

System Description Standard Interconnection Methods Page 14

Figure 1 Simplified Indiana Jones Stunt Spectacular Block Diagram

This show is run 5 times a day with the aid of a synchronized, PLC-controlled, event-based system. The show has 4 human operators -- 1 Mechanics/Pyro, 2 Sound, and 1 Lighting -- who are cued by a Stage Manager. What makes this production different from more traditional theatrical productions (aside from the money spent) is that the mechanics, pyro and sound effects systems are all triggered by the Mechanics operator, or in some cases, by the actors themselves. While this is a fairly sophisticated system, the interconnection between the systems is accomplished through the use of a very simple method -- contact closures. Standard Interconnection Methods Page 15

The Mechanics operator runs a large console full of controls and indicators which is connected to the inputs and outputs of a Allen-Bradley Model 525 Programmable Logic Controller. Each of the subsystems is run by a small, SLC series Allen-Bradley PLC, which is linked to the 525 by contact closures.

Operation

For each effect, the Mechanics operator presses and holds the appropriate "Arm" button. The program in the Master PLC then sends an Arm contact closure to the PLC running that effect's subsystem. The program in the sub PLC evaluates all the conditions of which it is in charge, and, if everything is ready, sends a "Ready" contact closure back to the Master Control PLC. A Ready light is then illuminated on the operator's console. The operator must continue to hold the Arm button and, on cue, press the "Go" button. The master PLC then sends a Go signal back to the sub PLC, and it triggers the valves, motors, etc. needed to make the effect happen.

In this way, at least three distinct events must occur before any effect is allowed to operate: 1) The Arm signal must be continuously sent by the operator, 2) The Ready signal must be returned by the sub PLC, and 3) The Go signal must be sent by the operator. Numerous safety features are also implemented in the system, but for legal reasons, Disney would not allow me to write about them. Standard Interconnection Methods Page 16

Other Control Inputs

For the Pungi Sticks system, the final event in the control sequence is the actor's depressing of a footswitch onstage. The Mechanics operator handles the Arm/Ready sequence, and the actor actually sends the Go signal by pressing the footswitch. The PLC then triggers the Pungi Stick subsystem, which operates the appropriate pungi stick, and the sound system, which plays the correct sound effect. In this manner, the actor is essentially operating the system, with authorization from the Mechanics operator. This approach allows very accurate timing of the effects.

Another actor-controlled effect is the machine gun in the fight sequence. The machine gun is actually a radio transmitter, and it transmits a signal to a radio receiver, which sends a contact closure to the Master PLC. If this signal is received along with the Mechanics operator's authorization, the Master PLC sends a contact closure to the Machine Gun subsystem PLC. This PLC triggers the mechanical bullet hit effects, and also triggers a sampled machine gun sound generator which is routed through the sound system. An identical control scheme was used several years earlier in the Miami Vice attraction at Universal Studios Tours in Hollywood.

Sound System Standard Interconnection Methods Page 17

The heart of the sound system is a Richmond Sound Design COMMAND/CUE audio control system, which is based on an Amiga computer. The COMMAND/CUE system handles all audio routing and effect triggering. The Master PLC, via the contact closure, essentially presses the COMMAND/CUE's Go button. Based on a cue list entered into its software, the COMMAND/CUE then triggers specific audio samples stored in a custom ROM-based sample playback unit, starts tape decks, and does routing and gain manipulations.

The sound operators control routing and levels for the wireless mics used in the show, and use the COMMAND/CUE to manually trigger sampled sound effects during the fight sequences. If Indy punches someone in the face, one of the operators is simultaneously punching a button on a custom Richmond control panel, which causes the appropriate punch sound to be played back and correctly routed.

Reasons for Using Contact Closures

In this application, an extremely simple interconnection method is used by Disney to run a very complex system. While Disney is certainly capable of a more sophisticated and powerful interconnection method, they use contact closures because they are proven, reliable, and easy to maintain. Maintenance is an important concern for Disney's attractions, which are designed to run for many years. Neither complex test equipment nor highly Standard Interconnection Methods Page 18 trained electronic technicians are needed to maintain a contact closure based system. Standard Interconnection Methods Page 19

SMPTE Time Code

In 1967, to facilitate electronic video tape editing, a company called EECO developed a method of encoding video frames with an absolute time and frame number. EECO's approach was based on a method used by NASA to synchronize telemetry tapes from its Apollo missions. The EECO code was very successful in the marketplace, and several other manufacturers developed their own, proprietary time coding methods. This situation became problematic, as equipment from one manufacturer would not work with equipment from another.

In 1969, the Society of Motion Picture and Television Engineers (SMPTE) formed a committee to formulate a standard video time coding scheme. The resulting standard was eventually adopted by the American National Standards Institute (ANSI) as a national standard. This standard, ANSI/SMPTE 12M-1986, is entitled "American National Standard For Television - Time and Control Code - Video and Audio Tape for 525-Line/60-Field Systems". Further definition for use with film is in SMPTE Recommended Practice RP 136-1986, "Time and Control Codes for 24, 25 or 30 Frame-Per-Second Motion-Picture Systems."2

2 Both standards are contained in the appendix. Standard Interconnection Methods Page 20

Figure 2 Bi-Phase Representation of LTC Frame 16:47:31:23. From Audio Production Techniques for Video (See Bibliography).

Description

Because of its origin in video tape editing, the SMPTE Time Code (TC) "address" is broken down into Hours, Minutes, Seconds and Frames. A typical Time Code would be 16:47:31:23, or 16 Hours, 47 Minutes, 31 Seconds, and 23 frames. SMPTE is an absolute standard, which means that by reading a single frame of time code, one can determine where in time that frame exists. If a show is started at 00:00:00:00, and the 16:47:31:23 frame shown above is read, the system can derive that this frame belongs 16 hours, 47 minutes, 31 seconds and 23 frames from the beginning.

User Bits

To allow for flexibility in Time Code applications, SMPTE left 32 of the bits used to code a TC frame available for any kind of information a user might want to record. These are known as the "User Bits". In Recommended Practice RP 135-1986, SMPTE Standard Interconnection Methods Page 21 recommends a 6 bit character set for encoding characters in the user bits.

LTC

There are two kinds of Time Code -- Longitudinal Time Code (LTC) and Vertical Interval Time Code (VITC). Longitudinal Time Code is so named because it is recorded longitudinally in an audio track on video tape, not helically like the video signal. LTC is distributed and amplified using standard audio circuitry. Each

Table II Usage of LTC Bits

00-03 Units of Frames 04-07 First User Bit Group 08-09 Tens of Frames 10 Drop Frame Flag 11 Color Frame Flag 12-15 Second User Bit Group 16-19 Units of Seconds 20-23 Third User Bit Group 24-26 Tens of Seconds 27 Bi-phase mark phase correction bit 28-31 Fourth User Bit Group 32-35 Units of Minutes 36-39 Fifth User Bit Group 40-42 Tens of Minutes 43 Binary group flag bit 44-47 Sixth User Bit Group 48-51 Units of Hours 52-55 Seventh User Bit Group 56-57 Tens of Hours 58 Unassigned address bit (0 until assigned by SMPTE) 59 User Bit Group flag bit 60-63 Eighth User Bit Group 64-79 Synchronizing word: 64-65 Fixed Zero 66-77 Fixed One 78 Fixed Zero 79 Fixed One Standard Interconnection Methods Page 22

LTC frame is composed of 80 bits, which are numbered 0-79. For a description of the function of each bit, see Figure 2 and Table I.

LTC is recorded using "bi-phase modulation" (See Figure 3). With this method, there is a transition from high to low or low to high at the start (and because bits are trailed Figure 3 Bi-Phase Modulation together, at the end) of every bit. A bit with no internal transitions is a 0, and a transition within a bit denotes a 1. This bi-phase coding scheme also means that the polarity of the LTC is irrelevant.

Because bit timing in the reading of LTC is derived from the actual code read, and not by comparison to an external clock, LTC can be decoded in either direction and at speeds well above (100X) or below (1/50) its normal speed.

At the end of each LTC frame, there is a 16 bit "Sync Word", which is a distinct series of bits that can not be repeated anywhere else in the Time Code word with any combination of time or user data. When the decoding system encounters a Sync Word, it knows that this point in the code is the end of one frame and Standard Interconnection Methods Page 23 the start of another. The Sync Word is also used to derive tape playback direction.

LTC has several disadvantages, however. If a tape containing LTC is moving extremely slowly or is stopped, the signal can be misinterpreted or not read, because the transitions in the signal carry the information. Since LTC is an analog audio signal, at high tape transport rates the frequency can exceed the frequency response of standard audio circuitry and become distorted. In addition, LTC generally becomes so distorted and noisy after 2 generations that it must be "jam-synced", which is a process where TC is read, regenerated, and re-recorded. LTC is generally recorded at a high level in an audio track, and because it is a square wave, cross talk with adjacent tracks or even low-level signals in adjacent cables is possible. Many of these problems are overcome with VITC.

VITC

VITC was developed in the late 70's after video tape recorders were developed which were capable of high quality still images. LTC, as mentioned above, cannot be read at extremely slow speeds. In VITC, the digital time coding information is inserted in the "vertical blanking interval" of the video frame itself. The vertical blanking interval is the time during which the electron beam is "blanked" as it rasters back to the top left-hand corner of a CRT after scanning a video field. Because VITC is part of Standard Interconnection Methods Page 24 the video signal itself, it can be read and decoded even in extreme slow motion or still-frame applications. Additionally, VITC does not need one of the precious audio tracks on the video tape as does LTC.

VITC uses 90 bits to encode the TC instead of LTC's 80, and most of these additional bits are used for Cyclic Redundancy Checking (a method of error correction). The VITC code is recorded in two non-adjacent video lines in the blanking interval, and is recorded once for each of a frame's two fields. Therefore, 4 copies of the VITC code exist in each frame, and this redundancy in conjunction with the CRC check means that VITC is much less subject to dropouts and misinterpretation than is LTC.

Because VITC is encoded within the video signal itself, it is used only in video applications. VITC must be recorded with the video signal, and therefore it cannot be changed without re- recording the video signal. Since video equipment is needed to read VITC, implementations of LTC are generally less expensive for non-video applications. None of the time-based example systems discussed in this thesis involved video, so they all used LTC. Standard Interconnection Methods Page 25

Types of Time Code

Since different media and countries have different framing rates, there are different types of time code. Film is shot and projected at 24 frames per second (fps), so there is one type of TC for film. American video uses the 60hz AC line as the basis for its framing rate, and because it is interlaced, monochrome video's framing rate is 30fps. This is another type of TC, known as 30fps non-drop (we'll see why shortly).

Drop Frame Time Code

National Television Standards Committee (NTSC) color video, for reasons that are outside the scope of this thesis, has an approximate framing rate of 29.97fps. If regular 30fps TC is used to count video frames which run at a rate of 29.97fps, the time code second will gain .03 frames per second in relation to an actual second. A cumulative gain of 108 frames or 3.6 seconds per hour of running time is the result. This error may be a bit difficult to comprehend, but the key to understanding it lies in the fact that the standard defines a "color second" as ". . . the time elapsed during the scanning of [30 frames] in a color television system at a [framing] rate of approximately [29.97 frames] per second."3 There is no external clock against which to compare the frame count, because the TC is a clock. So, if we're using 30 non-drop to count NTSC 29.97fps frames, it will

3 Society of Motion Picture and Television Engineers, ANSI/SMPTE 12M - 1986, White Plains, NY, 1986. Standard Interconnection Methods Page 26 take 1.03 actual seconds to reach a count of 30, or one time code second.

While an error of +.03 real seconds/Time Code second doesn't seem like much in theatrical terms, this accumulates to 3.6 seconds per hour, or 1 minute and 26.4 seconds per day, so it is critical in video applications. For this reason, another type of code, called 30 Drop Frame, was developed. Drop Frame TC is so named because it drops the extra 108 frame counts which would accumulate over the course of an hour. Since it is impossible to drop fractional frames, the first two frame numbers of every minute, with the exception of minutes 0, 10, 20, 30, 40, and 50, are dropped. (60 minutes - 6 not dropped = 54 minutes; 54 minutes X 2 frames dropped/minute = 108 frames)

European Time Code

The video system in Europe is far more logical than the American system, and is based on their 50hz AC power line frequency. Color and Monochrome video in Europe both run at exactly 25fps. For this reason, the European Broadcasting Union (EBU) has adopted a more or less identical (they have no need for a drop- frame version) Time Code, with a framing rate of 25fps. Because of the parallel standards, Time Code is sometimes referred to as

SMPTE/EBU time code. Standard Interconnection Methods Page 27

Standard Uses of Time Code

SMPTE/EBU Time Code is used in a variety of ways in film, video and audio production and post-production. These methods deserve some explanation here, because they are similar in concept to those used for system synchronization for live performance.

The most basic use of TC is to lock one kind of tape deck to another. For instance, one might want to lock a multi-track Audio Tape Recorder (ATR) to Figure 4 Simple SMPTE TC Synchronization Block Diagram a Video Tape Recorder (VTR). A TC synchronizer is used to accomplish this task. The video tape is coded with VITC, and the audio tape with LTC. The VITC from the VTR is sent to an input on the synchronizer. The synchronizer is interfaced to control the speed of the transport of the ATR. The LTC from the ATR is routed to another input on the synchronizer. Then, any time the VTR is rolled, the synchronizer compares the TC from the ATR and the VTR. The synchronizer allows the ATR to "chase" the VTR frame for frame, and the audio tracks can then be repeatedly used in perfect sync with the video image. Standard Interconnection Methods Page 28

Since there is currently no convenient, high-quality method to record sound on film, film production has traditionally used a double system -- the images are recorded on film, and the sounds are recorded on an ATR, usually one made by the Swiss company Nagra. A slate, on which is written shooting information such as Scene and Take, is used to sync the two systems. When shooting, the film camera and Nagra are both rolled, and once they are both at speed, the clapper on the slate is closed to make the familiar clap sound. This provides the editors with an image of the clapper closing on film, and the sound of the clapper closing on the tape. The film and tape are then mechanically synched and run together.

With time code, this process is greatly simplified. If the camera is capable of generating TC (some even burn it optically on the film outside the frame's perforations), it is sent to a special Nagra which is capable of recording LTC on a small center track. This system provides the editors with time code for each media, and they can synchronize the film and tape electronically as in the above video application.

The inverse can also be done. The time code Nagra is capable of simultaneously generating and recording time code, and this can be sent directly to the camera or to a slate with an LED readout which displays the TC, which is filmed for a few seconds when the camera is rolled. This approach gives the editors a visual image Standard Interconnection Methods Page 29 of the time code on film, and the LTC on the audio tape. In either of these examples, the script supervisor or sound mixer could record Scene and Take numbers or other information in the TC user bits.

Another similar application occurs during the production of a rock video. The music for the video is recorded on 1/4" audio tape with a center stripe of time code. When the tape is played back so that the musicians can lip sync, the time code is also played back. This time code is then sent either directly to the camera, or to a Time Code display slate. Standard Interconnection Methods Page 30

Pink Floyd and SMPTE Time Code

In the darkness, we hear the sound of a cash register opening and change being scooped out. The large circular screen upstage springs to life with a rapid-fire montage of coins being counted and processed, perfectly in sync with the rhythmic, stereo, money sounds the crowd had heard so many times before. The bass player starts in, and soon is joined by the rest of the band in "Money". The familiar sound effects continue throughout the song, staying precisely in beat with the band onstage.

Later, we see an animated clock flipping over and over on the screen. The clock chimes and alarms of "Time" come over the sound system. The band starts in, and again plays in perfect sync with the images.

Was it practice, talent or luck that made all this possible? Well, actually a little of each was necessary, but this synchronization was primarily accomplished through the use of a SMPTE Time Code-based system. The rock group Pink Floyd, in their 1987 Momentary Lapse of Reason tour, linked 11 musicians, high quality 8-track audio playback and large-scale 35mm film projection to achieve some spectacular effects. With the system described below, the musicians could add the impact of film and sound effects to their performance, and, with some limitations, still maintain the ability to improvise. Pink Floyd was not the first to use SMPTE time code to run such a system, but is Standard Interconnection Methods Page 31 probably the most prominent group to have done so in the concert field at the time of this writing.

System Description

In this system, SMPTE is fed to an automated film projection system from one track of an 8-track open reel tape deck. This

Figure 5 Pink Floyd Momentary Lapse of Reason Block Diagram deck is rolled on a specific lyric or other cue by a sound engineer at the front-of-house mixing position. A click track4

4 A click track is an audio track used to signal a tempo to musicians and let them know when to start playing in order to be in sync with the other synchronized elements. Sometimes it is several measures of a drum pattern, other times it is a vocal count. It is usually recorded on or triggered by the master source, and is sent to the musicians through stage monitors or headphones. Standard Interconnection Methods Page 32 was recorded on another track of the tape deck and fed through the musicians' onstage monitoring system. The remaining 6 tracks of the tape deck were used for sound effects and film audio. These sounds were routed to the multi-channel house audio system, which had speaker positions surrounding the audience in addition to the usual positions near the stage.

Associates and Ferren Automated Film Projection System

To get a sense of the way this system worked, one must understand the workings of the automated film projection system used on this tour. So, I'll explain the Associates and Ferren synchronized 35mm film projection system, which was developed for tours by Roger Waters (formerly a member of Pink Floyd), refined for a Coca-Cola industrial, and then used on this tour.

Standard 35mm motion picture projectors with high power (7kw) xenon lamphouses are highly modified to create the basis for the film system. Three functions of the projector are automated and are therefore available for remote or computer control. These are the variable speed motor which drives the film through the projector; the dowser, a heavy, metal piece which acts as a crude dimmer by blocking the light from the lamp; and the change-over shutter (not the regular timing shutter used to block the light during film frame advances), a lightweight metal piece which also blocks the light but is capable of a quick blackout or opening. Standard Interconnection Methods Page 33

This shutter cannot be left closed by itself for very long or the light will burn through it.

These functions are electrically controlled and are connected to a Mitsubishi F1 Series Programmable Logic Controller. A local control panel connected to the PLC enables manual operation of all the functions. The small PLC's programming language and input structure really isn't powerful enough to handle Time Code or offer a sophisticated user interface, so the TC is sent to an IBM PC interfaced to the PLC. In this configuration, the PC is essentially pushing the buttons on the projector's front panel. Focus, framing, and similar functions are still controlled manually.

In its idle state, the 35mm projector has its lamp on, motor off, and dowser and shutter closed. A standard series of cues for the projector is to start the motor, and, once the film is up to the proper speed (so that it won't be melted), open the dowser and then the shutter. The reverse sequence is used to shut down the projector after a film segment.

Custom A&F software is used in the PC to lock the film frame for frame to the incoming TC, and trigger dowser and shutter openings or closings on specific TC frames. Since at present, there is no satisfactory way of directly coding 35mm film with TC, the software in the PC maintains an internal frame count based on a Standard Interconnection Methods Page 34 relative one pulse per frame signal from the projector. This relative count is then compared with the absolute TC. The system continually adjusts the motor speed, using a software-emulated servo system to lock the film to the Time Code based on a cue list constructed for each show. If the Time Code is ahead of the local frame count, the motor is sped up. If the TC is behind the film, the motor is slowed down.

Once the PC is programmed with the show's cues, all the operator has to do (barring any problems) is thread the projector at the beginning of the night, align the film at its starting frame, and monitor the performance. The rest is handled by the computer based on the TC.

Time Code Scheme

To keep things logical in the control scheme used on this tour, each segment of film was given its own TC hour. Segment 1 started at 01:00:00:00, segment 2 at 02:00:00:00, etc. Some tape with time code was left at the beginning of each segment to ensure that there would be adequate code if the beginning of the tape broke or stretched. This proved also to be helpful whenever problems developed during a performance, as the beginning of the tape could be rolled without getting into the actual segment to re-establish sync. Standard Interconnection Methods Page 35

In the "Time" example explained above, the projector cue sequence would be as follows (TC numbers for example purposes only, not actual or necessarily realistic):

--:--:--:-- The sound engineer starts the tape deck, feeding TC to the projector. 04:00:05:00 Motor ON - The projector accelerates to speed. 04:00:15:00 Dowser OPEN 04:00:17:00 Shutter OPEN 04:00:38:00 Chime and alarm clock sounds start. 04:00:45:00 Vocal count fed to band. - "ONE two three four, TWO two three four, THREE two three four, FOUR" 04:06:16:00 Shutter CLOSE, Dowser CLOSE, Motor creeps to next frame count and stops. The system then waits for TC frame 05:00:05:00.

Did the Use of Time Code Limit the Performance?

According to Steve Bartlett, the tour projectionist, the band didn't feel restricted by the use of time code. He said they felt that any limitations placed on them were worth it to get the desired effect. Additionally, for parts of the show, there was no film or TC, so they were free to play whatever they wanted. There were also numbers like "Money", explained at the start of this section, in which the film only rolled at the beginning of the song, so that the band was free to improvise at the end.

Expansions of the System

This system worked very well, but was brought in late in the production process. So, there wasn't time to run other production elements from the TC. If there are to be future tours, Mr. Bartlett says that he would push to have the TC master the LASER system and possibly other elements such as the Standard Interconnection Methods Page 36 automated lighting. A minor difference in such a system would be that a TC source other than the audio deck would be needed, as there might be parts of the show with no audio playback. This would simply mean that the audio deck would be slaved to the new master.

Problems and Backup

Of course, when touring around the world with a system like this, problems are bound to occur. Mr. Bartlett said that most of the problems he encountered were in ground loops or other types of interference which corrupted the TC received at the projector. This is possible since the TC is simply treated as an analog audio signal until decoded by the computer. So, manual override and re-synchronization was sometimes necessary. If other problems had developed, a backup computer and interface were on line and receiving TC at all times during performance. Mr. Bartlett also enhanced the capabilities of the A&F operating program while on the road, adding several features which helped in running the show. He added the ability to add to or drop a few frames from the running count, allowing small synchronization adjustments to be made. He also modified the program to allow the projector to be started and stopped remotely. This was necessary because the control computer was run from the floor, and the projector itself was hoisted up to center screen height. Standard Interconnection Methods Page 37

Previous Sync Attempts

The system described here was not the first attempt at a Time Code based motion picture projection control system for the concert industry. Associates and Ferren had previously tried a system on Roger Water's 1985 Pros and Cons of Hitchhiking tour, where 3, 35mm film projectors were linked with each other and a multi-track tape deck. A relative encoder was placed on the master projector, and its signal was sent to a dubber5. The tape deck at the console was then synced to the dubber via time code. Unfortunately, the Time Code system worked too well, and the tape deck faithfully reproduced all the wow and flutter inherent in the projector/dubber link. A standard projection system is so low in fidelity that wow and flutter is not usually heard, but a high quality system reveals all of the defects. Efforts made to "smooth out" the time code were unsuccessful, so the system was synched manually for the tour. The system was eventually refined into the configuration discussed above and used successfully on Roger Water's 1987 Radio KAOS tour. Pink Floyd, in the early stages of the Momentary Lapse of Reason tour, also tried a dubber sync based system, but it failed for the same reasons. They then acquired the system developed for Roger Waters.

5 There are standard systems to sync audio to a film projection system. These are "Electronic Interlock Magnetic Film Recorders and Reproducers" or "Dubbers". Dubbers record and play audio on magnetic 35mm film. This format is used to facilitate the editing process. Although TC based models are available, the standard units use a relative sync scheme, and are not known for their ability to lock in sync quickly. Standard Interconnection Methods Page 38

MIDI

In the early 80's, musicians wanted the ability to link synthesizers to simplify complex studio and live performance setups. Several manufacturers developed proprietary interfaces to link their synthesizers, but these systems would not work with equipment from other manufacturers. In 1981, engineers at Sequential Circuits, a synthesizer manufacturer, began talks with their counterparts at Roland and Yamaha with the purpose of standardizing inter-synthesizer connections. These talks resulted in the formation of the MIDI Manufacturer's Association (MMA), which developed and promulgated the Musical Instrument Digital Interface (MIDI) 1.0 Specification in 1983. MIDI allows musical instruments to connect with, control, and be controlled by other instruments made by any manufacturer who complies with the standard. Even though MIDI was originally designed for synthesizer interconnection, its existence has spawned many new types of MIDI-controllable products including mixers, equalizers, processors, and even theatrical lighting consoles. A great deal of interest has recently been expressed in the use of MIDI as a theatrical control standard, and, as of this writing, work is under way in a subgroup of the MMA and in the US Institute for Theatre Technology (USITT) to create a standard way of using MIDI for this purpose. Standard Interconnection Methods Page 39

Description

In its simplest configuration, MIDI allows instruments to be connected in a master/slave relationship. If the MIDI Out from Synthesizer 1 is connected to Synth 2's MIDI In, pressing the G3 key on Synth 1 simultaneously creates the electronic equivalent of a G3 keypress on Synth 2. Synthesizer sound program changes and other commands are transmitted in Figure 6 Simple MIDI a similar fashion. If program number system Block Diagram 1 is selected on Synth 1, program 1 is also selected on Synth 2.

Differences Between SMPTE TC and MIDI

MIDI differs from SMPTE Time Code in a number of ways. TC is an absolute standard, and MIDI is essentially relative, because the data it deals with is not intrinsically time-based (e.g. a "Note C2 On" message gives no information about where in a musical composition that note belongs). TC is transmitted in an analog form and converted to a digital format in the receiving unit; MIDI is digital from start to finish. While SMPTE is essentially a software specification (it defines only the signal, not the hardware used to transmit the signal), MIDI specifies software and hardware. Standard Interconnection Methods Page 40

MIDI Hardware

The MIDI interface is a 31.25 KiloBaud, asynchronous, opto- isolated serial data link, similar to those used by personal

Figure 7 MIDI Interface Schematic From the MIDI Specification. (See Bibliography) computers. Since MIDI is unidirectional, a MIDI instrument usually has both a Receiver (MIDI In), a Transmitter (MIDI Out), and sometimes a MIDI Thru, which is an opto-isolated copy of the MIDI In. The interface circuit is relatively simple (See Figure 7) and the transmission of data is accomplished through the use of a Universal Asynchronous Receiver-Transmitter (UART). The circuit is a current-loop type, and a current on state is logical 0. Since the system was designed primarily for low-cost Standard Interconnection Methods Page 41 synthesizer interconnection, the maximum standard cable length is 50 feet.

MIDI Messages Standard Interconnection Methods Page 42 Standard Interconnection Methods Page 43

Table IIII MIDI Channel Messages

Status Data Function Byte Bytes Where ------Channel Voice Messages Note Off 1000cccc cccc = Channel Number, 0-F or 1-16 0nnnnnnn nnnnnnn = Note Number, 00-7F or 0-127 0vvvvvvv vvvvvvv = Velocity, 00-7F or 0-127 Note On 1001cccc 0nnnnnnn 0vvvvvvv Poly Key Pressure 1010cccc 0nnnnnnn 0vvvvvvv Control Change 1011cccc 0xxxxxxx xxxxxxx = Control Number, 00-79h or 0-121 0yyyyyyy yyyyyyy = Control Value Program Change 1100cccc 0ppppppp ppppppp = Program Number, 00-7F or 0-127 Channel Pressure 1101cccc 0xxxxxxx xxxxxxx = Pressure Value Pitch Wheel 1110cccc 0lllllll lllllll = Least Significant Pitch Change Byte 0mmmmmmm mmmmmmm = Most Significant Pitch Change Byte Channel Mode Messages Local Control Off 1011cccc 01111010 00000000 Local Control On 1011cccc 01111010 01111111 All Notes Off 1011cccc 01111010 00000000 Omni Mode Off 1011cccc 01111100 00000000 Omni Mode On 1011cccc 01111010 00000000 Mono Mode On 1011cccc 01111110 0nnnnnnn nnnnnnn = Number of Channels Poly Mode On 1011cccc 01111111 00000000 Standard Interconnection Methods Page 44

MIDI's data is transmitted in a 10 bit word, which is composed of a start bit, an 8 bit data byte, and a stop bit. MIDI breaks musical data down into "messages". A message is composed of a status byte followed by zero or more data bytes. A status byte always has its Most Significant Bit (MSB) set to 1 (i.e. 1xxxxxxx) and a data byte has its MSB "reset" or set to 0 (i.e. 0xxxxxxx).

When a key is pressed on a MIDI keyboard, a message such as 90 24 3F (hex) is generated. In the first, or status, byte, the 9 means "Note On", and the 0 means channel 1. The second byte means note C2. The third byte means velocity6, and 3F is medium velocity. When the aforementioned key is released, an 80 24 3F (hex) message is generated, which is a Channel 1 Note Off C2 at medium velocity.

Channel Messages

There are two primary types of MIDI messages: Channel and System. Channel messages are used to address units set to

6 Velocity is used to recreate key presses of differing intensity. This is desirable because striking a piano key hard creates a different sound than striking one lightly. Some MIDI synthesizers can measure the velocity of the key press, and recreate different sounds for different velocities. Standard Interconnection Methods Page 45 receive a specific channel. These messages can address 16 discrete channels, and 4 bits of the status byte are used to denote for which channel a message is intended. Within the Channel classification, there are Voice messages, which control the instrument's voices, and Mode messages, which determine how the synthesizer will deal with the Voice messages. (See Table II for details.)

System Messages

Table IIIIII MIDI System Messages

Status Data Function Byte Byte Where ------System Common Messages

Song Position Pointer 11110010 0lllllll lllllll = Least Significant Byte 0mmmmmmm mmmmmmm = Most Significant Byte Song Select 11110011 0sssssss sssssss = Song Number Tune Request 11110110

System Real Time Messages (No Data Bytes) Timing Clock 11111000 Start 11111010 Continue 11111011 Stop 11111100 Active Sensing 11111110 System Reset 11111111

System Exclusive Messages System Exclusive 11110000 0iiiiiii iiiiiii = Manufacturer's ID Number 0xxxxxxx xxxxxxx = Manufacturer's Data ... Any Number of Bytes EOX 11110111

There are three types of system messages (See Table III). The first is the System Common message, which is to be read by all units connected to the system. The second type of system message is the Real-Time message. These messages contain no data bytes, Standard Interconnection Methods Page 46 and are used for timing and other important functions within MIDI-synced systems.

The third type of system message is the System Exclusive message, which allows manufacturers to send any kind of data over the MIDI line. The MIDI 1.0 spec set aside two status bytes, System Exclusive (SysEx) and End of System Exclusive (EOX) for this purpose. The SysEx byte is sent first, and then a Manufacturer's ID number, which is assigned by the MMA. After the SysEx byte, the Manufacturer sends data according to a standard which they must publish, and then an EOX message is sent to return the system to normal operation. The MMA will assign any company a SysEx ID code, and if the manufacturer has used the code in a product and published all the technical details of their approach within a year, the code is assigned permanently.

MIDI Sync

As mentioned earlier, a MIDI Note On message contains no information about where in a musical composition that note belongs. A system based on "Song Position Pointers" was implemented in the original MIDI spec to communicate this information for sequencing and tape-sync applications. The Song Position Pointer (SPP) is a count of the number of 16th notes which have elapsed since the beginning of the song, or Start of the tape deck. If the MIDI system is locked to a tape deck, and the tape deck is fast forwarded, the system sends SPP messages Standard Interconnection Methods Page 47 only before and after shuttling. When the tape is playing, the sync device maintains its own SPP count, which is incremented every 6 "Timing Clock" messages (See Table III), which are sent out at a rate of 24 per Quarter Note. The Timing Clock is based on tempo, not real time, so its frequency varies with the tempo of the song.

To enable this SPP system to be locked to SMPTE Time Code, SMPTE- to-MIDI converters were developed. These devices convert the incoming TC into MIDI Timing Clock and Song Position Pointer messages, based on a tempo map which is programmed by the user into the converter. The problem with this method is that when even simple real-time changes are made, it is extremely difficult to reprogram the tempo map because of all the real-time/tempo conversions that must be done.

MIDI Time Code

To facilitate the SMPTE/MIDI connection, MIDI Time Code (MTC) was developed in 1986 by Evan Brooks and Chris Meyer, of the synthesizer companies Digidesign and Sequential Circuits, respectively. MTC breaks the analog SMPTE TC frames into MIDI messages, and digitally transmits them directly down the MIDI line. Standard Interconnection Methods Page 48

MTC Messages

MIDI Time Code is transmitted in two formats: Full Messages and Quarter Frame Messages. Full MTC Messages play basically the same role in MTC as Song Position Pointers do in MIDI Sync, and are used to transmit the complete TC frame number each time the

Table IVIV MIDI Time Code Messages

Status Data Function Byte Byte Where ------Full Frame Messages Real Time Universal System Exclusive Header 11110000 Intended for Entire System 01111111 Long Form Time Code ID 00000001 Hours and TC Type 0tthhhhh hhhhh = Hours 0-23 (Decimal) tt = TC Type 00 = 24 FPS 01 = 25 FPS 10 = 30 Drop Frame 11 = 30 Non-Drop Frame Minutes 0mmmmmm mmmmmm = Minutes 0-59 (Decimal) Seconds 0ssssss ssssss = Seconds 0-59 (Decimal) Frames 00fffff fffff = Frames 0-29 (Decimal) EOX 11110111 Quarter Frame Messages Frame Count LS Nibble 11110001 0000dddd d = TC data Frame Count MS Nibble 11110001 0001uuud u = Undefined and reserved Seconds LS Nibble 11110001 for future use. Set to 0. 0010dddd Seconds MS Nibble 11110001 0011uudd Minutes LS Nibble 11110001 0100dddd Minutes MS Nibble 11110001 0101uudd Hours LS Nibble 11110001 0110dddd Hours MS Nibble/TC Type 11110001 0111ttdd tt = TC Type (See Above)

system starts or stops.

When the system is running synchronously (for example, when it is synched to a rolling tape deck), Quarter Frame messages are sent. Quarter Frame messages, as the name implies, are sent out at the Standard Interconnection Methods Page 49 rate of 4 per TC Frame. Eight Quarter Frame messages are used to communicate one TC Frame, so the Time Code count can only be fully updated every other frame. However, the system can still be synched 4 times per TC frame, through the use of the Quarter Frame message timing. A previously undefined Status Byte, 11110001 (F1 hex), is used as the MTC Quarter Frame System Common byte. After this MTC status byte is sent, a data byte is sent, which can contain a nibble for the message number, a nibble representing one of the TC digits, or a nibble containing other data (see Table IV").

SMPTE User Bits and Set Up Messages

The MTC standard provides a method for encoding SMPTE TC user bits in the MTC, and also for transmitting "Set-Up" messages. Set-Up messages are used to transmit cuing information to various units in the system. A typical Set-Up message might transmit to an ATR, "Punch In Track #8 at 01:23:58:29". The ATR controller would then store this information and execute it when the appropriate Time Code was received. In this way, a single master control computer can be used to edit cue lists, transmit cue data, automate an entire system and lock it to TC. Standard Interconnection Methods Page 50

Siegfried and Roy and MIDI Control

Siegfried and Roy's show at the Mirage Hotel in Las Vegas is one of the most elaborate stage productions ever done, and it has an incredibly sophisticated sound system. This system, like the one at Disney's Epic Theatre, is based around a Richmond COMMAND/CUE sound control computer, but interconnections here are done with MIDI.

System Description Standard Interconnection Methods Page 51

Sound is used in a number of ways in the show. There are sound effects, sections of underscore music, live microphones and sync sounds for a film segment (and, therefore, a film projector).

Figure 8 Simplified Siegfried and Roy System Block Diagram

All of the pre-recorded sounds for the show are stored digitally, and most are stored on a New England Digital (NED) 8-Track film audio post-production system, which stores audio on hard disks. The show has 100 minutes of sound on 8 tracks, for a total of 800 Standard Interconnection Methods Page 52 minutes. Sixteen hard disks with a combined capacity of 5.5 gigabytes are used for storage. The control structure of the NED system doesn't allow a cue to be triggered while another cue is already running, so two stereo Akai S-1000 samplers with hard disks are also used for sound playback. This configuration gives the system three independently controllable digital sound sources (really 16 -- 8 channels from the NED, 4 from each of the Akai's).

All of the audio is routed and controlled by the COMMAND/CUE. In addition, the COMMAND/CUE controls a "Spatial Sound Processor", reverb units and other processing gear. Since up to three different devices could possibly send MIDI to the samplers simultaneously (see Figure 8), a MIDI Switcher is used. This is a device which routes and merges MIDI messages together, avoiding any potential data collisions.

Operation and Backup

Two operators are needed to run the show. One runs the NED and the keyboard, the other runs the Richmond and backup systems. All music and sound effects are backed up on CD's, and if any part of the system goes down, the system can be run completely manually from the CD's and a Harrison mixer. Standard Interconnection Methods Page 53

Master Control

The system doesn't have just one possible master control; it has five. The NED, the COMMAND/CUE and a Macintosh running "Cue Sheet" sequencing software are all capable of generating or reading SMPTE or MIDI; the film system generates Time Code, and the keyboard generates MIDI.

Jonathan Deans, the show's sound designer, broke the 1½ hour show down into roughly 200 "segments". For any given segment, one or more of the above systems can act as a master. During some parts of the show, the COMMAND/CUE or the NED is used as a master, with the operator pressing the appropriate Go button. Additionally, during many of the segments, the keyboard is used to trigger sound effects from the samplers directly.

When the film system acts as the master, it sends TC to the NED and to the Macintosh. The NED then locks itself to the film, and the Cue Sheet software running on the Mac handles triggering for the rest of the system through the COMMAND/CUE. The software in the Mac sends Song Position Pointer messages to the COMMAND/CUE, and this approach allows specific cues to be triggered via MIDI Sync.

Advantages of the System

The system proved to be extremely powerful and flexible, and has run without major failures since the show opened. All of the Standard Interconnection Methods Page 54 music for the show was composed in the theatre using a NED Synclavier system, and its use, in conjunction with the tapeless nature of the rest of the system, allowed changes to be implemented very quickly during the rehearsal process. Mr. Deans said that at the beginning of the technical rehearsal process, music changes would be given at least a day in advance of when they were desired. The system allowed changes to be made so quickly that by the end of the tech process, the director would become impatient if the changes took more than about 5 minutes. Standard Interconnection Methods Page 55

Possible MIDI Expansions for Theatrical Control

As of this writing, there is a great deal of work happening to arrive at some sort of standard way of using MIDI for Theatrical Control. Work is currently under way within both the Sound and Lighting commissions of USITT to develop a standard MIDI control scheme using SysEx messages. The subject is evolving so fast that I can only be as accurate as the information I can obtain as I "Go to Press". Standard Interconnection Methods Page 56

Table VV Possible MIDI SysEx Scheme for Theatrical Control

F0 7F = Real Time Universal System Exclusive Header = Individual device identifier: 0-126 (127=all) = Theatrical message (number to be assigned) = Message format, i.e.: 00 = Special 01 = Lighting 02 = Sound 03 = Moving instruments 04 = Rigging 05 = Laser 06 = Pyrotechnics 07 = Intercom 08 = Hydraulics 09 = Gas 0A = CD Player 0B = Videodisc player 0C = Video projector 0D = Film projector 0E = Slide projector etc. = Type of message, i.e.: 00 = Special 01 = Load cue No. 02 = GO cue 03 = GO cue and jam clock 04 = Advance to next cue 05 = Back to previous cue 06 = Open new cue directory 07 = Close cue path No. 08 = MIDI Time Code chase ON 09 = MIDI Time Code chase OFF 0A = Start Clock 0B = Stop Clock 0C = Zero Clock 0D = Zero All 0E = Jump to next sequence 0F = Controller data etc. = Additional info as required F7 = EOX (end of system exclusive) Standard Interconnection Methods Page 57

The most recent thinking, downloaded from the USITT Electronic Bulletin Board service known as the "CallBoard", is represented in Table V. This message-encoding method was promulgated by Charlie Richmond, of Richmond Sound Design, based on current implementations of MIDI in his products and a proposal by Andy Meldrum of VARI*LITE. This use of MIDI is currently only being thought of for controller level interconnection; it is not meant to replace any of the standard controller to device interconnection methods, such as DMX-512, currently in existence.

At the April 1990 USITT conference, a group of lighting and sound equipment engineers met, and the use of the SysEx approach was generally accepted. Debate is currently underway to determine what kind of lighting messages are needed, and then specific numbers will be assigned to each. Since Charlie Richmond is currently the only manufacturer of computer-based theatrical sound controllers, he will more than likely incorporate the sound messages he has been using into the standard. The use of this MIDI scheme to control "life threatening devices" such as automated scenery systems or pyro effects was discussed without resolution. Many of the people present thought that since MIDI is not particularly robust, nor does it have any sort of innate error correction, that it would be better not to officially put forth any method of controlling these devices. Several schemes for error correction were briefly discussed, but the topic was never fully resolved. Standard Interconnection Methods Page 58

In the current thinking on this implementation, one SysEx Manufacturer ID number would be used for all "theatrical" devices. Each type of controller would have its own number for its "Message Format". So the SysEx bytes would be sent first, then an "Individual Device Identifier", then the Theatrical Control SysEx number, then the Message Format. The actual cue function such as "Go" would be sent next, and finally an EOX message. This may seem like a lot of overhead to send a fairly simple message, but the amount of MIDI bandwidth a series of messages in this format would need in a typical theatrical application is fairly minimal compared to MIDI's capabilities, so there would be plenty of time to transmit all this information.

This approach is only in the proposal stages at the time of this writing, so really anything could happen with it, although a very similar approach is currently being used successfully by Charlie Richmond in installed, operating systems. MIDI is in no one's eyes the ultimate Entertainment Network Standard, it simply exists and has a huge installed base. Since there are so many products in so many industries already incorporating MIDI, it makes sense to have a way of using it for theatrical control. Only time will tell what becomes of this approach. LOCAL AREA NETWORKS Local Area Networks Page 60

All of the above interconnection methods have limitations of one kind or another. Contact closures are unidirectional and wire- intensive; SMPTE is good only for time-based systems; MIDI is unidirectional and is, in some implementations, currently being exploited to close to the limits of its bandwidth.

Currently a new (for the entertainment industry, anyway) type of interconnection system is starting to appear: the Local Area Network (LAN). ". . . [A] local area network enables microcomputers to share information and resources within a

limited (local) area, generally less than one mile."7 Modern LAN's solve many of the problems mentioned above: they are bi- directional, are capable of transmitting a huge amount of data in real time over a single cable, have wide bandwidths and high data transfer rates.

As I write this, Contact Closures, SMPTE and MIDI are all well established and have large installed bases. ShowNet, one of the first companies to offer a full show control network is only a few years old and has at this point a small but growing number of installations. In 1986, Strand Lighting developed and publicized its SMX Network, an open digital communication protocol, but it is currently implemented only in a few Strand products.

MediaLink is a high-speed fiber optic LAN, which is (literally as

7 Stan Schatt, Understanding Local Area Networks, Second Edition, Howard W. Sams & Company, Carmel, Indiana, 1990, Pg 19. Local Area Networks Page 61

I write this) just becoming available. Since all these systems are so new, it is uncertain how any of them will fare in the marketplace. In a few years, everything that is written in this section may be totally obsolete, or one of the systems may have grown to be a de facto or formally adopted standard; only time will tell.

Network Topologies

The way in which a LAN is structured is known as its "network architecture" or "topology". The three predominate LAN topologies are the Star, the Bus, and the Ring, and these are so named because of their schematic appearance.

Star Topology

Each device connected to a Star topology network has a cable which runs from its connection point, or node, to a central hub and control computer. The central computer controls all network traffic, and because of this, Figure 9 Star Topology the Star topology has the advantage that priority levels can be assigned to the nodes. Star topology LAN's have the disadvantage that if the central Local Area Networks Page 62 network traffic control computer fails, the entire network will fail.

Bus Topology

The Bus topology is a simple scheme in which all nodes are connected to a common network cable, and each node checks the network to see if it is busy before sending data.

This is essentially the way Figure 10 Bus Topology MIDI works. Bus Topology networks are easy to wire, but suffer some distance limitations because the signal is not amplified or repeated at any point. A variation of the Bus topology is the Tree topology, where several independent busses are connected together via a "Backbone", schematically giving the topology a tree appearance.

Ring Topology

In the Ring topology, all nodes are connected together with a loop of network cable.

Messages are passed from one node to another around the ring until the desired Figure 11 Ring Topology Local Area Networks Page 63 destination is reached. Once a message is received by the correct node, a copy is sent back to the sending station to verify reception. One problem with this traffic scheme is that if two stations on the network are very busy, other nodes may not be able to get sufficient access to the network. So, Ring Topology networks generally use "Token Ring" control software, and are referred to as Token Ring networks.

Network control in a Token Ring LAN is accomplished by the sequential circulation of a "Token". The token acts as a sort of constantly circulating mailman, and one of the bits in its header is set or reset to identify whether or not the Token is available to carry data. If a node needs to send data, it waits until it receives the token in an available state, and then it inserts its data into the token frame (See Figure 12). The token is then passed from node to node around the network until the destination is reached. The destination node controller recognizes data which is intended for it, copies the data out of the frame and sends the token back to the sender for verification and re- circulation. The sender then removes the destination address and

Figure 12 Token Frame Structure other header information, and sends the token to the next node Local Area Networks Page 64 with the token set to the available state. If the next node needs to send data, the process is repeated; otherwise the token is passed on. In this manner, each node has equal access to the network, because the free token goes sequentially from node to node around the network.

One of the node controllers acts as the "token monitor", which determines the way in which the token is handled. If anything goes wrong with this node, another can take over this function. For this reason, the Token Ring topology has the advantage of being better able to deal with hardware failures than the Star topology. Ring networks can also cover greater distances than the other topologies, because each node repeats and amplifies the token as it is passed around the ring. Token Ring networks have the disadvantages of being expensive and wire intensive. Also, it is difficult to add stations once a Token Ring LAN is installed permanently, because the entire network must be shut down to do so.

Problems with PC Networks for Entertainment Applications

LAN's come in many varieties and configurations. Most commercially available PC LAN's, such as ArcNet, Ethernet, or LocalTalk, were engineered for standard Personal Computer network applications, and as such suffer some limitations for entertainment applications. If, for instance, a PC network is sending a great deal of word processing data, it might get Local Area Networks Page 65 clogged up and slow down. If a word processing file arrives at a user's workstation a few seconds late, it probably wouldn't matter. However, in a live performance situation, this delay could be disastrous or even life-threatening. In response to this, several manufacturers have been developing networks that are specifically engineered to the real-time demands of our the entertainment business. Local Area Networks Page 66

ShowNet

ShowNet is a proprietary LAN-based combination of computer hardware and software developed in 1988 by Joe Fitzpatrick and Steve Atchley. It is capable of ". . . controlling virtually any element in the show environment."8 ShowNet was developed after Mr. Fitzpatrick observed how many operators were used on the Pink Floyd tour discussed previously. It seemed to him that many of the tour's control functions could be combined into a single system, controlled by a single operator, and so ShowNet was born. Since the ShowNet system is proprietary, I am limited in how much I can find out and write about it.

ShowNet can be used either for controller level interconnection, or for device level control for any sub-system (i.e. special effects, lighting or LASERs). The system can be configured to generate or read SMPTE, MIDI, Contact Closures, Serial Data, or DMX-512. ShowNet's communications are accomplished through the use of a proprietary LAN which uses either a standard 5 wire cable, or a duplex fiber optic cable. The system can have up to 1000 nodes, and supports several topologies. System response time is the same regardless of how many nodes are connected.

8 ShowNet Literature, ShowNet Corp., Glendale, CA Local Area Networks Page 67

Show Control Computer Local Area Networks Page 68

The system is based on the Show Control Computer (SCC), which serves as the master controller for the system. The SCC has MIDI, SMPTE, Dry Contact Closure, and RS-232 interfaces. ShowNet systems are operated from the "CUE" software package, which runs on the SCC. The CUE software can be run using the mouse, monitor and keyboard, or from a manual control console which interfaces to the SCC.

CUE Software Package

Instead of trying to rewrite ShowNet's description of the CUE package, I'm going to quote directly from their literature:

The 'CUE' software package is the fundamental link between the user and the rest of the system. It is conceptually similar to a modern lighting board. Users who are already familiar with such terms as 'XY split- dipless crossfaders' and 'submasters' will be able to master the system easily. Simply stated, the user creates 'cues', or collections of level settings, etc. for the various elements being controlled. These cues can then be used in a variety of ways:

C Cues can be assigned to any of the 24 submasters (16 variable, 8 on/off) and mixed and blended manually.

C Cues can be placed in any of the 50 available chases, each of which can be up to 50 steps in length and played back on either of the 2 available chasers. Each chaser has separate level, speed, and direction controls.

C Cues can be placed in an 'Event List' and assigned optional execution and fade times. This list can be executed manually, run off Local Area Networks Page 69

an internal timer, or driven by a variety of external sources such as:

C An external switch closure. C SMPTE time code C MIDI start, stop, clock, and song position pointer. One significant deviation from the 'lighting board model' used in designing the software is that execution and fade times are stored separately from cues. This scheme frees the user from having to create cues in any particular order or from having to create a new cue each time the same 'look' is used in a show, even if different fade times are used each time. 9

Interfaces

The ShowNet system can be configured with Analog, Dry Contact Closure, TTL, and DMX-512 outputs, and bi-directional Analog and serial RS-232 interfaces. The serial interface is very flexible, as it can either transmit pre-defined ASCII sequences, or be operated as a "Virtual Terminal", where the user can communicate bi-directionally with any serial node on the network.

A useful feature of the ShowNet interfaces is that local control is provided for most functions. For instance, if the interface is connected to a dimmer, the dimmer can easily be brought up locally for focusing without having an operator at the Show Control Computer.

9 From ShowNet literature, Shownet Corporation, Glendale, CA. Local Area Networks Page 70

Mötley Crüe and ShowNet

The house lights go down, and the crowd, largely composed of those under 20, goes wild. On one of the screens stretched inside a triangular truss section over the stage, an animated, multi-colored, LASER drawn character starts speaking, its voice booming out of the huge sound system. The character welcomes us to the show, and introduces the first song, "Dr. Feelgood." The band appears amid some massive pyro effects, and the crowd once again goes wild.

System Description

The visual and audio systems used to create this sequence are synchronized through the use of a ShowNet-based LASER control system, which is mastered from SMPTE Time Figure 13 Mötley Crüe Block Diagram Code. The system's TC source is a Fostex DAT machine, which is capable of recording TC independently of the stereo digital audio. At the opening of the show, the sound engineer starts the

DAT, the audio is sent to the sound system, and the TC is sent to the ShowNet system. The SCC then runs through an Event List, Local Area Networks Page 71 which contains cues for each of the various scan patterns, and the specific TC frames on which they should be triggered.

The LASERs for this tour were supplied by LASER Design, and two LASERs are used to create the images -- a Krypton for the reddish colors and an Argon for the green/blue. The beams are steered by two scan heads, which are run by microprocessor based scan controllers. The scan controllers store the animation patterns, and play them back when triggered by serial data received over the ShowNet network from the Show Control Computer.

Manual Operation

After the show's opening sequence, the system is run manually by the operator using a ShowNet manual console. This manual console allows the various preset scan patterns to be manipulated via actual buttons and faders instead of the mouse and video screen. This creates a much more responsive user interface, and allows the operator to select different scan patterns extremely quickly and in time to the music.

As is typical of all network based systems, a tremendous number of steps must be completed in order to accomplish a relatively simple task, but because the system runs so fast, there is no noticeable lag. For instance, if the operator wants to bump on a specific scan pattern, he or she presses the appropriate button on the manual console. This sends a message to the SCC, which Local Area Networks Page 72 determines which button has been pushed, and which cue this represents in the CUE software package. The corresponding serial messages, one for each scan controller, are then sent down the network to the ShowNet serial interface. The interface unit converts the network signals into RS-232 serial data, and routes each message to the appropriate scan controller. The two scan controllers then play back the scan patterns synchronously.

Other Applications

Even though the system is used on this tour to control the LASERs only, ShowNet is powerful enough to control all the production elements of an entire show. It is also flexible and modular enough that it can be configured to run a totally automated theme park attraction, a rock and roll show, or anything in between. Local Area Networks Page 73

Strand's SMX Protocol

Strand Lighting introduced an open entertainment communication protocol, called SMX, at a session of the 1989 USITT conference in Calgary. The SMX protocol is: . . . intended to provide a structured and open communication method between various elements of lighting and lighting related control processing and outputting. . . . The primary [functions] are: a. Provide a point-point and multi-drop digital dimmer link with expandability to intelligent dimmers. b. Provide a link to motion control equipment (eg automated luminaires) normally point to multi-drop. c. Provide a link between control processor(s) and control stations for the transfer of operator command and display data. d. Maintain a minimum cost of manufacture consistent with high reliability and an acceptable development cost. e. Provide maximum compatibility with current equipment whilst remaining expandable for future development.10

The protocol is, as of this writing, only implemented in some Strand Lighting equipment, but other manufacturers are currently reviewing the protocol for possible inclusion in their products. The potential exists in "c." above for control system interconnection, and that's why I've mentioned it here.

10 Strand Lighting SMX Data Communications Protocol, Strand Lighting, Rancho Domiguez, CA, 1988. Local Area Networks Page 74

MediaLink

As of this writing, a proprietary entertainment LAN, known as MediaLink, is just becoming available. Because it is so new, there aren't yet any implemented live performance synchronization applications, but I feel that this system has such potential as a LAN standard for the entire entertainment industry that it should be included in this thesis.

MediaLink is the brainchild of two musician/engineers, Mark Lacas and David Warman, who developed it to overcome MIDI's bandwidth and channel limitations, and to make the interface between instruments more transparent to the user. The LAN is not yet fully implemented, and is currently only capable of dealing with MIDI. Eventually it will be able to transmit SMPTE, 16 simultaneous channels of digital audio, SCSI, or any other information which can be broken down digitally. MediaLink itself does not replace MIDI or any other protocol; it is network hardware and control software, and as such is only a transmission medium.

MediaLink LAN

MediaLink's only currently available implementation (the MIDITap, described below) operates at a data transmission rate of 2 megabits per second (mb/s). With faster hardware the MediaLink LAN should be capable of up to 100 mb/s, because it uses fiber optic cable as the transmission medium. The network can be Local Area Networks Page 75 configured in Ring, Star, Bus or Tree topologies with up to 253 nodes. Data transfer is accomplished through the use of the proprietary MediaLink protocol.

Datagrams

MediaLink transmits data in "datagrams", which are similar to the Token frames discussed in the general networking section. "MediaLink datagrams are acknowledged by the receiver with a special message that is returned to the sender."11 (Sounds a bit like token passing, doesn't it?) If a datagram is not acknowledged within a timeout period, the datagram is sent 4 more times, and if there is still no response, the destination device is electronically removed from the network. Each datagram carries the destination address and a "Real-Time Vector" which allows sub-addressing within a node. This would allow something like MIDI port #3 connected to Node #198 to be accessed.

Agents

Every MediaLink-compatible device contains a piece of software known as an "agent", which contains configurable parameter information about the device. For instance, a synthesizer's agent might contain information about voices, timbre parameters, etc. This allows any "MediaLink-aware" device to be configured

11 Jeff Rona, "Does The Future Belong To The MediaLink LAN?", Keyboard, December, 1989, pg 58. Local Area Networks Page 76 remotely, and this means that an entire system's configuration can be stored and recalled with the touch of a button.

Operating Modes

MediaLink runs in two modes: Setup and Performance. Setup mode allows large amounts of non time-critical data, such as sample dumps, SysEx data or ASCII files, to be transmitted efficiently through the network. In Performance mode, the datagram size is limited to make sure that important information arrives in time. Synchronization data is also given a higher priority.

Routing

MediaLink data can be transmitted throughout the network in a number of ways, the simplest of which is direct node-node transfer. Data can also be "Broadcast" to all nodes in the network. A third and extremely powerful method of transfer is the "Multicast", which allows any of 64000 user-definable groups to transmit to any other groups simultaneously.

MIDITap

The only currently available MediaLink device is the "MIDITap". MIDITap is a MediaLink network interface which has 4 MIDI In's, 4 Out's, and an RS-232/422 port. Instruments connected to the MIDI ports can be software configured as "Devices". A specific synth or group of synths can be identified in this manner using information about the node, port, channel and program number. In Local Area Networks Page 77 complex traditional MIDI setups, the system interconnections exist in hardware and can get very complex and messy. MIDITap's user definable device names allow the entire system configuration to be software implemented, and this greatly facilitates changes or re-configuration. Another interesting possibility is that the MediaLink network can contain Modem nodes, which allows remote installations to be accessed. Obviously, a standard modem doesn't have the bandwidth necessary to transmit a huge amount of data in real time, but this would certainly give the ability to transmit sample dumps, etc. to a remote location.

MediaLink is really in its infancy here in 1990. It, like MIDI 1.0, was designed with an eye towards the future. The company hopes to license the system for direct, inexpensive inclusion in products, and this should help to bring the system into widespread use. The possibilities of using MediaLink as an extremely flexible and powerful system interconnection platform are tremendous. As more and more devices are made MediaLink aware, I think that the system will gain a large installed base. CONCLUSION Conclusion Page 79

I hope that I have accomplished my goal of providing an introduction to some of the many ways in which entertainment control systems are being interconnected and synchronized. In this section, I am going to detail my version of the "ideal" interconnected/synchronized system of the near future. This system wouldn't be quite as advanced as the one described in the opening paragraph of the thesis; it would be perhaps one generation earlier. When readers look back on this section in a few years, the system may seem laughably naïve, or parts of it might actually exist.

The Future

In my opinion, there are two real keys to the future of system interconnection: the use of fiber optic technology and the standardization of control communication. MediaLink and the current attempts to develop a theatrical MIDI standard are both significant steps in this direction. Imagine that someday, all control and communication for an entire show could take place on a loop of fiber optic cable run around the building. This could include lighting interconnection, digital audio sends from an audio workstation to integrated amp/speaker cabinets, intercom and paging systems, etc. All of this technology is available now, but is cost-prohibitive to implement at this point. An expanded version of MediaLink, since it is fiber-based, might someday be capable of handling a complex system such as the one shown in Figure 14. Conclusion Page 80

Figure 14 Simplified "Ideal" System Block Diagram

System Description

In this system, I envision a single Master Control console, run by a sort of hybrid Stage Manager/Show Controller. This operator would control the entire show from a touch screen panel, buttons and faders. The Master Control console would be linked to every device in the system via the fiber optic cable. In this manner, anything from a lighting console to a specific communication station could be addressed and accessed individually by the system.

In general, to minimize network traffic, a distributed processing approach would be taken throughout the system. Every device on the network, from an individual dimmer to a communication station, should contain some intelligence. This would allow, for example, the system to send a dimmer or group of dimmers fade times and go commands rather than continuously updating each Conclusion Page 81 dimmer's level. In general, if a device is inactive, it should not use any network time except for an occasional status check. Diagnostics would be incorporated into all devices to allow for quick and easy troubleshooting. Separate control computers would be used for each individual subsystem, and each of these controllers would be triggered by an internationally standardized digital show control protocol.

Console Design Philosophy

In general, all operator consoles would be based around a touch screen display in conjunction with real faders and buttons. I think that the now common approach of using a mouse, video display and keyboard to control show parameters is sorely lacking. I don't believe that anything can be manipulated as quickly and intuitively as actual buttons and faders. The faders need only be encoders; what they actually control would be determined in software and be totally user-configurable.

Life Threatening Devices

The only devices removed from the main network loop would be those which could be life threatening. In order to trigger any dangerous system, several levels of authorization would be needed. This might be the operator pressing an "Authorize" button on the subsystem controller in conjunction with an actor pressing a footswitch or interrupting a light beam. At this point I believe that if a human is at risk, a human should make Conclusion Page 82 the final decision whether or not to let the effect operate. Intelligent emergency stop systems would be located throughout the theatre.

Audio System

All audio would be transmitted and processed digitally, from the digital playback unit or microphone to the intelligent amplifier, which is integrated in the speaker cabinet. Advanced data compression techniques and the huge bandwidth of the fiber system allow the audio to be transmitted on the same loop as dimmer control and other data. The audio control console itself would basically be a computer front end to intelligent faders and a patching matrix. These would be controlled in a manner similar to that discussed previously for dimmers -- only fade times and routing information would be sent over the network. The intelligent fader modules themselves would handle all routing and gain manipulations. The use of a totally fiber optic-based system and all digital audio would mean that ground loops would be a thing of the past.

Drawbacks of the System

Fiber optic interconnection is not without its drawbacks, however. Connectors are difficult to install, and installation requires expensive equipment. Splicing is similarly difficult. This would be an incredibly complex, and sophisticated test equipment would be needed to do component level diagnostics. The Conclusion Page 83 solution to this would be to make the system as intelligent and modular as possible, to allow quick service by a show technician with no test equipment.

Conclusion

This system is totally in the realm of possibility using current techniques and technology. The future holds new advances and new techniques that I can't even imagine. Appendix A ANSI/SMPTE 12M-1986 Longitudinal and Vertical Time Code Standard Appendix B SMPTE Recommended Practice 136-1986 Film Time Code Recommendations Appendix C MIDI 1.0 Specification

From The MIDI Resource Book. (See Bibliography) Appendix D MIDI Time Code Specification

From The MIDI Resource Book. (See Bibliography) Bibliography Page 122 Bibliography Blake, Larry; Film Sound Today -- An Anthology of Articles from Recording Engineer/Producer, Reveille Press, Hollywood, CA, 1984. A good collection of articles, many of which include SMPTE Synching applications. De Furia, Steve and Joe Scacciaferro; The MIDI Resource Book, Third Earth Publishing, Inc., Pompton Lakes, NJ, 1988. This is a very strangely organized book, but it does contain some useful information including the entire MIDI specification. Huber, David Miles; Audio Production Techniques For Video, Knowledge Industry Publications,Inc., White Plains, NY, 1987. Contains an excellent section on standard applications of SMPTE Time Code. Lehrman, Paul D.; "MIDI Time Code: Secrets Of Syncing To Film & Video In The Smaller Studio," Keyboard, March, 1990, pp 90-98. Contains a very good description of MIDI Time Code. Rona, Jeff; "Does the Future Belong to the MediaLink LAN?," Keyboard, December, 1989, pp 54-61. Schatt, Stan; Understanding Local Area Networks, 2d ed., Howard W. Sams & Company, Carmel, IN, 1990. Scheirman, David; "Audio system for Disney World's Epic Theater", Sound and Video Contractor, November 20, 1989, pp 80-86. Westfall, Lachlan; "The Local Area Network -- MIDI's Next Step?," Electronic Musician, November, 1989, pp 64-67. Wilkinson, Scott; "MediaLink -- Integrating Media Systems With the Touch of a Button," Music Technology, August, 1989. ------, NAGRA IV-S Time Code Literature, Kudelski, S.A., NAGRA Tape Recorders Manufacture, CH-1033 Cheseaux/Lausanne, Switzerland. I found this at a trade show, and it contains one of the best explanations of standard film Time Code applications that I've seen. Contact Addresses Page 123 Contact Addresses Many of the manufacturers and associations discussed in this thesis will provide information about their systems, reprints of some of the articles detailed above, or copies of their standards. International MIDI Association 11857 Hartsook Street North Hollywood, CA 91607 Lone Wolf (MediaLink) 1505 Aviation Boulevard Redondo Beach, CA 90278 (213) 379-2036 MIDI Manufacturer's Association 2265 Westwood Blvd. Box 2223 Los Angeles, CA 90064 Richmond Sound Design, Ltd. 1234 West Sixth Avenue Vancouver, Canada V6H 1A5 (604) 734-1217 ShowNet P.O. Box 9677 Glendale, CA 91226 (818) 500-7265 Society of Motion Picture and Television Engineers 595 West Hartsdale Avenue White Plains, NY 10607 (914) 761-1100