The Mobile Csound Platform 6
Total Page:16
File Type:pdf, Size:1020Kb
are also working on MusicXML import, which will 5. http://cycling74.com/docs/max6/dynamic/ add to MaxScore’s user-friendliness. c74_docs.html#m4l_live_object_model (retrieved on February 14, 2012) THE MOBILE CSOUND PLATFORM 6. http://www.didkovsky.com/JavaMusicSyst 6. REFERENCES ems/JMSL3.pdf (retrieved on February 14, Victor Lazzarini, Steven Yi, Joseph Timoney 2012) Damian Keller 7. Marc Sabat, “The Extended Helmholtz- Sound and Digital Music Technology Group 1. Nick Didkovsky & Georg Hajdu (2008). Ellis JI Pitch Notation,” in Mikrotöne und National University of Ireland, Maynooth, Ireland Nucleo Amazonico de Pesquisa Musical (NAP) MaxScore. “Music notation in Max/MSP”. Mehr. von Bockel Verlag, 2005, pp. 315– [email protected] Universidade Federal do Acre, Brazil Proceedings of the International 331. Computer Music Conference. 8. Clarence Barlow (1987). “Two essays on [email protected] [email protected] 2. Georg Hajdu & Nick Didkovsky. “On the theory”. Computer Music Journal, 11, 44- [email protected] evolution of music notation in network 60. music environments”. Contemporary 9. http://www.computermusicnotation.com/? Marcelo Pimenta Music Review 28, 4/5, pp. 395 – 407 page_id=266 (retrieved on February 14, 3. http://www.sadam.hu/?q=software 2012) LCM, Instituto de Informatica (retrieved on February 14, 2012) 10. Georg Hajdu, Kai Niggemann, Ádám Universidade Federal do Rio Grande do Sul, Brazil 4. Cognitive Siska & Andrea Szigetvári. “Notation in [email protected] Foundations of Musical Pitch. the Context of Quintet.net Projects”. Carol L. Krumhansl. Contemporary Music Review, 29, 1, pp. 39 New York: - 53 Oxford University Press, 1990. ABSTRACT Csound API has been described in a number of articles [4], [6] [7]. This article discusses the development of the Mobile Csou- New platforms for Computer Music have been brought nd Platform (MCP), a group of related projects that aim to to the fore by the increasing availability of mobile devices provide support for sound synthesis and processing un- for computing (in the form of mobile phones, tablets and der various new environments. Csound is itself an estab- netbooks). With this, we have an ideal scenario for a vari- lished computer music system, derived from the MUSIC ety of deployment possibilities for computer music sys- N paradigm, which allows various uses and applications tems. In fact, Csound has already been present as the through its Application Programming Interface (API). In sound engine for one of the pioneer portable systems, the the article, we discuss these uses and introduce the three XO-based computer used in the One Laptop per Child environments under which the MCP is being run. The (OLPC) project [5]. The possibilities allowed by the re- projects designed for mobile operating systems, iOS and engineered Csound were partially exploited in this sys- Android, are discussed from a technical point of view, ex- tem. Its development sparked the ideas for a Ubiquitous ploring the development of the CsoundObj toolkit, which Csound, which is now steadily coming to fruition with a is built on top of the Csound host API. In addition to number of parallel projects, collectively named the Mo- these, we also discuss a web deployment solution, which bile Csound Platform (MCP). In this paper, we would like allows for Csound applications on desktop operating sys- to introduce these and discuss the implications and possi- tems without prior installation. The article concludes with bilities provided by them. some notes on future developments. 1. INTRODUCTION 2. THE CSOUND APPLICATION ECOSYSTEM Csound is a well-established computer music system in Csound originated as a command-line application that pars- the MUSIC N tradition [1] , developed originally at MIT ed text files, setup a signal processing graph, and pro- and then adopted as a large community project, with its cessed score events to render sound. In this mode, users development base at the University of Bath. A major new hand-edit text files to compose, or use a mix of hand- version, Csound 5, was released in 2006, offering a com- edited text and text generated by external programs. Many pletely re-engineered software, as a programming library applications–whether custom programs for individual use with its own application programming interface (API). or publicly shared programs–were created that could gen- This allowed the system to be embedded and integrated erate text files for Csound usage. However, the usage sce- into many applications. Csound can interface with a vari- narios were limited as applications could not communi- ety of programming languages and environments (C/C++, cate with Csound except by what they could put into the Objective C, Python, Java, Lua, Pure Data, Lisp, etc.). text files, prior to starting rendering. Full control of Csound compilation and performance is Csound later developed realtime rendering and event provided by the API, as well software bus access to its input, with the latter primarily coming from MIDI or stan- control and audio signals, and hooks into various aspects dard input, as Csound score statements were also able to of its internal data representation. Composition systems, be sent to realtime rendering Csound via pipes. These fea- signal processing applications and various frontends have tures allowed development of Csound-based music sys- been developed to take advantage of these features. The tems that could accept events in realtime at the note-level, _162 _163 such as Cecilia [8]. These developments extended the use tivity. A CsoundObj object controls Csound performance Android can take their experience and work on standard cases for Csound to realtime application development. and provides the audio input and output functionality, via Java desktop applications. The two versions of Java do However, it was not until Csound 5 that a full API the CoreAudio AuHAL mechanism. MIDI input is also differ, however, in some areas such as classes for access- was developed and supported that could allow finer grain handled either by the object, by allowing direct pass-through ing hardware and different user interface libraries. Simi- interaction with Csound [3]. Applications using the API to Csound for standard Csound MIDI-handling, or by rout- larly to iOS, in order to help ease development, a could now directly access memory within Csound, control ing MIDI through a separate MIDIManager class to UI CsoundObj class, here written in Java, of course, was de- rendering frame by frame, as well as many other low-level widgets, which in turn send values to Csound. Addition- veloped to provide straightforward solutions for common features. It was at this time that desktop development of ally, a number of sensors that are found on iOS devices tasks. applications grew within the Csound community. It is also come pre-wrapped and ready to use with Csound through this level of development that Csound has been ported to CsoundObj. As with iOS, some issues with the Android platform have motivated some internal changes to Csound. One mobile platforms. To communicate with Csound, an object-oriented call- such problem was related to difficulties in handling tem- Throughout these developments, the usage of the back system was implemented in the CsoundObj API. Ob- porary files by the system. As Csound was dependent on Csound language as well as exposure to users has changed jects that are interested in communicating values, whether these in the compilation/parsing stage, a modification to as well. In the beginning, users were required to under- control data or audio signals, to and from Csound must use core (memory) files instead of temporary disk files stand Csound syntax and coding to operate Csound. To- implement the CsoundValueCacheable protocol. These was required. day, applications are developed that expose varying de- CsoundValueCacheables are then added to CsoundObj and grees of Csound coding, from full knowledge of Csound values will then be read from and written to on each con- Two options have been developed for audio IO. The required to none at all. Applications such as those created trol cycle of performance (fig.1). The CsoundObj API first involves using pure Java code through the Audio- for the XO platform highlight where Csound was lever- comes with a number of CsoundValueCacheables that wrap Track API provided by the Android SDK. This is, at pres- aged for its audio capabilities, while a task-focused inter- hardware sensors as well as UI widgets, and examples of ent, the standard way of accessing the DAC/ADC, as it face was presented to the user. Other applications such creating custom CsoundValueCacheables accompany the appears to provide a slightly better performance on some as Cecilia show where users are primarily presented with Csound for iOS Examples project. devices. It employs the blocking mechanism given by Au- a task-focused interface, but the capability to extend the dioTrack to push audio frames to the Csound input buffer system is available to those who know Csound coding. (spin) and to retrieve audio frames from the output buffer The Csound language then has grown as a means to ex- (spout), sending them to the system sound device. Al- press a musical work, to becoming a domain-specific lan- though low latency is not available in Android, this mech- guage for audio engine programming. anism works satisfactorily. Today, these developments have allowed many classes of applications to be created. With the move from desktop As a future low-latency option, we have also devel- . Csound for iOS SDK sample app platforms to mobile platforms, the range of use cases that Figure 2 oped a native code audio interface. It employs the OpenSL Csound can satisfy has achieved a new dimension. API offered by the Android NDK. It is built as a replace- ment for the usual Csound IO modules (portaudio, alsa, (performed by screen or MIDI input), signal processing jack, etc.), using the provided API hooks.