[Nullpointer]

Plug−In : An arts coucil technology residency

Introduction | Research | Project Stage 1 | Project Stage 2 | Project Stage 3 | Results &Links

Introduction

Today’s Digital Media and Videogames Companies work with very specialised new media tools to produce cutting edge products. In the forefront of exploration are creative tools used in the development of new generation game consoles, mobile phone entertainment and interactive TV applications.

A three months residency will provide an artist to develop his/her own creative art practice by exploring such unique digital tools. Working alongside digital media industry professionals, the aim of the project is to investigate and produce innovative ideas on new digital platforms. The project focussed on developing an abstract series of visual toys for the mobile platform ( 3650,7650,nGage etc).

The following stages are documented (also listed above). Certain sections may be overly technical, but some discussion of code and hardware in such a project is obviously necessary.

Research: The process of identifying the appropriate mobile platform. Project Stage 1: Preparation for programming Project Stage 2: Development issues and details Project Stage 3: Final project outcome Results &Links: Conclusions and pointers to further information

The inital project proposal is also detailed here, however the final project represents an inevitable development and deviation from this document. iFone acted as the industry partner for the project.

Previous Work

Nullpointer has produced several previous projects t hat contribute to the background in which the Plug−In project will be developed BitmapSequencer / Audiopool / AvSeq / PixelMap (follow the links for more information hosted on this site)

In association with.

1 [Nullpointer]

Plug−In : An arts coucil technology residency

Introduction | Research | Project Stage 1 | Project Stage 2 | Project Stage 3 | Results &Links

Research

Which Platform?

In the first stage of research I have investigated the different options available for developing applications on mobile devices. Although other systems are available I chose to focus on the two leading platforms of J2ME and SYMBIAN. A brief description of these systems is as follows.

J2ME An interpreted language such as Java, which for mobile phones means J2ME (Java 2 Micro Edition). Javaenabled handsets are in wide and increasing deployment, and virtually all phone manufacturers (including Nokia) have committed to production of J2ME−enabled mobile phones. Although other interpreted languages are available on some handsets, J2ME is an open standard, easily learnable by programmers with Java experience, and more widely supported by handset manufacturers than the others.

Symbian OS An operating system for powerful that runs applications written in the C++ language and compiled to machine code. Smartphones are less widely deployed than Java−enabled phones, but more fully featured and capable of running more sophisticated applications, and will experience further market penetration over time. As with J2ME, there are competing “” operating systems in deployment, but phone manufacturers representing more than 70% of the market (including Nokia, Sony−Ericsson, Siemens, and Panasonic) have committed to the use of Symbian OS for their smartphones, making it more widely supported than other operating systems. Additionally, Symbian OS is an open standard, available for license by any hardware manufacturer, and is specifically designed for use with wireless communication devices, thus providing superior functionality.

The following chart provides a high−level comparison of the capabilities of the two technologies:

2 [Nullpointer]

* Except for some older legacy phones. 1 However, Symbian OS applications are typically so large that OTA installation is impractical. 2 J2ME can display video in phones that support the Java Mobile Media API (such as the Nokia 3650). 3 Access to SMS from J2ME is possible using the Nokia SMS API (supported on the Nokia 3410s) or Wireless Messaging API (WMA, supported on the Nokia 3650 and newer Nokia 3410s).

Chosen Platform: Symbian OS

After examining the development environments for each system and looking through existing software I decided that the SYMBIAN platform would be more appropriate for the project. Some of the following factors lead to this decision.

1. Previous experience in C++. 2. Improved sound support and 12bit graphics. 3. Ability to produce 'cutting edge' audiovisuals on a mobile device. 4. Support for Nokia series 60 phones (some Sony/Ericcson, Siemens) 5. Symbian is a next generation mobile OS with increasing support (N−gage e.t.c.)

More information on Symbian can be found at www.symbian.com and more details on nokias Symbian support at www.forum.nokia.com see also references.

Conclusion

Although further research would be beneficial it is obvious from the initial findings that the Symbian platform offers a familiar programming system and (for a mobile device) a powerful graphics engine. The issue with control would be a problem when writing for any mobile device (except one specifically designed to play games GBA for example) but a simple and elegant solution should be defined (with a minimal set of keys). In terms of sound the application may have to deal with unstable timing but there is

3 [Nullpointer] a possibility to use MIDI instruments in order to reduce filesize (only supported on the 3650). All these concerns will be taken into account when defining a technical &design document for the chosen project idea.

Reference Points

There are a number of artistic and commercial projects that act as inspiration to the Plug−In project. The intention is to learn from these existing products/methods to make the project a synthesis of new ideas and good practice. A small selection is detailed below.

Toshio Iwai:

Toshio Iwai has created many beautiful yet simple audiovisual games where the users can play in an open ended environment to create constantly changing patterns of sound and light. His work has been exhibited world−wide and used as performance tools by prominent composers.

Ixi Software / Golan Levin / SoundToys / Processing

These four sites provide free tools and documentation on small scale computer audiovisual composition toys and tools. The individual projects range from complex synthesis to simple atmospheric music.

Mr Driller / Chu Chu Rocket / Puzzle Bobble

These three games have little to do with audio, but they base their addictive and innovative gameplay on very simple rules that create emergent behaviour and wide variation (much in the vein of tetris). They also work in a scalable manner performing well on both handheld devices,consoles and PC.

Spheres of Chaos / rRootage / Turux.org

These two games and the website Turux.org use simple but effective algorithmic graphics techniques to generate abstract generative images. Often derived from the 'demo scene' such techniques are designed to work within the strict program parameters defined by most mobile/handheld devices.

Next: Project Stage 1 >>>

4 [Nullpointer]

Plug−In : An arts coucil technology residency

Introduction | Research | Project Stage 1 | Project Stage 2 | Project Stage 3 | Links

Initial Proposition

Synesthesia : An audiovisual sequencing game

1. Outline 2. References 3. Techniques 4. Development 5. Conclusion

1. Outline Synesthesia is an audiovisual environment where users can compose music and visuals in a reflexive generative system. The possibilities offered by digital technology to synchronize the generation of audio and video processes are rarely explored in either sequencing software or games. In this project users would engage with a sequencing system that is also part of an abstract puzzle game. As the player progresses the environment evolves and reacts to the developing audio track which will likewise be effected by changes in the game world. This reflexive process between audio and video is guided and manipulated by the player allowing them to explore a vast range of different sonic environments.

2. References Some video games/art works use audio−to−video synchronisation techniques. Vib Ribbon (PS2) and Rez (PS2,DC) present abstract visons of such relationships whereas ‘Bemani’ games provide a more mainstream approach. It is rare that such projects venture far into the realm of creating a generative and holistic a/v system. Media artist Toshio Iwai has created art works that allow users to ‘play’ with sound and visuals but most ‘interacive’ av art only provides a very brief and shallow experience of this kind.

3. Techniques a) Realtime sequencing Using a pool of audio samplesselected specifically for a particular environment the player will be involved in the process of sequencing rhythms and melodies in realtime. The progress of the player through a level will relate to their interaction with these systems. This sequencing will be represented in a visual form integral to the game landscape. b) Digital Signal Processing (DSP) Realtime DSP effects and synthesis could be used to ensure that the audio generated is always unique. Audio−Art applications such as MAX/MSP, Nato and Jitter have shown how realtime processing can create a myriad of experimental and atmospheric effects. Sounds could be filtered according to player energy levels or modulated to represent certain game states. c) Image Synthesis Using audio to generate imagery is a common technique used in PC sound players (I−Tunes, WinAmp, Windows Media Player e.t.c.). A combination of FFT analysis and image quantising would allow the visuals of the game to relate more instrinsincly to the evolving audio track. This would result in a visual environment that would reflect the players progress and technique.

4. Development Synesthesia could be devloped in a number of ways and has the facility to be a scalable system. Obviously the most visually lush incarnations would be based on high end technology such as consoles and PCs. However handheld systems and mobile platforms also offer a good opportunity to create such an a/v sequencing game. There is also the ambition of creating networked versions of the software where players ‘jam’ with each other either in competition or collaboratively.

5. Conclusion Synestesia is an ambitious project that aims to create an interactive environment true to its name. Melodies and images interweave in a virtual space where the user plays to conduct and direct the flow of audio−visual data.

5 [Nullpointer]

Plug−In : An arts coucil technology residency

Introduction | Research | Project Stage 1 | Project Stage 2 | Project Stage 3 | Results &Links

Project Stage 1

Developer Environment

I chose to work in C++ and use the Nokia Series 60 sdk, the symbian sdk and the EPOC emulator. The Initial Target device is the Nokia 3650. Applications that are validated on the emulator can be recompiled for the ARM cpu on the symbian device. The resulting application is then packaged into an installation file (.SIS) which is then transferred via bluetooth to the target device. During the research stage I have investigated the following areas of programming that are of key importance to the project.

1. Graphics

The 7650/3650 have a resolution of 176x208 and a 12bit Display (grouped 444). It is tricky but not impossible to access the screen buffer directly and perform per pixel operations. Unfortuanately there is no floating point processor and emulation in Symbian runs very slowly. The graphics engine supports sprite masks and different source bitmap depths. After writing a basic grid−pixel manipulator I managed to build a simple water simulation with bilinear filtering. Using the same basic template I wrote a quick voxel landscape renderer (pictured right). In terms of speed a program can acheive a quite fast rate (approx 60fps) by bypassing the windows server and drawing straight to the screen. The difficulty is that the program has to be aware of other functions the phone may require (incoming phone calls e.t.c.) so some additional functions will need to be developed.

2. Sound

The chosen system supports the playback of .wav (wave) audio files. However these files can become quite large (particularly if recorded at high quality) and can bloat the size of the final appliaction. The Nokia Symbian sdk also supports playing of sine based tones however these functions do not support any level of synthesis for generating complex sounds. It seems that the use of small wave files would suit the requirements of the Operating system better. The OS does not have a timer with decent accuracy (1/64th of a second is the best you'll get), so accurate audio quantization may be difficult to acheive.

3. Control

Control is a major problem aspect when developing for mobile platforms. Primarily designed as a communication device mobile phone keypads operate digitally (as opposed to discrete analogue controllers) and respond better to repeated presses rather than keys held over time. To add to this issue different devices have different key layouts, to account for this an application must implement a simple but flexible control method. Such a method should only involve a minimum amount of keys and not require absolute precision from the user.

Conclusion

Although further research would be beneficial it is obvious from the initial findings that the Symbian platform offers a familiar programming system and (for a mobile device) a powerful graphics engine. The issue with control would be a problem when writing for any mobile device (except one specifically designed to play games GBA for example) but a simple and elegant solution should be defined (with a minimal set of keys). In terms of sound the application may have to deal with unstable timing but there is a possibility to use MIDI instruments in order to reduce filesize (only supported on the 3650). All these concerns will be taken into account when defining a technical &design document for the chosen project idea.

Setting up for compilation

The method of developing for symbian based systems is fairly logical, but can seem complex (especially when compared to traditional (single system) developer environments. What the SDK actually does is to create a symbian Operating System on your PC hard drive (residing in C:\Symbian....). However because mobile phones use different processors and file storage the SDK

6 [Nullpointer] actually creates several versions of the symbian OS on your PC. One is for development and testing with an emulator (ie all compilation and execution takes place on the PC) called WINS. The other version is used for final compilation of code prior to uploading your application to the phone, called ARMI (and also a reduced instruction set version THUMB).

This is further complicated by the fact that symbian applications often use resource files that have to be in the right path of whichever system you are using. The SDK demonstration applications come with a few batch files (.bat) which extract/copy and compile demo applications to the correct locations. There are several versions of these batch files, one for each OS (WINS,Armi and Thumb).

Unfortunately this can lead to tedious long paths such as

C:\Symbian\6.1\Series60\Epoc32\BUILD\SYMBIAN\6.1\SERIES60\SERIES60EX\HELLOWORLD\GROUP\HELLOWORLD\WINS

These hold the workspace project files ( I am assuming Visual C++ is the compiler). These project files reference source code in a seperate (higher) part of the tree

C:\Symbian\6.1\Series60\Series60Ex\HelloWorld (Notice the workspace files creation duplicates the whole path within the Symbian home directory).

This can make it difficult to keep track of which files are current and makes backups difficult. I found it easier to create a single directory to contain both the project files and the source code (including resources). Backups are easier to make and files are more readily accessible. You can change the actual compilation directories too if you like but since you never need access them (they just contain compiler link objects etc) you can leave them as part of the symbian build subtree. Now you should be able to compile and test on the emulator easily.

If you want to compile for the target device you must do so from the command line compiler (Visual C++ wont compile for anything other than the WINS windows platform). You can execute a 'abld build armi urel' command from the dos prompt (you may need to check the path in abld.bat is correct) or create a batch file to run this for you. This will generate your target specific application in the appropriate directory (the ARMI subset of the symbian tree for instance).

\Symbian\6.1\Series60\Epoc32\release\armi\urel\testapp.app

Now this compilation destination is the path you should use in your package file, when you want to make up an installation file for the target device (.sis) SIS files package up all the necessary files for installing an application on the target device. When making a .sis file you need to identify which files to include and where to install them to on the phone. This is done in a small information file called a package file (.pkg). Now you know the location of your compiled files (in the appropriate tree of the symbian OS path on your PC) you can use this path to tell the package file (and the resulting .sis) where to look and where to copy to.

Example pkg instruction: (copy from PC to .sis and then install to target location) "\Symbian\6.1\Series60\Epoc32\release\armi\urel\testapp.app"−"!:\system\apps\testapp\testapp.app"

Again I found it easier to make a batch file to do this work for me. This means I do all testing and emulation development in Visual C++ without using command line tools. When I want to compile and load to the device I can just run a batch file that compiles the code to the Symbian ARMI release directory and then run a second batch file that uses those files (via a pkg file instruction) to create a resulting .sis file. Both batch files (and pkg file) reside in the same workspace directory. The final .sis file can be easily uploaded to the target device using bluetooth or infared e.t.c.

Next: Project Stage 2 >>>

7 [Nullpointer]

Plug−In : An arts coucil technology residency

Introduction | Research | Project Stage 1 | Project Stage 2 | Project Stage 3 | Results &Links

Project Stage 2

Zones

I decided to create different 'play' zones in one single application. Each zone would be accessible via a number key on the phone. I didn't want the program to be directed at competitive play or level orientated goals, so there was to be no 'point scoring' or 'lives' in the traditional sense.

Key to the development of the zones was the use of a pixel engine. I found that one useful resource for coding and experimental ideas was the 'demo−scene'. Popular in the 80s this was a movement of young programmers and hackers writing advanced graphics and audio routines to try and overcome the limitations of contemporary PCs (286, Amiga etc). They often focussed on optimizing code for small memory sets and low resolution low color displays. Mobile platform development is in much the same stage of technological development at the moment and as such demo coding ideas apply very well to development on these devices.

Various graphical effects were designed and tested for each potential zone and a simple particle system was developed. After trying out voxel based engines and other pseudo 3d effects it seemed better to allow the zones to share a common ground. In this spirit, the user interacts with each zone using only the cursor (which is a box shape, centered on a crosshair). This provides the user with a familiar interface and reduces the complexity of input to just 4 directional keypresses. A brief description of the pixel engine design follows.

Pixel Engine

I decided to write my own pixel drawing code rather than stick with the in−built sprite bitmap blitting functions. However there are a few obstacles to overcome when working this way.

Firstly if you are writing data to the screen you are at the mercy of the 'window server' this is the part of the symbian OS callbacks that updates the screen and can run very slowly. It can especially cause problems if it is trying to render a screen area while you are trying to write to it. This can be solved by bypassing the window server and using direct screen access, this requires you to update the screen yourself (and write some handling code to cope with application interruptions).

Secondly the display on most symbian 6.1 devices (the version I am working on) use a 12bit display depth. This is unusual as most systems are based on 8bit, 16bit, 24bit or 32bit depths. We are trying to represent r,g,b values (which in 24 bit would be 8bits per color). In 12 bit mode there is a maximum of 4096 values split into 3(4bit) ranges as follows: blue[1 2 4 8] green[16 32 64 128] red[256 512 1024 2048]. An offscreen Bitmap Is used as a temporary 12bit buffer in my application and then the whole scene is blitted to the screen after each frame of processing is finished.

Once a solid pixel drawing routine had been established and a common interface was decided upon the

8 [Nullpointer] programming of the zones could commence.

Sound

Initially I assumed that mobile devices, often sold on the strength of polyphonic ringtones and great sound quality, would be a good platform for developing interesting uses of sound. However Symbian offers little in the way of sound interfaces and most of the target devices are monophonic and support only low quality sound playback.

There are two main routes to sound creation on Symbian, CMdaAudioToneUtility for generation of simple monophonic sinewave tones, and CMdaAudioPlayerUtility for playback of media types such as wav files and midi files. Although an application can hold a number of audio files in memory it can only play one file at a time. There is no easy way to multi−track audio without writing data direct to the sound output port and building an independent mixer interface (a substantial project in itself).

This level of simplicity ony allows for basic event triggered sounds (like menu selection noises and alert messages). I did create a parallel version of several zones that used basic audio such as this, but ultimately I found that the existing system was too restrictive to do anything very dynamic. It is also worth noting that even by the addition of a few short wave (.wav) files to the project the overall filesize was instanstly doubled.

Next: Project Stage 3 >>>

9 [Nullpointer]

Plug−In : An arts coucil technology residency

Introduction | Research | Project Stage 1 | Project Stage 2 | Project Stage 3 | Results &Links

Project Stage 3

Final Zones

For the final program I created 10 different 'play' zones in one single application. Each zone would be accessible via a number key on the phone. Users can switch between zones at any time, allowing a 'bleed' effect where the residue from one zone would fade into another. The zero key triggers a small and cryptic help page, listing the zones by a single keyword that hints at the mode of interaction for that area. The zones are discussed in more detail below

0 − Help (Menu)

This page offers one word descriptor hints as to what form of interaction is required in the subsequent zones. It also displays a 'demo−scene' style spirograph style dynamic form oscillating in the background.

1 − Divide

In this zone small particles drift an bounce around the screen. If the users cursor touches a particle then it splits into several more. Particles eventually turn red and die. The user has to keep splitting the existing particles to increase the complexity of the scene. The field is processed using a fake 'water' effect allowing particles to generate wave−like trails across the surface. As the scene becomes more complex incidental 'raindrops' are added to the scene, further perturbing the visual surface.

2 − Scatter

This zone represents a classic implementation of Conways Life(cellular automata) algorithm. This CA uses a very simple set of rules (If an occupied cell has two or three neighbors, the cell survives to the next generation. If an unoccupied cell has three occupied neighbors, it becomes occupied. Otherwise the cell is considered overcrowded or isolated and is removed). Even with such simple rules very complex behaviour can emerge from a chaotic system. The users cursor sprays random bacteria data across the surface, scattering potential CA lifeforms.

10 [Nullpointer]

3 − Burst

In this zone firework−like particles shoot from the base of the screen and arc back to the ground where they bounce back up again. If the users catches a particle in the cursor box then the firework explodes. After a certain number of explosions the number of fireworks is increased making the scene more complex. With more fireworks the zone also increases the incidence of random sparks and fire−like forms at the base of the screen. Eventually the zone become saturate with color and moving particles.

4 − Gather

Trailing lines move around the screen, heading vaguely in the direction of the cursor. The longer they spend inside the cursor box the larger the trails become, also trigggering new trails to appear on the screen. The user can gather these trails from one part of the screen to another, changing their size and effect.

5 − Sketch

This zone acts like a semi−automatic spirograph. A looping geometric figure is sketched on the screen with the cursor position effecting the drawing parameters. This allows for a large range of forms to be generated.

6 − Type

Letters fall at different rates from the top of the screen, leaving trails behind them. If a letter is caught within the cursor box it is added to a word that is displayed in the center of the screen. Special characters such as delete and space allow the word to be edited further. If the word reaches 15 character in length it is wiped and the process starts again.

11 [Nullpointer]

7 − Avoid

In this zone a particle chases the users cursor. If the users can avoid the particel for long enough it will grow in size and eventually produce another chasing particle. Collision with the cursor reduces the particle size. Long term successful avoiding will increase the number of pursuing objects to a large degree, also increasing the visual compexity of the scene with oversaturating trails.

8 − Grow

Skinny plant−like growths move erratically up the screen. On entering the cursor box they split into two seperate sprouts and continue on different routes. Repeated splitting causes purple buds to appear at these junctions and evetually on the growing stalks themselves.

9 − Link

In this zone the User must link white particles together by collecting them one by one. If the user accidentally collects the red particle then the whole chain breaks and must be rebuilt. Each successful chain built adds another white particle to be connected to the next chain. As more links are needed the red particle speeds up and the scene becomes more complex.

Next: Results &Links >>>

12 [Nullpointer]

Plug−In : An arts coucil technology residency

Introduction | Research | Project Stage 1 | Project Stage 2 | Project Stage 3 | Results &Links

Results

The final application installer (.sis) exists at a modest 27kb. This is mainly due to the fact that most of the work is done algorithmically and there is no need to store large graphics files etc. This file is almost instantly transferred via bluetooth/infared and would not be excessive if received over the air either.

Although the limitations of working with a restrictive platform and OS can be frustrating and inevitably reduce the range of expression available I feel that the project has allowed for the development of something that is in fact currently unique for symbian devices. The choice of mobile platforms is perhaps the most primitive working environment amongst entertainment/gaming devices (when compared to consoles/PCs and even Gameboy level hardware) and as such initial artistic expectations had to be led as much by what is possible with the technology as by their own ambitions.

When working with new technology/software the learning curve will always be steep. This required the first half of the project to essentially be focussed on establishing a familiarity with the systems involved. Although some artists may have chosen to concentrate on using the technology as a preset medium for dissemination of artistic work, I decided as a programmer to write 'within' the technology to produce something that actually ran inside the machine. This was a challenging task for many reasons but I feel like the final product justified the attempt.

Working with a Technolgy/Industry partner also required certain working practices to be adhered to. Ifone provided technical support in terms of testing and development hardware. But as distributors (rather than developers) there was little programming advice. Luckily mobile developers are keen to promote the use and development of their products and so much information is available online to aid programmers and artists.

In conclusion I would say that writing for mobile devices is a fascinating area which also leads to a whole new method of dstribution and communication. Development of this area is only going to progress at a faster and faster rate, to the point where mobile devices will become the most ubiquitous form of technology available for an artist/programmer to work with. There are however many restrictions with the medium but if you bear those in mind then you can still create some unique work that represents an artistic vision and also pushes the envelope of mobile coding.

Reference Links www.forum.nokia.com Nokia developers site http://www.flipcode.com/articles/article_mobilegfx01.shtml Symbian programming www.newlc.com symbian programming resources www.symbian.com OS homepage http://www.flipcode.com/voxtut Good article on Voxel Mapping http://www.flipcode.com/demomaking Demo−scene programming techniques Anglefire Nokia/symbian page a comprehensive list of links for both developers and users

13