MASTER'S THESIS

Web-based collaboration through screen sharing

Catalina Tejada Pérez

Master of Science in Engineering Technology Computer Science and Engineering

Luleå University of Technology Department of Computer Science, Electrical and Space Engineering Web-based Collaboration through Screen Sharing

Catalina Tejada P´erez

Dept. of Computer Science and Electrical Engineering Lule˚aUniversity of Technology

Lule˚a, Sweden

November, 2011

Supervisors: Adam Bergkvist K˚areSynnes ABSTRACT

Nowadays there are plenty of different options for connecting users all around the world: video- conferencing, audio-conferencing, chat applications... Nevertheless, most of these need specific , plug-ins or separate applications installed in the personal devices. This could be avoided by using web-based technologies, so the user would not have to agree any term or condition in order to communicate with the other end. In this area, the main aim of this thesis is to investigate, study and analyse the different approaches and options available nowadays to share the screen or part of it, natively in the browser, without the need of plug-ins or added software. It is also implemented an application with this characteristics, so two users will be able to share their screen, or part of it, directly from the browser. The application implemented takes the images from the screen (encoded in jpeg format) using GStreamer. It uses the WebSocket protocol to communicate both ends, using a which will push the data from one user to another, both connected with a password to allow the identification. In this thesis it is also compared this implemented application, with another approach which uses different technology but the same idea of not using any added software. For further work or in case it is intended to create a commercial product, at the end of the report it is proposed some ideas to improve it.

2 To my family and friends.

3 ACKNOWLEDGEMENTS

This thesis work has been taken place in Ericsson Lule˚a,from April 2011 to November 2011. I would like to thank all the people that have participated with their effort and time in the carrying out of this thesis. First of all, to my supervisor in Ericsson, Adam Bergkvist for his dedication, time spent and effort, guiding and giving advice whenever I needed. To K˚areSynnes, for encouraging me in this challenge and Stefan H˚akansson for trusting in me for this thesis. To everyone in Ericsson that made the work easier for me with their support and company. Finally, I would like to thank everyone who has supported and taught me throughout these years: family and friends. Thank you.

Catalina Tejada

4 CONTENTS

1 Introduction 8 1.1 Today’s communication ...... 8 1.2 Problem Statement and Approach ...... 8 1.3 Thesis Purpose ...... 9 1.4 Delimitation ...... 9 1.5 Related Work ...... 10

2 Theory 11 2.1 Desktop sharing ...... 11 2.1.1 Protocols ...... 11 2.1.2 Comparison table of protocols ...... 13 2.1.3 Web Standards ...... 13 2.2 GStreamer ...... 14 2.3 WebKit ...... 14

3 Design Decisions 16 3.1 Sending images versus sending instructions ...... 16 3.2 Communication ...... 16 3.3 Websockets versus XMLHttpRequest ...... 17 3.4 Image encoding ...... 18

4 Results 19 4.1 Image provider ...... 19 4.2 Link images ...... 20 4.3 Image sender ...... 20 4.4 Server ...... 20 4.5 Image painter ...... 21 4.6 Total ...... 21 4.6.1 Final implementation ...... 21 4.6.2 Final application behaviour ...... 23 4.6.3 Final code diagram ...... 23 4.7 User application ...... 25 4.7.1 Use cases ...... 25 4.7.2 Requirements ...... 26

5 5 Application Screenshots 28 5.1 General ...... 28 5.2 Sharer ...... 28 5.3 Viewer ...... 30 5.4 Sharing ...... 31

6 Testing 33 6.1 Image provider testing ...... 33 6.2 Image sender testing ...... 33 6.3 Server testing ...... 34 6.4 General testing ...... 34

7 Discussion 35 7.1 General Implementation ...... 35 7.2 Base64 encoded images and Blob Images ...... 36 7.3 Image encoding ...... 36 7.4 Server ...... 36 7.5 Viewing remote desktop ...... 36 7.6 Identifying the “Viewer” ...... 37 7.7 Comparison with video stream Implementation ...... 37 7.8 Working flow ...... 37 7.9 Problems encountered ...... 38 7.10 Future work ...... 38

8 Conclusion 40

References 41

6 LIST OF FIGURES

1 Remote sharing Desktop [4] ...... 10 2 VNC architecture protocol [10] ...... 12 3 Protocol ...... 12 4 Three elements linked together [25] ...... 14 5 Different sending possibilities ...... 17 6 Difference between websockets and XHR [31] ...... 17 7 Linked GStreamer pipeline ...... 19 8 WebSocket server handler ...... 21 9 Final implementation image ...... 22 10 Application flow chart ...... 23 11 Code diagram from the final application ...... 24 12 Share the entire desktop Diagram ...... 25 13 Sharing part of the screen Diagram ...... 26 14 Cancel sharing the screen Diagram ...... 26 15 View Diagram ...... 27 16 General view of the application ...... 28 17 Share the entire desktop image ...... 29 18 Share only one rectangle image ...... 30 19 Id message ...... 30 20 Confirmation message ...... 31 21 View remote desktop menu ...... 31 22 Viewer password ...... 32 23 Viewing one fourth of clientA’s desktop ...... 32 24 Viewing one half of clientA’s screen ...... 33

LIST OF TABLES

1 Comparison between protocols [13] ...... 13

7 Web Collaboration Tools -Screen Sharing-

Introduction

This section will begin with an introduction of today’s communication, trends and differences, and will go on with the problem area in general and how this thesis will try to solve part of it. This will lead to state the purpose and delimitations which will characterize this thesis, and will end with some related work in the area.

1.1 Today’s communication

Communication nowadays takes an important role in everybody’s life. In the last thirty years it has develop a lot; from telephone exchanges to cellphones, from sending postcards to sending e-mails, from having PDA’s to being able to communicate between them. Not only helping in every day’s life, but also in being informed: news about what is happening all over the world instantaneously, changes in the trends or needs of users, how markets are going... The definition of communication is the exchange of information, by speaking, writing, or using some other medium [1]. It can be divided, therefore, in several different groups: type of device used (smartphone, computer, tablet...), type of channel (satellite, optical fiber, copper line...), number of transmitters and receivers... Nevertheless, we will focus on two; the type of communication considering the need of users, and considering how it is presented in the users’ device. If we consider users’ need, people can communicate between them for personal needs or pro- fessional needs. In the personal area, now more than ever, communication among people from all over the world is important. As travelling is cheaper and easier, little by little more people tend to live away from their home countries and want to maintain their relationship with their family and friends. Normally, this communication is preferred to be as real as possible and not just telephone calls, as we can see in the constant growth of users in Skype or Gtalk, which both support video-conferencing [2] [3]. On the other hand, for companies, due to globalization and their willing to expand their market, to reach all types of users in different cities or countries, communication among their offices, workers and clients is essential. Software products in their majority focus their market on them, as they know that these will pay for the service obtained, in order to be guaranteed a good quality in their communication with the client (not being interrupted at any time, high video and audio quality...). In contrast with private users, which will try to look for other alternatives, maybe less competent or with less characteristics, to save money, as the use of the service will be more intermittent. If we considered how this communication is presented to the user, there are different options too. It can be presented as hardware, such as telephones or computers, it can be presented as software to be installed in the devices, such as Skype, or it can be presented as software, but without any installed applications in the computer, using web browsers, such as e-mails or some chat applications (GTalk).

1.2 Problem Statement and Approach

As it has been said, there are many ways of making this communication possible: as it has been doing for many years, with the typical telephone calls or fax, or as it has been doing in the last

8 Web Collaboration Tools 1.3 Thesis Purpose -Screen Sharing- years using the new technologies, such as video-conferencing, text chat, audio-conferencing, or screen sharing. From the point of view of the telephone calls or fax, for nowadays purposes they lack of many functionalities, although they complement very good to other technologies. On the other hand, video-conferencing or audio-conferencing, among others, have all one drawback in common, which is the fact that most of them need software to be installed in the final devices. This implies that the user has to accept some conditions in order to use the application, although he/she might not totally agree (give the e-mail address, receive newsletters...). This is the main reason why this thesis will focus on not needing any installation or agreement for connecting the users. This makes the final user more free in their decisions when choosing how to share his/her information, widening the options of applications and technologies. So, looking at the division written above, this thesis will focus on working purposes, and it will focus on not needing any installation in the devices.

1.3 Thesis Purpose

Moreover, this thesis will focus on screen sharing. Sharing the information would be easier and quicker, making the communication more real and fluent. Its purpose will be to study, investigate and analyse how to, natively in the browser, make it able to support sharing the screen between two users, with no need of plug-ins or added software in the final devices. This is, making easier the experience of sharing information, without the need of agreement between any software company and the client. In the end it will be implemented one of the studied options, the best one which fit our requirements. This application could be used for personal issues too, nevertheless, it has been decided to particularize it for a work environment, e.g. sharing slides in a meeting. To perform this purpose, there are some research problems which have been identified:

• How should the screen be captured? • Should these be encoded in some specific format (png, jpeg...)? • How should these be rendered in the “Viewer” side? • How should the “Viewer” be identified by the “Sharer”? • How should the images be sent?

These will have to be investigated and compared with the different options in order to achieve our goal.

1.4 Delimitation

This thesis will be implemented in WebKit, so supporting other web browsers will not be a matter of study. In this thesis it will not be considered encrypting the data: as the application will be first used in a lab environment, point to point security will not be at risk at this point. Furthermore, it will not be considered to be able to interact in the sharer computer, but only watch what it is on the screen. Another delimitation will be that the communication is going to be held from one source to one destination, so, it will only connect two users, one for sharing, and one for viewing. Nevertheless these delimitations could be implemented and would be good requirements for future work.

9 Web Collaboration Tools 1.5 Related Work -Screen Sharing-

Figure 1: Remote sharing Desktop [4]

1.5 Related Work

Ericsson Labs (https://labs.ericsson.com/) began in spring of 2010 a set of experimental imple- mentations of Web-RTC (Web-RealTime Communication) in WebKit, using the emerging web standards, to support real-time voice and video communication without the need for plug-ins [5]. In this thesis it will be used a modified part of this work to compare it with our final implementation (see 7.7).

10 Web Collaboration Tools -Screen Sharing-

Theory

This section will introduce the reader to the main technologies used and studied during the carrying out of the thesis, as well as the terminology used in the rest of the report. It begins with the different options there exist to share the screen, followed by the explanation of an application to take screenshots (GStreamer), ending the section talking about the web browser engine, WebKit.

2.1 Desktop sharing

There are different ways to view a remote desktop: On the one hand, using established protocols, which is the most common way among software products, and on the other hand, using today’s web standards, WebSocket protocol or XMLHttpRequest. In this thesis, it will be studied both paths and will be decided which way to take in order to fit all the requirements as good as possible.

2.1.1 Protocols

In this section there will be named and repeated new and specific terms, these are: Rootless: Programs run under it, show up on your desktop as regular programs, instead of being trapped on a box. Seamless mode: The ability to launch only one application remotely and have it forwarded to your local display [6]. Session shadowing: The ability to view, and optionally interact, with an existing session or local display. So two users sitting at different computers can view or interact with the same application [7]. There exist many different protocols for sharing the desktop between final users; nevertheless, most of them are proprietary protocols, so we know very little about them as no information is given from the developers. In this case we have the famous software application Skype, and other proprietary protocols such as ICA (Independent Computing Architecture), protocol designed by Citrix Systems [8], or ALP (Appliance Link Protocol), developed and distributed by Sun Microsys- tems [8]. There are, however, some open source protocols which will be the ones we will study in order to compare with the web standards. The four most important are: RFB, NX, Xpra and RDP.

• RFB(VNC): VNC (Virtual Network Computing) uses the protocol RFB (Remote Frame- Buffer), for remote desktop access. This is a very simple protocol, which sends from the client the keyboard’s and mouse movements (event messages), and it sends back from the server the screen actualizations. One of the main advantages of this protocol is that it works at the framebuffer level so it is suitable to all applications and windowing systems (X11, Windows or Machintosh) [9]. Nevertheless, the main drawback of this protocol is that in its simplest form it uses a lot of bandwidth. As it is an open source protocol, it can be found several different software applications which uses it. The most important are RealVNC, TightVNC or UltraVNC. None of them support seamless mode.

11 Web Collaboration Tools 2.1 Desktop sharing -Screen Sharing-

Figure 2: VNC architecture protocol [10]

• NX: NX is a designed by noMachine. NX is rootless, it supports seamless mode, sharing the entire desktop mode and session shadowing. The three highlights which make NX different are: compressing the X11 and proxying mes- sages, reduction of the roundtrips nearly to zero, advanced caching methods, and adapting the bandwidth in real-time according to the network [11] [12]. So it works under slow bandwidth connections or high latency networks. To reduce the roundtrips, NX splits each message in two different parts: an identification, which is the “fingerprint” of the message and different every time, and a data part which might be the same in different messages. This data part is stored in cache memory and restored each time there would be a match [12]. • Xpra: Xpra (X Persistent Remote Application) is rootless, it supports seamless mode, but not session shadowing and sharing the full desktop is allowed, but disabled by default [13]. It allows you to run X programs on a remote host and direct display to your local machine without losing any state [14]. Xpra works by connecting to an Xvfb server (performs all graphical operations in memory, not showing any screen output) as a compositing manager, but instead of combining the window images to present on the screen, it takes the window images and send them through the network to the Xpra client, which then displays them onto the remote screen [15] as it can be seen in Figure 3.

Figure 3: Xpra Protocol

• RDP: RDP (Remote Desktop Protocol) is a proprietary protocol developed by Microsoft. It provides remote display over network connections for Windows-based applications running on a server [16]. It supports full desktop sharing, nevertheless, session shadowing it is not supported. This is, when the session is initialized from the client, in the server (remote computer) it will not be

12 Web Collaboration Tools 2.1 Desktop sharing -Screen Sharing-

possible to view or interact; only a screensaver will be shown. The seamless mode is available but limited and not very practical. In RDP, it transfers graphics display information from the remote computer (server) to the user, and transports input commands from the user to the remote computer [17]. The main feature of RDP is the reduction of bandwidth, achieved by the and the persistent caching of bitmap [16].

2.1.2 Comparison table of protocols

Table 1 shows the comparison between the different protocols: Xpra, NX, VNC and RDP.

Xpra NX VNC RDP Stability Low Medium High High Performance over slow links Low High Medium Medium Ability to suspend and resume sessions Yes Yes Yes Yes Speed of suspend High Low Medium Medium Seamless mode Yes Yes Partial Partial Full Desktop mode No Yes Yes Yes Session shadowing No Yes Yes No Session pre.loading Yes No Yes No Typical server port number n/a n/a 5900 3389

Table 1: Comparison between protocols [13]

2.1.3 Web Standards

In the beginning of the World Wide Web, webpages were static, where all the content was separate from them, being these only a way of showing data. Nowadays, this is changing and it is developing to a more dynamic way of use, standing out the interaction between the webpages and the users, making them to take part in the content. HTML5: Programming language used for creating web pages, structuring and presenting their content. This new standard is the evolution of HTML4 and is still under development. It unifies HTML + CSS + JavaScript features, making web pages dynamic, active and powerful. It supports most of the HTML4 elements and introduces some new features. These are, among others, geolocation, integration of SVG (Scalable Vector Graphics) or the

13 Web Collaboration Tools 2.2 GStreamer -Screen Sharing- the client is connected to the server, it doesn’t have to re-establish the connection for every new message. This technology eliminates network overhead, and allows the connection to remain idle until the client or server initiates a request [20]. To sum up, once the connection with the server is opened, the client can receive data messages, error messages or closing messages directly into the JavaScript code, simplifying its use. The protocol consists of an opening handshake followed by basic message framing, layered over TCP [21]. One of the drawbacks is that images cannot be sent as normal images, but encoded in base64. Without this encoding, the image data could interfere with the WebSocket protocol, resulting in corrupted data [20]. Nevertheless, in the receiver part it can be displayed in base64 without needing to decode it. XMLHttpRequest (XHR): XMLHttpRequest, as WebSocket protocol, is a bidirectional communication between the client and the server. It is used to send HTTP or HTTPS requests directly to a web server and load the server response data directly back into the script [22]. Every time the client side wants to receive a message, it has to poll the server for a reply (through a HTTP request). The main difference between the protocols mentioned before and these web standards, are that, in the last ones, everything is web-based. No client or server installation is needed in the devices, so it simplifies the implementation of it as well as the final user (client) experience. The main software product which uses these technologies partially is ThinVNC, although it also needs a client to be installed in one of the devices [23].

2.2 GStreamer

GStreamer is a multimedia framework, based on pipelines to create multimedia streaming appli- cations such as video editors or media players [24]. The framework is based on linked plug-ins (filters, codecs, sources...) in the pipeline, which define the flow of data for a particular purpose [25]. In the pipeline, each element has its own purpose: receiving the data, treating it, and passing it to the next element. At the beginning of the pipeline there has to be a source, an element which generates data to be treated in the pipeline (reading from a file, take screenshots...) At the end of the pipeline there has to be a sink, which only accepts data but doesn’t produce it (writing in a file, soundback playback...). In between, there can exist different elements to operate on data, such as video scaler (convertor) or volume element (filter), which will be linked together in the pipeline. Every in-between element has two pads, one in which it receives the data, and one where it sends it. For a good communication between elements, both pads have to communicate and be compatible. This process, of linking two pads and checking if they are compatible is called “caps negotiation”. In case there is no compatibility, the data will not flow and the pipeline will be broken.

Figure 4: Three elements linked together [25]

2.3 WebKit

WebKit is an open source web browser engine [26]. It is used in some new web browsers such as Safari, Chrome, Chromium, or the Android web browser [27] [28].

14 Web Collaboration Tools 2.3 WebKit -Screen Sharing-

In this thesis it will be used a modified library of this, used in Ericsson Labs for their work in Web Real-Time Communication, (GTK browser).

15 Web Collaboration Tools -Screen Sharing-

Design Decisions

All over the carrying out of the Master thesis there have been times where decisions had to be made in order to be able to continue the work, choosing from several options which would determine the final application. In this section these decisions are exposed motivating the choice, although the final discussion it is written in section 7.

3.1 Sending images versus sending instructions

When sharing images (in this case, the desktop) between two devices, there are two different ways for doing it. One is sending the whole image (or part of it), and the other is sending the instruction to draw the image directly in the remote computer. After doing the study about the protocols used for desktop sharing, we have realized that in their majority they send the image instead of the instruction to draw the image. For the protocols studied, only RDP has an extension which encodes the drawing operations that produce an image, instead of encoding the actual image [29]. The main advantage of sending the instruction is the bandwidth reduction, but on the other hand, the complexity is higher. Moreover, sending an image (or part of it), is safer, as, if there is a loss, only a small part of the image will not be shown, in contrast with sending the instruction, which if it loses, no image would be shown in the client. In this Master thesis it will be used sending the images, for the reasons written above.

3.2 Communication

At first, the only option considered to communicate both users, was the established protocols. They were all deeply studied, investigating how they worked, their differences and similarities (see subsections 2.1.1 and 2.1.2). It was decided to use VNC protocol, as it fulfilled all the requirements better and is much more documented. Once the implementation began, and after creating a server and a client, allowing the communication between them (see section 7.9), we had to think how they would communicate through the network. In that point, we realised that it was not so straightforward. It was also unclear how would we identify the “Viewer”, as, until then, both users where held in the same device. Moreover, having to have a client and a server in every device (as it could share or view), made difficult the situation, as one of the main requirements is that the web application would be an extension of a browser, (and not a plug-in or a different application). We began to think if there could be any way of varying this protocol, or adapt it to our purpose. But in the end, we thought that it would be better to change our point of view, taking into account other possibilities. It was then when we widened our options for communicating both ends, introducing the Web- Socket protocol and XMLHttpRequest possibilities. To sum up, figure 5 shows all the options which have been studied to send the screenshots.

16 Web Collaboration Tools 3.3 Websockets versus XMLHttpRequest -Screen Sharing-

Figure 5: Different sending possibilities

3.3 Websockets versus XMLHttpRequest

When using the traditional protocols was dismissed, it had to be decided which of the other possibilities had to be used. An study of both options was made, realising that both were very similar, although some differences made them vary a lot. XMLHttpRequest(XHR) and websockets, are both bidirectional web-based communications between the client and the server. Nevertheless, websockets are much quicker when sending data, as it uses less bandwidth because, unlike XHR, no headers are exchanged once the single connection has been established [30]. Moreover, websockets do not need to poll the server for a reply, it will receive it without any request, unlike XHR which will ask the server for new messages every time is needed. This last configuration might be useful for static webpages, but can limit nowadays dynamic web applications. For these reasons, in the end it was decided to use WebSocket protocol to communicate both clients.

Figure 6: Difference between websockets and XHR [31]

17 Web Collaboration Tools 3.4 Image encoding -Screen Sharing-

3.4 Image encoding

When implementing the pipeline for taking the screenshots in GStreamer, it came to us the question of what type of encoding should be used. The two considered were PNG (Portable Network Graphics) and JPEG (Joint Photographic Experts Group) encoding. Although JPEG encoding uses lossy compression (compresses data by losing part of it), in the end, this was used instead of PNG. It was decided like that, because at the time we had to implement in GStreamer, some errors came to us, forcing us to rethink and restructure our first thoughts.

18 Web Collaboration Tools -Screen Sharing-

Results

Once all these decisions were made, and the investigation and study concluded, the implementation for the final application began. In general terms: for taking the screenshots, we will use GStreamer. Then, these images will be sent through the WebSocket protocol to a server. There, the server will push the data to the other client, displaying it as an image directly in the JavaScript code. For the connection between the two users it is used a websocket handler written in python. For testing this part, it has been used a WebSocket standalone server, pywebsocket, which is provided by Google for testing and experimental purposes [32]. In the rest of the section, it will be explained every part of the project in detail.

4.1 Image provider

As it has been said before, the image provider in this case it has been GStreamer. It has been written in language, using GStreamer Application Development Manual as main tool. [33] In general, its functionality is to take screenshots from the desktop every second and save them in a buffer. For this, first it takes the screenshots every second, then it encodes the image in jpeg format, and later, it sends us a signal when data is available so we can pull out the data and save it in a buffer. In figure 7 it can be seen all the plug-ins used to implement the image provider.

Figure 7: Linked GStreamer pipeline

The pipeline for taking the images consists of: ximagesrc: Source. It creates a screenshot video stream. [34] videorate: It adjusts timestamps on video frames between its source and its sink to make a perfect stream. [35] video/x-raw-rgb: As it has been said in 2.2, caps negotiation is the process where elements configure themselves and each other for be compatible in their format. [36] This indicates that the data will be a video, in raw format, in the rgb color space. It is here where we configure the frame rate, to tell “videorate” to adjust it. jpegencoder: Encodes the image in jpeg format. appsink: This is the sink. Responsible of emitting a signal when there is data available, so another method is called which is responsible of saving it in a buffer.

19 Web Collaboration Tools 4.2 Link images -Screen Sharing-

4.2 Link images

Once the pipeline was set, the code was introduced in the WebKit environment that has been used during this thesis. When the method was called, it gave us the image data every second. There were two different ways of giving this images to JavaScript: the first one, which the images were a string already encoded in base64, and the second one which the images were given as binary data (Blob). As websockets support both ways of sending data, there were both tested, to compare and decide which to choose. If we decide to send a Blob, there are two approaches that can be chosen. The first one, is to send the image in its binary form, as a Blob image. The second one, is to create a Blob, and append to it the base64 encoded image. In this case, in the receiver part, the JavaScript client only has to read what is inside the Blob, and paint it directly in the image tag. These three options were studied and analysed, nevertheless, when sending binary data, the image could not be displayed in client B side. After investigating and testing, it was discovered that the GTK browser used for the implemen- tation of this thesis, did not support binary data. Although other WebKit browsers (Chrome, for example) supports it, in this case, it is not implemented yet in GTK, as sending binary data is a really new feature. So, the only working option available to send data, was using base64 encoding, with the problems it causes, (see subsection 7.2). Nevertheless, it doesn’t have to be forgotten that when testing sending binary data with large images in Chrome worked well. So, in case this problem in GTK is solved in the future, it would be better to send binary messages.

4.3 Image sender

When clientA wants to share the desktop, he/she would introduce a password, which will be the identification to allow the communication between the two users. When this id is sent, it is saved on the server. On the other part, when clientB wants to view the remote screen, he/she would have to introduce the same password as clientA, to allow the server to identify both ends, and connect them. Before sending the images, the browser would ask for permission to the user, as for security reasons, it would never take screenshots nor send them without it. When the user accepted, the connection through websockets with the server would be opened (sending the id). Once the “Viewer” got connected, it would begin to receive the images through the websocket connection.

4.4 Server

For implementing the server part, it has been used a standalone server, pywebsocket, provided by Google and a websocket handler written in python. In the end, this was implemented with a dictionary in python, with every id identifying a request. Every time there is an incoming request, (identified by an id), the server looks for a match in the ids already saved in the dictionary. If there is a match, it will connect both ends, if not, it will introduce the id with the request in the dictionary.

20 Web Collaboration Tools 4.5 Image painter -Screen Sharing-

Figure 8: WebSocket server handler

4.5 Image painter

Once the clientB has decided to view the remote screen, introduced the password and the con- nection has been made, he/she would begin to received the data from the server. As it has been written in 2.1.3, no polling is needed to receive it, so every time there is an incoming message, it would be called the onmessage function inside the JavaScript code. Here, the image is painted as a normal image in HTML, as JavaScript supports painting the image encoded in base64.

4.6 Total

4.6.1 Final implementation

Putting all the modules together, it would create the final application. As it can be seen in figure 9, the server would only connect those ends which had the same id.

21 Web Collaboration Tools 4.6 Total -Screen Sharing- Figure 9: Final implementation image

22 Web Collaboration Tools 4.6 Total -Screen Sharing-

4.6.2 Final application behaviour

The following flow chart explains the application behaviour step by step, so it can be followed easily. Orange colour represents the behaviour from the sharer part and yellow colour represents the viewer part.

Figure 10: Application flow chart

4.6.3 Final code diagram

On figure 11 it can be seen an easy code diagram, with straightforward images, to follow up the code easily. As in the image before, orange colours correspond to the sharer part, yellow colours represent the viewer part, and blue colours to both.

23 Web Collaboration Tools 4.6 Total -Screen Sharing-

Figure 11: Code diagram from the final application

24 Web Collaboration Tools 4.7 User application -Screen Sharing-

4.7 User application

In this subsection it is exposed the different use cases and requirements which motivated the final layout and behaviour of the application. They were thought and written at the beginning of the thesis, but they have suffered changes and have been adapted to fit the new delimitations that appeared.

4.7.1 Use cases

For the use cases shown below we are going to have two different actors. ClientA, whose desktop will be shared (“Sharer”), and ClientB, who will view the remote desktop (“Viewer”).

• Sharing the whole desktop

Actor: “Sharer”. When the “Sharer” wants to share the entire desktop, he/she selects it by clicking the corre- sponding option. It appears a first confirmation dialog, which the user accepts. Then, the “Sharer” has to introduce a password, which is the identification to connect both clients. When the password is introduced and accepted, it pops-up the last confirmation message, which the user has to accept in order to share the screen. Once the connection with the “Viewer” is made, an icon indicating that the screen is being shared appears.

Figure 12: Share the entire desktop Diagram

• Sharing part of the screen

Actor: “Sharer”. If the user prefers sharing only one part of the screen, he/she selects this option in the main menu. Afterwards, the user has to select the portion of the screen to be shared, or introduce the coordinates. Then, after introducing a password, it pops-up a confirmation message, resuming his/her choice. The user has to accept this message in order to be able to begin sharing the screen. Once the connection is made, the sharing icon appears and the application begins to share the selected rectangle.

• Cancel

25 Web Collaboration Tools 4.7 User application -Screen Sharing-

Figure 13: Sharing part of the screen Diagram

Actor: “Sharer” or “Viewer”. At any time during the process of sharing the screen between both users, any of them can cancel the communication and stop sharing/viewing the desktop by clicking the “Stop Sharing” button. If this button is selected, the screen sharing communication stops immediately, and therefore, the sharing indicator disappears.

Figure 14: Cancel sharing the screen Diagram

• View

Actor: “Viewer” Every time the “Sharer” wants to share the desktop, he/she introduces a password as this would be the identification for making the connection. The “Viewer” has to, first select in the menu that he/she wants to view the remote screen, and then introduce the same password as the “Sharer” did. Once this is made, the connection is made, and the screen is shared.

4.7.2 Requirements

As time goes by, communication develops faster than people can get used to it, so an important feature is to create an application easy and friendly to use. As the main purpose of this application is for use in a working environment, it will not be necessary good video quality, but enough for sharing slides, this is, with a frame rate of about 1fps

26 Web Collaboration Tools 4.7 User application -Screen Sharing-

Figure 15: View Diagram

(one frame every second). On the other hand, good quality when sharing text (letters and words) will be required. Security takes a very important role in this application: the system should not share the user’s desktop without permission and no one should be able to enter or interact in the users desktop without permission either. File sharing will not be a requirement itself: other applications can do this already and the users can use them at the same time as they share their screen.

27 Web Collaboration Tools -Screen Sharing-

Application Screenshots

This section illustrates the main screenshots of the application, as well as the steps that the user should follow in order to share or view the screen.

5.1 General

The final layout of the application has not been made at once. First, it was done a draft in paper with the first ideas, then, this was implemented and, little by little changed to fit the new requirements. An important requirement that is fulfilled is the fact that is very user friendly: easy and efficient.

Figure 16: General view of the application

In figure 16 it can be seen the general view of the application. Although some of the features are not available at this stage (“Options” and “Share application” menus), it was considered to be a positive idea to introduce them for facilitating future work in case it is done. It can be seen three different groups of buttons. The first one, it would be for the “Sharer”, the second one for the “Viewer” and the last one, for both.

5.2 Sharer

The user would be able to choose between sharing the entire desktop, sharing one rectangle, or sharing one application (although at this point only the first two are available). If the first one is selected, in would appear a menu on the right asking the user for approval. (figure 17)

28 Web Collaboration Tools 5.2 Sharer -Screen Sharing-

Figure 17: Share the entire desktop image

If only one rectangle wants to be shared, the user would have to choose the correct option and it would appear another menu on the right with three options (figure 18). The first one, not implemented, would be to choose the rectangle to share (from a smaller screen, for example). The second one, is to select the coordinates, and the last one it is to select one of the different options. This one, has been made to facilitate the user which part of the screen he/she wants to share, without needing the coordinates, which can be less friendly. The interval to take the images by default is one second, nevertheless it is possible to change it, and let the user decide. After selecting the coordinates or the desired image, it would appear a window asking for the password as it is shown in figure 19. One it is introduced, it would appear another window (figure 20), confirming the user which are the coordinates he/she selected, the password introduced, and asking for permission to begin sharing the screen.

29 Web Collaboration Tools 5.3 Viewer -Screen Sharing-

Figure 18: Share only one rectangle image

Figure 19: Id message

5.3 Viewer

From the “Viewer” point of view, the steps to end up viewing the remote desktop would be the same: selecting from the menu the correct button and introducing the same password as the “Sharer” did.

30 Web Collaboration Tools 5.4 Sharing -Screen Sharing-

Figure 20: Confirmation message

Figure 21: View remote desktop menu

5.4 Sharing

If both passwords are the same, the communication would take place, receiving the “Viewer” the images of the “Sharer” desktop. From the beginning, it would appear a “Stop Sharing” button, which would be visible at all times. In case it is pressed, the websocket would be closed and the communication suspended. In figure 23 and figure 24 it can be seen how the image would be shown in the “Viewer” side

31 Web Collaboration Tools 5.4 Sharing -Screen Sharing-

Figure 22: Viewer password when it is shared one fourth or half of the desktop.

Figure 23: Viewing one fourth of clientA’s desktop

32 Web Collaboration Tools -Screen Sharing-

Figure 24: Viewing one half of clientA’s screen

Testing

Before joining all the modules in the final implementation, they were tested separately, because, in case there was an error it could be delimited easily and, therefore, solved it. In this section it will be explained how these tests took place.

6.1 Image provider testing

For the GStreamer part, first it was implemented a pipeline where the images were saved in the disk, so it could be checked if the pipeline was well structured. For testing if it took the screenshots every second (as we had configured in the pipeline), it was implemented a simple counter, which printed one number every second. Watching the images saved, we could see that the pipeline was well configured, so the only thing remaining was to change the last plug-in in it, changing the fact of saving the screenshots in the disk, to saving them on a buffer.

6.2 Image sender testing

When it was decided to use websockets for sending the images, first it was used an echo server, to learn how it worked, sending first written messages to end up sending images, already encoded in base64. As it has been said in 4.2, it was also tested to send images as binary data. This was tested in Chrome, with different image sizes, up to 1.4 MB, which is much more that what takes for the whole screen image size. It worked in Chrome, so it was thought that it should work on the GTK browser. But it did not, as it has been written in 4.2.

33 Web Collaboration Tools 6.3 Server testing -Screen Sharing-

6.3 Server testing

When this part was finished, the stage of implementing the desired websocket handler began. We wanted that this could push the data from the source to the destination. For this, it was created a very simple webpage which had only some buttons to connect to the server and open the websocket, introduce the password and the written messages. When both clients could communicate with text messages, it was tested to send different images. For this, it was created a fake image provider, simulating GStreamer. It was implemented a three-image array with already selected and saved images, which changed every second, so it simulated the GStreamer effect. These images were taken from the array, and sent through WebSocket protocol to the server, which pushed the data to the other client, receiving the images changing every second in the simple webpage created before.

6.4 General testing

The next step was to introduce the previous communication part between the server and the client in the main application. At this point, we had the image provider in one hand (GStreamer) with the screenshots taken every second, and on the other hand we had the communication between two users, so the only remaining thing was to link both. Once it was everything linked, for testing how to send the images through the websocket connection, it was created another easy and simple application. This, took a small rectangle from the screen and it them through the websockets connection. As it has been stated in subsection 4.2, there were studied two ways of sending the images, or as binary data or as string data. For testing both, it was used the same application, first with the echo server, and then with the push websocket handler it was created for the final application.

34 Web Collaboration Tools -Screen Sharing-

Discussion

In this section it will be discussed the technology and methods used looking at their advantages and drawbacks, comparing the result it could have appeared if it had been used the other technologies available. It will also be compared our final implementation with another way of sharing the screen, using a modified application already implemented in Ericsson Labs. The section ends with suggestions and ideas for future work that could be done within this area, as well as the problems encountered during the carrying out of this thesis.

7.1 General Implementation

As it has been written before (section 4), at the end, the final implementation was done using web-based technologies: WebSocket protocol, and HTML5. This approach has its advantages and drawbacks, nevertheless, it was considered to be the most suitable for our purpose. Some of its main advantages are: as HTML5 is a new technology which is emerging now, it will have a lot of future, and further work could be done without taking the risk of becoming deprecated. As it has been written in subsection 3.1, in this thesis it has been decided to send the images instead of sending the instruction to paint the image. Although the reasons have been stated before, it has to be said that, as the image is always sent (every second), no matter if there has been any change on it, it might be considered to exist a lot of useless sent data. This could be avoided in case the instructions were sent instead of the images themselves, just telling not to paint in case there is no change. But, although in general terms a lot of bandwidth would be saved, as we are only sending one image every second, we are not using a lot of bandwidth anyway, so, the cost effort-efficiency would not compensate in this particular case. Nevertheless, it can be proposed for future work. The frame rate chosen in GStreamer at the beginning for taking the snapshots was one per second. This is because our application is thought to be used for sharing slides, so a higher frame rate it is not needed, which would increase a lot the bandwidth used. Nevertheless, it is possible to change this frame rate and let the user decide, (in the same menu as choosing the rectangle to share). As one frame rate per second is the most suitable, it is the default one. Websockets are layered over TCP protocol, and this has its drawbacks and advantages. The main advantage is that TCP is connection-oriented, which means that it guarantees that no data will be lost in between the communication. On the other hand, the main drawback of using TCP is that it only supports one-to-one communication. For supporting more destinations, it is needed UDP protocol, which is not supported by websockets. In our case, as the purpose is to connect only two users, websockets work well, nevertheless, it should be studied for future work in case more people want to view the remote screen. Another advantage of our application is, as it is web-based the user do not belongs to any software product nor has to accept any conditions, such as receiving newsletters or be inserted in a data base or become a member of a group... He/she just has to click on the right buttons and, under his/her responsibility the communication will be held.

35 Web Collaboration Tools 7.2 Base64 encoded images and Blob Images -Screen Sharing-

Moreover, being web-based is the trend which is being followed right now, as we can see in several web applications, such as GTalk, which it is able to chat or call without installing any software. This leads to the fact that for the final user, the communication is easier and more comfortable, and sharing information more natural, real and clean.

7.2 Base64 encoded images and Blob Images

In subsection 4.2 it has been said that there were two options to send the images: one as binary data (Blob), and one as string data (images encoded in base64). Finally it has been implemented the one of sending the images encoded in base64, as string data, despite the drawbacks. Base64 encoding creates more data (25%-30% more) than Blobs do, so the delay when sending the images is bigger. The bigger the image, the bigger the delay. For small images this delay is not very big, approximately one second, but when sharing the whole desktop this delay can be huge and not very optimal (up to 20 seconds). This is due to the pywebsocket server, which is not able to handle so much data in such small time, (see 7.4). When sending a large image, the data that has not been sent yet, is stored in a buffer waiting. If we try to send large images every second, these are stored in a queue in the buffer, making it bigger, until they are sent. So, the larger the images, the larger the delay. This was avoided by taking and sending the image only when the buffer is empty. In this case, for sharing the whole desktop, (the worst case), only one image very tenth was taken and sent, no matter what interval the user chose in the menu. On the other hand, for small images, all images are sent with little delay between the receiver and the sender.

7.3 Image encoding

GStreamer supports both, JPEG and PNG encoding. Although at the beginning it was thought to use PNG encoding, as it doesn’t loss any data, we then realised that for our purpose it was not so important. We have focused our thesis for work purposes, and in particular, for sharing slides, this is, with big images and big letters, so it was proved watching both final images, (with PNG and JPEG encoding) that losing part of the data did not affect the images at all. Nevertheless, if the purpose of the thesis changed, to, for example, share some maps, this option should be probably be changed or, at least, reconsidered.

7.4 Server

As it has been said before in subsection 4.4, it was used pywebsocket for the server part. After studying why there was that big delay when sending big images, we realised that the problem was in the server. This standalone server provided by google is thought to be used for experimental purposes, so probably, it is not thought to handle so much data. The throughput of the server is very low for what we expected, so probably if it is used another websocket server the general implementation of the application would improve. Nevertheless, this has not been studied in this thesis, but should be the first thing to study in further work.

7.5 Viewing remote desktop

One of the main advantages of this application is the fact that sharing the desktop and viewing it are available in the same application. So, in case that both roles are changed (viewer and sharer), it is not a complication added for them.

36 Web Collaboration Tools 7.6 Identifying the “Viewer” -Screen Sharing-

Nevertheless, in case this happens, the users have to start from the beginning and introduce again a password to open a new websocket connection. It can be thought to be a drawback, nevertheless it was thought for security reasons, as, wanting to view a remote desktop doesn’t mean that the user wants to share it, so, another confirmation is needed.

7.6 Identifying the “Viewer”

Once it was decided to use websockets, and there had to exist a server which would push the data from one client to another, we had to think how both ends could be identified together. The idea of introducing a password turned up quickly, although at that point we didn’t know how this would be implemented, (see 4.4 to read how it was finally done). However, before this decision, it was thought to connect both ends as if they both knew the IP address of the other device. This option was dismissed, as, of course, it would not be very user-friendly. Normally the final user doesn’t know his/her IP address, and look for it is not very comfortable for non-experienced computer users. With the final implementation way chosen, anyone, even though without any computer skills, can share their screen and communicate with the other end.

7.7 Comparison with video stream Implementation

Ericsson Labs has been experimenting with video and audio real-time communication in WebKit. It is a modified library which supports audio and video communication within the web browser with no need of any plug-ins. It works taking the image from the camera and taking the voice from the microphone, and send both through a peer connection to the other end. For this experiment it has been modified part of this code to work in a similar way, but instead of taking the images from the camera taking them from the screen. First of all, it is needed a server to send the signalling messages so both users can identify the other end. For this, it is used the same idea as in our application: a websocket server and a password, which would identify both ends. Once the peer connection is established, the video stream is sent through it to the other user. Once clientB receives the stream, it displays it using the

7.8 Working flow

Defining the working methodology has been essential to carry out the Master thesis, as it is fundamental in any project to have everything organized with specific objectives and milestones. The work has been separated in stages, each one with different objectives, to create a continuity in the work done. Although at the middle of the thesis it was changed the path taken to send the images (from VNC protocol to websockets), the working flow has been the same in both cases. Firstly, it was done a research about the different options to share the desktop through the network (see 2.1.1). They were analysed and compared, to, in the end, select VNC protocol, which

37 Web Collaboration Tools 7.9 Problems encountered -Screen Sharing- was the one that best fit in our requirements for the application. Then, the implementation began, with the VNC server and client, and when we realised that this was not the path to follow, we started from the beginning, looking for other options to communicate both ends (see 3.2). Once we widened our options to communicate both users, it was decided to take the web stan- dards as way of communication. Both, WebSocket protocol and XMLHttpRequest, were studied and analysed, to be able to decide at the end, to use websockets (see 3.3). Once the theoretical study of the Master thesis was finished, the implementation phase began. Firstly, it was implemented the image provider, using GStreamer as main application. Once it worked as it was desired, it was implemented the main application HTML page. In the beginning, only the layout was done, this is, an static page which did not do anything, then, it was introduced the websocket protocol and tested with an echo server. The next stage was to create a websocket handler in the server part, which acted as we wanted, this is, pushing the data images from one client to another. The last phase was to finish the main application so it fulfilled all the requirements.

7.9 Problems encountered

The main problem that appeared during this thesis was deciding which way to choose for sharing the desktop. As it has been written in subsection 3.2 the first option chosen was to use VNC protocol, as comparing it with the rest of protocols, it was the best one to fit our requirements. A server and a client were created, making them to communicate, firstly with simple text messages and then sending images from the server to the client. This image was an static one, first in BMP (Bitmap Image File) format, studying the specific library for this type of images. Then, it was thought that BMP images were very big to send, so it was considered to use PNG instead. It was then studied its library, to, in the end, create a server which could take the image, and send it through the VNC protocol to the client, which would be able to display it. Nevertheless this implementation was dismissed for the reasons stated in subsection 3.2. Another problem encountered came when testing how to send the screen images. When sending the images as a string (base64 encoded images), the delay got bigger as the image got bigger. So when trying to solve this problem, sending the images as binary data, we encountered the problem that client B could not display it. In the beginning it was thought to be a server problem. As GTK port and Chrome both work over WebKit, they share the same WebCore code, so it was difficult to realise that the problem was inside GTK. Nevertheless, when we found it, we could not do much to solve it, so we had to adapt our application and try to reduce the delay as much as possible. Moreover, to find that the server was not able to handle that amount of data, was not easy either. We thought the problem was in the images sent, but actually it is a combination of both problems. Nevertheless, once the problems are determined and delimited, it is easier to find a solution, (proposed in next section).

7.10 Future work

The idea of this thesis can have a wide range of implementation, however, in this thesis only a small part has been put in practice. It has not to be forgot that this thesis has been for research, and in a six-months work it cannot be implemented a real commercial product. For this reason, here are exposed some ideas to improve it and implement in future work, in case this topic and work is continued.

38 Web Collaboration Tools 7.10 Future work -Screen Sharing-

The main and most important feature that could be improved, is the one of improving the websocket server. If it is used a server which can handle a lot of data, the application would improve considerably. Moreover, another important feature that could be improved is the one of sending binary data. Once the GTK port supports it, the images could be sent faster and the delay would be much less, augmenting a lot the user experience and improving the application purpose. So these two characteristics are the most important ones to improve this application. As future work, I would suggest to improve the features of the application, such as being able to share only one application instead of the entire desktop. As taking the image of the desktop is done in a different layer than the fact of sending it, it would only have to be improved one part, leaving the sending part in the same way. Another improvement of this thesis would be to, instead of the need to appear a new webpage application (with the different options to choose for sharing) in the browser every time the desktop wants to be shared, to create a semi-transparent slide which appeared in front of the user’s working environment (e.g. Gmail). Disappearing it when the sharing began, remaining only a small button to stop sharing. This improvement has nothing to do with the sending part nor the image taken part, but with the layout of the application. As it has been written before, to allow the “Viewer” to interact with the “Sharer” desktop would be another improvement in the application. Nevertheless, this would increase the risk in the security, so it would be a big requirement to guarantee it. Another good improvement would be to support other web browsers, to expand the market and get to be known by more final users. Nevertheless, nowadays not all web browsers support WebSocket protocol or HTML5, as it is under development, but this technology it is supposed to grow up and be supported eventually by the rest, (as it improves a lot the user experience in ).

39 Web Collaboration Tools -Screen Sharing-

Conclusion

It has been possible to achieve the goal of implementing a web-based application, with no need of plug-ins, which supports sharing the screen between two users. Although the final result is not as we expected, it fulfils all the requirements. The highlight of it is when small rectangles are shared. For large images, the delay between the two users gets bigger, making the application less useful as it was first thought. As it has been said, this could be solved probably by changing the websocket server, as it is the main cause for the delay. Nevertheless, and in case this solution is not done, for sharing the entire desktop, it would be recommended to use the peer connection implementation. Actually, the best implementation would be to unify and join both comparisons, making the final application not only effective but also efficient. Some future work is proposed in the previous section to improve the actual application. There is a lot that could be worked on, as this application is based in new standards which are emerging now.

40 REFERENCES

[1] Definition of communication. Oxford Dictionary. http://english.oxforddictionaries. com/definition/communication (19th October 2011) [2] Skype growing chart. (Until Q4 2009) http://gigaom.com/2010/04/20/ skype-q4-2009-number/ (18th October 2011) [3] “Skypes changing traffic growth”. Financial Times. 10th May 2011. http: //www.ft.com/cms/s/2/e858ad1c-7b1f-11e0-9b06-00144feabdc0,dwp_uuid= 9a36c1aa-3016-11da-ba9f-00000e2511c8.html#axzz1LtCJH6ph [4] http://webtoolsandtips.com/wp-content/uploads/2010/01/ free-remote-desktop-sharing-application.png (27th April 2011) [5] Web Real-Time Communication. Ericsson Labs.https://labs.ericsson.com/apis/ web-real-time-communication/ (19th October 2011) [6] Seamless mode. http://winswitch.org/documentation/seamless.html (18th April 2011) [7] Shadowing. http://winswitch.org/documentation/shadow.html (18th April 2011) [8] Comparison of . Wikipedia. http://en.wikipedia.org/wiki/ Comparison_of_remote_desktop_software (15th April 2011) [9] The RFB Protocol. Tristan Richardson. RealVNC Ltd. Version 3.8. November 2010. [10] Virtual Network Computing Tristan Richardson, Quentin Stafford-Fraser, Kenneth R. Wood and Andy Hopper. IEEE Internet Computing Volume 2, Number 1. January/February 1998

[11] Introduction to NX Technology. http://www.nomachine.com/documents/intr-technology. php (19th April 2011) [12] The arrival of NX. Part 1. Kurt Pfeifle. Jul 29, 2005. http://www.linuxjournal.com/ article/8477?page=0,2 (19th April 2011) [13] Protocol Comparison. http://winswitch.org/documentation/protocols/choose.html (15th April 2011)

[14] Xpra. Wikipedia. http://en.wikipedia.org/wiki/Xpra (19th April 2011) [15] FAQ for Xpra. http://code.google.com/p/partiwm/source/browse/README.xpra [16] Remote Desktop Protocol. Microsoft Developer Network. http://msdn.microsoft.com/ en-us/library/aa383015%28VS.85%29.aspx (15th April 2011) [17] Remote Desktop Protocol: Basic Connectivity and Graphics Remoting Specification. [MS- RDPBCGR] - v20110318. Copyright c 2011 Microsoft Corporation. Release: Friday, March 18, 2011.

[18] HTML5. Wikipedia. http://en.wikipedia.org/wiki/Html5 (25th August 2011)

41 [19] WebSockets. https://developer.mozilla.org/en/WebSockets (3rd August 2011) [20] Remote Data Visualitzation through Websockets. 2011 Eighth International Conference on Information Technology: New Generations (ITNG). (IEEE Article. April 2011)

[21] The WebSocket protocol http://tools.ietf.org/html/draft-ietf-hybi-thewebsocketprotocol-10# section-1 [22] XMLHttpRequest http://en.wikipedia.org/wiki/XHR (3rd August 2011) [23] ThinVNC http://www.thinvnc.com/thinvnc/html5-vnc.html (4th May 2011)

[24] GStreamer http://en.wikipedia.org/wiki/Gstreamer (30th June 2011) [25] Gstreamer Application Development Manual http://gstreamer.freedesktop.org/data/ doc/gstreamer/head/manual/html/index.html (18th July 2011) [26] WebKit home page http://www.webkit.org/ (10th October 2011)

[27] Applications which use WebKit http://trac.webkit.org/wiki/Applications%20using% 20WebKit (10th October 2011) [28] WebKit http://en.wikipedia.org/wiki/WebKit (10th October 2011) [29] Remote Desktop Protocol: Graphics Device Interface (GDI) Acceleration Extensions. [MS- RDPEGDI]v20110318. Copyright c 2011 Microsoft Corporation. Release: Friday, March 18, 2011.

[30] Websockets http://jmesnil.net/stomp-websocket/doc/ (3rd August 2011) [31] Websockets vs HTTP http://code.google.com/p/websocket-sample/wiki/Tips (26th August 2011)

[32] Pywebsocket project http://code.google.com/p/pywebsocket/ (22nd August 2011) [33] GStreamer Application Development Manual http://gstreamer.freedesktop.org/data/ doc/gstreamer/head/manual/html/index.html (18th July 2011)

[34] Ximagesrc in GStreamer (18th July 2011) http://gstreamer. freedesktop.org/data/doc/gstreamer/head/gst-plugins-good-plugins/html/ gst-plugins-good-plugins-ximagesrc.html (19th July 2011) [35] Videorate in GStreamer http://gstreamer.freedesktop.org/data/doc/gstreamer/head/ gst-plugins-base-plugins/html/gst-plugins-base-plugins-videorate.html (19th July 2011)

[36] Caps negotiation http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/ html/chapter-negotiation.html (18th July 2011)

42