LOCAL POINTS OF INTEREST USING

______

A Thesis

Presented to the

Faculty of

San Diego State University

______

In Partial Fulfillment

of the Requirements for the Degree

Master of Science

in

Computer Science

______

by

Sudeshna Mukherjee

Spring 2013

iii

Copyright © 2013 by Sudeshna Mukherjee All Rights Reserved

iv

DEDICATION

I dedicate this thesis work to my parents and teachers. The constant encouragement of my parents has given me direction in life. The guidance and support of my teachers has helped me in my educational journey of enlightenment.

v

ABSTRACT OF THE THESIS

Local Points of Interest Using Augmented Reality by Sudeshna Mukherjee Master of Science in Computer Science San Diego State University, 2013

As the worldwide smartphone market continues to expand, by the third quarter of 2012 Android had a 75% share of the global smartphone market according to the research firm IDC. One reason that smartphones are so popular is because of the wide range of smartphone applications that they provide these applications are also so useful because they can integrate intimately with our personal lives. Augmented reality is a new technology which integrates the virtual world of a smart phone with the real world of nearby locations as seen by the user. This thesis implements a number of augmented reality features by blending both visual, map-based and non-map- based elements like live projection of nearby landmark on preview. The main screen of the proposed application gives the address of the current location and provides two buttons. The first button is for a map view and gives the representation of the current location on the map, with options. The second button on the main screen is for showing a camera preview. When this button is clicked a camera preview is shown, and as we hold the phone vertically, it projects various nearby landmarks as icons on the phone screen. Again there is a menu bar with useful options.

vi

TABLE OF CONTENTS

PAGE

ABSTRACT ...... v LIST OF ...... viii LIST OF FIGURES ...... ix ACKNOWLEDGEMENTS ...... xi CHAPTER 1 INTRODUCTION ...... 1 1.2 Motivation ...... 1 1.3 Thesis Organization ...... 2 2 USE OF AUGMENTED REALITY FOR MOBILE PHONE ...... 3 2.1 Introduction to Augmented Reality ...... 3 2.2 Hardware Components of a Phone Used for Mobile Augmented Reality ...... 3 2.3 Popular Mobile Augment Reality Applications ...... 3 2.4 Introduction to Software Developent Kit ...... 5 2.4.1 Features of Unity Software Development Kit ...... 5 2.4.2 A Sample Augmented Reality Application Developed Using Unity ...... 6 3 TECHNOLOGY DETAILS ...... 8 3.1 Android Architecture ...... 8 3.2 Android Application Life Cycle ...... 9 3.3 Composition of Android Application ...... 9 4 ANDROID APPLICATION PROGRAMMING INTERFACE DETAILS ...... 12 4.1 Location Services ...... 12 4.1.1 Location Manager ...... 12 4.1.2 Location Class ...... 12 4.1.3 Geocoder Class ...... 12 4.1.4 Address Class ...... 13 4.2 Maps Android API ...... 13

vii

4.2.1 Android API Map Types ...... 14 4.2.2 Obtaining Google Map API Key ...... 14 4.3 Data Source ...... 15 4.4 Android Package ORG.JSON ...... 15 4.5 Android Camera API ...... 15 4.6 Usage of Device Sensors ...... 16 4.6.1 Android Sensor API Methods ...... 17 4.6.2 Pitch Roll Azimuth for Android Phone ...... 19 5 IMPLEMENTATION DETAILS ...... 22 5.1 Permissions to Access Device Hardware and Other Services ...... 22 5.2 Main Activity and Map View Details ...... 22 5.3 Usage of Google Map API Key in XML File ...... 25 5.4 Obtaining Information About Landmarks Using JSONOBJECT ...... 25 5.5 Plotting Icons for Nearby Landmark on Camera Preview ...... 26 5.5.1 Relationship between the Plotted Points and Radius ...... 30 5.5.2 Usage of Thread Pool for Dynamic Update of Phone Screen ...... 33 6 EXECUTION RESULTS ON ANDROID PHONE ...... 35 6.1 Android Logging System ...... 36 6.2 Procedure for Recording Execution Times ...... 38 6.3 Conclusion ...... 40 7 CONCLUSIONS AND OBSTACLES ...... 41 8 FUTURE SCOPE AND IMPROVEMENT...... 42 BIBLIOGRAPHY ...... 43 APPENDIX SETTING OF DEVELOPMENT AND TEST ENVIRONMENT ...... 46

viii

LIST OF TABLES

PAGE

Table 4.1. Table Describing Function Names and Their Descriptions ...... 13 Table 4.2. Table Describing the Parameters for the getRotationMatrix Method ...... 19 Table 4.3. Table Describing the Parameters for remapCoordinateSystem Method ...... 20 Table 6.1. Table Describing Execution Times at Different Radius for Samsung Galaxy S3 ...... 35 Table 6.2. Table Describing Execution Times at Different Radii for HTC Thunderbolt ...... 36 Table 6.3. Table Showing Summarized Data of Execution Times at Different Radius for HTC Thunderbolt and Samsung Galaxy S3 (From Table 6.1 and 6.2) ...... 37 Table 6.4. Table Describing Phone Configurations Used for Running the Application ...... 38

ix

LIST OF FIGURES

PAGE

Figure 2.1. Snapshot of translated text...... 4 Figure 2.2. Snapshot of Google Sky map application...... 4 Figure 2.3. User interface of Unity. Snapshot taken after downloading, installing and launching the trial version of the unity IDE...... 5 Figure 2.4. Snapshot of menu items of Unity...... 6 Figure 2.5. Augmentation of GMCS building on map using Unity IDE...... 7 Figure 2.6. Snapshot of phone screen after augmentation...... 7 Figure 3.1. Android architecture...... 8 Figure 3.2. Android application lifecycle...... 10 Figure 4.1. Flowchart describing the order of call to various camera API methods needed to use the phone camera...... 17 Figure 4.2. Default orientation of phone...... 18 Figure 4.3. Sensor coordinate system is coincident with the screen coordinate system in default orientation...... 18 Figure 4.4. Sensor coordinate system is not coincident with the screen coordinate system after rotation...... 19 Figure 4.5. World coordinate system as used by android API methods...... 20 Figure 4.6. Representation of azimuth pitch and roll as used by android API methods...... 21 Figure 5.1. Main screen displayed on application launch...... 23 Figure 5.2. Instruction dialog displayed on clicking the button labeled as instructions...... 23 Figure 5.3. On clicking the map view button, street view of the map will be displayed...... 23 Figure 5.4. On clicking the map button, satellite view will be displayed by default with a menu...... 24 Figure 5.5. Snapshot of the toast message displayed on clicking the save to SD card menu option...... 24 Figure 5.6. Zoomed State of the map view after using pinch and zoom gesture on the map...... 25 Figure 5.7. Flowchart explaining how the data is retrieved from internet...... 26 Figure 5.8. Flowchart illustrating the plotting of landmarks on camera preview...... 27

x

Figure 5.9. Main screen displayed on application launch after tapping back button...... 28 Figure 5.10. On clicking the camera preview button...... 28 Figure 5.11. Snapshot of screen displayed on clicking on show plot menu option...... 29 Figure 5.12. Snapshot of list view displayed on clicking on list view menu item from the menu...... 29 Figure 5.13. Snapshot of plot when the radius on the slider is 25 km...... 30 Figure 5.14. Snapshot of plot when the radius on the slider is 50 km...... 30 Figure 5.15. Snapshot of plot when the radius on the slider is 75 km...... 31 Figure 5.16. Snapshot of plot when the radius on the slider is between 75 km and 100km...... 31 Figure 5.17. Diagram showing the relationship between the slider and the circle on which points are plotted...... 32 Figure 5.18. Diagram illustrating the direction on the circle...... 32 Figure 6.1. Graphical representation of the execution time from Table 6. 3...... 37

xi

ACKNOWLEDGEMENTS

I am heartily thankful to Dr Carl Eckberg, Dr Marko Vuskovic and Dr Ming-Hsiang Tsou for their guidance and support. I would like to profusely thank Dr Eckerg’s for his invaluable time in reviewing and guiding me during my work. My deepest gratitude to Dr Ming-Hsiang Tsou for his guidance during the preparation of the manuscript, which helped me in completing the document. The coursework for advanced robotics taken under Dr.Marko Vuskovic was very helpful for this thesis work and I would heartily thank him for his guidance.

1

CHAPTER 1

INTRODUCTION

The objective of this android application for smart phones is to give information about the local landmarks which are in the vicinity of the current location. This mobile application is a location aware application, as this application incorporate location based services to receive location updates. This application uses the information to populate a list of nearby landmarks. It gives an accurate map view representation and also shows the user’s current location on the map. The application also uses the knowledge of the current location and nearby landmarks to augment them on the camera preview, thereby bringing a more contextual experience to users. The application has the ability to overlay the view obtained through the device camera with useful information about places.

1.2 MOTIVATION As smartphones explode in popularity, augmented reality is starting to move from novelty to utility [1]. Currently there are several augmented reality applications available for various platforms. These applications provide users interactive information on digital displays. In most mobile phones the navigation applications will detect the user’s current location and the user has to enter the destination. This application will detect the user’s nearby landmarks and also populate them in a list view. Once the information of landmarks is available this information can be used to enter a destination address in the navigation applications. My enrollment in the course of spatial databases during spring 2012 gave me an insight regarding the uses of the location data in various fields. My enrollment in the course of advanced robotics during fall 2012 has equipped me with the mathematical knowledge needed to complete the project. All these are my motivation behind creating a location aware application using augmented reality.

2

1.3 THESIS ORGANIZATION The document has been divided into chapters as follows.  Chapter 2: Use of augmented reality for mobile phones: this chapter gives a brief description about some of the popular augmented reality applications available on different platforms.  Chapter 3: Technology Details: this chapter illustrates the android architecture, android application life cycle and the components of an android application.  Chapter 4: Android application programming interface details: this chapter explains the various functions which have been used to develop this augmented reality application.  Chapter 5: Implementation details: this chapter demonstrates the working of the application and also gives an insight into the usage of augmented reality in the application.  Chapter 6: Execution results on android phones: this chapter gives graphical representation of the execution results obtained after running the application on two types of android phones.  Chapter 7: Conclusion and Obstacles: this chapter briefly describes the various obstacles faced during the implementation of this application.  Chapter 8: Future Prospect: this chapter talks about the future scope and potential of this application.

3

CHAPTER 2

USE OF AUGMENTED REALITY FOR MOBILE PHONE

2.1 INTRODUCTION TO AUGMENTED REALITY This application uses augmented reality to project nearby landmarks as icons on a camera preview. Augmented Reality, abbreviated as AR, is a type of virtual reality that generates a composite view for the user that is the combination of the real scene viewed by the user, and a virtual scene generated by the computer or devices like mobile phone and tablets that augments the scene with additional information [2].

2.2 HARDWARE COMPONENTS OF A PHONE USED FOR MOBILE AUGMENTED REALITY Hardware components for augmented reality are: processor, display, sensors and input devices. Modern mobile computing devices like smartphones contain these elements which often include a camera and sensors such as accelerometer, GPS, and solid state compass, making them suitable for use in augmented reality [3].

2.3 POPULAR MOBILE AUGMENT REALITY APPLICATIONS Word Lens for IPhone: The Word Lens is an apple application that uses character recognition to instantly translate text from one language to another language, when a camera is focused on a piece of text that is written in some other language. The text alphabets are recognized by character recognition and the translated version of the same text appear on the phone screen in the same font and style as the original in front of you. The example in Figure 2.1 [4] shows how this could be important [5]. Google Sky Map: Google Sky Map can identify objects that appear in the sky and allow users to search for them. The user interface of Google Sky Map is shown in the Figure 2.2 [6]. Sky Map enables users to identify stars and planets by pointing devices towards these objects in the sky. Sky Map automatically adjusts to identify on the device's screen the

4

Figure 2.1. Snapshot of translated text. Source: ACCUFORM SIGNS, Spanish bilinngual sign, danger high voltage/peliigro alto voltajee,14" X 10", vinyl. Amazon, http://www.amazon.com/Accuform-Signns- Spanish-Bilingual- Voltage/dp/B001836WFK, accessed December 2012, n.d.

Figure 2.2. Snapshot of Google Sky map application. Source: GOOGLE, Google Sky map. Googo le, http://www.google.com/mobile/skymap/, accessed December 2012, n.d. objects it is facing. Users can zoom in and out, and switch various layers such as constellations, planets, grids, and deep sky objects, on and off, choosing to make these elements visible or not. Users can also determine the locations of planets and stars relative to their own current locations with the search function. It allows inputting the name of a planet or star and will direct users towards this object [6]. Nokia Point & Find is a mobile application, which lets you point your Nokia smartphone camera at objects and images you want to know more about, to find more

5 information. It is a visual search technology that uses the phone's camera to obtain information by using image recognition to identify objects, imagges and places in the physical world in real-time. One of the current uses of this app is by pointing the camera at a barcode, Nokia Point & Find retrieves information about product pricing and availability [7].

2.4 INTRODUCTION TO UNITY SOFTWARE DEVELOPENT KIT Unity is a cross-platform game engine and IDE developed by Unity Technologies targeting web plugins, video game consoles and mobile devices. It allows the creation of augmented reality applications [8]. The user interface of Unity is shown in Figure 2.3 [8] and Figure 2.4 [8].

Figure 2.3. User interface of Unity. Snapshot taken afterr downloadinng, installing and launchingn the trial version of the unity IDE. Source: UNITY TECHNOLOGIES, Game egine, tools, and multiplatform. Unity Technologies, http://unity33d.com/unityt /, accessed December 2012, n.d.

2.4.1 Features of Unity Software Development Kit Unity has been used for augmented Reality application development for android phones and for 3D game development. It supports the following features.

6

Figure 2.4. Snapshot of menu items of Unitty. Source: UNITY TECHNOLOGIES, Game egine, tools, and multiplatform. Unity Technologies, http://unity3d.com/unity/, accessed December 2012, n.d.

 Provides features to overlay target markers on the top of the reference object.  Provide camera source to focus on 3D text or 3D object from different angles  Provide a framework for compiling the softwware code as aan android application.

2.4.2 A Sample Augmented Reality Application Developed Using Unity The snapshots of the sample project created using Unity are shown in the Figure 2.5 and Figure 2.6.

7

Figure 2.5. Augmmentation of GMCS buuilding on map using Unity IDE. Snapshot taken after developing a sample application that augmments a 3D text and 3D object on a pdf file.

Figuure 2.6. Snapshot of phone screen after augmmentation.

8

CHAPTER 3

TECHNOLOGY DETAILS

3.1 ANDROID ARCHITECTURE Figure 3.1 [3] shows the android architecture. The details of the android architecture are illustrated in the figure.

Figure 3.1. Android architecture. Source: R. METZ, Augmented reality is finally gettinng real. Technology Review, http://www.technologyreview.com/news/428 654/augu mented-reality-is-finnally-getting- real/, accessed December 2012, last modified August 2012.

Linux Kernel: It is a part of the that acts as an abstraction layer between the hardware and the rest of the software stack. It controls memory management, process management, network stack and driver model [9]. Android Runtime: Android application runs in its own prrocess with its own instance of Dalvik virtual machine. Dalvik is the process virtual machine (VM) in Google's Android operating system. It is the software that runs the application on Android devices. Programs

9

are commonly written in Java and compiled to bytecode. They are then converted from Java Virtual Machine compatible .class to Dalvik-compatible .dex (Dalvik Executable) files before installation on a device. The compact Dalvik Executable format is designed to be suitable for systems that are constrained in terms of memory and processor speed. The Dalvik VM uses a register-based architecture [10]. Libraries: Android includes a set of C/C++ libraries used by various components of the android system. Application Framework: Android provides an open development platform. Developers can use device hardware, access location information, run background services, add dialogs. The application developed here also uses many of these features [9].

3.2 ANDROID APPLICATION LIFE CYCLE Figure 3.2 illustrates the paths an activity may take between states [11]. The colored ovals are major states the activity can be in. The square rectangles represent the callback methods which can be implemented to perform operations when the activity transitions between states [11].

3.3 COMPOSITION OF ANDROID APPLICATION User Interface: Android provides a variety of pre-built UI components such as structured layout objects and UI controls that allow you to build the graphical user interface for your app. Android also provides other UI modules for special interfaces such as dialogs, notifications, and menus. Application Resources: resources are the additional files and static content that your code uses, such as bitmaps, icons as png files, layout definitions files which define the base layout which will be used by various activities, user interface strings. Sensor Data: most Android-powered devices have built-in sensors that measure motion, orientation, and various environmental conditions. These sensors are capable of providing raw data with high precision and accuracy, and are useful if we want to monitor three-dimensional device movement or positioning, or we want to monitor changes in the ambient environment near a device. For example, a game might track readings from a device's gravity sensor to infer complex user gestures and motions, such as tilt, shake, rotation, or swing. Likewise, a weather application might use a device's temperature sensor

10

Figure 3.2. Android application lifecyccle. Source: GOOGLE, Activity. Android, http://developer.android.com/reference/android/ app/Activity.html, accessed December 2012, last modified April 2013.

11

and humidity sensor or a travel application might use the geomagnetic field sensor and accelerometer to report a compass bearing. The Android platform supports the following sensors. Motion sensors: These sensors measure acceleration forces and rotational forces along three axes. This category includes accelerometers, gravity sensors, gyroscopes, and rotational vector sensors. Position sensors: These sensors measure the physical position of a device like orientation sensors which can tell a device’s orientation. Location and Map data: Android provides a framework that an application can use to determine the device's location and register for updates. A Google Maps external library is available that lets developers display and manage Map data [12].

12

CHAPTER 4

ANDROID APPLICATION PROGRAMMING INTERFACE DETAILS

4.1 LOCATION SERVICES Android gives your applications access to the location services supported by the device through classes in the android.location package. The central component of the location framework is the LocationManager system service, which provides APIs to determine the location of the underlying device. LocationManager is not instantiated directly. An instance of the class is created from the system by calling getSystemService(Context.LOCATION_SERVICE). The method returns a handle to a new LocationManager instance [13].

4.1.1 Location Manager This class provides access to the system location services. These services allow applications to obtain periodic updates of the device's geographical location, or to fire an application-specified Intent when the device enters the proximity of a given geographical location. The class cannot be instantiated directly but is retrieved through Context.getSystemService(Context.LOCATION_SERVICE) [14].

4.1.2 Location Class Location: A data class representing a geographic location. A location can consist of latitude, longitude, timestamp, and other information such as altitude and velocity. All locations generated by the LocationManager are guaranteed to have a valid latitude, longitude, and timestamp [15].

4.1.3 Geocoder Class Geocoder: A class for handling geocoding and reverse geocoding. Geocoding is the process of transforming a street address or other description of a location into a latitude, longitude coordinate. Reverse geocoding is the process of transforming a latitude, longitude

13

coordinate into a partial address. The amount of detail in a reverse geocoded location description may vary, for example one might contain the full street address of the closest building, or one might contain only a city name and postal code [16].

4.1.4 Address Class Address: A class representing an Address, it is a set of Strings describing a location. The android.Location package contains classes that define Android location-based and related services. Address is one of the classes from this package. This class provides various methods to retrieve the country, latitude, longitude, locality, postal code. Table 4.1 illustrates some of the most useful functions [17].

Table 4.1. Table Describing Function Names and Their Descriptions Returntype Function Name and Description

getLatitude() Double Returns the latitude of the address if known.

getLongitude() Double Returns the longitude of the address if known.

getAddressLine(int index)

String Returns a line of the address numbered by the given index (starting at 0), or null if no such line is present.

getLocality()

String Returns the locality of the address.

getPostalCode()

String Returns the postal code of the address. getCountryName() String Returns the localized country name of the address, for example "Iceland", or null if it is unknown

4.2 GOOGLE MAPS ANDROID API Google Maps Android API allows developers to add maps to the application. These maps are based on Google Maps data. The API automatically handles access to Google Maps servers, data downloading, map display. It also supports touch gestures like zoon in and zoom out on the map. API calls allow one to add markers, polygons and overlays, and to change the user's view of a particular map area. The key class in the Google Maps Android API is MapView. A MapView displays a map with data obtained from the Google Maps

14

service. When the MapView has focus, it will capture keypresses and touch gestures to pan and zoom the map automatically including handling network requests for additional maps tiles. It also provides all of the UI elements necessary for users to control the map. Your application can also use MapView class methods to control the map programmatically and draw a number of overlays on top of the map [13].

4.2.1 Google Maps Android API Map Types There are many types of maps available within the Google Maps Android API. A map's type governs the overall representation of the map. For example, road maps that show all of the roads for a city or region. The Google Maps Android API offers four types of maps [18].  Normal: Typical road map. Roads, some man-made features, and important natural features such as rivers are shown. Road and feature labels are also visible.  Hybrid: Satellite photograph data with road maps added. Road and feature labels are also visible.  Satellite: Satellite photograph data. Road and feature labels are not visible.  Terrain: Topographic data. The map includes colors, contour lines and labels, and perspective shading. Some roads and labels are also visible.  None: The map will be rendered as an empty grid without any tiles loaded [18].

4.2.2 Obtaining Google Map API Key MapView gives access to Google Maps data, so it needs to be registered with the Google Maps service and Terms of Service have to be agreed before MapView will be able to obtain data from Google Maps. This will apply whether you are developing your application on the emulator or preparing your application for deployment to mobile devices. Registering for a Google Maps Android API Key has two parts.  Registering the MD5 fingerprint of the certificate that you will use to sign your application. The Maps registration service then provides you a Google Maps Android API Key that is associated with your application's signer certificate.  Adding a reference to the Google Maps Android API Key in each MapView, whether declared in XML or instantiated directly from code. You can use the same Google Maps Android API Key for any MapView in any Android application, provided that the application is signed with the certificate whose fingerprint you registered with the service [18].

15

4.3 DATA SOURCE GeoNames is a geographical database available and accessible through various Web services, under a Creative Commons attribution license. The GeoNames database contains over 10,000,000 geographical names corresponding to over 7,500,000 unique features. Besides names of places in various languages, data stored includes latitude, longitude, elevation, population, administrative subdivision and postal codes. All coordinates use the WGS 841 datum [19]. The Web services include direct and reverse geocoding, finding places through postal codes, finding places next to a given place [20].

4.4 ANDROID PACKAGE ORG.JSON The below mentioned classes are part of the org.json Package. This application will use these to convert information obtained from geonames.org into class objects. JSONObjects: It is a modifiable set of (name, value) mappings. The names are unique, non-null strings. The corresponding values can be any mix of JSONObjects, JSONArrays, Strings, Booleans, Integers, Longs, Doubles or NULL. JSONArray : A dense indexed sequence of values. Values may be any mix of JSONObjects, other JSONArrays, Strings, Booleans, Integers, Longs, Doubles or NULL. In this application information about all the nearby landmarks is stored as a JSONArray [21].

4.5 ANDROID CAMERA API The Android framework supports capturing images and video through the Camera API. The Camera class is used to set image capture settings, start/stop preview, snap pictures, and retrieve frames for encoding for video. This class is a client for the Camera service, which manages the actual camera hardware. This class belongs to the android.hardware package. Some of the methods of the camera API are described below.  public Camera.Parameters getParameters():This method returns the current settings for this Camera service.

1 WGS 84 is the reference coordinate system used by the Global Positioning System. The coordinate origin of WGS 84 is meant to be located at the Earth's center of mass. WGS is an abbreviation for World Geodetic System which is a standard for use in cartography and navigation.

16

 public static Camera open(): This method creates a new Camera object to access the first back-facing camera on the device. If the device does not have a back-facing camera, this returns null.  public void setParameters(Camera.Parameters params): This method changes the settings for this Camera service.  public final void startPreview(): This method starts capturing and drawing preview frames to the screen. Preview will not start until a surface is supplied with setPreviewDisplay(SurfaceHolder).  Public final void setPreviewDisplay(SurfaceHolder holder): This sets the Surface to be used for a live preview. A surface or surface texture is necessary for a preview.  SurfaceHolder Interface: It is an Abstract interface to someone holding a display surface. It allows controlling the surface size and format, editing the pixels in the surface, and monitoring changes to the surface. This interface is available through the SurfaceView Class.  SurfaceView Class: The SurfaceView class is used to present a live camera preview to the user. This class has a method getHolder() which gives access to the underlying surface via the SurfaceHolder interface. Figure 4.1 [22] illustrates the call flow.

4.6 USAGE OF DEVICE SENSORS A normal cell phone might know its location in latitude and longitude from the phone GPS but a smart phone in addition, can be aware of its orientation for example if it is facing west, is in landscape mode or portrait mode and is tilted up or down. In a smart phone, this information is available from the sensor framework. A mobile augmented reality application often uses the sensor framework to access sensors and acquire sensor data. The sensor framework uses a standard three axis coordinate system to express data values. For most sensors, the coordinate system is defined relative to the device's screen when the device is held in its default orientation [23]. If application matches sensor data to the on-screen display shown in Figure 4.2 [23] then the getRotation() method is used to determine screen rotation and then the remapCoordinateSystem() method is used to map sensor coordinates to screen coordinates. Figure 4.3 [24] and Figure 4.4 [24] illustrate relationship between sensor coordinate system and screen coordinate system.

17

Figure 4.1. Flowchart describing the order of call to various camera API methods needed to use the phone camera. Source: GOOOGLE, Camera. Android, http://developer.android.com/reference/android/hardware/Camera.html, accessed January 2013, last modified April 2013.

4.6.1 Android Sensor API Methods  public int getRotation(): This method returns the rotation of the screen from its natural orientation. The returned value may be one of the four Surface.ROTATION_0, Surface.ROTATION_90, Surface.ROTATION_180 or Surface.ROTATION_270 depending on the amount of rotation.  public static boolean getRotationMatrix(float[] R, float[] I, float gravity, float geomagnetic): The parameters of the method are described in Table 4.2 [23]. It computes the inclination matrix I as well as the rotation matrix R transforming a

18

Figure 4.2. Default orientation of phone. Source: GOOGLE, SensorManager. Android, http://developer.android.com/reference/android/hardware/SensorManager.html##get RotationMatrix, accessed January 2013, n.d.

Figure 4.3. Sensor coordinate system is coincidennt with the screen coordinate system in default orientation. Source: TIM BRAY, One screen turn deserves another. Blogspot, http://android-developers.blogspot.com/2010/099/one-screen-turn-deserves- another.html, accessed January 2013, last modified September 2010. vector from the device coordinate system to the worlld's coordinate system as shown in Figure 4.5 [23] is defined as a direct orthonormal basis, where o X is defined as the vector product Y x Z. It is tangential to the ground at the device's current location and roughly points east. o Y is tangential to the ground at the device's current location and points towards the magnetic North Pole. o Z points towards the sky and is perpendicular to the ground.

19

Figure 4.4. Sensor coordinate system is not coincident with the screen coordinate system after rotation. Source: TIM BRAY, One screen turn deserves another. Blogspot, http://android-developers.blogspot.com/2010/09/one-screen-turn- deserves-another.html, accessed January 2013, last modified September 2010.

Table 4.2. Table Describing the Parameters for the getRotationMatrix Method Parameter Name Parameter Description R It is an array of 9 floats holding the rotation matrix R when this function returns. R can be null I It is an array of 9 floats holding the rotation matrix I when this function returns. I can be null. Gravity It is an array of 3 floats conntaining the gravity vector expressed in the device's coordinate. You can simply use the values returned by a SensorEvent of a Sensor of type TYPE_ACCELEROMETER. Geomagnetic It is is an array of 3 floats ccontaining thhe geomagnetic vector expressed in the device's coordinate. You can simply use the values returned by a SensorEvent of a Sensor of type TYPE_MAGNETIC_FIELD.

 public static boolean remapCoordinateSystem(float[] inR, int X, int Y, float[] outR): This method rotates the supplied rotation matrix to express it in a different coordinate system. This method is used when an application needs to compute the three orientation angles of the device in a different coordinate system. Table 4.3 illustrates the parameters required to call this method [23].

4.6.2 Pitch Roll Azimuth for Android Phone The diagrammatic representations of the azimuth, pitch annd roll angles are shown in Figure 4.6. Azimuth is the rotation around the Z axis of phone. It varies between 00 and 3600. Pitch is the rotation around the X axis of phone. It varies between -1800 to 1800, with positive

20

Figure 4.5. World coordinate system as used by android API methods. Source: GOOGLE, SensorManager. Android, http://developer.android.comm/reference/and roid/hardware/SensorManaager.html#getRo tationMatrix, accessed January 2013, n.d.

Table 4.3. Table Describing the Parameters for remapCoordinateSystem Methhod Parameter Name Parameter Description inR It is the rotation matrix to be transformed. It is usually the matrix returned by getRotationMatrix(float[], float[], float[], float[]). X It defines on which world axis and direction the X axis of the device is mapped Y It defines on which worldd axis and direction the Y axis of the device is mapped outR The transformed rotation matrix. inR and outR can be the same array, but it is not recommended for performance reason. values when the z- axis moves towards the y- axis. Roll is the position around Y- axis. It varies between -900 to 900, with the positive values when the z- axis moves towards the x axis [25].

21

Figuure 4.6. Representation of azimuth pitch and roll as used by android API methods.

22

CHAPTER 5

IMPLEMENTATION DETAILS

5.1 PERMISSIONS TO ACCESS DEVICE HARDWARE AND OTHER SERVICES All Location API methods require ACCESS_COARSE_LOCATION or ACCESS_FINE_LOCATION permissions. This project has a functionality which allows one to save a snapshot of the screen to the SD card present in the phone. The contents of the SD card can later be viewed when we use a data cable to access the phone as a disk drive (see Appendix). To allow the application to write to external storage, WRITE_EXTERNAL_STORAGE permission needs to be declared in the android package manifest. The application also needs access to camera hardware and the internet to collect information about the nearby landmarks. These permissions are declared in the AndroidManifest.xml file using the uses-permission tag.

5.2 MAIN ACTIVITY AND MAP VIEW DETAILS The main screen of the project uses geocoding to display the current address using the android API’s Geocoder and Address class. This screen given in Figure 5.1 is displayed when the application is launched, it has three buttons. The first button is like a help button which displays instructions about the application as shown in Figure 5.2. The second button when clicked starts the activity that display a map and overlays the current location of the user on the map using Google Map API’s as shown in Figure 5.3. The satellite view of the map is displayed by default as shown in Figure 5.4. This screen will have menu options that allow

23

Figure 5.1. Main screen displayed on application launch.

Figure 5.2. Instruction dialog dissplayed on clicking the button labeled as instructions.

Figure 5.3. On clicking the map view button, street view of the map will be displayeed.

24

Figure 5.4. On clickingn the map button, satellite view will be displayed by default with a menu. the user to toggle between street view and satellite view of the map. The menu also contains an option that allows the user to save the snapshot of the map on the phone SD Card. Figure 5.5 shows the toast message displayed after saving the file.

Figure 5.5. Snapshot of the toast message displayeed on clickingn the save to SD card menu option.

The map view allows zooming and panning. Zooming out of the map view can be done by pinching with two fingers and zooming in on the map view to see more details on a small part of the map can be done by an opposite finger stretch. Panning is done by dragging on the map view with a finger. Panning gesture will help to see hidden parts of the map which are not visible when map view is in zoomed in state as shown in Figure 5.6.

25

Figure 5.6. Zoomed State of the map view after using pinch and zoom gesture on the map.

5.3 USAGE OF GOOGLE MAP API KEY IN XML FILE Using the Google Maps Android API, maps can be embedded into an activity as a fragment with a XML snippet. To use the Google map API, the Google map API key has been obtained. The Google map API key is highlighted in the code fragment below.

5.4 OBTAINING INFORMATION ABOUT LANDMARKS USING JSONOBJECT This application needs information about the landmarks located near the current location. This information is obtained from the internet as shown in the Figure 5.7. The code for the method is given below. private static final String BASE_URL = "http://ws.geonames.org/findNearbyWikipediaJSON"; public String createRequestURL(double lat, double lon, double alt, float radiuus, String locale) { return BASE_URL+ "?lat=" + lat + "&lng=" + lon + "&radius="+ radius + "&maxRows=40" + "&lang=" + locale; }

26

Figure 5.7. Flowchart explaining how the datta is retrieved from internet.

5.5 PLOTTING ICONS FOR NEARBY LANDMARK ON CAMERA PREVIEW Once the Camera Preview button is clicked a camera preeview will be started. As the phone is held vertical to the ground the phone accelerometer sensor will detect the change and initiate a routine to detect nearby landmarks. Once nearby landmarks have been detected, another routine will draw the icons corresponding to nearby landmarks detected, on the camera preview. The flowchart of Figure 5.8 depicts this process.

27

Figure 5.8. Flowchart illustrating the plotting of landmarks on camera preview.

Since a large number of landmarks will be detected withiin the specified radius but as all of these cannot be drawn on the limited camera preview surface only the nearest ones lying within the 45 degree view angle will be drawn. Rotating the phone will reveal all landmarks. Figure 5.9 and Figure 5.10 show the snapshot of the phone screen when iconns are being projected on the camera preview. At times two or more landmark icons can have the same value for x and y coordinates of the phone screen. In this case these icons will overlap, if their screen coordinate values are

28

Figure 5.9. Main screen displaayed on application launch after tapping back button.

Figure 5.10. On clicking the camera preview button. not adjusted. In order to prevent overlap between icoons, the y coordinate value of one of the overlapping icons is adjusted. The code for the method is provided below. private static void adjustForOverlap(Canvas canvas, List collection) { updated.clear(); for (LandmarkHolder marker1 : collection) { // Move to next marker, if // (a) The marker is not in view // (b) The marker has been already updated if (updated.contains(marker1) || !marker1.isInView()) continue;

int collisions = 1; for (LandmarkHolder marker2 : collection) { // Do not compare a marker with itself if (marker1.equals(marker2) || updated.contains(marker2) || !marker2.isInView()) continue;

29

if (marker1.isMarkerOnMarker(marker2)) { marker2.getLocation().get(locationArray); float y = locationArray[1]; float h = collisions*OVERLAP_ADJUSTMENT; locationArray[1] = y+h; marker2.getLocation().set(locationArray); marker2.update(canvas, 0, 0); collisions++; updated.add(marker2); } } updated.add(marker1); } }

Figure 5.11 shows the usage of 45 degree view angle. It makes sure that only the landmarks in front of user holding the phone are shown on the camera preview as icons.

Figure 5.11. Snapshot of screen displayed on clicking on show plot menu option.

There is a menu bar which gives the option to show the nearby places as a list view as shown in the Figure 5.12.

Figure 5.12. Snapshot of list view displayed on clicking on list view menu item from the menu.

30

5.5.1 Relationship between the Plotted Points and Radius The slider on the user interface allows the user to change the radius within which the landmarks will be detected. Figure 5.13, Figure 5.14, Figure 5.15 and Figure 5.16 show the snapshots of the plot on camera preview at different radii.

Figure 5.13. Snapshot of plot when the radius on the slider is 25 km.

Figure 5.14. Snapshot of plot when the radius on the slider is 50 km.

When the radius is between 0 to 25 km all the landmarks which correspond to the points on the circle are constrained within the full region as represented by outer circle P4 in the Figure 5.17. Similarly when the radius is between 26 to 50 km all the points on the circle are constrained within the region as represented by circle P3 in the Figure 5.17. When the radius is between 51 to 75 km all the points on the ciircle are constrained within the region as represented by circle P2 in the Figure 5.17. When the radius is between 76 to 100 km all the landmarks which correspond to the points on the circle are constrained within the region as represented by inner most circle P1 in the Figure 5.17.

31

Figure 5.15. Snapshot of plot when the radius on the slider is 75 km.

Figure 5.16. Snapshot of plot when the radius on the slider is between 75 km and 100km.

The code for the method that calculates the direction in which the phone is pointing and draws the text on the phone screen is given below and is also illustrated diagrammatically in Figure 5.18. private void drawDirectionText(Canvas canvas) { if (canvas==null) throw new NullPointerException(); int value = (int) (ViewData.getAzimuth() / (360f / 16f)); String dirTxt = ""; if (value == 15 || value == 0) dirTxt = "N"; else if (value == 1 || value == 2) dirTxt = "NE"; else if (value == 3 || value == 4) dirTxt = "E"; else if (value == 5 || value == 6) dirTxt = "SE"; else if (value == 7 || value == 8) dirTxt= "S"; else if (value == 9 || value == 10) dirTxt = "SW"; else if (value == 11 || value == 12) dirTxt = "W"; else if (value == 13 || value == 14) dirTxt = "NW"; int directionAngle = (int) ViewData.getAzimuth(); circleText( canvas, ""+directionAngle+((char)176)+" "+dirTxt, (PAD_X + RADIUS), (PAD_Y ‐ 5), true);

32

Figuure 5.17. Diagram showing the rellationship between the slider and the circle on which points are plotted.

Figure 5.18. Diagram illustratinng the direction on the circle.

33

5.5.2 Usage of Thread Pool for Dynamic Update of Phone Screen The screen needs to be redraw and the data about landmarks needs to be updated under three circumstances.  Firstly if the location data changes, that is, if the current location or address has changed.  Secondly if the orientation of the phone has changed.  Thirdly when the user has moved the slider on the camera preview activity thereby increasing or decreasing the radius to be used for landmark detection. Many of these calls will take place at the same instant of time so BlockingQueue and ThreadPoolExecuter has been used to allow the queuing of the calls. private static final BlockingQueue queue = new ArrayBlockingQueue (1); A BlockingQueue, as declared in the code snippet above, has a fixed capacity of one as specified in the argument. It waits for the queue to become non-empty when retrieving an element, and waits for space to become available in the queue when storing an element. A BlockingQueue may be used to transfer and hold submitted tasks. If corePoolSize or more threads are running, the Executor always prefers queuing a request rather than adding a new thread [26]. private static final ThreadPoolExecutor exeService = new ThreadPoolExecutor(1, 1, 20, TimeUnit.SECONDS, queue); exeService.execute( new Runnable() {

public void run() { for (BaseDataSource source : sources.values()) download(source, lat, lon, alt); } } );

The declaration and usage of ThreadPoolExecuter is shown in the code snippet above. This ThreadPoolExecuter has a corePoolSize of one as specified by the first argument and MaximumThreadPool size of one as specified with the second argument. ThreadPoolExecuter is an ExecutorService that executes each submitted task using one of possibly several pooled threads. Thread pools provide improved performance when

34

executing large numbers of asynchronous tasks, due to reduced per-task invocation overhead [27]. Once the updated data is available, to initiate the drawing of the icons on the camera

surface, which is represented by an instance of the CameraSurface class, the invalidate() method is called to initiate redrawing on the camera preview as shown in the code snippet given below. protected static CameraSurface cam = null; cam = new CameraSurface(this); cam.invalidate()

35

CHAPTER 6

EXECUTION RESULTS ON ANDROID PHONE

The execution time for collection and plotting information about the landmarks on the camera preview has been recorded at four different radiuses as shown in Table 6.1 and Table 6.2. Table 6.3 contains the summarized data which has been used to plot a comparative graph in Figure 6.1.

Table 6.1. Table Describing Execution Times at Different Radius for Samsung Galaxy S3

36

Table 6.2. Table Describing Execution Times at Different Radii for HTC Thunderbolt Radius:25Km Radius:50Km Radius:75Km Radius:100Km Landmarks=2 Landmarks=26 Landmarks=27 Landmarks=29 Time Time Time Time 17 83 107 76 6 38 71 75 8 49 61 47 21 56 61 146 28 42 58 54 25 56 80 40 28 59 91 86 43 33 65 43 32 78 73 65 5 53 54 60 27 36 66 64 19 49 104 55 18 64 49 54 7 27 35 54 29 33 77 34 17 33 37 44 20 37 36 54 13 63 67 61 8 69 81 96 26 28 90 144 27 33 46 41 24 42 59 58 5 57 37 54 18 44 130 119 19.625 47.72 67.52 67.84

6.1 ANDROID LOGGING SYSTEM The Android logging system provides a mechanism for collecting and viewing debug output (see Appendix). This logging system has been used to record the execution time at different radiuses for two different phones, namely Samsung Galaxy S3 and HTC thunderbolt. The logs can be imported to a text file and studied for further analysis. The description of the configuration of the phones used for running the application is given in Table 6.4. The address where the execution time has been recorded is 4375 Derrick Drive, San Diego, CA- 92117, United States.

37

Table 6.3. Table Showing Summarized Daata of Execution Times at Different Radius for HTC Thunderbolt and Samsung Galaxy S3 (From Table 6.1 and 6.2) Distance Number of Averaage Averaage (In Km) Landmarks Time in Time in ms(HTC) ms(Samsung) 25 2 19.625 5.708333 50 22 47.72 28.875 75 26 67.52 44.45833 100 29 67.84 49.91667

Figure 6.1. Graphical representation of the execution time from Table 6. 3.

38

Table 6.4. Table Describing Phone Configurations Used for Running the Application Phone HTC ThunderBolt 4G Samsung Galaxy S3 Date of Release Mar-11 May-12 4G (LTE/HSPA+) 4G (LTE/HSPA+) Network Edge/2G (GSM/GPRS) Edge/2G (GSM/GPRS) Operating System Version Android 4.0 Android 4.0 Accelerometer Accelerometer Digital Compass Digital Compass Sensors GPS GPS Gyroscope Gyroscope Screen Size 4.3 inch 4.8 inch Display Technology TFT Super AMOLED Screen Resolution 800 x 480 pixels 1,280 x 720 pixels Autofocus Autofocus Digital Zoom Digital Zoom Camera Options Front Facing Camera Front Facing Camera Rear Facing Rear Facing Video Recording Video Recording 3G 3G Wireless Connectivity 4G 4G Bluetooth Bluetooth CPU Speed 1.00 GHz 1.40 GHz Processor Cores Single Core Quad Core RAM 768 MB 2,048 MB Dimensions 4.75" x 2.44" x 0.56" 5.38" x 2.78" x 0.34" Weight 177 grams 133 grams Storage 8.00 GB 16.00 GB Removable Storage 32 GB - (included) Removable Storage - 64 GB (maximum)

6.2 PROCEDURE FOR RECORDING EXECUTION TIMES Whenever the sensor data changes the postInvalidate() is called on the class that in turn calls the OnDraw method as shown below and re-renders the screen. The time required to execute the OnDraw method is recorded using the android logging system. This method is responsible for calling methods that draw the landmark icons on camera preview, adjusts the

39 landmark icons if they overlap on the phone screen and populates the list view with the details about nearby landmarks. protected void onDraw(Canvas canvas) { if (canvas==null) return; long start = System.currentTimeMillis(); if (drawing.compareAndSet(false, true)) { List collection = ViewData.getMarkers(); cache.clear(); // All the markers that are around are added to the cache variable for (LandmarkHolder m : collection) { m.update(canvas, 0, 0); if (m.isOnRadar()) cache.add(m); } collection = cache; // cache is duplicated to collection // Adjusting markers to prevent collision if (AugmentedActivity.useCollisionDetection) adjustForOverlap(canvas,collection); // Drawing of markers ListIterator iter = collection.listIterator(collection.size()); int i = 0; while (iter.hasPrevious()) { LandmarkHolder marker = iter.previous(); marker.draw(canvas); i++; // For populating 20 nearest items in the list if(i<=20) { double distance = marker.getDistance(); String name = marker.getName(); if (distance<1000.0) { PlaceList.places.add(marker.getName()); PlaceList.distances.add((new DecimalFormat("@#").format(distance)) + "m"); } else

40

{ double d=distance/1000.0; PlaceList.places.add(namePlaceList.distances.add( (new DecimalFormat("@#").format(d)) + "km"); } } } // Updating plot after drawing of markers if (AugmentedActivity.showPlot) plot.draw(canvas); drawing.set(false); long end = System.currentTimeMillis(); long timeElapsed = end ‐ start; Log.v("AugumentedView", Long.toString(timeElapsed) + "Time To Draw"); Log.v("AugumentedView", (collection.size()) + "no of landmarks detected"); } }

6.3 CONCLUSION The conclusion from this test activity is that the application performs optimally on both phones. The graph in Figure 6.1 conveys that execution time for Samsung Galaxy S3 is better than HTC thunderbolt primarily because of the superior configuration of the phone [28, 29]. The configuration details of the phones are listed in Table 6.4. The execution time for Samsung Galaxy S3 and HTC thunderbolt are listed in Table 6.1 and Table 6.2 respectively. Y axis represents the time taken to recond the landmarks in milli seconds. X axis represents the radius in km around the current location within which the landmarks were recorded. The graph is plotted using Matlab. A script file has been created in Matlab, in which the data of Table 6.3 has been recorded into two arrays. The script file projectPlot.mat, when run as a matlab command, generates the graph shown in Figure 6.1. The contents of the script file are given below. projectPlot.mat x = [25;50;75;100]; y = [19.625,5.708333; 47.72,28.875; 67.52,44.45833; 67.84,49.91667]; figure; bar(x,y);

41

CHAPTER 7

CONCLUSIONS AND OBSTACLES

This application performs optimally on an android phone as is also deduced from the execution results presented in the previous chapter. This chapter illustrates some of the obstacles faced during the implementation phase of the application. One of the obstacles faced while developing this application was regarding the usage of the magnetic field sensor of the phone. This application requires the use of two types of sensors, accelerometer sensor and magnetic field sensor. It is possible to use android sensor simulator plugin to simulate an accelerometer sensor in the android emulator but it is not possible to use the magnetic field sensor of phone in the android emulator. The output of the magnetic field sensor is crucial as it is used as an argument in the getRotationMatrix2 method so all the code testing and execution relating to this functionality had to be done on the phone. Another obstacle faced was regarding the usage of multithreading in this application. This application contains a data class which contains all details like current location, radius of detection etc. The methods that set and get these data values are prefixed with the synchronized keyword so only one thread will be allowed to access these data values at any instant of time in order to maintain integrity of the data values. Hence multithreading was causing issues to the application so I had to use a blocking queue with a maximum thread pool size of one.

2 Refer to Table 4.2

42

CHAPTER 8

FUTURE SCOPE AND IMPROVEMENT

This application uses location information obtained from Geonames.org and augmented reality to present local points of interest on camera preview. In future the concept augmentation on camera preview can be used with Twitter and Facebook web services to create interactive useful applications. The unity software development kit is used in industry to create 3D games but it can be used to create 3D educational models also. The support of opening a more detailed web view when a landmark icon is touched on the phone screen can be added to the application in the future. Currently the application performs optimally when the user is walking, or static with the phone, but performance and accuracy of the application gets impeded when it is used in a fast moving car.

43

BIBLIOGRAPHY

WORKS CITED [1] IDC, Android marks fourth anniversary since launch with 75.0% market share in third quarter, according to IDC. IDC, http://www.idc.com/getdoc.jsp?containerId=prUS23771812#.UPNpKOQ82Ds, accessed December 2012, n.d. [2] QUIN STREET INC., Augmented reality. Webopedia, http://www.webopedia.com/TERM/A/Augmented_Reality.html, accessed December 2012, n.d. [3] R. METZ, Augmented reality is finally getting real. Technology Review, http://www.technologyreview.com/news/428654/augmented-reality-is-finally-getting- real/, accessed December 2012, last modified August 2012. [4] APPLE, World lens. Apple, https://itunes.apple.com/us/app/word- lens/id383463868?mt=8, accessed December 2012, n.d. [5] ACCUFORM SIGNS, Spanish bilingual sign, danger high voltage/peligro alto voltaje,14" X 10", vinyl. Amazon, http://www.amazon.com/Accuform-Signs-Spanish-Bilingual- Voltage/dp/B001836WFK, accessed December 2012, n.d. [6] GOOGLE, Google Sky map. Google, http://www.google.com/mobile/skymap/, accessed December 2012, n.d. [7] G. STERLING, Will ‘Point & Find’ get Nokia back in the game? i2G, http://internet2go.net/news/local-search/will-point-find-get-nokia-back-game, accessed December 2012, n.d. [8] UNITY TECHNOLOGIES, Game engine, tools, and multiplatform. Unity Technologies, http://unity3d.com/unity/, accessed December 2012, n.d. [9] S. Mahanti, Java with Android updates: Android architecture. Blogspot, http://java4freshers.blogspot.com/2012/02/android-architecture.html, accessed December 2012, n.d. [10] WIKIPEDIA, Dalvik (software). Wikipedia, http://en.wikipedia.org/wiki/Dalvik_(software), accessed December 2012, n.d. [11] GOOGLE, Activity. Android, http://developer.android.com/reference/android/app/Activity.html, accessed December 2012, last modified April 2013. [12] GOOGLE, Application fundamentals. Android, http://developer.android.com/guide/components/fundamentals.html, accessed December 2012, n.d.

44

[13] GOOGLE, Location and maps. Android, http://developer.android.com/guide/topics/location/index.html#maps, accessed December 2012, n.d. [14] GOOGLE, LocationManager. Android, http://developer.android.com/reference/android/location/LocationManager.html, accessed December 2012, n.d. [15] GOOGLE, Location. Android, http://developer.android.com/reference/android/location/Location.html, accessed December 2012, n.d. [16] GOOGLE, GeoCoder. Android, http://developer.android.com/reference/android/location/Geocoder.html, accessed December 2012, last modified April 2013. [17] GOOGLE, Address. Android, http://developer.android.com/reference/android/location/Address.html, accessed December 2012, last modified April 2013. [18] GOOGLE, Google Maps Android v1 API (deprecated). Google, https://developers.google.com/maps/documentation/android/v1/mapkey, accessed December 2012, last modified March 2013. [19] WIKIPEDIA, World Geodetic System. Wikipedia, http://en.wikipedia.org/wiki/World_Geodetic_System, accessed January 2013, n.d. [20] WIKIPEDIA, GeoNames. Wikipedia, http://en.wikipedia.org/wiki/GeoNames, accessed January 2013, n.d. [21] GOOGLE, Org.json. Android, http://developer.android.com/reference/org/json/package- summary.html, accessed January 2013, n.d. [22] GOOGLE, Camera. Android, http://developer.android.com/reference/android/hardware/Camera.html, accessed January 2013, last modified April 2013. [23] GOOGLE, SensorManager. Android, http://developer.android.com/reference/android/hardware/SensorManager.html#getRota tionMatrix, accessed January 2013, n.d. [24] TIM BRAY, One screen turn deserves another. Blogspot, http://android- developers.blogspot.com/2010/09/one-screen-turn-deserves-another.html, accessed January 2013, last modified September 2010. [25] GOOGLE, SensorListener. Android, http://developer.android.com/reference/android/hardware/SensorListener.html, accessed January 2013, last modified April 2013. [26] GOOGLE, BlockingQueue. Android, http://developer.android.com/reference/java/util/concurrent/BlockingQueue.html, accessed January 2013, last modified April 2013.

45

[27] GOOGLE, ThreadPoolExecutor. Android, http://developer.android.com/reference/java/util/concurrent/ThreadPoolExecutor.html, accessed Janaury 2013, last modified April 2013. [28] VERIZON WIRELESS, Samsung Galaxy S III. Verizon Wireless, http://www.verizonwireless.com/b2c/store/controller?item=phoneFirst&action=viewPh oneDetail&selectedPhoneId=5987, accessed December 2012, n.d. [29] PHONEARENA, HTC Thunderbolt. PhoneArena, http://www.phonearena.com/phones/HTC-ThunderBolt_id4985, accessed December 2012, n.d.

WORKS CONSULTED R. SOOD, Pro Android Augmented Reality, Apress, New York, New York, 2012. S. VAN EVERY, Pro Android Media: Developing Graphics, Music, Videos and Rich Media Apps for Smartphones and Tablets, Apress, New York, New York, 2009. M. VUSKOVIC, Advanced Robotics, Lecture Notes, San Diego State University, San Diego, California, 2012.

46

APPENDIX

SETTING OF DEVELOPMENT AND TEST ENVIRONMENT

47

SETTING UP THE DEVELOPMENT AND TEST ENVIRONMENT  Introduction.  Setting up development environment.  Installation of ADB driver for phone.  Setting up USB debugging on phone. Introduction: Android offers a custom plugin for the Eclipse IDE, called Android Development Tools. This plugin provides a powerful, integrated environment to develop AAndroid apps. It extends the capabilities of Eclipse to let us set up new Android projects, build an application UI, and also debug application. Setting up development environment: To download Eclipse IDE: http://www.eclipse.org/downloads/ To downloading and install ADT plugin, Start Eclipse, then select Help > Install New Software. Click Add, In the Add Repository dialog enter "ADT Plugin" for the Name and the following URL for the Location: https://dl-ssl.google.com/android/eclipse/ Click OK, In the Available Software dialog, select the checkbox next to Developer Tools and click Next.

In the next window, you will see a list of the tools available for downloading. Click Next. Click Finish after accepting the license agreement to complete installation.

48

Installation of ADB driver for phone: This project allows saving a png file obtained afteer taking a camera snapshot. To be able to access the phone as a disk drive and to allow for USB debugging some ADB drivers need to be installed for each type of phone. The ADB driver for HTC Thunderbolt phone is called HTC sync. Similarly the ADB driver for Samsung galaxy s3 is called Kies. These are free applications for a computer that makes it easy to sync all of media to and from the computer with a phone. Download link for HTC Thunderbolt driver: http://www.htc.com/www/software/htc-sync-manager/. Download link for Samsung Galaxy s3 driver: http://www.samsung.com/us/kies/

Setting up USB debugging in phone: 1. Click settings on phone menu then go to applicatiions option.

49

Choose Development suboption in the next window.

Click the check box to enable USB debugging on phone.