<<

A Novel Approach to Home Automation: Application Development with Glass

Rupesh Chinta Roland Fong Isabel Murdock rupesh.ch97@.com [email protected] [email protected]

William Kang Quinn Williamson [email protected] [email protected]

ABSTRACT 1 INTRODUCTION

Recently developed wearable The rapid advancement of modern devices have opened up new capabilities devices has led to an outgrowth of new in human interactions with the ways for people to interact with the world. Home automation is an area of world. Among these advancements is interest for the application of wearable , which combines the devices. Currently, home automation is computing power of a with performed with home control panels or convenience and accessibility. Wearable mobile devices. This paper, on the other technology consists of computing devices hand, presents an implementation of that can be worn, like a belt or a home automation using the Google backpack, and have a form of input, such Glass. With a few simple voice commands as a touchpad1. The rise of wearable or finger swipes, the user can monitor the technology opens up a new wealth of home and control home appliances, thus possibilities for potential applications, bringing a new level of interaction with including sensing the user’s immediate the home. The Glass communicates with environment2, navigating1, and assisting an online server that relays commands to medical professionals3. One application of a microcontroller to execute certain wearable devices is in home automation. tasks. The microcontroller is, in turn, able A primary goal of home automation is to to communicate sensor information to the optimize the efficiency and comfort of Glass through the same server. This home residents. However, current efforts framework serves as a prototype that to fulfill these desires face limitations demonstrates the potential for wearable ranging from restricted portability, such technology to simplify the user’s in home as that provided by wall panels, to a lack experience. In the future, it can be of a nearly hands free experience when expanded to include more features and using and tablets4. more devices for the user to interact with in the home. Wearable devices solve these problems with constant accessibility and awareness of their context, or the user’s

1 surrounding area1. Because it is 2.2 Components constantly accessible, the interfaces of most wearable devices are designed to The development of this accommodate constant usage5. Google application relies on several programs, Glass, which focuses on simplicity and tools, and hardware. The Android efficiency, best fulfills these necessities. It Development Toolkit (ADT) allows allows users to easily complete daily tasks development of android mobile through advanced voice and motion applications on the . The recognition. In addition, the Google Glass , a “Platform as a is comfortable, because it can be worn Service” (PaaS), allows the android like ordinary , providing an application to access an online server. unobtrusive, yet novel experience. Finally, the Arduino microcontroller, an open source electronics platform, controls This research seeks to design an most of the hardware. innovative approach to home automation through Glass application development. The key features of our application, “Start 2.2.1 Android Development Toolkit home auto”, incorporate the Android Development Kit (SDK), Google Glass runs on the latest Arduino, and the Google App Engine. Android’s , 4.4.2, also known as “Kit Kat.” This Android Development Kit (SDK) includes a 2 BACKGROUND debugger, libraries, a handset emulator, sample code, tutorials, and 2.1 Why Google Glass? documentation. The Java programming language is used to write Android Google Glass is a wearable device applications. The Extensible Markup with an optical head mounted display and Language (XML), is also used for design in a on the right temple of the the ADT. XML is a common programming glasses. The user can interact with Google language designed to carry data. In the Glass both by using voice commands and ADT, XML is used to declare static by tapping/swiping the touchpad. Glass structures like that in the Android displays information in the top-right of Manifest. The Manifest holds the the user’s field of view, in a location that application name, version, and icon (not is easily accessible without obstructing applicable with this application) as well the user’s sight. Additionally, Glass can as permissions from the operating connect to the Internet through Wi-Fi and system, used to request additional , and can use this to access features of a device’s hardware or access online devices and servers. personal information and services. The officially supported Integrated

2 Development Environment (IDE) is input and output to and from the server. Eclipse. Its annotation-based code gives it ease of use from a developer’s perspective. In order to code specifically for the Google Glass, the Glass Development Kit 2.2.3 Arduino (GDK), a revision to the ADT, is used. The GDK handles specific Google Glass The Arduino is a single-board functionality, such as voice commands. microcontroller that senses the Additionally the entire project is complied environment by receiving inputs from with the GDK and targets the Google Glass many sensors. It can also output data and instead of the Android phone operating be programmed, allowing it to control system. devices. While there are many different Arduino models, the Arduino Yún is the 2.2.2 Google App Engine microcontroller of choice because it is optimized for wireless connections, which This PaaS allows developers to are crucial to the application. The Yún build and run applications through also contains a USB-A port, micro-SD card Google. A PaaS is a category of cloud slot, 20 digital input/output pins (of computing that allows for the building of which 7 can be used as PWM outputs and applications and of services over the 12 as analog inputs), a 16 MHz crystal internet. Uses of the App Engine are for oscillator, a micro USB connection, an applications that have web service ICSP header, and 3 reset buttons. Arduino components, applications that keep a Yún is programmed with a C/C++ based profile or sync across devices, and language using the Arduino IDE. applications that need authentication. AppEngine Endpoints API parses parameters as JSON (JavaScript Object 3 DEVELOPING A GOOGLE GLASS Notation), a lightweight data interchange APPLICATION format, to translate data between servers. In the case of the application, Google App 3.1 Overview Engine was used to create servers. The servers allow Arduino to communicate Start Home Auto was created to with the application via the Internet and demonstrate the potential of wearable vice-versa. devices in home automation. The application ultimately allows the user to The Google App Engine is remotely control home utilities. The especially tailored to this application project consists of three components: The because it is designed specifically for Google Glass application, an online server interfacing between servers and android hosted on App Engine, and hardware or Google products. The App Engine API controlled by an Arduino microcontroller. also allows flexibility for controlling the

3 3.2 Glass Application motion, it will prompt the server, and the server provides a timestamp for when the The Google Glass runs on the motion was detected. The Glass Android operating system, and is application polls the server periodically programmed using the Android and when it finds that motion is detected, Development Toolkit (ADT) in the Eclipse it will get the timestamp and notify the development environment. Using the user. voice command “Start Home Auto,” the user can open the application and view 3.4 Arduino Microcontroller and Hardware the home screen, a menu that allows the user to access the toggle menu, program The Arduino microcontroller menu, or live stream video. These options toggles a switch, which is directly can be used to wirelessly control connected to the hardware, namely the hardware and monitor the home through lights and the air conditioner. With the live streaming and motion detection. output received from the Google App Engine, the Arduino turns the devices on 3.3 App Engine Server or off. In addition, the Arduino is connected to an infrared sensor, which Google App Engine Server allows acts as a motion detector. When the the Google Glass application and Arduino values of the infrared sensor fluctuate, the to interact. The communication flows in Arduino notifies the server and then both directions. The Glass application can pauses before again checking the sensor. send a command, such as “lights on,” to the server. The command is then relayed 4 RESULTS AND DISCUSSION to the Arduino, which performs the actual command. When the Arduino detects

Figure 1. Project Schematic

4 Table 1: Program Classes

Classes Description

MainActivity Home Page. Allows access to Toggle, Task, and Streaming Activity.

LiveStreamingActivity Live stream feed from camera connected to Arduino.

MotionNotificationActivity Not on UI thread. Notifies user about motion detected by device on Arduino.

TaskActivity Allows addition of programmable tasks (actions that will occur at a certain time)

ManageActivity Allows the user to delete existing tasks

ToggleActivity Allows access light switch and AC switch

ToggleGetService Starts a service that runs in the background which checks whether motion is detected

States Holds the states of each toggle-able activity as well as the motion timestamp

ToggleGetTask Application can receive information from Arduino through the server.

ToggleJsonParser.java Parses string edited in Toggle Activity so server can understand individual commands

TogglePutTask.java Sends commands to server through JSON

4.1 Application Activities and Classes this notification. The remaining classes The finished application has a communicate with the Google App Engine simple that includes few Server. Table 1 demonstrates the various activities. The activities included are a uses for each of the classes in the Toggle Activity, an Add Tasks Activity, a application. Manage Tasks Activity, and a Live Streaming Activity. In the actual code for 4.2 Application Expandability the application, there are a total of eleven The current prototype of Start classes. The Notification Activity informs Home Auto and associated hardware the user about detected motion. includes capabilities to toggle lights and Otherwise, there is no separate screen for

5 air conditioning and to notify the user of to bring up the Toggle Menu, which will motion. The user can also create a task give the current status of the lights and program that will automatically execute air conditioning and allow the user to the toggling of lights or the air toggle between states by tapping. conditioning at a preconfigured time. Additionally, while the Google Glass is on, When the user navigates to the Toggle whenever the Arduino detects motion, Menu, he or she can tap the Google Glass Google Glass will alert the user and give the time and date it was detected. The Figure 2. Glass User Interface Diagram user can then take appropriate action.

6 The current framework of the send information to the server. The first is sharing of information from Google Glass the light toggle, with the option to change to the Arduino and vice versa is a the status of the “Lights” to “ON” or “OFF” framework that can be expanded to with the tap of the touchpad. The second include a wide variety of tasks and option on Toggle is the AC toggle, which capabilities, which greatly simplifies the functions the same way. user experience. For example, the Google Glass can be programmed to tell the If Program is chosen, then the user Arduino to open the garage. The Google again has two options: Add Tasks or Glass can then track the user’s location Delete Tasks. Tasks include programming and the user can configure the application the application to automatically toggle to open the garage or the lights when the (either the lights or the AC) at certain user is within a certain radius of his or times. In the future, such an activity can her home. The Arduino, or any also be programmed to toggle certain microcontroller, can receive data from tasks when the user is a certain radius sensors around the home. With away from the home. appropriate hardware and utilizing the If Stream is chosen at the main convenience of Google Glass’s voice activity, then live stream from the commands, the user can ask Google Glass security camera feed is displayed. The to do even more. For instance, a coffee application checks for motion detection maker can be configured to make coffee every minute from the server. When every time the user asked Google Glass, motion is detected, Glass notifies the user “Glass, make me coffee.” with the time detected. Figure 2 4.3 Using “Start Home Auto” summarizes the Glass User Interface (UI).

4.4 Challenges Faced When the user turns on the Google

Glass, he or she can view the applications Several factors made designing using the voice command “ok glass.” Upon and implementing home automation reaching the menu, the user can say, using Google Glass difficult. Recent “start home auto” to run the application. updates to Android’s operating system This command takes the user to the main introduced bugs in the latest version of activity. The main activity is a list the Eclipse IDE, which meant that it including three options: Toggle, Program would have to be reverted to an older and Stream. Using voice commands again version. The Google Glass, being a or taps on the touchpad allows the user to relatively recent development, did not choose an option. have all the tools available for ease of If Toggle is chosen, then the user development. For example, the lack of an has two screens, which he modifies to emulator made testing Glass code difficult. Additionally, the Arduino Yún,

7 designed specifically for Wi-Fi features should allow the user to more connections, experienced problems easily add devices and program more accessing the internet and the secure features with ease. server. Home automation application Another challenge was using systems will eventually be more Glass’s simple interface to program some accessible to the public. Because the complicated aspects of the user interface. application depends on its users to have For instance, selecting a time on a phone some background in code to understand is trivial due to the screen size (which is its works and use it efficiently, it is not larger than that of Glass) and its entirely practical. Future applications will touchscreen. Glass, on the other hand, build upon its functions and current uses a smaller screen, only a touchpad, capabilities as a first of Glass home and a simplistic design philosophy. As automation applications. The next such, to select a time when programming generation of applications of its kind will tasks, a user must navigate through four make a home automation application that menus, first, a menu containing all the is more accessible and user-friendly to hours, then a menu containing all the consumers. “tens minutes” (the tens digit of the minutes), then the “ones minute,” and 5 CONCLUSION finally, whether the time is AM or PM. “Start Home Auto” serves as a 4.5 Future Employment prototype for future projects involving the integration of the home with wearable The concept of home automation devices. The Google Glass worked as an using wearable devices is easily excellent medium for exhibiting the expandable, and this project potential of wearable devices because it demonstrates just a fraction of its incorporated simple convenience with capabilities. “Start Home Auto” was powerful computing unmatched by other created to be built upon a model for kinds of wearable devices to date. The future applications. Its capabilities are simple interface, which includes a menu minimal as a result of time constraints to toggle activities, program tasks, and and limited access to home devices. Home stream video, captures three of the most automation systems should maximize basic aspects of the integration of such user customization. “Start Home Auto” in technology with the home. The user can some ways limits the extent to which the control his or her home utilities and user may personalize the application for monitor the surroundings without so his specific needs. The application allows much as moving a finger. Additionally, the the user to toggle only two devices. motion detection serves as an example of Applications that will build upon these how appliances in the home can

8 communicate back to the wearable REFERENCES device. Flexibility that is inherent in the program allows for an easy expansion of 1Billinghurst, M., and T. Starner. functionality for more practical "Wearable Devices: New Ways to applications. Manage Information." Computer 32, no. 1 (1999): 57-64. doi:10.1109/2.738305. ACKNOWLEDGEMENTS 2Farringdon, J., Moore, Andrew J., Tilbury, The authors would like to thank N., Church J., and Biemond, Pieter D. Mr. Victor Kaiser-Pendergrast for his "Wearable Sensor Badge and Sensor guidance throughout the Android Jacket for Context Awareness." Third development process and his insights on International Symposium on Wearable the Google Glass application. The Computers, 1999. Accessed July 18, 2014. reference resources provided by Mr. Kaiser-Pendergrast also contributed 3Hung, K., Y. T. Zhang, and B. Tai. greatly in determining the most accurate "Wearable Medical Devices For Tele- and efficient process for setting up a Home Healthcare." 26th Annual network for this research. The authors International Conference of the IEEE would also like to acknowledge their EMBS, September 1-5, 2004. Accessed Residential Teaching Assistant Anthony July 16, 2014. doi:10.1109/IEMBS.2004. Yang for overseeing and providing 1404503. support and guidance throughout the duration of the project. The research 4Yamazaki, Tatsuya. "Ubiquitous Home: group is also grateful to Rutgers Real-life Testbed for Home Context University, the Governor’s School of Aware Service." International Journal of Engineering and Technology (GSET) Smart Home Vol. 1, no. No.1 (January program, Director Dean Ilene Rosen, 2007). Accessed July 14, 2014. Assistant Director Jean Patrick Antoine, the program sponsors, and all Residential 5Lumsden, J., and S. Brewster. "A Teaching Assistants for all the support Paradigm Shift: Alternative Interaction crucial for the success of the group. Techniques For Use With Mobile & Finally, the research group would like to Wearable Devices." Proceedings of the thank the sponsors of the GSET program: 2003 Conference of the Centre for The State of New Jersey, Morgan Stanley, Advanced Studies on Collaborative Lockheed Martin, Silver Line Windows, Research, 2003, 197-210. Accessed July South Jersey Industries, Inc., The 16, 2014. Provident Bank Foundation, and Novo Nordisk.

9