LOW-COST EMBEDDED SECURITY SYSTEMS

A Degree Thesis Submitted to the Faculty of the Escola Tècnica d'Enginyeria de Telecomunicació de Barcelona Universitat Politècnica de Catalunya by Javier Sánchez Buitrago

In partial fulfilment of the requirements for the degree in ELECTRONIC SYSTEMS OF ENGINEERING

Advisor: Sergi Bermejo

Barcelona, January 2017

Abstract

This project presents a low-cost device for IoT (Internet of Things) developed using open- source tools, which could be used in the field of security since it has some features that will make life safer for humans. In addition, an APP for Android mobile phone is provided to easily manipulate the interface.

1

Resum

Aquest projecte presenta un dispositiu de baix cost per les IoT (Internet de les Coses) desenvolupat utilitzant eines de codi obert, el qual podria ser utilitzat en el camp de la seguretat, ja que té algunes característiques que faran la vida dels éssers humans més segura. A més, comptarà amb una APP per un smartphone Android, proporcionada per manipular fàcilment la interfície.

2

Resumen

Este proyecto presenta un dispositivo de bajo coste para las IoT (Internet de Cosas) desarrollado utilizando herramientas de código abierto, el cual podría ser utilizado en el campo de la seguridad, ya que tiene algunas características que harán la vida de los seres humanos más segura. Además, contará con una APP para un Smartphone Android, proporcionada para manipular fácilmente la interfaz.

3

Revision history and approval record

Revision Date Purpose

0 1/10/2016 Document creation

1 28/12/2016 Document revision

DOCUMENT DISTRIBUTION LIST

Name e-mail

Javier Sánchez Buitrago [email protected]

Sergi Bermejo [email protected]

Written by: Reviewed and approved by:

Date 14/9/2016 - 8/1/2017 Date 12/1/2017

Name Javier Sánchez Name Sergi Bermejo

Position Project Author Position Project Supervisor

4

Table of contents

Abstract ...... 1 Resum ...... 2 Resumen ...... 3 Revision history and approval record ...... 4 Table of contents ...... 5 List of Figures ...... 8 List of Tables: ...... 10 1. Introduction ...... 11 1.1. Aims of the work ...... 11 1.2. Scope ...... 11 1.3. Requirements and Specifications ...... 12 1.4. Work Plan, Packages and Task Milestones ...... 12 1.4.1. Work Packages: ...... 13 1.4.2. Milestones ...... 16 1.4.3. Time plan and Gantt diagram ...... 17 1.4.4. Deviations from the plan ...... 18 1.5. Organization of this document ...... 18 2. Security embedded systems: a review of the state of the art ...... 19 2.1. Background ...... 19 2.2. Origin of the IoT ...... 20 2.3. What is IoT ...... 20 2.3.1. IoT elements...... 21 2.3.2. Communication models ...... 22 2.3.3. IoT communication standards ...... 24 2.3.3.1. Application protocols ...... 24 2.3.3.2. Service Discovery Protocols ...... 24 2.3.3.3. Infrastructure Protocols ...... 24 2.4. Fundamental characteristics and requirements for IoT ...... 25 2.4.1. Characteristics ...... 25 2.4.2. Requirements ...... 26 2.5. Applications of IoT ...... 27

5

2.6. Safety and Safety System ...... 27 2.7. A low Cost Embedded Security System ...... 28 2.7.1. How works MQTT ...... 29 2.7.1.1. Architecture ...... 29 3. Methodology / project development: ...... 31 3.1. Design ...... 31 3.1.1. Architecture of the project ...... 31 3.2. Development ...... 36 3.2.1. List of equipment ...... 36 3.2.2. Device 1 ...... 36 3.2.2.1. Raspberry pi zero ...... 36 3.2.2.2. Movement infrared sensor, HC-SR501 ...... 37 3.2.2.3. Ultrasonic sensor HC-SR04 ...... 39 3.2.2.4. Battery Charge & Power Boost ...... 40 3.2.3. Device 2 ...... 41 3.2.3.1. Camera ...... 41 3.2.4. Device 3 ...... 42 3.2.4.1. GPS Module Ublox M6 ...... 42 3.3. Engineering & implementation...... 43 3.3.1.1. Raspbian ...... 44 3.3.1.2. Broker ...... 44 3.3.1.3. Device 1 ...... 45 3.3.1.4. Device 2 ...... 53 3.3.1.5. Device 3 ...... 55 3.4. Uses Cases ...... 57 3.4.1. Device 1 ...... 57 3.4.2. Device 2 ...... 57 3.4.3. Device 3 ...... 58 3.5. Specification ...... 58 4. Frontend ...... 59 4.1. APP -Android ...... 59 4.2. Design ...... 59 4.2.1. Login Page ...... 60 4.2.2. Options & use ...... 61 6

5. Results ...... 66 6. Budget ...... 71 7. Conclusions and future work...... 73 8. Bibliography ...... 76 Glossary ...... 78 Annexes ...... 79 Annex I. Comparison about different application layer protocols ...... 79 Conclusions ...... 87 Annex II. Code ...... 88 PIR sensor ...... 88 HCSR4 sensor...... 89 Camera module ...... 90 GPS module ...... 91 Algorithms of the device 1 ...... 92 Algorithms of the device 2 ...... 98 Android APP ...... 106 LAYOUTS ...... 106 Colors ...... 130 Dimensions ...... 130 Strings...... 131 Styles ...... 132 JAVA code for LCESS ...... 132 Web ...... 164 GET created to pass the values in the DB ...... 169

7

List of Figures

Figure 1. Time Plant ______17 Figure 2. Gantt ______17 Figure 3. Operation mode of typical security system companies ______19 Figure 4. Meeting like an equation of IoT ______21 Figure 5. IoT elements. ______21 Figure 6. Standardization layers in support of IoT (comparison with an ISO model) ______25 Figure 7. Global share of IoT application in projects ______27 Figure 8. Comparison between the IoT application protocols ______29 Figure 9. Clients MQTT subscribed to a topic. ______30 Figure 10. The behaviour of the MQTT clients when one publishes. ______30 Figure 11. Client libraries comparison ______33 Figure 12. First part of LCESS architecture. ______33 Figure 13. Second part of LCESS Architecture.______34 Figure 14. Third part of LCESS Architecture. ______35 Figure 15. LCESS architecture. ______35 Figure 16. Raspberry pi Zero module. ______37 Figure 17. HC-SR501 PIR sensor chosen. ______37 Figure 18. Working of a PIR motion sensor. ______37 Figure 19.PIR motion sensor electrical configuration. ______38 Figure 20. Raspberry Pi PIR motion sensor connection. ______38 Figure 21. Ultrasonic Sensor. ______39 Figure 22. Ultrasonic function. ______39 Figure 23. 5V to 3.3V Voltage Divider. ______40 Figure 24. Raspberry Pi Ultrasonic motion sensor connection.______40 Figure 25. Battery and Power Boost of the LCESS. ______41 Figure 26. Connection between battery, power boost and raspberry ______41 Figure 27. Raspberry Pi Camera module. ______41 Figure 28. Six-inch adapter cable connected in both modules. ______42 Figure 29. Ublox M6 GPS module. ______42 Figure 30. Commands received from the GPS. ______43 Figure 31.Raspberry Pi GPS module connection. ______43 Figure 32. Test of a message using the MQTT application protocol. ______45 Figure 33. MQTT publish message scheme. ______46 Figure 34. MQTT subscribe topic scheme. ______46 Figure 35. Calibrated algorithm scheme of PIR sensor. ______47 Figure 36. Detect movement algorithm of PIR sensor. ______48 Figure 37. Counter algorithm of PIR sensor. ______48 Figure 38.Switch algorithm of PIR sensor. ______48 Figure 39.Calibrate algorithm scheme of HCSR4 sensor. ______49 Figure 40.Distance algorithm scheme of HCSR4 sensor. ______50 Figure 41. Configuration distance algorithm scheme of HCSR4 sensor. ______50 Figure 42. Algorithm of detect distance when movement is detected. ______51 Figure 43. Algorithm of not exceed the distance condition when movement is detected. ______51 Figure 44. Algorithm of not exceed the time condition when movement is detected. ______52 Figure 45. Algorithm of counter when movement is detected. ______52 Figure 46. Algorithm of distance detect when all conditions are accomplish. ______53 Figure 47. Take a picture scheme camera module. ______53 Figure 48. Take number specific of picture scheme camera module. ______54 Figure 49. Video scheme camera module. ______54 Figure 50. Video scheme camera module with choosing a time. ______54 8

Figure 51. Video scheme camera module with transparency. ______55 Figure 52. Algorithm of detect distance when movement is detected and take a photo. ______55 Figure 53. Algorithm of GPS module. ______56 Figure 54. Log in page of LCESS app. ______60 Figure 55. Start page of LCESS app. ______60 Figure 56. Introduction & Start up page of LCESS. ______61 Figure 57. Devices page of the LCESS app. ______61 Figure 58. Functions of JsB LCESS ______62 Figure 59. Functions of JsB- LCESS. ______62 Figure 60. The image and video received from the device. ______63 Figure 61. Functions of JsB-CG LCESS. ______63 Figure 62. Notification of LCESS app. ______64 Figure 63. Log in screen of LCESS Web.______64 Figure 64. Start device screen of LCESS Web. ______65 Figure 65. Inserted and saved values of LCESS Web. ______65 Figure 66. The different components needed to testing the device. ______66 Figure 67. Message controlled for the broker. ______67 Figure 68. Schematic of sensor hardware. ______67 Figure 69. PCB of sensors hardware. ______68 Figure 70. Part I of LCESS hardware. ______68 Figure 71. Part II of LCESS hardware. ______69 Figure 72. 3D Solid Work chassis design. ______69 Figure 73. Result of the 3D printing. ______70 Figure 74. JsB-CG or Device 3. ______70 Figure 74. Standardization of IoT. ______79 Figure 75. Functionality of MQTT, we can see the publishers, subscribers and the broker. ______79 Figure 76. MQTT Architecture diagram. ______80 Figure 77. MQQT message format. ______80 Figure 78. Functionality CoAP. ______81 Figure 79.CoAP Architecture diagram. ______82 Figure 80.CoAP message format. ______82 Figure 81.Functionality XMPP. ______83 Figure 82. Structure of XMPP stanza. ______84 Figure 83. XMPP Architecture diagram. ______84 Figure 84. Functioning of AQMP. ______85 Figure 85. AMQP message format. ______85 Figure 86. AMQP frame format. ______86 Figure 87. Comparison between the IoT application protocols. ______88

9

List of Tables:

Table 1. IoT elements example ______23 Table 2. The IoT elements of LCESS ______28 Table 3. LCESS architecture ______35 Table 4. LCESS system specifications. ______58 Table 5. Price of different components of LCESS. ______71 Table 6. Cost for units. ______72 Table 7. Cost total of each device. ______72

10

1. Introduction

The smart security systems currently are falling behind the technology the market offers. The security systems of the population houses carry on the same point since many years (for reasons unrelated to the objectives of this work), irrespective of some automation houses has just been built, it is difficult to find a house that offers all the possibilities the technology offers us in our daily life (sensors, social communication, apps, etc.). As the moderately advance on the security is rather limited, this project intends to start to apply these opportunities or features in this field.

1.1. Aims of the work

The main goal of this project is the development of a low-cost embedded security system that implements some degree of smartness in order to have more facilities in several aspects of daily life. Additionally, and as a secondary goal, such embedded security system will have small dimensions, the highest possible versatility attainable and easiness of manipulation.

Accordingly, the people not only will be able to know whether there are something “in this site”, so it will be a smart security system capable to establish rules, communicate with other devices, inform the user from different forms or platforms, etc.

This system will be based on raspberry pi Zero, and different sensors that allow us to obtain and send different information. Using a movement and distance sensor, we want to obtain a basic system with smart applications, as camera and GPS to provide the device with the intelligence for obtaining important functionalities that give it more value. Finally, to use it as an embedded device we also include a battery that allows having portable and then we will get a versatile and competitive device in the market.

To accomplish all expectations explained before, we need a system able to configure the device from “other site”, in other words, it will be configurable using internet. For this reason, this device is part of internet of things, and we need an application or platform that allows manipulating it. In this case, an android application will be developed. Accordingly, with the aims of the work the scope is depicted below.

1.2. Scope

Thus, a detailed list of goals is the following:

1.- To build a low-cost security system.

2.- To build an embedded security system. 11

3.- To build a security system able to detect the movement.

4.- To build a security system able to detect the distance.

5.- To build a security system able to take photos.

6.- To build a security system able to record video.

7.- To build a security system able to see this GPS localization.

8.- To build a security system that advertise with any form the movement to the customer.

9.- To build a security system that advertise with any form the distance to the customer.

10.- To build a security system able to set and save the photos or videos and send it.

11.- To build an APP to get the device configured from everywhere (Wi-Fi).

12.- To build an APP which allows to establish rules in the device.

1.3. Requirements and Specifications

Product requirements:

- The LCESS should detect the movement whether it is not much than 6 meters. - The LCESS should measure not much than 4 meters. - The LCESS should be minimum 4 hours of function with battery. - The LCESS should accomplish the objectives proposals.

1.4. Work Plan, Packages and Task Milestones In the preliminary stages of the development, the very first ideas were motivated to provide a simple security system for the author’s surfboard, although in subsequent stages these preliminary ideas were refined by current research practices and topics in the related areas in which such systems are characterized according to the work six work packages detailed below.

12

1.4.1. Work Packages:

Project: Low-cost embedded security system WP ref: (WP1)

Major constituent: Information Sheet 1 of 2

Short description: Planned start date: 5/09/2016 Search information and choose the suitable Planned end date: 30/09/2016 environment to either part of project. Decide the material, architecture, protocols, hardware and Start event: 5/09/2016 software that I will use. End event: 30/09/2016

Internal task T1: Choose the instrument that I will use. Deliverables: Dates: Internal task T2: Choose the OS, the broker, definition Final Design 30/09/2016 and schematic of all communications and the protocols that will use.

Project: Low-cost embedded security system WP ref: (WP2)

Major constituent: hardware prototype Sheet 1 of 2

Short description: Planned start date: 7/10/2016 In a different device, connect the Planned end date: 25/11/2016 correspondent sensors of the prototype. Start event: 3/10/2016 Check the electrical parameters are correct. End event: 25/11/2016 Test if the sensors will do their function well.

Deliverables: Dates: Internal task T1: Check sensors of device 1. 7/10/2016 Internal task T2: Test if the sensors will do their 14/10/2016 function well.

Internal task T3: Check sensors of device 2. 21/10/2016 Internal task T4: Test if the sensors will do their 28/10/2016 function well. Sensor configurations 4/11/2016 Internal task T5: Check sensors of device 3 and connections 11/11/2016

Internal task T6: Test if the sensors will do their function well. 18/11/2016 Internal task T7: Define technical specifications and the expected margins.

13

Project: Low-cost embedded security system WP ref: (WP3)

Major constituent: software and testing Sheet 1 of 2

Short description: Planned start date: 17/10/2016 Define and build the different functions in a python Planned end date: 6/12/2016 language to give the different “smart” functionalities designed for either device. Start event: 17/10/2016 Test the different functions and the final algorithm. End event: 6/12/2016

Deliverables: Dates: Internal task T1: Check algorithm for device 1. 17/10/2016 Algorithms explication Internal task T2: Check algorithm for device 2. 31/10/2016

Internal task T3: Check algorithm for device 3. 20/11/2016

Project: Low-cost embedded security system WP ref: (WP4)

Major constituent: front-end Sheet 1 of 2

Short description: Planned start date: 14/10/2016 Configure a broker, and check the different points of Planned end date: 10/1/2016 communication. Start event: 14/10/2016 Build an android APP to test the Device. End event: 10/1/2016

Deliverables: Dates: Internal task T1: Check the communication with the 14/10/2016 broker from the different points.

Internal task T2: Build a first version APP to configure 20/10/2016 the device 1, check the communication between phone and the broker. Internal task T3: Build second version APP to configure App design 4/11/2016 and the device 2. functionalities. Internal task T4: Build third version APP to configure 30/11/2016 the device 3.

Internal task T5: Build fourth version APP to can configure the device, with Wi-Fi direct. 20/12/2016

14

Project: Low-cost embedded security system WP ref: (WP5)

Major constituent: Hardware Sheet 1 of 2

Short description: Planned start date: 1/12/2016 Connect the different devices with their battery and Planned end date: 6/1/2016 charge to obtain the final version of devices and check their electrical parameters. Start event: 1/12/2016 End event: 6/1/2016

Deliverables: Dates: Internal task T1: Join the devices. 30/11/2016

Internal task T2: Check the consumption and the 15/12/2016 guarantees we will offer.

28/12/2016 Internal task T3: Do functional tests.

Project: Low-cost embedded security system WP ref: (WP6)

Major constituent: Chassis and final prototype Sheet 1 of 2

Short description: Planned start date: 4/01/2017 Build a chassis to have a protected and portable Planned end date: 10/01/2017 device. Start event: 4/01/2017

End event: 10/01/2017

Deliverables: Dates: Internal task T1: Build a PCB to put the different 4/01/2017 sensors and solder components.

Internal task T2: Build a chassis and get the product 7/01/2017 finished.

10/01/2017 Internal task T3: Write the thesis

15

1.4.2. Milestones

WP# Task# Short title Milestone / deliverable Date (week)

1 1 Instrumentation final Design1 30/09/2016

1 2 Configurations final Design1_1 30/09/2016 2 1 Device 1 sensor configurations & 7/10/2016 connections1

2 2 Testing sensor conf. & conn. 1_1 14/10/2016

2 3 Device 2 sensor conf. & conn. 1_2 21/10/2016

2 4 Testing sensor conf. & conn. 1_3 28/10/2016

2 5 Device 3 sensor conf. & conn. 1_4 4/11/2016

2 6 Testing sensor conf. & conn. 1_5 11/11/2016

2 7 Technical specifications sensor conf. & conn. 1_6 18/11/2016

3 1 Algorithm 1 algorithms explication1 17/10/2016

3 2 Algorithm 2 algorithms explication1_1 31/10/2016

3 3 Algorithm 3 algorithms explication1_2 20/11/2016

4 1 Checked app design and functionalities1 14/10/2016 communication 4 2 Android APP 1 app design and functionalities1_2 20/10/2016

4 3 Android APP 2 app design and functionalities1_3 4/11/2016

4 4 Android APP 3 app design and functionalities1_4 30/11/2016

4 5 Android APP 4 app design and functionalities1_5 20/12/2016

5 1 Join together 1/12/2016

5 2 Checked technical 15/12/2016 specifications 5 3 Tests 28/12/2016

6 1 PCB schematic 4/01/2017

6 2 Chassis schematic 7/01/2017

6 3 Write the thesis Write final document. 10/01/2017

16

1.4.3. Time plan and Gantt diagram

The time plant and Gantt diagram is the following one:

Figure 1. Time Plant

Figure 2. Gantt

17

1.4.4. Deviations from the plan In the second package, the hardware prototype, we had problems with the antenna of GPS sensor, but, with work, this problem will be resolve it. In the same line, we had some problems to send images using MQTT (Message Queue Transport Telemetry) protocol. Then, we implemented other solutions, but we want to arrive this solution, too. In the fourth package, we had some problems, but we have an app enough complete, to be able to do, a better look-and-feel. When we delivered the file, project plan & Work plan, we did not know the delivery dates, so the final dates were representative. Then the two last packages, have been done days before to ensure to have the final device before the deadline.

1.5. Organization of this document This document describes the demonstration of developing a IoT project can be launched to the market and is divided in the following sections:  State of art provides an overview of the general concept of this thesis, enabling a deeper understanding of the material at hand. It starts with a description of IoT, and continue with the important theoretical of all components needed to realize this technology.  Methodology defines the device, protocols and software used to develop it. It holds technical details about the environment, the set-up, and the integration in the device.  Frontend shows how the device works, how to use it, and some information of it.  Results contains the information about the results of the project and a short demonstration of a basic use cases.  Budget contains the information from the point of view of the cost management.  Conclusions and future directions states the most important features of the work done and points out to further developments.

18

2. Security embedded systems: a review of the state of the art

2.1. Background

From many years ago, the human being keeps an extra eye on his safety, being a necessity/vulnerability to satisfy/cover. Where in the past it was to place snares, nowadays it is to place advanced security systems. These systems can be used in business places, private property or public property, etc. Although the functionality of security system has changed over years, recent technology advances have allowed an important increase in their complexity.

Security systems have been offered as a service by companies specialized in this sector, which it has advantage and some handicaps. Some Spanish companies in this sector are among others: Securitas Direct, Prosegur and Verisur. The common point these companies have is their mode of operation: when clients acquire their systems, the companies install all the necessary including some extra features like 24/7 monitoring, the advertisement to police, etc. In particular, they have assistants that, in case of any unexpected event, notify it.

Figure 3. Operation mode of typical security system companies

Thinking in all explained before, it is inevitable to wonder about the money we have to pay every month, and here there is the handicap mentioned before. This is one of the reason for the creation of the LCESS device, with whom we will be able to realize all functions that these companies offer without the need of any assistant or operator. Then, clients only will pay once since it would not require a monthly payment. Accordingly, we will try to obtain the most competent possible price and to develop a system whose installation, use, effectiveness and manageability would be the most intuitive possible.

Another big advantage, and one of the reason why we decided to build this device, is the capacity to keep an eye one any physical element or object in any moment. Not just to prevent a robbery, but also for fear of leaving it, to prevent anybody to touch it in an inappropriate moment (formatting laptops, downloads, material drying, etc.), and to warn anybody to enter in a dangerous site (it is wet, for cleanliness, it is in construction, etc.).

19

We can see the different application to open the mind to the multiple capacities that this device could offer: control of persons, vigilance and control of different aspects in an industry, stablish rules for vigilance and more examples. Once we have the device we will build and the technology we will use, we need to study all theory to do an elaborate IoT device.

2.2. Origin of the IoT

Nowadays, IoT concept is becoming popular although it is not new in terms of technology. The idea of IoT could be traced back to the article “the ‘only’ coke machine on the internet” (Camegie Mellon University 1982) which was written in the first person perspective about how an internet-connected coke machine functions. Later on, more studies done from different approaches shaped the vision in which things start to think.

It was in 1999 when Kevin Ashton introduced a new paradigm named Internet of things (IoT) [1]. He presented the imagination of an internet connected physical world based on real-time feedbacks through new technology. The idea of IoT is “the pervasive presence around us of a variety of things or objects which, through unique addressing schemes, are able to interact with each other and cooperate with their neighbours to reach common goals” [2].

At the beginning, IoT was focused on RFID but, currently, now this concept has been changing. Besides using RFID, things identifiers could be achieved through other technologies, as we will see later.

2.3. What is IoT

There are many definitions for this concept: all big companies like Google, IBM, Intel, etc. have their particular definition. The definition of IoT is still rather fuzzy on demanding subject to philosophical debate.

Mckensey & Company defines it as: “As objects become embedded with sensors and gain the ability to communicate, the new information networks promise to create new business models, improve business processes, and reduce costs and risks.” [3] or

Recommendation ITU-T Y2060 defines it as: “A global infrastructure for the information society, enabling advanced services by interconnecting (physical and virtual) things based on existing and evolving interoperable information and communication technologies”. [4]

IoT consists in taking advantage of all data that the “Things” can provide in order to improve the interaction between them and what it is around them. Through the exploitation of identification, data capture, processing and communication capabilities, the IoT makes full

20

use of things to offer services to all kinds of applications, whilst ensuring that security and privacy requirements are fulfilled.

Figure 4. Meeting like an equation of IoT

When we talk about things, we are thinking on physical or virtual objects that are capable of being identified and integrated into communication networks. More concretely, physical objects are capable of being sensed, actuated and connected; and virtual things exist in the information world and are capable of being stored, processed and accessed.

We could also talk about devices, which are not the same as things. Devices are part of the equipment and have all capabilities of communication, sensing, actuation, data capture, data storage and data processing.

From a broader perspective, the IoT can be perceived as a vision with technological and societal implications and the elements that produce it are the following ones described below.

2.3.1. IoT elements

Understanding the IoT building blocks helps to gain better insight into the real meaning and functionality of the concept.

Figure 5. IoT elements.

IoT elements are the following:

Identification: It is crucial for the IoT. Objects have to be identified by their name, the ID, and, they must have an address to be addressed with a communication networks.

Sensing: It means gathering data from related objects within the network and sending it back to a database or cloud. The collected data is analysed to take specific actions based on required services.

21

Communication: The IoT communication technologies connect heterogeneous objects together to deliver specific smart services. Typically, the IoT nodes should operate using low power in the presence of lossy and noisy communication links.

Computation: This is the processing units and applications software used for a brain or computational ability of IoT. (e.g., microcontrollers, microprocessors, SOCs, FPGAs). These brains are directly related to OS.

Services: There are four different services: Identity-related Services, Information Aggregation Services, Collaborative-Aware Services and Ubiquitous Services.

Semantics: The ability or capacity to extract knowledge smartly using resource, modelling information and also recognizing and analysing data in order to make sense of the right decision, to provide the required services.

In the table 1, we can see different samples of IoT elements to understand them better.

2.3.2. Communication models

From an operational perspective, it is useful to think about how IoT devices connect and communicate in terms of their technical communication models [5]1.

Device-to-Device Communications: It represents the direct communication between one or more devices that are connected to each other. It communicates through Z-Wave, Bluetooth, ZigBee, etc.

Device-to-Cloud Communications: It represents the direct communication between device and internet cloud. The devices are connected directly with the cloud, like an application service provider, to exchange data and control message traffic. It communicates through Ethernet, Wi-Fi, SigFox, etc.

Device-to-Gateway Model: The device connects directly to a ALG (device-to-application- layer gateway) service, that is to say there is an application software operating on a local gateway device, which acts as an intermediary between the device and the cloud service and provides security and other functionalities such as data or protocol translation. It

1 In March 2015, the Internet Architecture Board (IAB) released a guiding architectural document for networking of smart objects (RFC 7452),39 which outlines a framework of four common communication models used by IoT devices. The discussion below presents this framework and explains key characteristics of each model in the framework.

22

communicates through Ethernet or Wi-Fi to the local gateway and through IPv4/IPv6 to the application server provider.

Back-End Data-Sharing Model: It represents an architecture of communication that permits users to export and analyse data from the devices that are connected directly to the cloud or locally. This architecture allows the user to grant access to the uploaded sensor data by third parties for their particular use.

The four basic communication models demonstrate the underlying design strategies used to allow IoT devices to communicate.

IoT elements Samples

Computation Identifier EPC2, uCode3.

Addressing IPv4, IPv6.

Sensing Smart Sensors, Wearable Sensing devices, Embedded Sensors, Actuators, RFID tag.

Communication RFID, NFC, UWB, Bluetooth, BLE, IEEE 802.15.4, Z-Wave, WiFi, WiFi-Direct, LTE-A, ZigBee, sigfox .

Computation Hardware SmartThings, Arduino, Intel Galileo, Raspberry Pi, Phidgets, Smart phones

Software OS (Contiky, TinyOS, LiteOS, Android) Cloud (Hadoop, Nimbits, etc)

Service Identity related, Information, Smart Home, Smart Security, Smart safety

Semantic XML, JSON, EXI, RFD

Table 1. IoT elements example

2The Electronic Product Code (EPC) is designed as a universal identifier that provides a unique identity for every physical object anywhere in the world, for all time 3 uCode: identification number system that can be used to identify things in the real world uniquely. 23

2.3.3. IoT communication standards

The IoT device is always communicating with the “getaway” to send the data to a real-time service. So, we need a communication protocol able to save up as maximum battery as possible. For this reason, we have done a study to choose to make the standard protocol will use (Annex I.).

The common standard protocols for IoT which pretend to make easy and simplify the function of application's and service's providers are the next ones.

2.3.3.1. Application protocols

 Constrained Application Protocol (CoAP)

 Message Queue Telemetry Transport (MQTT)

 Extensible Messaging and Presence Protocol (XMPP)

 Advanced Message Queuing Protocol (AMQP)

 Data Distribution Service (DDS)  The Hypertext Transfer Protocol (HTTP) These protocols make of the following services and infrastructures already defined.

2.3.3.2. Service Discovery Protocols

The most used protocols are multicast DNS (mDNS) and DNS Service Discovery (DNS- SD) which are able to discover resources and services offered by IoT devices.

2.3.3.3. Infrastructure Protocols

 IEEE 802.15.4: It is a standard which specifies the physical layer and media access control for low-rate wireless personal area networks (LR-WPANs). The IEEE 802.15 working group maintains it. It is the basis for more specifications, each of which further extends the standard by developing the upper layers, which are not defined in IEEE 802.15.4.  IPv6: It is an Internet Layer protocol for packet-switched internetworking and it provides end-to-end datagram transmission across multiple IP networks.  6LoWPAN: it is an acronym of IPv6 over Low Power Wireless Personal Area Networks, an adaption layer for IPv6 over IEEE802.15.4 links.

24

 Bluetooth: It works in the 2.4 GHz ISM band and uses frequency hopping, with a data rate up to 3 Mbps and maximum range of 100m. Each application type that can use Bluetooth has its own profile.  ZigBee: This protocol uses the 802.15.4 standard and operates in the 2.4 GHz frequency range with 250 kbps. The maximum number of nodes in the network is 1024 with a range up to 200 meter. ZigBee can use 128 bit AES encryption.  Z-Wave: It is a wireless communication protocol for automation smart homes.  LTE-A: It is a mobile communication standard, an improved version of LTE to have an extended coverage, higher throughput and lower latency.  WiFi-Direct: It is used for peer-to-peer communication without the need of having an access point.

Figure 6. Standardization layers in support of IoT (comparison with an ISO model)

2.4. Fundamental characteristics and requirements for IoT

The IoT device must accomplish the next characteristics and requirements:

2.4.1. Characteristics

 Interconnectivity: Regarding IoT, to ensure the interconnection with the global information and communication infrastructure using communication models.  Things-related services: IoT provides thing-related services within the constraints of devices, such as privacy protection and semantic consistency between physical things and their associated virtual things.  Heterogeneity: IoT is heterogeneous because it is based on different hardware platforms and networks. Devices can interact with other devices or service platforms through different networks.

25

 Dynamic changes: The data of IoT devices change dynamically, connected and/or disconnected, as well as the context of devices including location and speed.  Enormous scale: Some years from now, it is estimated there will be billions of things that need to be managed and communicate with each other. Then, the management of the data generated and their interpretation for application purposes will be even more critical.

2.4.2. Requirements

 Identification-based connectivity: It is necessary that the IoT support the connectivity between a “thing” and the IoT, which, is based on the things identifiers (ID). Moreover, it includes that possibly heterogeneous ID's of different things are processed in a unified way.  Interoperability: It is necessary to guarantee the interoperability, in other words, the IoT systems need to be able to “talk the same language” of protocols and encodings.  Autonomic networking: It is necessary to support the network control functions of the IoT, in order to adapt them to different application domains, different communication environments and a large number and types of devices. It is needed to make this requirement usable for any user in any place. (Autonomous networks that include self-management, self-configuration, self-healing techniques, self-optimization and self-protection and / or mechanisms).  Provision of regional services: These services catch, automatic data processing and communication, all based on the rules configured by the IoT device fabricant.  Location-based capabilities: They must be IoT-compliant. They have to sense and track the location information automatically.  Security: It is necessary that each 'thing' is connected blocking rise to important security threats, as confidentiality, authenticity and integrity of data and services.  Privacy: The IoT has to support the privacy protection during the transmission of data, Aggregation, storage, extraction and processing. The protection of Authentication of the data source.  High quality and highly secure human body related services: It should be kept in mind because there are different laws and regulations depending on the country.  Plug and play: To achieve an on-the-fly generation, composition or acquiring of information for a good integration of interconnected things with applications.  Manageability: To ensure normal network operations, besides all IoT works automatically.

26

2.5. Applications of IoT

The Applications of IoT are distributed in different scenarios because their devices are limited, and it is difficult to classify all (e.g. smart home, smart city, connected car, connected worker, connected industry, etc.).

In the next figure, we can see the most important segments on which we can use the IoT. More concretely this analysis is done for Q3 of 2016, and it shows that these devices are increasing.

Figure 7. Global share of IoT application in projects

2.6. Safety and Safety System

We can understand safety as: “The state of being "safe" the condition of being protected from harm or other non-desirable outcomes. “

We want to use the IoT to cover this necessity and protect us about any outcomes, then we can understand safety as: “A set of devices strategically placed in the perimeter of a specific site to detect the presence, irruption, or invasion of an unknown person of an individual who does not have improper access.”

Once seen the main definitions necessary to understand all theory of the idea, we can start to describe how the device will be and what sensors, communication, getaways and particularities must be included.

27

2.7. A low Cost Embedded Security System

In this section, it is established a link between the previous theoretical framework and the product specifications: components, architecture, protocols and formats needed to build a low-Cost embedded security system are defined.

After a careful analysis of the specifications, the following methods and hardware can be obtained:

- Movement or presence detect.

- Distance detection.

- Camera to capture pictures in real-time. (This system also permits the visualization of the camera when the customer will want.)

- Camera to capture videos in real time.

- Localization.

In table 2, we can see the different components necessaries to build the system.

IoT elements LCESS

Computation Identifier LCESS

Addressing IPv4

Sensing Wi-Fi

Communication Embedded Security System, Smart security system

Computation Hardware Raspberry Pi Zero

Software Linux-Debian 8.0

Service Smart Home, Smart Security, Smart safety

Semantic JSON

Table 2. The IoT elements of LCESS

After we study the application protocols done (Annex I.), as depicted in the figure 8, we conclude:

28

Figure 8. Comparison between the IoT application protocols

Finally, we just have two protocols that offer us the best solutions due to their advantages, MQTT and CoAP, MQTT gives flexibility in communication patterns and acts purely as a pipe for binary data and CoAP is designed for interoperability with the web. Then, we have chosen the MQTT protocol to implement the solution for the next reasons:

 There are more libraries in python to use MQTT.  It is an open source.  It is one of the most used in IoT.  There is a good documentation in the web to help us.  It is a robust protocol.  The most important reason is for the use of the battery, as the size of the message in this protocol is smaller.

As our project will be controlled by an app, the battery is an important point, whether not the most important.

2.7.1. How works MQTT

MQTT is a messaging protocol that allow us to public and subscribe in a topic or also known it like a queue of messages (originally developed by IBM).

2.7.1.1. Architecture

MQTT has a client/server model on which each device using this protocol acts like a client that needs to be connected to a server, known as a broker over TCP.

Each message of the client is published in a topic created in a broker, whereby each device it subscribes to a topic, and like this, they receive the message that is published by any devices always that one publishes it.

As depicted below, we can see the different clients subscripted in a temperature topic. 29

Figure 9. Clients MQTT subscribed to a topic.

For example, whether the client A publishes a value of 22.5 for topic temperature. The broker forwards the message to all subscribed clients.

Figure 10. The behaviour of the MQTT clients when one publishes.

This functionality allows MQTT clients to communicate one-to-one, one-to-many and many- to-one. Moreover, it is important to comment that MQTT has a hierarchical topic like a filing system (e.g. Temperature/plant1/room1) motherless, it ensures privacy, the TCP connection may be encrypted with SSL/TLS.

30

3. Methodology / project development:

This chapter explains and describes the different pieces and components that are involved in the project: hardware, software, sensors, material, architecture, protocols, frontend, backend and formats used to build and represent the LCESS.

3.1. Design

To cover all the goals, we have defined in the introduction we need apply the state of art with the correct architecture, protocols and formats. For the broker, the device and the app and the solution proposed, the following questions would be answered:

1. How do we get a low-cost and embedded security system?

2. Which are the best sensors to meet the needs or goals?

3. Which is the best camera to meet the requirements?

4. Which is the best GPS to meet the requirements?

5. How will be the communication of the device? Why will be like this?

6. How will be the communication of the mobile app? Why will be like this?

7. Which is the best way to fit together all pieces of LCESS puzzle?

Then we need to answer these questions, in this way we will define our design comprehending all parts and building thus, the architecture.

3.1.1. Architecture of the project

1. To fulfil the first two goals, we need to use a SBC (Single Board Computer) that allows us deploying all codes, software and scripts. The best option is to use the cheapest SBC in the market: The Raspberry Pi Zero.

2. To fulfil the third and fourth goals, we need to use a sensor that is able to detect movement and compute distance. There are many sensors in the market that could be used, but our decision was made based on two requirements: the sensor must to be compatible with the SBC and must have a low price. Finally, we have chosen the PIR (Passive Infrared Sensor) HC-SR501 sensor for the movement detection and the ultrasonic HC-SR04 sensor for the distance computation.

3. To fulfil the next two goals, we need to use a camera to capture images and reproduce them; unlike the sensors, the camera is more limited, because there are not many

31

compatibles with the raspberry pi having a low price. Accordingly, we have chosen the Raspberry pi camera Rev 1.3.

4. To fulfil the goal of the GPS, we need to use a low cost sensor. For compatibility issues, we have chosen the low-cost GPS sensor known as Ublox M6, which has the smallest antenna in the market.

5. To fulfil the communication specifications of the device and the app mobile, we need to use a protocol that minimizes power consumption and works with images. Consequently, the MQTT application layer protocol and Wi-Fi for the network layer were chosen.

7. To have all the above components functionally operative as an embedded system, a PCB (Printed Circuit Board) will be designed; also a chassis have been included for protecting the board and for improving the final presentation of the product.

Once explained all we have used, we can start to join all pieces and we can explain all steps in more detail. First, we need to prepare the SCB to be able to build run software’s programing, scripts and deploys in them. To do it, we choose install the NOOBS: [16] out of the box software with the Raspbian (it is a Debian Linux distribution) [17] OS in the raspberry pi. With this setting up we will be able communicate with MQTT [18], we need a broker permit us publish and subscribe in the topics with which we can communicates with other devices. Here [19] we can see that there are more public brokers like Eclipse, Mosquitto, Rabittmq, etc. but the problem is that they are often useful for testing and prototyping, but not for working in a project and offering a service. Moreover, the MQTT foundation announces that: “none of these test brokers carry any guarantee of service. Be sensible when using them and don't break things for others”. Then, we can be testing the device with these brokers but to use a product and give a robust service we need to build a broker with which we can control the message and then, offer a reliable system to the clients. There are more brokers or servers [20] that works with this protocol to guarantee M2M and M2D communication, but finally, we have chosen Mosquitto [19] Broker because is an Open Source and it has many client libraries. Generally, all scripts we deploying to carry out the device are written in python. For this programing language Mosquito provide us the Eclipse Paho library.

32

Figure 11. Client libraries comparison

Mosquitto message broker does the server function, and manages and controls all information coming from de security device, which does client function in this case. Obviously all device or smart objects that we will connect here, act as a client (e.g. android app). Step by step we go constructing the architecture of the project as noted in the figure below.

Figure 12. First part of LCESS architecture.

In the Second part, we will talk about how will do a device and, in this case, we prepare the raspberry to be able to carry out all requirements of the LCESS. We can use different OS than the server because we only have these limitations:

1. Putting python IDLE.

2. Putting an editor text.

3. Interfaces of networks.

4. Interfaces of USB2.0 ports (Universal Serial Bus).

5. Interfaces for CSI (Camera Serial Interface), using I2C to control module.

6. Interfaces for GPIO (general purpose input/output).

33

The best solution4 is to use the same OS than the server, the Raspbian or Debian and in future to realize a maximum cleaning of all packages and software’s to prepare the OS to run with the minimum resources possible (this step would be needed only for commercializing the product).

Figure 13. Second part of LCESS Architecture.

The last point of the project is the second client of the broker, the app mobile, that permit us to manage, control and manipulated the device.

Mobile app is an informatics application designed for a smartphone, tablets or other mobile devices that allows the user to use it and do a particular task of any type.

There are three models to implement it: the native app, hybrid app and app web. We need to deploy a native app, because we need to use all resources of the system like the hardware, and this is the only model that guarantee this advantage.

Now, we do not choose the OS mobile. We choose Android because, as we see in the comparison clients of Mosquitto broker, it has implemented a client library to use it.

4 We have studied the possibility of using TinyOS, contikiOS or LiteOS OS (Open software) but they do not support the libraries that we need to use. We could build these libraries from the beginning, but this would be another TFG.

34

Figure 14. Third part of LCESS Architecture.

We summarize all the relevant information generated in a table to allow everyone to consult it in any moment and to follow better the document.

SERVER DEVICE APP

SCB OS Broker Architecture SCB OS Application Architecture Mobile OS Application Architecture Protocol Protocol

Raspberry Debian Mosquitto Server

Pi B Raspberry Pi Debian MQTT Client Smart Android MQTT REST Zero phone

Table 3. LCESS architecture

Once we have clarified how the architecture of our project will be, we also capture all in a figure 15 to keep it always in mind.

Figure 15. LCESS architecture.

Now, we have stablished all parts needed in architecture of the device. The last step we have not taken into account yet is the use of IoT platforms, also known as a IoT PaaS (Platform as a Service [22]), like Clayster [40], oracle [41], Sap Hana [42], Thethings [43], 35

etc. These are very useful and we can divide into two bags, the ones for big projects like smart city, connected workers of an industrial unit, etc. And the other ones more basic, useful to connect some sensors and to offer their platform, which it is usually a dashboard customizable to a web app, it is nice-looking to use sensors in easy way, mainly to prototype a concept of device or to test in user level. What we mean by this, is that for this project, we can decide all parts, without using any others software layers or restrictions in user layer, since we only can manipulate the dashboard or interface given by the platform distributor. Although, one of the more important point of this work is the versatility of the device, and this is only accomplished whether we configure all alternatives ourselves. This could be a facility for the work, but it is mandatory to design have the minimum restrictions possible, without ties of a specific software or a “brand”. Then, what we mean with this, is that the proposal solution is not be the using of a platform to not associated or obligated to use its protocols (been good or wrong for our project), its decisions of design, its architecture, etc. Next, let us define how we deploy it and the tool used.

3.2. Development

3.2.1. List of equipment

-Raspberry pi zero. -SD Card (necessary to host the OS and all data generated). -PIR sensor. -HCSR04 Ultrasonic Distance Sensor. -Battery charge & Power Boost. -Camera. -Six-inch adapter cable. -GPS. -Chassis.

3.2.2. Device 1

3.2.2.1. Raspberry pi zero

Raspberry Pi measures only 65 mm long per 30 mm wide and 5 mm deep. The Raspberry Pi Zero supports mini connectors to save on space and the 40 pin GPIO is unpopulated providing the flexibility to use only the connections your project requires.

 1Ghz, Single-core CPU

 512MB RAM 36

 Mini HDMI and USB ports

 Micro USB power

 40-pin header (GPIO, UART, ISP, I2C)

 Composite video and reset headers

Figure 16. Raspberry pi Zero module.

3.2.2.2. Movement infrared sensor, HC-SR501

 Definition

The infrared movement sensor “PIR” allows to detect whether exists movement in its functional area. All objects emit a little quantity of infrared radiation, and as more hot it is, more radiation emits.

Figure 17. HC-SR501 PIR sensor chosen.

 Operation

In the figure 18, we can see the operation of the sensor, which uses a Fresnel lens that would work in an area with 6 or 7 meters of radius detection.

Figure 18. Working of a PIR motion sensor.

 Tuning & Electrical wiring diagram

37

PIR motion sensors can even adjust the delay at which the sensor outputs a high signal at the expense of compromising the accuracy. You just need to turn the two knobs on the sensor counter clockwise5. The distance can variate from 7 meters to 3 meters and the sensibility can variate from 300 seconds to 5 seconds Error! Reference source not found..

The sensor also has a parameter, the jumper set, that offers us repetitive trigger, but we only need a single trigger.

Figure 19.PIR motion sensor electrical configuration.

Finally, to test and to adapt the sensor to the comportment we want to give to the device, the connection with raspberry is the following one:

Figure 20. Raspberry Pi PIR motion sensor connection.

Once we have made this, the next step is to check the responsible algorithm to detect the movement and to adjust this algorithm both in terms of software and hardware.

5 In our case we turn the knobs on lower delay and hard sensitivity. 38

3.2.2.3. Ultrasonic sensor HC-SR04

 Definition

The ultrasonic ranging module provides 2cm – 400cm non-contact measurement function, and the ranging accuracy can reach to 3 mm. to measure distance. The module includes ultrasonic transmitters, a receiver and a control circuit.

Figure 21. Ultrasonic Sensor.

 Operation

It works sending to IO trigger, with a high level signal, at least 10 µs, through one of its cylinder which composes the sensor, and waits that sound rebounds in an object and returns. The return is captured by the other cylinder Error! Reference source not found..

We know the velocity of the sound in the air is of 340 meters per second, then, whether we calculate the time elapsed between the sent pulse and the returned signal, we can apply the math equation to obtain the distance between the sensor and the object.

Summarizing, below we can follow the functioning and compare it with figure 20:

1. To provide trigger signal to TRIG input, it is required a high signal of at least 10μS duration.

2. This enables the module to transmit eight 40KHz ultrasonic burst.

3. Whether there is an obstacle in-front of the module, it will reflect ultrasonic waves.

4. Whether the signal comes back; the ECHO output of the module will be high during the duration of time taken for sending and receiving ultrasonic signals. The pulse width ranges from 150μS to 25mS depending on the distance of the obstacle from the sensor.

Figure 22. Ultrasonic function. 39

340 m Equation to obtain the range: the range = time (seg) * seg 2

 Tuning & Electrical wiring diagram

Voltage divider: the ECHO output is of 5V. The input pin of Raspberry Pi GPIO is rated at 3.3V, so 5V cannot be directly given to the unprotected 3.3V input pin. Therefore, we use a voltage divider circuit using appropriate resistors to bring down the voltage to 3.3V.

Figure 23. 5V to 3.3V Voltage Divider.

340 m The following equation can be used to calculate resistor values, the range = time (seg) * seg 2

Finally, to be able to test and adapt the sensor to the comportment we want to give to the device, the connection with raspberry is the following:

Figure 24. Raspberry Pi Ultrasonic motion sensor connection.

Once we have done this. the next step is to check the responsible algorithm to detect the movement, adjusting this algorithm both in terms of software and hardware.

3.2.2.4. Battery Charge & Power Boost

 Definition

Battery charge: Lithium-polymer battery produced by SHENZHEN PKCELL BATTERY CO., LTD [26].

LP-803860 3.7V 2000mAh with PCM (phase-change material), which is a protection of charge and discharge, is used to supply the device of electricity and to do it autonomous.

40

Power Boost: To be able to charge the device while it is functioning we need to use a power boost. Whether we do not use it, we will use only the device plugged to the power or to the battery (charged previously). To be able to do it all at the same time (without charging the battery previously), we need to use the power boost.

Figure 25. Battery and Power Boost of the LCESS.

 Tuning & Electrical wiring diagram

Figure 26. Connection between battery, power boost and raspberry

3.2.3. Device 2

3.2.3.1. Camera

 Definition

A camera is an optical instrument to record or to capture images, which may be stored locally, transmitted to another location, or both. The images may be photographs or sequences of images constituting videos. The camera is a remote sensing device, as it senses subjects without physical contact.

Figure 27. Raspberry Pi Camera module.

41

 Operation

The camera board is a small PCB that connects to the CSI camera module interface Raspberry Pi using a short ribbon cable. It provides connectivity for a camera capable of capturing images or video recordings. The camera connects to the Image System Pipeline (ISP) in the Raspberry, where the incoming camera data is processed and eventually converted to an image or video on the SD card (or other storage).

 Tuning & Electrical wiring diagram

A custom six-inch adapter cable, which converts the fine-pitch connector format to the coarser pitch, used by the camera board:

Figure 28. Six-inch adapter cable connected in both modules.

3.2.4. Device 3

3.2.4.1. GPS Module Ublox M6

 Definition

GPS (Global Positioning System) tracker or GPS module is a device able to track “anything” (a person, a vehicle or an object) through geographic coordinate (latitude and longitude) in real time due to a dispositive that can communicate with a network constituted by 24 satellites that orbits at 21.000km above our head with a trajectory synchronized to cover all the surface of the Earth.

These satellites use the triangulation to determine the position of the locator with an accuracy of little meters.

Figure 29. Ublox M6 GPS module. 42

 Operation

The NEO-6M GPS is compatible with the NMEA (National Marine Electronical Association) protocol [25]. Once the GPS is connected in a serial interface, it sends a series of commands every second following the protocol depicted in figure 30. The user has to design a software able to recognize these commands.

Figure 30. Commands received from the GPS.

In this case, all commands start with a symbol: $ and end with a checksum and line break. It is important to say that the module only provides us validate data whether the green led is open in intervals of one second approximately.

 Tuning & Electrical wiring diagram

To communicate Ublox M6 with the Raspberry and to be able to test and adapt the sensor to the comportment we want to give to the device, the connection with raspberry is the following one:

Figure 31.Raspberry Pi GPS module connection.

Once done it, the next step is to check the algorithm responsible of detecting the movement and adjust this algorithm both in terms of software and hardware.

3.3. Engineering & implementation.

We have all the components connected, their protocols and compatibility established. Therefore, all we have to do is to start developing and fighting with the device, to be able to make it as we have defined it in the objectives.

43

As we do not use any platform, it will be built step by step. At the end, platform is a system that it is used as a base to make function certain modules both hardware and software with which they are compatible. That is what we are going to do.

3.3.1.1. Raspbian

Before all, we need to install our OS both for the server and for the device, therefore, here we explain how we have done it:

1. We downloaded the Raspbian official system [17]. 2. We put it in the SD card, which has to be in FAT32 (File Allocation Table) format. 3. We started to use it.

Then, we can install the broker, which will act as a server.

3.3.1.2. Broker

As we have commented before, we will use Mosquitto to control our messages. Then, we will proceed to install it.

First we need to import the repository package signing key: wget http://repo.mosquitto.org/debian/mosquitto-repo.gpg.key sudo apt-key add mosquitto-repo.gpg.key

Next, to make the repository available to apt and download the Jessie Debian version of the mosquito. cd /etc/apt/sources.list.d/ sudo wget http://repo.mosquitto.org/debian/mosquitto-jessie.list

Then, we update apt information and we install it: apt-get update apt-get install mosquito

Once we have installed the broker, we install the mosquitto-clients, which is a command line clients, very useful in debugging, and python-mosquitto, which is the Python language bindings: sudo apt-get install mosquitto mosquitto-clients sudo apt-get install mosquitto python-mosquitto

Finally, we only need to verify the correct functioning of the broker, then, we open two terminals: in one, we subscribe in a topic and, in the other one we publish a message.

44

Figure 32. Test of a message using the MQTT application protocol.

We have prepared the server and the dispositive to start the logical programing and communication in python language. With this language, it is more useful to use the Mosquitto broker, since the only needed is a script, which is responsible of sending a message to a topic, and the only we will do is to launch this script when we need it.

3.3.1.3. Device 1

We start to specify all the steps we will follow to implement the first device; we need to implement the script that permits us to publish in the broker. In our case, we publish in a topic called “Device1”, and here, it will be where subscribe all devices that the data of the first device needs to stablish behaviour rules, predictions, etc.

We use the Paho_mqtt and JSON libraries to implement the script6, which permits us to communicate with the server and to send data in JSON format.

As depicted in the figure 33, we use a host, the IP of my local MQTT broker, but also we could use "iot.eclipse.org" public broker as we have explained in 2.1. Moreover, we have written an example of message in JSON format to make clear how it is.

6 It is necessary to comment that this script will be also used for publishing in other topics like "Device 2" or "Device 3".

45

Figure 33. MQTT publish message scheme.

Second, we need to implement the script that permits us to subscribe to the broker. In our case, we subscribe it in a topic called “Device1”.

Figure 34. MQTT subscribe topic scheme.

Once done, we can start to implement the first sensor, the PIR part.

46

 PIR sensor

As it works with 5V, we connect VCC with GPIO2 of raspberry to provide the voltage supply, GND with GPIO6 and the output with GPIO7. As this pin provides 3,3V when detects movement and 0 V when not, we can use the GPIO library to build the algorithm.

The main functionality of this sensor is to detect whether there is someone in this room, site, etc. then we use this sensor to build the following specific functions:

1. Sensor calibration: We need to calibrate the sensor to ensure a correct functioning of this. In the datasheet, it is recommended to wait one minute from the powered up of the sensor to use it, since it is the time expected to be taken in standby state. But, after proving its functioning more than 2 months, using always this function, we reduce this time to 20 seconds, since it always has functioned correctly. Once detected the movement, this function returns 1 whether it is calibrated and 0 whether not.

Figure 35. Calibrated algorithm scheme of PIR sensor.

2. Movement detection: To detect whether any person enters in an indoor or outdoor location and returns to 1 whether the movement is detected and to 0 whether not.

47

Figure 36. Detect movement algorithm of PIR sensor.

3. Object counting: To count every movement detected, this function is able to launch passing the time parameter. This variable is the duration of the counter function. This function returns to the number of the counter.

Figure 37. Counter algorithm of PIR sensor.

4. Switch: This function acts as a switch, returning to 1 whether the movement is detected and to 0 whether not.

Figure 38.Switch algorithm of PIR sensor.

The second sensor for device 1 is for the distance. 48

 Distance sensor

As the distance sensor works with 5V, we connect VCC with GPIO4 of raspberry to provide the voltage supply, GND with GPIO8, the trigger with GPIO11 and the echo signal with its respective divide sensor commented before, with the GPIO13. This way, we can use the HCSR4 sensor [27] library implemented for raspberry pi, and we will also use the GPIO library to build a complete algorithm.

The main functionality of this sensor is to detect the distance of someone in this room, site, etc.; then, we use this sensor to build the following specific functions:

1. Distance calibration: We need to calibrate the sensor to ensure a correct functioning of it. In the datasheet it is recommended to do 5 lectures of the distance before starting to measure, in other words, from the powered up of the sensor until we use it. Since the library has in count some parameters like temperature of the environment and the unit of measurement, it is recommended to do eleven measures whether you want a high precision. But, after proving its functioning more than 2 months, using always this function, we reduce these lectures or measurements as it is recommended in the datasheet. Until it obtains the five measures, the time waited to do it is of little seconds, and its uncertainty always has been of very little centimetres. This function returns to 1 whether it is calibrated and to 0 whether not.

Figure 39.Calibrate algorithm scheme of HCSR4 sensor.

49

2. Distance measurement: It measures the distance, saves it and returns the distance variable.

Figure 40.Distance algorithm scheme of HCSR4 sensor.

3. Hcsr4 configuration: This function measures the distance as the function explained before, but this one receives two input parameters, the time and the distance conditions. With these variables, we can measure the distance whether the conditions are accomplished: therefore, that the measure was obtained in the specific time and that the measure obtained is smaller than the distance condition.

Figure 41. Configuration distance algorithm scheme of HCSR4 sensor.

Once defined all functions, we can start to build an algorithm that permits us to build all rules or behaviours we want to carry out. We will explain the algorithms with the functions names designated before. Before all, we want to comment that the JSON messages the

50

functions return always have the same variables, but they are only actualised these ones designated in every call.

Specifically, we will implement: 1. Distance detection: This function returns the distance measured of the tangible or physical element detected, in other words, when the PIR sensor detects the movement, the logic electronic will be put in 1 and the HCSR4 sensor will measure the distance.

Figure 42. Algorithm of detect distance when movement is detected.

2. Distance condition detection: This function returns the distance measured of the physical element detected, but with the condition of not overtaking distance, in other words, when the PIR sensor detects the movement, the logic electronic will be put in 1 and the HCSR4 sensor will measure the distance, but whether the distance is longer than the condition, this will be violated or exceed.

Figure 43. Algorithm of not exceed the distance condition when movement is detected. 51

3. Time condition detection: This function returns the distance measured of the physical element detected, but with the condition of “happen” in this time, in other words, when the PIR sensor detects the movement, the logic electronic will be put in 1 and the HCSR4 sensor will measure the distance, but this distance will only be valid whether the time is in the specific condition, thus it will not be violated or exceed.

Figure 44. Algorithm of not exceed the time condition when movement is detected.

4. Movement counting: This function returns the counting number of the physical element detected. This process will be carried out in a specific time.

Figure 45. Algorithm of counter when movement is detected. 52

5. Distance detection based on distance, time and count conditions: This function returns the distance measured of the physical element detected, but with the distance, the time and the count conditions. When the PIR sensor detects the movement, the logic electronic will be put in 1 and the HCSR4 sensor will measure the distance and compare it with the distance condition, whether it happens in this time condition and whether exceeds the count condition.

Figure 46. Algorithm of distance detect when all conditions are accomplish.

3.3.1.4. Device 2

The second model of the device includes the camera module. Basically, this module offers us the advantage of offering in our product the visualization of images or videos.

Keeping the functionality and the PiCamera [44] library implemented to use it with raspberry in mind, we use this module to build the following specific functions:

1. Take a picture: this function takes a photo with 1024 x 768 resolution when it is launched and saves the photo in the raspberry.

Figure 47. Take a picture scheme camera module. 53

2. Take a number of pictures: This function has the same purpose that the last one mentioned, but taking a number of photos, variable passed like a parameter.

Figure 48. Take number specific of picture scheme camera module.

3. Take a video: Basically, this function records a video in h264 format when launched and saves the video in the raspberry. The default duration of the video is 10 seconds.

Figure 49. Video scheme camera module.

4. Take a video seconds: This function has the same purpose that the last one, but choosing the time of the video duration, variable passed like a parameter.

Figure 50. Video scheme camera module with choosing a time.

5. Transparency camera: This function changes the transparency of the video, for example, whether you use the camera module at night.

54

Figure 51. Video scheme camera module with transparency.

Once all new modules defined, we can start to build an algorithm that permits us to build the rules or behaviours we want to carry out using these. Basically we have used the same functions explained before in the device 1, but with the next particularity: always we detect distance (variable of return that we use in any module), for example, we take a photo with the camera to prove the object or thing detected.

To be more specifics, we use the “detect distance”, “detect distance condition”, “detect distance time conditions” and “detect distance with the conditions” functions, but with the next plus: when the function is accomplished, we take a photo, a video, a capture, etc.

The first function detects distance and adds the camera module of the next scheme. Whether we want to take a video or to use another function explained before in camera module, we will do as depicted below.

Figure 52. Algorithm of detect distance when movement is detected and take a photo.

3.3.1.5. Device 3

The third model of the device includes the GPS module. Basically, this module offers us the advantage of versatility and the visualization of the tracking. Moreover, this module pretends to provide the most intelligent functions to this device, for example, stablishing intelligent routs with either device. 55

To use this module, we have used the library gpsd-aware, which is a python module [55] that developers of gpsd-aware applications use to encapsulate all communication with gpsd to turn the commands commented before into NMEA 0183 format. Gpsd makes all data on the location/course/velocity of the sensors available to be queried on TCP port 2947 of the host computer.

This module only provides us the location of the device, which is enough for the functionality we will want to give to it. Whether we have more than one device, this will communicate with the other, and each one knows the position of the other.

This module is integrated and functions, but it needs much more work since after stablishing this module and ensuring its function, we need to represent its localization in a visual way as may be google maps.

First of all, we have set up the raspsberry to be able to read the latitude and the longitude of the GY-NEO6MV2 GPS Module.

We need to unblock the port serial ttyAMA07, since it comes blocked from the factory, like an input and output console. Therefore, we need to open the /boot/cmdline.txt and the /etc/inittab files, and change them to be able to use the port.

Once configured, we will install the library gpsd, and, then, we will be ready to try the script responsible of obtaining the latitude and the longitude. The behaviour of the GPS algorithm returns us a txt file accessible to use its data, as depicted in figure 54.

Figure 53. Algorithm of GPS module.

7 This is the name used in Linux OS to work with UART (Universal Asynchronous Receiver-Transmitter) serial port. 56

3.4. Uses Cases

As we have already known all the functionalities the different devices offer us, we can deduce the following uses cases.

3.4.1. Device 1

This device is responsible of covering the basic functionalities needed to offer security. Whenever we want to secure any object or anything like a baby for a certain time, we can use this device.

We can protect an object putting the device in the same room, for example on the wall, and whether we want to obtain the distance of the possible robbery, we need to put the device in front of the object, thing, etc. Moreover, we can count the people that enter in a site, for example, whether we have a baby and we want that nobody enters in his room at night, or that the only ones who enter are his parents, we can use this module to guarantee it.

Other daily case is whether you wash the floor of any room and need that nobody passes on it, you can use the device to guarantee that nobody passes for the site for the next 20 minutes, and, then, do not destroy your work.

With these examples of uses, we want to suggest that this device can be used for many daily cases and the motives of use are scalable.

3.4.2. Device 2

With the camera module, we want the device to be able to offer advanced functionalities, and, then, to be used it in different scenarios. It offers the uses commented before, but adding the capacity of taking photos or videos, and going beyond it. Moreover, whether we put the device in the entrance of our house or in a shop, we can recognize the people responsible of the robbery or the violation of condition we have chosen. The main functionality added with this module is the possibility of watching the photos or videos in real time when the condition is accomplished. This module helps to expand the market of the device, since it is not only useful for daily cases but can be used in a store, an industrial unit, etc. In the future, with the use of the recognition algorithms we will be able to offer the employee control (employee behaviour control and check-in and check-out access control). Also, whether you have installed devices in different shops, they can control the people that enter and you can stablish analytics of the target people in your business and business rules.

57

3.4.3. Device 3

GPS module only adds the localization data to the device, but it can provide many interesting functionalities. Although, to take the maximum advantage of this sensor, it is better to have more than one device. The functionalities of this device in an industry or a business worth taking a look at. For example, whether we install 3 or 4 devices in an industry unit, we can control the work of the employees, and in this way, we can optimize the work, and we can stablish work orders when a problem appears. Also, we can control the maintenance of the plant since the device can collect the data of the machines’ defects (whether the machine of the employee is broken, the device helps to create a work order to repair it). Another uses case may be stablishing rules of a possible way out of the robber when a burglary of goods occurred (due to the GPS): when it occurs, the module communicates with the other one, stablish the possible scape ruts and prepare to guard the plant, the store, etc. In case of such system was working in the street, it will be communicated and it would be possible to follow the robber.

3.5. Specification

Having tested all, the specifications of the device are the following ones:

LCESS DEVICE SPECIFICATION

Battery Capacity Nominal 2000mAh

Minimum 1900mAh

Nominal Voltage 3.7V

Voltage at end of discharge 3.0V

Charging Voltage 4.2V

PIR Sensor Delay Time 0.2 to 0.5 min

Sensing range 120 degrees in 7 meters

Temperature -15ºC to 70ºC

HCSR04 Sensor Working frequency 40Hz

Sensing range 12 degrees in 2 centimeters to 4 meters

Temperature 0ºC to 50ºC

GPS sensor Working frequency 0.25 Hz to 1kHz

Sensing range 50.000 meters

Temperature -40ºC to 850ºC

Table 4. LCESS system specifications. 58

4. Frontend

We need to use the device in a way that the user understands it rapidly, and easy to use it. For this reason, we build a mobile app.

4.1. APP -Android

To develop the app, we use the Android Studio, which is an official IDE (Integrated development environment) for android platform. To be able to use the app, it is necessary to have a mobile with the OS version upper than 4.0 and, finally, we use the following versions of repository and dependencies: android { compileSdkVersion 23 buildToolsVersion "23.0.3"

defaultConfig { applicationId "LCESS" minSdkVersion 16 targetSdkVersion 23 versionCode 1 versionName "1.0" } repositories { maven { url "https://repo.eclipse.org/content/repositories/paho- snapshots/" } } dependencies { compile fileTree(dir: 'libs', include: ['*.jar']) testCompile 'junit:junit:4.12' compile 'com.android.support:appcompat-v7:23.4.0' compile 'com.android.support:cardview-v7:23.4.0' compile 'com.android.support:recyclerview-v7:23.4.0' compile 'org.eclipse.paho:org.eclipse.paho.client.mqttv3:1.0.3- SNAPSHOT' compile 'org.eclipse.paho:org.eclipse.paho.android.service:1.0.3- SNAPSHOT' compile 'com.android.support:support-v4:23.4.0' compile 'com.android.support:design:23.4.0' }

4.2. Design

To design the app, we divide it in different layouts or screens that permits us to build a functional app to control and manage the LCESS device.

59

4.2.1. Login Page

We need a page to control the people who use our app. This point is important, since whether this product is commercialized someday, with the info of the registers, we will know important information e.g. about the target of our device, when it is more used, etc.

As depicted below in fig.54, we can see the login and register8 layout of the MQTt-LCESS app. In this case, the only valid password is the following one9 because this is a PoC (Proof of Concept) and we need to do software and hardware tests.

Figure 54. Log in page of LCESS app.

When we click in login button, we access to a start page, where we can see the use instructions and the set-up of the device.

[Mention that to have a more commercial name of the app, you can see JsB instead of Device 1, JsB-C instead of Device 2 and JsB-CG instead of Device-3.]

Figure 55. Start page of LCESS app.

8 Also, we can register in the BD (data base) of the app. 9 User: admin / Password: admin

60

Whether you access to the insructions, you can see a little explanation about, “what is the device?” and. “what are its functions?”, etc. As we can see in the figure 56.

And whether you access to “Start up”, you can see how configure the device easily to start to use it at the moment.

t Figure 56. Introduction & Start up page of LCESS.

When we do scrolling to the left, we see an important screen, on which there are the different types of device, as depicted in figure 57.

Figure 57. Devices page of the LCESS app.

Whether we access to either of them, we see the usability page where you can manage and manipulate the e.g. JsB device.

4.2.2. Options & use

To see how this layout works, as depicted in figure 59, we have the five functionalities explained before in Engineering & implementation point of each device.

61

We have three input texts, where we can put the conditions: the distance condition (centimetres), the time (minutes) and the count condition (name of people we don't want to exceed).

Whether we click in detect distance button, the app put the device on, and it starts to wait that anybody passes in front of the device or enters into the room. When it occurs, the app advices it to us by means of a pop up notification.

It happens the same whether we click the other buttons, but with the difference of that these other function need that the user writes the parameter in the text box, as depicted in figure 58.

Figure 58. Functions of JsB LCESS

In the next type of device, we can see the same functionalities but with the plus of seeing the image we can take when the functionality is accomplished, taken for the camera module, as depicted in figure 59. The streaming video in the app is a resource we would like to implement in the future.

Figure 59. Functions of JsB-C LCESS.

The device allows to send images and videos by email to the user account as depicted below.

62

Figure 60. The image and video received from the device.

The last model, the JsB-CG, includes a clickable text to consult the GPS advantages, and there we can see the location10, etc.

Figure 61. Functions of JsB-CG LCESS.

All notifications have the same structure as depicted below. They are produced when any functionality is accomplished, e.g. we start the “detect distance” functionality with different conditions and whether these ones are true, the device notifies this scenario with its respective data, image, video, etc.

10 This module also needs much more work, as it could benefit substantially and be implemented for different uses, as commented before. 63

Figure 62. Notification of LCESS app.

As we see in the use cases about device 2 and 3, we have two important functionalities which cannot be leveraged on the mobile, but on the web.

We decide to implement a web to control the devices like the app. An important point of the web will be the streaming of the video, since, once implemented, we will only have to pass the link to the app and, then, we would kill two birds with one stone. The other important point is the GPS, as we need to visualize all the map to implement any uses cases, so in this case the web is more useful than the app. Moreover, whether we want to commercialize our product it is important it have both app and web11. Like the app, the web also has a login and register screen, but in this case we access with any user and password (cannot repeat the user), as we can see below.

Figure 63. Log in screen of LCESS Web.

11 It is worth mentioning that this part was not an objective of the work, but we decided to start this part because it is important in IoT technology projects.

64

Like an app, the web also allows us to enter the parameters or conditions, as we can see in the figure 64.

Figure 64. Start device screen of LCESS Web.

When we insert the conditions and turn on the device, we can see that the conditions have been inserted and saved12.

Figure 65. Inserted and saved values of LCESS Web.

The other idea of the web is to implement a dashboard to control all functionalities of the device and prototype the use cases explained before to expose them to the clients in a future.

12 With this information, we can start the analytics explained in the use cases. 65

5. Results

The results obtained with the developed prototype are discussed and analysed in this section. First, we tested the device with an electric shield, soldered it with the sensors to start to send values and assure that all sensors were working as expected. After the hardware was checked, several algorithms were then implemented according to the considerations given before.

Figure 66. The different components needed to testing the device.

To see the message sent in JSON format better, and test the device, we can see the different messages in the broker, as depicted below in figure 67, where we can see messages of different devices and with different topics.

In the first part, we can see two messages published with the value of 25 to check the communication; after that, we can see different messages with the same format, the well- known JSON, where we can see two topics: measurement and device. The topic used officially is “deviceX”13, depending on which device we are using. The other important thing that changes is the functionality: when the variable “functionality” is higher than five is because the message comes from the device to advice the app that the functionality has been accomplished or not. It should be advised with a notification to the user, as commented before. The message with the variable “functionality” smaller than five comes from the app to the device to start its functioning.

13 Device 1, device 2 or device 3. 66

Figure 67. Message controlled for the broker.

Once done and tested all algorithms, we can start to build the PCB to give more robustness to the device. We need the PCB to put all hardware together with its respective components and welding’s. First of all, we build the schematic, as depicted below in figure 67.

Figure 68. Schematic of sensor hardware.

It is needed more than one effort or design of PCB, since whether it is not taken into account the minimum detail, e.g. the dimensions of each sensor, when the hardware is put in the board, it is probably to not fit or that any component collides with another one and, then, it must be rebuilt. In the figure 69 it can be seen the design of the PCB.

67

Figure 69. PCB of sensors hardware.

Finally, as depicted in the following figures, we put together all the parts of the device, including the battery, buck-boost and raspberry pi, as we can see in the figure 70, and, next, the movement sensor, distance sensor, camera module, GPS module and Wi-Fi module as we can see in the figure 71.

Figure 70. Part I of LCESS hardware.

68

Figure 71. Part II of LCESS hardware.

Once we tested all algorithms with the complete device, we only lack a chassis to cover the hardware and protect it from the people. We have a basic design, done with solid works14 [56].

To do it, we take advantage of the Aula RepRap [28], which is the hall where does 3D printing, and we do a 3D design of the chassis with a CAD (Computer-aided design) program.

Figure 72. 3D Solid Work chassis design.

14 This assembly has been carried out without a previous design or study, necessary whether we want to commercialize the device. In this case, we would get a designer or make a marketing study to create the chasis.

69

Next, in the figure 73, we can see the result of the 3D design printing.

Figure 73. Result of the 3D printing.

Finally, the image depicted in 74 is the final physic product, the result of putting all the components of the project together.

Figure 74. JsB-CG or Device 3.

70

6. Budget

In the following table, there is a list of components with the approximate cost.

UNIT

Name Raspberry Pi Zero SD Card PIR Sensor Unit price 5 € 1,99 € 1,79 € 20 units price 100 units price 100 units price

Distance Sensor Wi-Fi module USB adapter Battery 1,69 € 12,50 € 2,95 € 12,50 € 2 € 11 € 2 € 9,00 € 1,50 € 7,80 €

Power Boost Chassis(example) Camera Camera cable Ubox M6

14,50 € 12 € 9 € 5,95 € 11 € 13 € 10 € 8,00 €

11 € 8 € 10,20 € 8 €

Table 5. Price of different components of LCESS.

71

Once we have the list of the prices, we can predict the price of the device without designing and prototyping costs, as depicted below in table 5.

Device 1 Device 2 Device 3

Cost for unit 64,92 € 79,86 € 90,86 €

Cost for 20 units 58,97 € 72,92 € 83,92 €

Cost for 100 units 52,97 € 66,92 € 77,92 €

Table 6. Cost for units.

To continue with the analysis, we need to state the estimation of the hours dedicated to the project is difficult to known exactly, but may stay round of 400-500 hours, approximately. Whether we include the time of designing and prototyping, the cost of device will be the following one.

Cost for 1000 unit 50,47 € 64,42 € 75,42 €

Number of employees 1 1 1

Hours 400 400 400

Salary/ hour 10 € 10 € 10 €

Price 4.000 € 4.000 € 4.000 €

Price / 1000 units 4 € 4 € 4 €

Cost 54,47 € 68,42 € 79,42 €

Table 7. Cost total of each device.

Another important thing is that we have only used open source software’s to build this project, and for this reason there is no any cost of software in the analysis. We think that an economic and financial analysis is not necessary. This would be necessary whether we want to build a business where we will be the responsible of launching this device to the market; in this case, our company or start-up would have more products (all associated with IoT) to obtain a fixed position in the market.

72

7. Conclusions and future work

We start explaining the personal objectives for the realization of this work. There were:

- To allow me to learn by myself new and current concepts, important to apply in any project and that provide me some experience.

- To obtain more knowledge about IoT technologies and about their implementation.

- To learn how to implement a “big project”, we always need the first contact to learnt what we have done well or no.

- To learn to develop in other languages of programming which are so used in this era of industry 4.0. And we think that it is basic know it whether you want to dedicate to develop projects in this ambit.

- To learn to put face to face with a problem, in this case, this device, and take it forwards.

We think that we have accomplished all personal objectives because after dedicating so long hours, we have learned much, and we will be able to be critic to know that whether we would have to start the project again we would do something different in some part to guarantee a better result, although we know that these things are learned with the experience. Accordingly, we are happy about personal aspirations, since this work has been perfect and following still been good for the idea we thought. Nevertheless, in measure that we were going developing, deploying and building it, we were that there were new things that we could implement to the device to make it better, and practically only need time, since there were software things that had no cost, or the cost was minimum. But the time is limited and we have a times to accomplish like all projects. These new releases will be commented after. In relation to the objectives of the project, it has been satisfying because all the objectives presented in the project plan and work plan have been obtained, in one or another form we have solved the problems introduced, or complications that have been appeared when the problems were complicated to resolve, and their solution needed expert knowledge. Finally, another important thing of this work, not commented yet, is the idea appeared, while it has carried out the project, thanks to the passion for the technology and all the learnt in this project about IoT. More ideas have appeared to us about the development of devices or software’s, related to the IoT world, and it is another important thing, since without this work we would not have the motivation to continue working in this line. For future work, it is suggested the following:

73

 To build an OS especially for this type of device, to optimize the resources, and make it faster. This would be the best solution. But also, we can remove all software and packages that we don’t need in our current OS, Debian Pixel to optimize it. Or as last option, taking a Contiky or TinyOS and adapting it to our device, installing the respective software’s and packages needed.  To include the Wi-Fi direct communication between device and android app, also called P2P (Peer to peer) connections, android already has a service [38] implemented to start building this part. But the module of Wi-Pi that it is the module that we use in the device, needs, that we implement all interface or its respective scripts to make able to use this communication. Also, whether the Wi-Pi is not compatible with the Wi-Fi direct communication protocol, we can use CF-WU720N module Wi-Fi (also obtained to do tests in the project) and it is compatible with the communication protocol.  Once done this PoC, and knowing the cost of this device and the difficulties to build it, we can build all parts, and optimize both the dimensions, the costs and the fabrication, to be able to do all device only in a PCB with two layers, for example. In other words, we can design the power boost, we can design the SoB, choose the sensors like we have had until now, and send all to fabrication together, and then like this, optimize all parts.  To build all GPS uses cases explained in frontend, build it in the web and the app to have a show case, and business case to show to the clients.  To build a data base to collect all data of the device, MySQL database [39].  To build analytics with the information that the device can collect, the people that enter, as usual detect anything, counter of persons, etc. And build their respective graphics.  To include the streaming video in the web, and made to order, also in the app.  To include machine learning, for example when we have the streaming implemented, we can recognize the people that enter in the different shops or companies, take her data, analyses it and build an algorithm to obtain better marketing, rules of business, etc. Or to predictive maintenance of industries like we have commented before in the uses cases.  To include the advertisement from twitter whether you bind your count, same as we do with the email.

We see that there are many uses cases that give more functionalities to improve the device (add targets, features and benefits). 74

We decided to develop these basic features but, as commented, finally this is a proof of concept, and we think that it is enough because each one carries much work.

75

8. Bibliography

[1] Ashton, K.: That ‘internet of things’ thing, in the real world things matter more than ideas. RFID J. (2009). http://www.rfidjournal.com/articles/view?4986 [2] Atzori, L., Iera, A., Morabito, G.: The internet of things: a survey. Computer Networks 54 (2010) 2787– 2805 [3] Mckinsey & Company “Internet of things (IoT),” [Online]. Avaliable: http://www.mckinsey.com/industries/high-tech/our-insights/the-internet-of-things [4] Y. 2060, Recommendation ITU-T, “Overview of the internet of things”, 06/2012. [5] K. Rose, S. Eldridge and L. Chapin, "The Internet of Things: An Overview. Understanding the Issues and Challenges of a More Connected World," The Internet Society (ISOC), October 2015.

A. Al-Fuqaha, M. Guizani, M. Mohammadi, M. Aledhari and M. Ayyash, "Internet of Things: A Survey on Enabling Technologies, Protocols, and Applications," IEEE COMMUNICATION SURVEYS & TUTORIALS, vol. 17, no. 4, p. 2352 to 2357, 2015. [6] J. Polastre, R. Szewczyk, D. Culler. "Telos: enabling ultra-low power wireless research". In Proceedings of the Fourth International Symposium on Information Processing in Sensor Networks, IPSN 2005, 25-27 April 2005, Los Angeles, USA. pp. 364-369. doi: 10.1109/IPSN.2005.1440950. [7] V.C. Gungor, B. Lu, G.P. Hancke. "Opportunities and challenges of Wireless Sensor Networks in Smart Grid". IEEE Transactions on Industrial Electronics, vol. 56, no. 10, pp. 3557-3564, October 2010. DOI: 10.1109/TIE.2009.2039455. [8] R. Faludi. Building Wireless Sensor Networks: with ZigBee, XBee, Arduino, and Processing, 1st ed. Sebastopol, USA: O'Reilly Media, 2010. [9] Internet Protocol, Version 6 (IPv6) Specification. IETF RFC 2460, December 1998. [10] IEEE Standard for Information technology. Telecommunications and information exchange between systems Local and metropolitan area networks. Specific requirements. Part 11: Wireless LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications. IEEE Std 802.11-2012. [11] T. Tarun, P. Viswanathan, S. Suman. "Wireless Sensor Network White Paper". Tetcos Engineering, 2012. [Online] Available: http://www.tetcos.com/Enhancing_Throughput_of_WSNs.pdf. [Accessed: 23 October 2012]. [12] J. P. Wilkinson, “Nonlinear resonant circuit devices,” U.S. Patent 3 624 125, July 16, 1990. [13] M.V. Alvarez Fernández. "Feasibility study and design for Wireless Sensor Networks in a space environment". M.S. thesis, Faculty of Electrical Engineering, Mathematics and Computer Science, Delft University of Technology, Delft, The Netherlands, 2011. [14] J. O. Williams, “Narrow-band analyzer,” Ph.D. dissertation, Department of Electrical Engineering, Harvard University, Cambridge, MA, USA,1993. [15] IoT Analytics “The top 10 IoT applications areas” [Online]. Available: https://iot-analytics.com/top-10-iot- project-application-areas-q3-2016/ [16] Raspberry Pi “Download NOOBS for Raspberry Pi” [Online]. Available: https://www.raspberrypi.org/downloads/noobs/ [17] Raspberry Pi “Download Raspbian for Raspberry Pi” [Online]. Available: https://www.raspberrypi.org/downloads/raspbian/ [18] MQTT foundation “MQTT” [Online]. Available: http://mqtt.org/ [19] GitHub “Public MQTT brokers” [Online]. Available: https://github.com/mqtt/mqtt.github.io/wiki/public_brokers [20] GitHub “MQTT Servers/Brokers” [Online]. Available: https://github.com/mqtt/mqtt.github.io/wiki/servers [21] Eclipse Mosquitto “An Open Source MQTT v3.1/v3.1.1 Broker” [Online]. Available: https://mosquitto.org/ [22] Wikipedia “Platform as a service” [Online]. Available: https://en.wikipedia.org/wiki/Platform_as_a_service [23] Author not available. (Date not available). HC-SR501 PIR MOTION DETECTOR [Online]. Available: https://www.mpja.com/download/31227sc.pdf [24] ELECFreaks. (Date not available). Ultrasonic Ranging Module [Online]. Available: http://www.micropik.com/PDF/HCSR04.pdf [25] Wikipedia “NMEA 0183” [Online]. Available: https://en.wikipedia.org/wiki/NMEA_0183

76

[26] SHENZHEN PKCELL BATTERY CO., LTD. (2014, June 03). Li-Polymer Battery Technology Specification [Online]. Available: https://cdn-shop.adafruit.com/product-files/1570/1570datasheet.pdf [27] Python “hcsr04sensor 1.2.0 Python package Index” [Online]. Available: https://pypi.python.org/pypi/hcsr04sensor/1.2.0 [28] Escola Tècnica Superior d'Enginyeria Industrial de Barcelona. ETSEIB “Aula Reprap” [Online]. Available: https://etseib.upc.edu/ca/serveis/aula-reprap [29] Adafruit “RASPBERRY PI ZERO - VERSION 1.3” [Online]. Available: https://www.adafruit.com/products/2885 [30] Adafruit “USB Micro to USB” [Online]. Available: https://www.adafruit.com/products/2910 [31] Adafruit “Wi-Fi module” [Online]. Available: https://www.adafruit.com/products/2638 [32] Adafruit “Lithium Ion Battery - 3.7 2000mAh” [Online]. Available: https://www.adafruit.com/products/2011 [33] Adafruit “Raspberry Pi Zero v1.3 Camera Cable” [Online]. Available: https://www.adafruit.com/products/3157 [34] Adafruit “Power Boost 500 Charger” [Online]. Available: https://www.adafruit.com/products/1944 [35] Amazon “Ublox NEO-6M GPS Module” [Online]. Available: https://www.amazon.com/niceEshop-NEO-6M- Module-Aircraft-Controller/dp/B00S4RLICU/ref=sr_1_7?s=toys-and- games&ie=UTF8&qid=1482878564&sr=1-7&keywords=NEO-6M+GPS+Module [36] Ebay “8 Gb Kingston Micro Sd” [Online]. Available: http://www.ebay.es/itm/like/282253725369?lpid=115&chn=ps [37] Amazon “Camera video 1.5MP Module” [Online]. Available: https://www.amazon.es/dp/B01N6EIQTZ/ref=sr_1_1?ie=UTF8&qid=1482876371&sr=8- 1&keywords=C%C3%A1mara+5MP+para+Raspberry+PI [38] Android “Android developers” [Online]. Available: https://developer.android.com/training/connect-devices- wirelessly/index.html [39] MySQL “MySQL” [Online]. Available: https://www.mysql.com/ [40] Clayster “Products that give physical a secure digital life” [Online]. Available: http://www.clayster.com/ [41] Oracle “Internet of Things | Oracle” [Online]. Available: https://www.oracle.com/solutions/internet-of-things/ [42] SAP HANA “What is SAP HANA” [Online]. Available: http://www.sap.com/product/technology- platform/hana.html [43] Thethings “IoT platform” [Online]. Available: https://thethings.io/ [44] Picamera “Picamera 1.12 documentation” [Online]. Available: http://picamera.readthedocs.io/en/release- 1.12/ [45] Fortino,G. – Trunfio,P. (2014) Internet of Things Based on Smart Objects. Italy: Springer. [46] Mukhopadhyay,S. (2014) Internet of Things - Challenges and Opportunities. New Zealand: Springer [47] Waher,P. (2015) Learning Internet of Things with Raspberry Pi. Birmingham: Packt [48] Internet of Things: A Survey on Enabling Technologies, Protocols, and Applications [49] Ala Al-Fuqaha, Mohsen Guizani, Fellow, Mehdi Mohammadi, Moussa Ayyash. Internet of Things: A Survey on Enabling Technologies, Protocols, and Applications. 2015;17(4):5-12. [50] D. Thangavel, X. Ma, A. Valera, H. Tan, and C. K. Tan, “Performance evaluation of MQTT and CoAP via a common middleware,” in Proc. IEEE 9th Int. Conf. ISSNIP, 2014, pp. 1–6. [51] W. Colitti, K. Steenhaut, N. De Caro, B. Buta, and V. Dobrota, “Evaluation of constrained application protocol for wireless sensor networks,” in Proc. 18th IEEE Workshop LANMAN, 2011, pp. 1–6. [52] M. Laine and K. Säilä, “Performance evaluation of XMPP on the Web,” Aalto Univ. Tech. Rep., Aalto, Finland, 2012. [53] J. L. Fernandes, I. C. Lopes, J. J. P. C. Rodrigues, and S. Ullah, “Performance evaluation of RESTful web services and AMQP protocol,” in Proc. 5th ICUFN, 2013, pp. 810–815. [54] C. Esposito, S. Russo, and D. Di Crescenzo, “Performance assessment of OMG compliant data distribution middleware,” in Proc. IEEE IPDPS, 2008, pp. 1–8. [55] GPSd “Put your GPS on the net” [Online]. Available: http://www.catb.org/gpsd/#downloads [56] Solid Works “Software de diseño CAD en 3D” http://www.solidworks.es/

77

Glossary

A list of all acronyms and the meaning they stand for.

LCESS - Low Cost Embedded Security System

OS - Operational System.

IoT - Internet of Things.

MQTT - Message Queue Telemetry Transport

CoAP - Constrained Application Protocol

QoS - Quality of Service

GPS - Global Positioning System

XMPP - Extensible Messaging and Presence Protocol

AMQP - Advanced Message Queuing Protocol

DDS - Data Distribution Service

HTTP - The Hypertext Transfer Protocol. PCB - Printed Circuit Board PoC - Proof of Concept Wi-Fi - Wireless Fidelity JSON - Java Script Object Notation TCP - Transport Computer Protocol PIR - Passive Infrared Sensor M2M - Machine to Machine M2D - Machine to Device PaaS - Platform as a Service IO - Input/Output IBM - International Business Machines Corporation LP - Latium Polymer CSI - Camera Module Interface ISP - In-circuit Serial Programming SD - Secure digital NMEA - National Marine Electronical Association FAT32 - File Allocation Table GPIO - General Purpose Input/Output IDE - Integrated Drive Electronics 78

Annexes

Annex I. Comparison about different application layer protocols To decide the application layer protocol, we will use the project, first of all, we need to do a study with the different possibilities we have to work with. We will analyze them deeply and we will decide what it is the best protocol for my device. Which of them will adapt better? Which of them will offer more facilities? Which of them is the most used in IoT? Which of them is open source? Which of them is the most documented, on which we can support? etc.

We analyses the application layer protocol because the other sub layers (Service Discovery, Infrastructure protocol and influential protocol) are define with the dispositive it we use. In the next figure we can see the different layers of IoT 0.

Figure 75. Standardization of IoT.

MQTT: Queue Telemetry Transport

MQTT utilizes the publish/subscribe pattern to provide transition flexibility and simplicity of implementation as depicted in Figure 75 Moreover, MQTT is suitable for resource constrained devices that use unreliable or low bandwidth links.

Figure 76. Functionality of MQTT, we can see the publishers, subscribers and the broker. 79

MQTT is built on top of the TCP protocol as depicted in Figure 76. It delivers messages through three levels of QoS (Quality of service).

The latter is defined specifically for sensor networks and defines a UDP mapping of MQTT and adds broker support for indexing topic names. The specifications provide three elements: connection semantics, routing, and endpoint. MQTT simply consists of three components, subscriber, publisher, and broker. An interested device would register as a subscriber for specific topics in order to be informed by the broker when publishers publish topics of interest. The publisher acts as a generator of interesting data. After that, the publisher transmits the information to the interested entities (subscribers) through the broker.

Figure 77. MQTT Architecture diagram.

Numerous applications utilize the MQTT such as health care, monitoring, energy meter, and Facebook notification. Therefore, the MQTT protocol represents an ideal messaging protocol for the IoT and M2M communications and is able to provide routing for small, cheap, low power and low memory devices in vulnerable and low bandwidth networks. Figure 77 shows the message format used by the MQTT protocol. The first two bytes of message are fixed header. In this format, the value of the Message Type field indicates a variety of messages including CONNECT, CONNACK, PUBLISH, SUBSCRIBE and so on. The UDP flag indicates the massage is duplicated and the receiver may have received it before. Three levels of QoS for delivery assurance of Publish messages are identified by the QoS Level field. The Retain field informs the server to retain the last received Publish message and submit it to new subscribers as a first message.

Figure 78. MQQT message format.

80

COAP: Constrained Application Protocol

The CoAP defines a web transfer protocol based on REST (REpresentational State Transfer) on top of HTTP functionalities. REST represents a simply way to exchange data between clients and servers over HTTP. REST can be seen as a cacheable connection protocol that relies on stateless client-server architecture. It is used within mobile and social network applications and it eliminates ambiguity by using HTTP get, post, put, and delete methods. REST enables clients and servers to expose and consume web services like the Simple Object Access Protocol (SOAP) but in an easier way using Uniform Resource Identifiers (URIs) as nouns and HTTP get, post, put, and delete methods as verbs. REST does not require XML for message exchanges. Unlike REST, CoAP is bound to UDP (not TCP) by default which makes it more suitable for the IoT applications. Furthermore, CoAP modifies some HTTP functionalities to meet the IoT requirements such as low power consumption and operation in the presence of lossy and noisy links. However, since CoAP has been designed based on REST, conversion between these two protocols in REST-CoAP proxies is straightforward. The overall functionality of CoAP protocol is demonstrated in Figure 78.

Figure 79. Functionality CoAP.

CoAP aims to enable tiny devices with low power, computation and communication capabilities to utilize RESTful interactions. CoAP can be divided into two sub-layers, namely: the messaging sub-layer and the request/response sub-layer. The messaging sub- layer detects duplications and provides reliable communication over the UDP transport layer using exponential back off since UDP does not have a built-in error recovery mechanism.

The request/response sub-layer on the other hand handles REST communications. CoAP utilizes four types of messages: confirmable, non-confirmable, reset and acknowledgement. 81

Reliability of CoAP is accomplished by a mix of confirmable and non-confirmable messages. It also employs four modes of responses as illustrated in Figure 79. The separate response mode is used when the server needs to wait for a specific time before replying to the client. In CoAP’s non-confirmable response mode, the client sends data without waiting for an ACK message, while message IDs are used to detect duplicates.

Figure 80.CoAP Architecture diagram.

The server side responds with a RST message when messages are missed or communication issues occur. CoAP, as in HTTP, utilizes methods such as GET, PUT, POST and DELETE to achieve Create, Retrieve, Update and Delete operations. For example, the GET method can be used by a server to inquire the client’s temperature using the piggybacked response mode. The client sends back the temperature whether it exists; otherwise, it replies with a status code to indicate that the requested data is not found. CoAP uses a simple and small format to encode messages. The first and fixed part of each message is four bytes of header. Then a token value may appear whose length ranges from zero to eight bytes. The token value is used for correlating requests and responses. The options and payload are the next optional fields. A typical CoAP message can be between 10 to 20 bytes. The message format of CoAP packets is depicted in Figure 80.

Figure 81.CoAP message format.

The fields in the header are as follows: Ver is the version of CoAP, T is the type of Transaction, OC is Option count, and Code represents the request method (1–10) or response code (40–255). For example, the code for GET, POST, PUT, and DELETE is 1, 2, 3, and 4, respectively. The Transaction ID in the header is a unique identifier for matching the response.

82

XMPP: Extensible Messaging and Presence Protocol

XMPP is an IETF instant messaging (IM) standard that is used for multi-party chatting, voice and video calling and telepresence. XMPP was developed by the Jabber open source community to support an open, secure, spam free and decentralized messaging protocol. XMPP allows users to communicate with each other by sending instant messages on the Internet no matter which operating system they are using. XMPP allows IM applications to achieve authentication, access control, privacy measurement, hop- by-hop and end-to-end encryption, and compatibility with other protocols. Figure 81 illustrates the overall behavior of XMPP protocol, on which gateways can bridge between foreign messaging networks. Many XMPP features make it a preferred protocol by most IM applications and relevant within the scope of the IoT.

Figure 82.Functionality XMPP.

It runs over a variety of Internet-based platforms in a decentralized fashion. XMPP is secure and allows the addition of new applications on top of the core protocols. XMPP connects a client to a server using a stream of XML stanzas. An XML stanza represents a piece of code that is divided into three components: message, presence, and iq (info/query) (See Figure 82). Message stanzas identify the source (from) and destination (to) addresses, types, and IDs of XMPP entities that utilize a push method to retrieve data. A message stanza fills the subject and body fields with the message title and contents. The presence stanza shows and notifies customers of status updates as authorized. The iq stanza pairs message senders and receivers. The text based on communication in XMPP using XML imposes a rather high network overhead. One solution for this problem is compressing XML streams using EXI.

83

Figure 83. Structure of XMPP stanza.

Figure 84. XMPP Architecture diagram.

HTTP/REST: The Hypertext Transfer Protocol.

We do not compare the different protocols with this because CoAP offer the same functionalities that HTTP/REST, but improving them. (HTTP is not adapted to a IoT world). For this reason we dismiss this protocol and don’t explain its theory.

AMQP: Advanced Message Queuing Protocol

AMQP is an open standard application layer protocol for the IoT focusing on message- oriented environments. It supports reliable communication via message delivery guarantee primitives including at-most-once, at-least-once and exactly once delivery. AMQP requires a reliable transport protocol like TCP to exchange messages.

By defining a wire-level protocol, AMQP implementations are able to interoperate with each other. Communications are handled by two main components as depicted in Figure 84: exchanges and message queues. Exchanges are used to route the messages to appropriate queues. Routing between exchanges and message queues is based on some 84

pre-defined rules and conditions. Messages can be stored in message queues and then be sent to the receivers.

Figure 85. Functioning of AQMP.

Beyond this type of point-to-point communication, AMQP also supports the publish/subscribe communications model.

AMQP defines a layer of messaging on top of its transport layer. Messaging capabilities are handled in this layer. AMQP defines two types of messages: bare massages that are supplied by the sender and annotated messages that are seen at the receiver. In Figure 85 the message format of AMQP is shown. The header in this format conveys the delivery parameters including durability, priority, time to live, first acquirer, and delivery count.

Figure 86. AMQP message format.

The transport layer provides the required extension points for the messaging layer. In this layer, communications are frame oriented. The structure of AMQP frames is illustrated in Figure 86. The first four bytes show the frame size. DOFF (Data Offset) gives the position of the body inside the frame. The Type field indicates the format and purpose of the frame. For example,

0x00 is used to show that the frame is an AMQP frame or type code 0x01 represents a SASL frame.

85

Figure 87. AMQP frame format.

DDS Data Distribution Service

Data Distribution Service (DDS) is a publish-subscribe protocol for real-time M2M communications that has been developed by Object Management Group (OMG). In contrast to other publish-subscribe application protocols like MQTT or AMQP, DDS relies on a broker-less architecture and uses multicasting to bring excellent Quality of Service (QoS) and high reliability to its applications.

Its broker-less publish-subscribe architecture suits well to the real-time constraints for IoT and M2M communications. DDS supports 23 QoS policies by which a variety of communication criteria like security, urgency, priority, durability, reliability, etc. can be addressed by the developer.

DDS architecture defines two layers: Data-Centric Publish-Subscribe (DCPS) and Data-Local Reconstruction Layer (DLRL). DCPS is responsible for delivering the information to the subscribers. DLRL on the other hand, is an optional layer and serves as the interface to the DCPS functionalities. It facilitates the sharing of distributed data among distributed objects.

Five entities are involved with the flow of data in the DCPS layer: (1) Publisher that disseminates data; and (2) Data Writer that is used by the application to interact with the publisher about the values and changes of data specific to a given type. The association of Data Writer and Publisher indicates that the application is going to publish the specified data in a provided context; (3) Subscriber that receives published data and delivers them to the application; (4) Data Reader that is employed by the Subscriber to access to the received data; and (5) a Topic that is identified by a data type and a name. Topics relate Data Writers to Data Readers. Data transmission is allowed within a DDS domain which is a virtual environment for connected publishing and subscribing applications. Fig. 16 demonstrates the conceptual architecture of this protocol.

86

Conclusions Firstly, all theory of these protocols reported in the literature has been compared and assimilated. For example, [50] compares the performance of MQTT and CoAP in terms of end to- end transmission delay and bandwidth usage. Based on their results, MQTT delivers messages with lower delay than CoAP. In contrast, when the packet loss rate is high, CoAP outperforms MQTT. In case of small size messages and a loss rate under 25%, CoAP outperforms MQTT in generating less extra traffic.

The performance comparison between CoAP and HTTP is investigated for energy consumption and response time in [51]. Due to its condensed header and small packet size, CoAP is more efficient than HTPP in transmission time and energy usage. The authors in [52] present an evaluation of XMPP to verify its applicability to real-time communications on the web. They assessed the performance of XMPP over HTML5 Web Socket and their results show that XMPP is an efficient option for web applications that require real-time communication. Performance evaluation of AMQP and REST is reported in [53]. To carry out their study, the authors have used the average number of exchange messages between the client and the server in a specific interval to measure the performance. Under a high volume of message exchanges, AMQP demonstrated better results than RESTful web services.

An experimental evaluation of two implementations of DDS [54] points out that this protocol scales well when the number of nodes is increased.

To the best of our knowledge, there is no comprehensive evaluation of all these protocols together. However, each of these protocols may perform well in specific scenarios and environments.

So it is not feasible to provide a single prescription for all IoT applications protocols.

Figure 88 provides a brief comparison between the common IoT application protocols. The last column in the table indicates the minimum header size required by each protocol.

To sum up, we can need different types of protocols for a project, because each protocol has a particularity that provide better solution for some concretely parts.

87

Figure 88. Comparison between the IoT application protocols.

Finally, just we have two protocols that offer me the best solutions because of their advantages, MQTT and CoAP, and we choose the MQTT protocol to implement my solution for the following reasons:

 There are more libraries in python to use MQTT  It is an open source,  There is a good documentation in the web to help me.  It is a robust protocol,  The most important reason is for the use of the battery, like the size of the message in this protocol is smaller.

Annex II. Code

PIR sensor #import libraries import RPi.GPIO as GPIO import time from time import gmtime, strftime

#configuration GPIOs Raspberry PIR_PIN = 4 GPIO.setmode(GPIO.BCM) GPIO.setup(PIR_PIN, GPIO.IN)

#functions def calibrated_pir(timeseg): count = 0 time1 = strftime("%d-%m-%Y %H:%M:%S", gmtime()) #build a string with the time # mirar libreria time, necesitamos coger el tiempo como int para compararlo más tarde t1=time.time()+timeseg if GPIO.input(PIR_PIN): count = count + 1 while (time.time() >= t1): 88

if ((count != 0) & (GPIO.input(PIR_PIN) == 0)): calibrated = 1 return calibrated

#You need calibrated the sensor if you want detect the any thing or person. #When you are detected one time, need wating some seconds to turn active, because if not you are #send 1 all time although in this moment there isn't nothing or anything. #time.time() return the time in seconds.----- uses this!!! def detect_pir(): detect = 0 #while True: #if calibrated: if GPIO.input(PIR_PIN): timex = strftime("%d-%m-%Y %H:%M:%S", gmtime()) #build a string with the time print timex + " MOVIMIENTO DETECTADO" detect = 1 count = count + 1 #calibrate_pir(15) return detect def switch_pir(): if GPIO.input(PIR_PIN): switch = 1 else: switch = 0 return switch def count_pir(): count = 0 while True: if detect_pir(): count=coun +1 return count

HCSR4 sensor from hcsr04sensor import sensor

'''Script using hcsr04sensor module for Raspberry Pi''' trig_pin = 17 echo_pin = 27

def calibrated_hcsr04(num): # num is the number of readings needed to do a stable measuring. #The median of the sample for a accuretly reading is 11, for proves I use 5 val=0 ok=0 for i in range(num): value = sensor.Measurement(trig_pin, echo_pin, 20, 'metric', 1) raw_measurement = value.raw_distance(5) distance = value.distance_metric(raw_measurement) if distance > 0: val = val + 1 if val == num: 89

ok=1 return ok else: return ok def configuration_hcsr04(distance,timeseg,countCond): configurated = 0 value = sensor.Measurement(trig_pin, echo_pin, 20, 'metric', 1) raw_measurement = value.raw_distance(5) metric_distance = value.distance_metric(raw_measurement) t1=time.time() + float((timeseg*60)) if distance <= metric_distance: if t1<=time.time(): if count_pir>=countCond: configurated = 1 return configurated def distance(): # Create a distance reading with the hcsr04 sensor module value = sensor.Measurement(trig_pin, echo_pin, 20, 'metric', 1) raw_measurement = value.raw_distance(5)

# Calculate the distance in centimeters metric_distance = value.distance_metric(raw_measurement) print("The Distance = {} centimeters".format(metric_distance)) return metric_distance

Camera module from time import sleep from picamera import PiCamera def take_picture(): #;;PI camera module. # maximum resolution: 2592 x 1944 photos camera = PiCamera() camera.resolution = (1024, 768) camera.start_preview() sleep(2) camera.capture('/home/pi/Desktop/image.jpg') camera.stop_preview() def taking_number_pictures(number): camera = PiCamera() camera.resolution = (1024, 768) camera.start_preview() for i in range(number): sleep(2) camera.capture('/home/pi/Desktop/images/image%.jpg' % i) camera.stop_preview() def take_video(): ## maximum resolution: 1920 x 1080 video, the format are .h264 #default 10 seconds camera = PiCamera() camera.start_preview() 90

camera.start_recording('/home/pi/Desktop/videos/video.h264') sleep(10) camera.stop_recording() camera.stop_preview()

#------#You can alter the transparency og the camera def transparency_camera(alpha): from picamera import PiCamera from time import sleep

camera = PiCamera() #alpha=200 camera.start_preview(alpha) sleep(10) camera.stop_preview()

GPS module import os from gps import * from time import * import time import threading gpsd = None os.system('clear') class GpsPoller(threading.Thread): def __init__(self): threading.Thread.__init__(self) global gpsd gpsd = gps(mode=WATCH_ENABLE) self.current_value = None self.running = True

def run(self): global gpsd while gpsp.running: gpsd.next() #This continue the loop and take the data if __name__ == '__main__': gpsp = GpsPoller() # thread try: gpsp.start() while True: os.system('clear')#Limpiamos la terminal print('Latitud: ' + str(gpsd.fix.latitude)) print('Longitud: ' + str(gpsd.fix.longitude)) if gpsd.fix.latitude == 0.0 and gpsd.fix.longitude == 0.0: print "Waiting GPS..." else: print "GPS OK" data = open("locations.txt", "a") data.write("%s,%s\n" % (gpsd.fix.latitude, gpsd.fix.longitude)) data.close() time.sleep(5)

except (KeyboardInterrupt, SystemExit): 91

print "\nDesconnecting GPS..." gpsp.running = False gpsp.join() # waiting thread finalice print "Ok.\nExit..."

Algorithms of the device 1

######################################################################## ## ######################################################################## #import libraries

# Import package import paho.mqtt.client as mqtt import sys import RPi.GPIO as GPIO import time from time import gmtime, strftime import json

#configuration GPIOs Raspberry PIR_PIN = 4 GPIO.setmode(GPIO.BCM) GPIO.setup(PIR_PIN, GPIO.IN)

#functions def calibrated_pir(timeseg): count = 0 time1 = strftime("%d-%m-%Y %H:%M:%S", gmtime()) #build a string with the time # mirar libreria time, necesitamos coger el tiempo como int para compararlo más tarde t1=time.time()+timeseg if GPIO.input(PIR_PIN): count = count + 1 while (time.time() >= t1): if ((count != 0) & (GPIO.input(PIR_PIN) == 0)): calibrated = 1 return calibrated

#You need calibrated the sensor if you want detect the any thing or person. #When you are detected one time, need wating some seconds to turn active, because if not you are #send 1 all time although in this moment there isn't nothing or anything. #time.time() return the time in seconds.----- uses this!!! def detect_pir(): detect = 0 #while True: #if calibrated: if GPIO.input(PIR_PIN): timex = strftime("%d-%m-%Y %H:%M:%S", gmtime()) #build a string with the time print timex + " MOVIMIENTO DETECTADO" detect = 1 return detect 92

def switch_pir(): if GPIO.input(PIR_PIN): switch = 1 else: switch = 0 return switch def count_pir(): count = 0 while True: if detect_pir(): count=coun +1 return count #------from hcsr04sensor import sensor

'''Script using hcsr04sensor module for Raspberry Pi''' trig_pin = 17 echo_pin = 27

def calibrated_hcsr04(num): # num is the number of readings needed to do a stable measuring. #The median of the sample for a accuretly reading is 11, for proves I use 5 val=0 ok=0 for i in range(num): value = sensor.Measurement(trig_pin, echo_pin, 20, 'metric', 1) raw_measurement = value.raw_distance(5) distance = value.distance_metric(raw_measurement) if distance > 0: val = val + 1 if val == num: ok=1 return ok else: return ok def configuration_hcsr04(distance,timeseg,countCond): configurated = 0 value = sensor.Measurement(trig_pin, echo_pin, 20, 'metric', 1) raw_measurement = value.raw_distance(5) metric_distance = value.distance_metric(raw_measurement) t1=time.time() + float((timeseg*60)) if distance <= metric_distance: if t1<=time.time(): if count_pir>=countCond: configurated = 1 return configurated def distance(): # Create a distance reading with the hcsr04 sensor module value = sensor.Measurement(trig_pin, echo_pin, 20, 'metric', 1) raw_measurement = value.raw_distance(5)

# Calculate the distance in centimeters metric_distance = value.distance_metric(raw_measurement)

93

print("The Distance = {} centimeters".format(metric_distance)) return metric_distance

#------def json_send( Topic, ok, Functionality, Distance, Time, Count): data = {"Topic":Topic, "ok":ok, "Functionality":Functionality, "Distance":Distance, "Time":Time, "Count":Count} u=json.dumps(data) a=json.loads(u) return a def json_receive(a): u=json.loads(a) return u

#------

#Return the distance when pir sensor detect. def detect_distance(): ok = 0 mqttTopic = "Device1" Functionality = 1 while (detect_pir()==0): inf=json_send(mqttTopic, ok, Functionality, distance(), 0, 0) ok = 1 inf = json_send(mqttTopic, ok, Functionality, distance(), 0, 0) return inf

#Return ok and distance when pir sensor detect that the condition movement is acomplish def detect_distance_condition(distanceCondition): #pir_calibrated(15) #calibretion #hcsr04_calibrated(15) ok = 0 count = 0 mqttTopic = "Device1" Functionality = 2 while (distance() >= distanceCondition): while ((detect_pir()==0) & (distance() >= distanceCondition)): #max sensor tener en cuenta inf=json_send(mqttTopic, ok, Functionality, distance(), 0, count) #save the distance count = count + 1 ok=1 inf=json_send(mqttTopic, ok, Functionality, distance(), 0, count) return inf

#tractar si la persona a violado la distancia con alarmas//////////

#Return ok and distance when pir sensor detect that the condition movement is acomplish def detect_time_condition(timeCond):

94

#pir_calibrated(15) #calibretion #hcsr04_calibrated(15) ok = 0 mqttTopic = "Device1" Functionality = 3 t1 = time.time() + float(int(timeCond)*60) while ((detect_pir()==0) & (time.time() <= t1)): #max sensor tener en cuenta inf=json_send(mqttTopic, ok, Functionality, distance(), timeCond, 0) #save the distance if (time.time() <= t1): ok = 1 inf=json_send(mqttTopic, ok, Functionality, distance(), timeCond, 0) inf=json_send(mqttTopic, ok, Functionality, distance(), timeCond, 0) return inf

#Return distance if the conditions are acomplish, time, number of sensor detect movement def detect_distanceIfConditions(distanceCondition,timeCond,countCond): #pir_calibrated(15) #calibretion #hcsr04_calibrated(15) ok = 0 cond1 = 0 cond2 = 0 count = 0 mqttTopic = "Device1" Functionality = 4 t1 = time.time() + float(int(timeCond)*60) while (time.time() <= t1) & (distance() >= distanceCondition & count<=countCond): while (detect_pir()==0): inf=json_send(mqttTopic, ok, Functionality, distance(), timeCond, count) count = count +1 if (time.time() <= t1) & (distance() <= distanceCondition): cond1 = 1 if (time.time() <= t1) & (count == countCond): cond2 =1 if (cond1 & cond2): ok = 1 inf=json_send(mqttTopic, ok, Functionality, distanceCondition, timeCond, countCond) return inf

#How persons enter to the room or detect this sensor in one time #return num of persons/objects detects in the time condition def count_movementDetect(timeCond): counter = 0 mqttTopic = "Device1" Functionality = 5 t1 = time.time() + float(int(timeCond)*60) while time.time() <= t1: if detect_pir(): counter = counter + 1 ok = 1 print counter count = counter / 20 inf=json_send(mqttTopic, ok, Functionality, 0, timeCond, count)

95

return inf

##################################################################### ####################################################################### def mqqtt_subscribe(): # Define Variables MQTT_BROKER = "192.168.1.38" #iot.eclipse.org MQTT_PORT = 1883 MQTT_KEEPALIVE_INTERVAL = 45 MQTT_TOPIC = "Device1"

# Define on_connect event Handler def on_connect(mqttc, userdata, rc): if rc == 0: print("MQTT Connect") else: print("MQTT not Connect, produced wrong with code:"+str(rc)) #Subscribe to a Topic mqttc.subscribe(MQTT_TOPIC, 0)

# Define on_subscribe event Handler def on_subscribe(mqttc, userdata, mid, granted_qos): print ("Subscribed to MQTT Topic with mid: "+ str(mid))

# Define on_message event Handler def on_message(mqttc, userdata, msg): print("Received message: " + str(msg.payload)) print("On Topic: "+ str(msg.topic)) print("With QoS: " + str(msg.qos)) print("------") value = json_receive(msg.payload) if value["Functionality"]==1: value = detect_distance() if value["ok"]: value["Functionality"]=6 a = json.dumps(value) publish_mqtt(a) #print ("Detect object/person in " + str(value["Distance"]) + " cm") #print ("Publish distance: " +str(value["Distance"])) print ('####################### PUBLICATED #######################')

elif value["Functionality"]==2: value = detect_distance_condition(float(value["Distance"])) if value["ok"]: value["Functionality"]=7 a = json.dumps(value) publish_mqtt(a) print ("Detect object/person in " + str(value["Distance"]) + " cm") print ("Publish distance : "+str(value["Distance"]))

elif value["Functionality"]==3: value = detect_time_condition(int(value["Time"])) #if value["ok"]: value["Functionality"]=8 a = json.dumps(value) publish_mqtt(a)

96

print ("Detect object/person in " + str(value["Time"]) + " mins") print ("Publish distance : "+str(value["Time"]) +" with topic: "+ str(value["Topic"])) elif value["Functionality"]==4: value = detect_distanceIfConditions(value["Distance"],value["Time"],value["Count "]) if value["ok"]: value["Functionality"]=9 a = json.dumps(value) publish_mqtt(a) print ("Detect object/person in " + str(value["Time"]) + " mins and") print ("Detect object/person in " + str(value["Distance"]) + " cm") print ("Publish distance : "+ str(value["Time"]) +" and " + str(value["Distance"]) + "with topic: "+ str(value["Topic"]))

elif value["Functionality"]==5: value = count_movementDetect(int(value["Count"])) if value["ok"]: value["Functionality"]=10 a = json.dumps(value) publish_mqtt(a) print ("Detect " + str(value["Count"]) + " object/person") print ("Publish distance : "+ str(value["Distance"]) +" with topic: "+ str(value["Topic"]))

# Initiate MQTT Client mqttc = mqtt.Client(client_id="LCESS", clean_session=False)

# Register Event Handlers mqttc.on_message = on_message mqttc.on_connect = on_connect mqttc.on_subscribe = on_subscribe

# Connect with MQTT Broker # probar mqttc.username_pw_set(username, password) mqttc.connect(MQTT_BROKER, MQTT_PORT, MQTT_KEEPALIVE_INTERVAL)

# Continue the network loop mqttc.loop_forever() def publish_mqtt(message): # Define Variables MQTT_HOST = "192.168.1.38" #iot.eclipse.org MQTT_PORT = 1883 MQTT_KEEPALIVE_INTERVAL = 45 #MQTT_TOPIC = "device1" #MQTT_MSG = 25

def on_connect(mqttc, userdata, flags, rc): #Subscribe to a Topic mqttc.subscribe(message["Topic"], 0) print("Connection returned result: "+connack_string(rc))

97

# Define on_publish event function def on_publish(mqttc, userdata, mid): print "Message Published..." print("mid : " +str(mid))

# Initiate MQTT Client mqttc = mqtt.Client(client_id="LCESS", clean_session=False)

# Register publish callback function mqttc.on_publish = on_publish mqttc.on_connect = on_connect # Connect with MQTT Broker # probar mqttc.username_pw_set(username, password) mqttc.connect(MQTT_HOST, MQTT_PORT, MQTT_KEEPALIVE_INTERVAL)

# Publish message to MQTT Broker mqttc.publish("Device1",message,0,True) #str(message["Topic"]) # Disconnect from MQTT_Broker mqttc.disconnect() mqqtt_subscribe()

Algorithms of the device 2

#import libraries import sys import RPi.GPIO as GPIO import time from time import gmtime, strftime import json

#configuration GPIOs Raspberry PIR_PIN = 4 GPIO.setmode(GPIO.BCM) GPIO.setup(PIR_PIN, GPIO.IN)

#functions def calibrated_pir(timeseg): count = 0 time1 = strftime("%d-%m-%Y %H:%M:%S", gmtime()) #build a string with the time # mirar libreria time, necesitamos coger el tiempo como int para compararlo más tarde t1=time.time()+timeseg if GPIO.input(PIR_PIN): count = count + 1 while (time.time() >= t1): if ((count != 0) & (GPIO.input(PIR_PIN) == 0)): calibrated = 1 return calibrated

#You need calibrated the sensor if you want detect the any thing or person. 98

#When you are detected one time, need wating some seconds to turn active, because if not you are #send 1 all time although in this moment there isn't nothing or anything. #time.time() return the time in seconds.----- uses this!!! def detect_pir(): detect = 0 #while True: #if calibrated: if GPIO.input(PIR_PIN): timex = strftime("%d-%m-%Y %H:%M:%S", gmtime()) #build a string with the time print timex + " MOVIMIENTO DETECTADO" detect = 1 return detect def switch_pir(): if GPIO.input(PIR_PIN): switch = 1 else: switch = 0 return switch def count_pir(count): while detect_pir(): count=count +1 print count return count #------from hcsr04sensor import sensor

'''Script using hcsr04sensor module for Raspberry Pi''' trig_pin = 17 echo_pin = 27

def calibrated_hcsr04(num): # num is the number of readings needed to do a stable measuring. #The median of the sample for a accuretly reading is 11, for proves I use 5 val=0 ok=0 for i in range(num): value = sensor.Measurement(trig_pin, echo_pin, 20, 'metric', 1) raw_measurement = value.raw_distance(5) distance = value.distance_metric(raw_measurement) if distance > 0: val = val + 1 if val == num: ok=1 return ok else: return ok def configuration_hcsr04(distance,timeseg,countCond): configurated = 0 value = sensor.Measurement(trig_pin, echo_pin, 20, 'metric', 1) raw_measurement = value.raw_distance(5)

99

metric_distance = value.distance_metric(raw_measurement) t1=time.time() + float((timeseg*60)) if distance <= metric_distance: if t1<=time.time(): if count_pir>=countCond: configurated = 1 return configurated def distance(): # Create a distance reading with the hcsr04 sensor module value = sensor.Measurement(trig_pin, echo_pin, 20, 'metric', 1) raw_measurement = value.raw_distance(5)

# Calculate the distance in centimeters metric_distance = value.distance_metric(raw_measurement) print("The Distance = {} centimeters".format(metric_distance)) return metric_distance

#------

#------from time import sleep from picamera import PiCamera def take_picture(): #;;PI camera module. # maximum resolution: 2592 x 1944 photos and 1920 x 1080 video camera = PiCamera() camera.resolution = (1024, 768) camera.start_preview() sleep(2) camera.capture('/home/pi/Desktop/images/image.jpg') camera.stop_preview() def taking_number_pictures(number): camera = PiCamera() camera.resolution = (1024, 768) camera.start_preview() for i in range(number): sleep(2) camera.capture('/home/pi/Desktop/images/image%.jpg' % i) camera.stop_preview() def take_video(): ## maximum resolution: 1920 x 1080 video, the format are .h264 #default 10 seconds camera = PiCamera() camera.start_preview() camera.start_recording('/home/pi/Desktop/videos/video.h264') sleep(10) camera.stop_recording() camera.stop_preview()

#You can alter the transparency og the camera def transparency_camera(alpha): from picamera import PiCamera from time import sleep

100

camera = PiCamera() #alpha=200 camera.start_preview(alpha) sleep(10) camera.stop_preview()

#------def json_send( Topic, ok, Functionality, Distance, Time, Count, Camera): data = {"Topic":Topic, "ok":ok, "Functionality":Functionality, "Distance":Distance, "Time":Time, "Count":Count, "Camera":Camera } u=json.dumps(data) a=json.loads(u) return a def json_receive(a): a=json.loads(u)

#Return the distance when pir sensor detect. #camera = 1 take photo, camera = 2 make a video , camera = 3 x number of pictures def detect_distance(): ok = 0 camera = 0 mqttTopic = "Device1/Detect" Functionality = 1 while (detect_pir()==0): inf=json_send(mqttTopic, ok, Functionality, distance(), 0, 0, camera) ok = 1 #in python tests put the mode "camera/video" camera = "camera" inf = json_send(mqttTopic, ok, Functionality, distance(), 0, 0, camera) return inf

#Return ok and distance when pir sensor detect that the condition movement is acomplish def detect_distance_condition(distanceCondition): #pir_calibrated(15) #calibretion #hcsr04_calibrated(15) ok = 0 mqttTopic = "Device1" Functionality = 2 while ((detect_pir()==0) & (distance() <= distanceCondition)): #max sensor tener en cuenta inf=json_send(mqttTopic, ok, Functionality, distance(), 0, 0) #save the distance if (distance() <= distanceCondition): ok=1 inf=json_send(mqttTopic, ok, Functionality, distance(), 0, 0) inf=json_send(mqttTopic, ok, Functionality, distance(), 0, 0)

101

return inf

#tractar si la persona a violado la distancia con alarmas//////////

#Return ok and distance when pir sensor detect that the condition movement is acomplish def detect_time_condition(timeCond): #pir_calibrated(15) #calibretion #hcsr04_calibrated(15) ok = 0 mqttTopic = "Device1" Functionality = 3 t1 = time.time() + float(int(timeCond)*60) while ((detect_pir()==0) & (time.time() <= t1)): #max sensor tener en cuenta inf=json_send(mqttTopic, ok, Functionality, distance(), timeCond, 0) #save the distance if t(time.time() <= t1): ok = 1 inf=json_send(mqttTopic, ok, Functionality, distance(), timeCond, 0) inf=json_send(mqttTopic, ok, Functionality, distance(), timeCond, 0) return inf

#Return distance if the conditions are acomplish, time, number of sensor detect movement def detect_distanceIfConditions(distanceCondition,timeCond,countCond): #pir_calibrated(15) #calibretion #hcsr04_calibrated(15) ok = 0 cond1 = 0 count = 0 mqttTopic = "Device1" Functionality = 4 t1 = time.time() + float(int(timeCond)*60) inf=json_send(mqttTopic, ok, Functionality, distance(), timeCond, 0) while (time.time() <= t1): if (distance()) <= int(distanceCondition): cond1 = cond1 + 1 distancia =distance() print cond1 while detect_pir(): #max sensor tener en cuenta count = count_pir(count) print count print cond1 print count print countCond if ((cond1 > 0 ) and (count >= countCond)): ok = 1 inf=json_send(mqttTopic, ok, Functionality, distancia, timeCond, 0) print inf return inf

#How persons enter to the room or detect this sensor in one time #return num of persons/objects detects in the time condition def count_movementDetect(timeCond): counter = 0

102

mqttTopic = "Device1" Functionality = 5 t1 = time.time() + float(int(timeCond)*60) while time.time() <= t1: if detect_pir(): counter = counter + 1 ok = 1 print counter count = counter / 20 inf=json_send(mqttTopic, ok, Functionality, 0, timeCond, count) return inf

#------import paho.mqtt.client as mqtt def publish_mqtt(message): # Define Variables MQTT_HOST = "192.168.1.35" #iot.eclipse.org MQTT_PORT = 1883 MQTT_KEEPALIVE_INTERVAL = 45 #MQTT_TOPIC = "device1" #MQTT_MSG = 25

def on_connect(mqttc, userdata, flags, rc): #Subscribe to a Topic mqttc.subscribe(message["Topic"], 0) print("Connection returned result: "+ mqtt.connack_string(rc))

# Define on_publish event function def on_publish(mqttc, userdata, mid): print "Message Published..." print("mid : " +str(mid)) mqttc.disconnect()

# Initiate MQTT Client mqttc = mqtt.Client(client_id="LCESS", clean_session=False)

# Register publish callback function mqttc.on_publish = on_publish mqttc.on_connect = on_connect # Connect with MQTT Broker # probar mqttc.username_pw_set(username, password) mqttc.connect(MQTT_HOST, MQTT_PORT, MQTT_KEEPALIVE_INTERVAL)

# Publish message to MQTT Broker mqttc.publish("Device2",message,0,True) #publish image

# Disconnect from MQTT_Broker mqttc.disconnect()

#------

######################################################################## ##################################

#------FUNCTIONALITIES------

103

######M# EMAIL

# Import smtplib for the actual sending function import smtplib

# Here are the email package modules we'll need from email.mime.image import MIMEImage from email.mime.multipart import MIMEMultipart from email.mime.text import MIMEText from email.mime.base import MIMEBase from email import Encoders def send_email(Email, opcion): strFrom = '[email protected]' strTo = "[email protected]" #put an email of the user Subject = 'Advise of your LCESS Security system' body = 'Your Device dectect object or person, you can see the image or video attached'

# Create the container (outer) email message. msg = MIMEMultipart() msg.attach(MIMEText(body)) msg['Subject'] = Subject # me == the sender's email address # family = the list of all recipients' email addresses msg['From'] = strFrom msg['To'] = strTo # Assume we know that the image files are all in PNG format

# Open the files in binary mode. Let the MIMEImage class automatically # guess the specific image type. if opcion == "camera": fp = open("/home/pi/Desktop/images/image.jpg", 'rb') img = MIMEImage(fp.read()) fp.close() msg.attach(img) elif opcion == "video": part = MIMEBase('application', "octet-stream") fp = open("/home/pi/Desktop/videos/video.h264", 'rb') part.set_payload(fp.read()) Encoders.encode_base64(part) fp.close() msg.attach(part) #fp.close() #msg.attach(part)

# Send the email via our own SMTP server. s = smtplib.SMTP('smtp.gmail.com',587) s.ehlo() s.starttls() s.ehlo() s.login(strFrom, 'toshibaE1vision') s.sendmail(strFrom, strTo, msg.as_string() + body + 'dsfkureivnuir') s.quit()

##################### def send_mqttImage(): f = open("/home/pi/Desktop/images/image.jpg","rb")

104

filecontent = f.read() byteArray = bytearray(filecontent) publish_mqtt(byteArray)

####################################################################### def main(argv): #Return the distance when pir sensor detect. while True: function = raw_input('Configure your Security System:\nWhat your prefer?\n\ndetect_distance()\ndetect_distance_condition(distanceConditio n)\ndetect_time_condition(TimeCondition)\ncount_movementDetect(timehour) \ndetect_distanceIfConditions(distanceCondition,timeseg,count)\n\n') if function == 'detect_distance()': value = detect_distance() if value["ok"]: value["Functionality"] = 6 a = json.dumps(value) publish_mqtt(a) if value["Camera"] == "camera": #image take_picture() send_email('[email protected]', value["Camera"]) #send_mqqtImage() print "image received" elif value["Camera"] == "video": #video take_video() send_email('[email protected]', value["Camera"]) #try to send picture with mqtt #f = open("/home/pi/Desktop/images/image.jpg","rb") #filecontent = f.read() #byteArr = bytearray(filecontent) #publish_mqtt(0,byteArr) print a print ("Detect object/person in " + str(value["Distance"]) + "cm") print ("Publish distance : "+str(value["Distance"])+" with topic: "+ str(value["Topic"])) elif function == 'detect_distance_condition()': arg = raw_input('Distance Condition = ') value = detect_distance_condition(arg) if value["ok"]: value["Functionality"]=7 a = json.dumps(value) publish_mqtt(a) print a print ("Detect object/person in " + str(value["Distance"]) + "cm") print ("Publish distance : "+str(value["Distance"])+" with topic: "+ str(value["Topic"])) elif function == 'detect_time_condition()': arg = raw_input('Time Condition = ') value = detect_distance_condition(arg) if value["ok"]: value["Functionality"]=8 a = json.dumps(value) publish_mqtt(a) print a

105

print ("Detect object/person in" + str(value["Time"]) + "mins") print ("Publish distance: "+str(value["Distance"])+" with topic: "+ str(value["Topic"])) elif function == 'detect_distanceIfConditions()': arg = raw_input('Distance Condition = ') arg1 = raw_input('\ntime Condition = ') arg2 = raw_input('\nCount Condition = ') value = detect_distanceIfConditions(arg, arg1, arg2) if value["ok"]: value["Functionality"]=9 a = json.dumps(value) publish_mqtt(a) print ("Detect object/person in" + str(value["Distance"]) + "cm") print ("Detect object/person in" + str(value["Time"]) + "mins") print ("Publish distance: "+str(value["Distance"])+" with topic: "+ str(value["Topic"])) elif function == 'count_movementDetect()': arg = raw_input('Time Condition = ') value = count_movementDetect(arg) if value["ok"]: value["Functionality"]=10 a = json.dumps(value) distancia = value["Distance"] publish_mqtt(a) print ("Detect " + str(value["Count"]) + " object or persons") print ("Publish distance : "+str(distancia)+" with topic: "+ str(value["Topic"])) if __name__ == "__main__": main(sys.argv[1:])

How still not we have put all code of the app and of the web, we have decided not put the code of the device 3 because is the same that the device 2, but including the GPS module code, which pass the longitude and latitude in a message to the app.

Android APP

LAYOUTS

106

android:layout_marginBottom="8dp">

114

115

android:hint="@string/textcount" android:textSize="@dimen/optionstext" android:textColorHint="@color/colorBlack" android:textStyle="bold" />

116

android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_marginTop="55dp">

118

android:textSize="20dp" android:layout_marginLeft="0dp" android:textColorHint="@color/colorBlack" android:textStyle="bold" />

120

121

android:layout_width="100dp" android:layout_height="wrap_content" android:textColorHint="@color/colorWhite" android:hint="@string/prompt_count" />

124

android:layout_marginTop="15dp">

125

android:layout_height="150dp" android:layout_gravity="center_horizontal" />

126

Fill all variables After pulse insert