Integration of 'Google Assistant' and 'Adafruitio' Via 'IFTTT' to Implement
Total Page:16
File Type:pdf, Size:1020Kb
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 07 Issue: 09 | Sep 2020 www.irjet.net p-ISSN: 2395-0072 Integration of ‘Google Assistant’ and ‘AdafruitIO’ via ‘IFTTT’ to Implement an IoT based Wireless Multi-Sensor Network Surveillance and Multi-Appliance Control System Ms. Divya Agarwal, Mr. Santosh Kumar Srivastava2 1M.Tech. Scholar, Department of Computer Science Engineering, B.R.C.M.C.E.T., Bahal, Haryana, India 2Assistant Professor, Department of Computer Science Engineering, B.R.C.M.C.E.T., Bahal, Haryana, India ----------------------------------------------------------------------***--------------------------------------------------------------------- Abstract - Here in this work efforts were made to playing music and the like and are still learning with every implement an IoT (Internet-of-Things) based multi-purpose interaction the users make. This intuitive way of system. One purpose was the controlling of multiple interacting with technical devices without the need for electrical appliances remotely via a VPA (Virtual Personal haptic contact makes verbal communication the new Assistant) and the other purpose was the remote monitoring interface to technology. Voice commands on IoT-based of a multi-sensor network via a MQTT broker. Here, the VPA Smart Home are simpler to apply, without typing text used was ‘Google Assistant’ by Google and MQTT broker messages. Users get convenience compared to using text. used was ‘AdafruitIO’ by Adafruit. Another purpose was to Dialogue systems or conversational systems can support a integrate and connect these two different services via an wide range of applications. Spoken dialogue systems or ‘IFTTT’ Applet so that they could be able to communicate voice-controlled assistants are devices that can respond to with each other. So, here in the IFTTT applet created, multiple voices, regardless of accent, can execute several ‘Google Assistant’ was used as a Trigger service whereas commands or can provide an answer, thus imitating a ‘AdafruitIO’ was used as an Action service for this system. natural conversation. Speech recognition also called This IoT based system was designed developed and speech to text translation. It transforms human voice into implemented as a hardware prototype around a high a string of data than can be interpreted by the computer. performance and low cost NodeMCU development board. However, in recent years cloud-based speech recognition The multi-sensor network consists of sensors like PIR motion systems have been developed a lot. In this way, all Sensor, LDR light sensor, DHT11 Temperature and Humidity elements of a voice-controlled assistant are placed in Sensor, MQ135 Gas Sensor. All the sensor’s data and status cloud. “Siri” from Apple, “Google Assistant” from Google, of relays could be observed over the AdafruitIO dashboard “Cortana” from Microsoft, “Alexa” from Amazon, and on a laptop or mobile phone from anywhere in the world. “Bixby” from Samsung are the five major VA's commercially available in the market at this point of time. Key Words: MQTT, IFTTT, IoT, VPA, Google Assistant, They are present in most smartphones and are based on AdafruitIO, WiFi, NodeMCU, etc. artificial intelligence elements such as deep learning and neural networks. 1. INTRODUCTION 2. MOTIVATION Human's communication with the machines is evolving day by day. This human-machine interaction process The motivation behind this work came from the started with the buttons has evolved to the touch-pads and consistently rising demand of Artificial Intelligence and now humans can give command to the intelligent IoT devices in our day to day life. These cutting edge machines by just talking to them. One of the goals of technologies have the power to strongly influence the Artificial intelligence (AI) is the realization of natural working methodologies of humans as well as machines. dialogue between humans and machines. In recent years, The Human Machine Interface (HMI) can do a lot in some the voice systems, also known as interactive specific areas like robotics, smart homes, smart cars, etc. conversational systems are the fastest growing area in AI. Further, for developers, the easy access to some of the Voice Assistant (VA) is defined as a “voice-based interface open source platforms like Google Assistant, Alexa, that relies on voice commands supported by artificial Adafruit IO, IFTTT, etc. has increased the innovation intelligence to have verbal interaction with the end users spectrum worldwide. Even hobbyists can now use these for performing a variety of tasks”. The VA's are gaining in platforms for developing their freaking gadgets. Although popularity and getting integrated into our daily lives at a these platforms are limited in terms of their free usage but rapid pace. Besides, the progress of technology has made it provides an option to enhance the knowledge levels about easier to integrate them into devices, which greatly the practicality of these applications. improves user experience and makes everyday life much easier, e.g. make a voice command and smart devices execute it. Virtual Voice Assistants already allow for many tasks like asking for information, turning off the lights, © 2020, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 2226 International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 07 Issue: 09 | Sep 2020 www.irjet.net p-ISSN: 2395-0072 3. AIM & OBJECTIVE 5. SYSTEM ARCHITECTURE The aim of the project was to control a scrolling text message LED matrix display using Artificial Intelligence based Virtual Personal Assistants (VPAs) by Google i.e. “Google Assistant”. The developed IoT system was NodeMCU centric and communication medium between the user mobile handset and the device to control was wireless only using the Wi-Fi protocol. The objective was to empower the user to assign some voice command over the mobile handset with VPA facility to control a smart device. The VPA should respond to the user by converting this voice command-to-text and revert back to the user by replying in voice. The objective of the study was the implementation of an IoT based VPA (Virtual Personal Assistant) controlled home automation system. The Fig.-1: System Architecture proposed system should be capable enough to save data to the cloud and enable the user to perform remote 6. METHODOLOGY ADOPTED surveillance of multisensory environmental parameters over the phone or laptop by using internet anywhere from Here in this section the methodologies followed to design the world. It should also enable the user to control different and implement this system using different hardware and appliances remotely over the internet through voice software tools were discussed. commands using the ‘Google Assistant’ which is a VPA by Google. The objective was to empower the user to assign 6.1 System Components some voice command over the mobile handset with VPA facility to control a smart device. The VPA should respond To build this application, these different hardware to the user by converting this voice command-to-text and components were used revert back to the user by replying in voice and text both and also concurrently by performing the assigned task or NodeMCU –ESP8266 development board with Wi- transferring the commands to the device. Fi SoC. Multiple Analog and Digital Sensors including PIR, 4. SYSTEM WORKING PRINCIPLE DHT-11, LDR, MQ-135 +5V DC power supply This project uses two services to make it control through Also, to build this application, three different software Google Assistant from anywhere in the world, Adafruit platforms were used MQTT and IFTTT. Adafruit MQTT broker allows changing ‘Google Assistant’ Virtual Personal Assistant the message feed/ topic from any internet connected ‘AdafruitIO’ MQTT Broker device globally. Here, the NodeMCU was acting as a MQTT ‘IFTTT’ Applet client hence it was constantly listening to Adafruit MQTT To use above services we need to configure these software broker. So if any changes occur in the server side, the same platforms individually. changes will be observed on the client side i.e. on our NodeMCU board. And to change our message on the MQTT 6.2 Experimental Set-up broker side via Google assistant, we were using one service called IFTTT. In IFTTT, we were making an applet The experimental setup was developed as per the circuit in which we could connect two services, Google Assistant design and as shown below all the sensor modules like LDR and Adafruit MQTT. So by making proper applet, we could light sensor module, IR Proximity sensor module, MQ-135 successfully update the message feed/ topic on the gas sensor module, PIR Motion sensor module, DHT-11 adafruit broker side with Google Assistant on the phone. If temperature and humidity sensor module were deployed This Then That, also known as IFTTT is a freeware web- in the circuit along with a small buzzer and four channel based service that creates chains of simple conditional relay module. The output of these relay modules were statements, called applets. An applet is triggered by further extended to switch the ac load. For ac-load four changes that occur within other web services such as LED bulbs were used here for the demonstration purpose Gmail, Facebook, Telegram, Instagram, or Pinterest. IFTTT only. Otherwise, any single phase ac electrical appliance pairs two different services/ devices so that they can talk could be connected with relay outputs. to each other. In other words, IFTTT is the free way to get all the apps and devices talking to each other. © 2020, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 2227 International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 07 Issue: 09 | Sep 2020 www.irjet.net p-ISSN: 2395-0072 7. FIRMWARE COMPONENTS The firmware was edited and compiled in the Arduino IDE (Integrated Development Environment). The very first step before starting writing the code is to write an algorithm.