<<

processes

Article Autonomous Indoor Scanning System Collecting Spatial and Environmental Data for Efficient Indoor Monitoring and Control

Dongwoo Park and Soyoung Hwang *

Department of Software, Catholic University of Pusan, Busan 46252, Korea; [email protected] * Correspondence: [email protected]; Tel.: +82-51-510-0644

 Received: 23 July 2020; Accepted: 7 September 2020; Published: 11 September 2020 

Abstract: As various activities related to entertainment, business, shopping, and conventions are done increasingly indoors, the demand for indoor spatial information and indoor environmental data is growing. Unlike the case of outdoor environments, obtaining spatial information in indoor environments is difficult. Given the absence of GNSS (Global Navigation Satellite System) signals, various technologies for indoor positioning, mapping and modeling have been proposed. Related business models for indoor space services, safety, convenience, facility management, and disaster response, moreover, have been suggested. An autonomous scanning system for collection of indoor spatial and environmental data is proposed in this paper. The proposed system can be utilized to collect spatial dimensions suitable for extraction of a two-dimensional indoor drawing and obtainment of spatial imaging as well as indoor environmental data on temperature, humidity and particulate matter. For these operations, the system has two modes, manual and autonomous. The main function of the systems is autonomous mode, and the manual mode is implemented additionally. It can be applied in facilities without infrastructure for indoor data collection, such as for routine indoor data collection purposes, and it can also be used for immediate indoor data collection in cases of emergency (e.g., accidents, disasters).

Keywords: autonomous scanning; indoor spatial data; environmental data; IoT (Internet of Things); mobile application; sensor

1. Introduction People are known to spend more than 90% of their entire lives indoors. As this time increases, indoor spaces’ various elements and environments will inevitably impact human life. Additionally, the expression and understanding of indoor spaces have grown more complicated as commercial buildings and large-scale facilities have increased in number. Accordingly, the demand for location-based services that can process space and location information is increasing. In order to realize such a service, basic indoor spatial information must be provided. This includes detailed indoor maps and models that can be used for route planning, navigation guidance, and many other applications, as well as location information for pedestrians and goods in indoor spaces. In indoor environments unlike outdoor ones, obtaining spatial information is difficult. In the absence of GNSS (Global Navigation Satellite System) signals, various technologies for outdoor positioning, mapping as well as modeling have been proposed. An indoor positioning technology is one that determines an object’s or person’s location and then tracks its movement to the next location. Indoor mapping and modeling entail the generation of spatial information based on data that already exists or is obtained by surveying, followed by detection of changes and updating

Processes 2020, 8, 1133; doi:10.3390/pr8091133 www.mdpi.com/journal/processes Processes 2020, 8, 1133 2 of 28 accordingly. Related business models for indoor space services, convenience, facility management, safety, and disaster response have been suggested. In order to meet the market demand for indoor applications, first, techniques for deriving indoor space information and performance analysis data for the related system are presented in [1,2]. Usually, they suggest schemes for deriving indoor maps and models based on collected point cloud data. In addition, a smart home or building infrastructure integrating IoT technology based on sensors and actuators for monitoring and control of indoor spaces such as homes or buildings has been proposed [3,4]. Therefore, we consider a system that can collect indoor data autonomously for monitoring and control of indoor spaces in real time where there is no infrastructure for the purpose. The data include indoor spatial data as well as environmental data. In these pages, an autonomous scanning system for collection of indoor spatial and environmental data and efficient monitoring and control of indoor spaces, thereby, is proposed. The system can be utilized to collect indoor environmental data on temperature, humidity, particulate matter, and spatial dimensions suitable for extraction of a two-dimensional indoor drawing and obtainment of spatial imaging. For these operations, the system has two modes, manual and autonomous. It comprises three main components: a user terminal, a gateway, and an autonomous scanner. As for the user terminal, it controls the autonomous scanner component, and immediately checks scanned information that comes through the gateway. The gateway receives control commands from the user terminal and sends monitoring information to the user. Finally, the autonomous scanner comprises the following components: a mobile ; a lidar sensor; a camera sensor; a temperature/humidity sensor, and a particulate matter sensor. It is used to overcome the distance limitation of the lidar sensor in collecting indoor spatial information and to perform autonomous scanning. The proposed system can be applied in facilities without infrastructure for indoor data collection, such as for routine indoor data collection purposes, and it can also be used for immediate indoor data collection in cases of emergency (e.g., accidents, disasters). The remainder of this paper is organized as follows. In Section2, the background and motivation of the present study are discussed. Section3 introduces the autonomous scanning system’s design and prototype implementation. Section4 presents the system’s experimental results. Section5 concludes the paper.

2. Background and Motivation As an indoor space is a deliberately designed space, it can have many characteristics according to its purpose. Therefore, various related information is available. In this section, we survey the research on indoor mapping and modeling, indoor positioning and localization, and application services utilizing indoor spatial and environmental information. In addition, we present the motivation for the present research.

2.1. Indoor Mapping and Modeling Various techniques have been developed for efficient construction of indoor expressing and visualizing data in the forms of indoor maps or modeling, for example. Among those techniques are automatic or semi-automatic extraction from architectural blueprints, and Building Information Management (BIM) systems that use specialized software and scanning-based indoor map construction. Autonomous mobile are widely employed in many areas. A with a Laser Range Finder (LRF) developed for environment-scanning purposes is presented in [5,6]. In that study, the mobile robot was operated in a known environment; as such, path following was the easiest navigation mode to adopt. Path following allows a robot’s movement and navigation according to a preprogrammed path. By the time the robot has moved through the black points, it will have stopped moving in order to scan the environment; then, after the scanned data has been completely saved in the computer, the robot continues on its path to the end point [6]. Adán et al. present a critical review of current mobile scanning systems in the field of automatic 3D building-modeling in [7]. Díaz-Vilariño et al. propose a method for the 3D modeling of indoor Processes 2020, 8, 1133 3 of 28 spaces from point cloud data in [8]. Some work has evaluated and analyzed the performance of conventional indoor scanning and mapping systems [9,10] that utilize hand-held devices or backpack or trolley forms. A simultaneous localization and mapping approach based on data captured by a 2D laser scanner and a monocular camera is introduced in [11].

2.2. Indoor Positioning and Localization Indoor positioning technology identifies and determines the location of an object or person and tracks it or him to a new location. Indoor positioning technology is a key element in indoor spatial information utilization services. In outdoor scenarios, the mobile terminal position can be obtained with a high degree of accuracy, thanks to GNSS. However, GNSS encounters problems in indoor environments and scenarios involving deep shadowing effects. Various indoor and outdoor technologies and methodologies, including Time of Arrival (ToA), Time Difference of Arrival (TDoA), Received Signal Strength (RSS)-based fingerprinting as well as hybrid techniques are surveyed in [12], which emphasizes indoor methodologies and concepts. Additionally, it reviews various localization-based applications to which location-estimation information is critical. Recently, a number of new techniques have been introduced, as well as wireless technologies and mechanisms that leverage the Internet of Things (IoT), and ubiquitous connectivity that provides indoor localization services to users. Other indoor localization techniques, such as Angle of Arrival (AoA), Time of Flight (ToF), and Return Time of Flight (RTOF), as well as the above-noted RSS, are analyzed in [13–16], all of which are based on WiFi, Radio Frequency Identification Device (RFID), Bluetooth, Ultra-Wideband (UWB), and other technologies proposed in the literature. For indoor positioning, various algorithms utilizing WiFi signals are investigated in [17–22] to improve localization accuracy and performance. With regard to UWB ranging systems, an improved target detection and tracking scheme for moving objects is discussed in [23], and a novel approach to precise TOA estimation and ranging error mitigation is presented in [24]. Thanks to the wide-spread of WiFi and the support of the IEEE 802.11 standard by the majority of mobile devices, most proposed indoor localization systems are based on WiFi technologies. However, the inherent noise and instability of wireless signals usually degrades the accuracy and robustness in a dynamically changing environment. A recent novel approach is the deep learning-based framework to improve localization accuracy. Abbas et al. propose an accurate and robust WiFi fingerprinting localization technique based on a deep neural network in [25]. A deep learning framework for joint activity recognition and indoor localization task using WiFi channel state information (CSI) fingerprints is presented in [26]. Hoang et al. describe RNN (Recurrent Neural Networks) for WiFi fingerprinting indoor localization focusing on trajectory positioning in [27].

2.3. Applications Utilizing Indoor Spatial and Environmental Information Applications utilizing indoor spatial information facilitate indoor living and can be defined as services provided through wired/wireless terminals by acquiring and managing information directly or indirectly related to the indoor space. Services using indoor spatial information can be categorized into those enhancing people’s security and convenience, facility management and disaster response, marketing, and other businesses based on basic services provided by locating their client occupants. Security services in high-rise buildings and large-sized buildings rely on the effective utilization of indoor spatial information. These services include software and smart equipment to maintain the security of occupants in the building and to assist in firefighting and rescue activities. Convenience enhancement services include user location-based road guidance services that guide visitors to complex buildings, such as large-scale buildings, underground facilities, and complex space facilities. Facility management and disaster response services can be classified into space and asset management, BEMS (Building Energy Management System), and PSIM (Physical Security Information Processes 2020, 8, 1133 4 of 28

Management). BEMS is an integrated building energy control and management system that enables Processesfacility 2020 managers, 8, x FOR to PEER take REVIEW advantage of rational energy consumption by using ICT (Information4 of and 29 Communication Technology) technology to efficiently maintain a user’s pleasant and functional work andenvironment. functional PSIM work is aenvironment. service for responding PSIM is toa disasterservice situationsfor responding by identifying to disaster such situationssituations andby identifyingassisting evacuees such situations [28]. and assisting evacuees [28]. As one research research field field for for efficient efficient energy energy control control of building building using using ICT ICT technologies, technologies, video/image video/image processing technologies were explored to to monitor monitor human human thermal thermal comfort comfort status status in in contactless contactless way, way, which gives feedback signals for BEMS to create energy eefficient,fficient, pleasant and functional work environments [29–31]. [29–31]. Martinez-Sala et et al. describe describe an an indoor indoor navigation navigation system system for for visually visually impaired impaired people people in in [32]. [32]. Plagerasa etet al.al. propose propose a systema system for collectingfor collecting and managingand managing sensor sensor data in data a smart in buildinga smart operatingbuilding operatingin an IoT environmentin an IoT environment in [33]. The in importance [33]. The import of theance indoor of environmentalthe indoor environmental monitoring ismonitoring emphasized is emphasizedin [34] for human in [34] safety for human and health. safety Mujanand health. et al. Mujan review et the al. state-of-the-artreview the state-of-the-art literature and literature establish and a establishconnection a connection between the between factors thatthe factors influence that health influe andnce productivity health and productivity in any given in indoor any given environment indoor environmentin [35], whether in [35], it be residentialwhether it be or commercial.residential or A commercial. systemic review A systemic of relevance review between of relevance indoor between sensors indoorand managing sensors and optimal managing energy optima saving,l energy thermal saving, comfort, thermal visual comfort, comfort, visual and indoor comfort, air and quality indoor in theair qualitybuilt environment in the built environment is presented inis presented [36]. Therefore, in [36]. indoorTherefore, environment indoor environment scanning can scanning be useful can forbe usefulinvestigating, for investigating, monitoring monitoring and studying and relatedstudying with related indoor with applications. indoor applications. This paper proposes an autonomous scanning sy systemstem for acquisition of indoor environmental data such as temperature,temperature, humidity,humidity, particulate particulate matter matter and and picture picture of of indoor indoor as as well well as spatialas spatial data data for forbeing being aware aware ofthe of the indoor indoor environment environment status status and an indoord indoor visualization. visualization. The The proposed proposed system system can can be beapplied applied in facilitiesin facilities without without infrastructure infrastructure for indoorfor indoor data data collection, collection, such such as for as routine for routine indoor indoor data datacollection collection purposes, purposes, and it and can alsoit can be also used be for used immediate for immediate indoor data indoor collection data collection in cases of in emergency cases of emergency(e.g., accidents, (e.g., disasters). accidents, disasters).

3. Autonomous Autonomous Scanning Scanning System System This section presents the architecture, function functionalal design and prototype implementation of the proposed autonomous scanning system.

3.1. Architecture Architecture of Autonomous Scanning System The conceptual architecture architecture of of the the autonomous autonomous scanni scanningng system system is isdefined defined as asshown shown in inFigure Figure 1. 1It. isIt composed is composed of three of three main main parts: parts: a user a user terminal, terminal, a gateway, a gateway, and an and autonomous an autonomous scanner. scanner.

Figure 1. Conceptual architecture of proposed autonomous scanning system. It is composed of three Figuremain parts: 1. Conceptual a user terminal, architecture a gateway, of proposed and an autonomous scanner.scanning system. It is composed of three main parts: a user terminal, a gateway, and an autonomous scanner. The user terminal is configured to receive and confirm the results of the indoor spatial data and environmentalThe user terminal data, and is configured to input control to receive commands. and confirm The the gateway results isof configuredthe indoor spatial to relay data control and environmentalcommands input data, from and the userto input terminal control and commands spatial and. environment The gateway data is collectedconfigured by to the relay autonomous control commandsscanner. The input autonomous from the scanner user terminal is configured and spatial to facilitate and indoorenvironment data collection data collected and considers by the autonomouscapability of scanner. autonomous The autonomous driving. In addition, scanner is various configured sensor to equipmentfacilitate indoor is considered data collection for indoor and considersenvironment capability data collection. of autonomous driving. In addition, various sensor equipment is considered for indoor environment data collection.

Processes 2020, 8, 1133 5 of 28 Processes 2020, 8, x FOR PEER REVIEW 5 of 29

3.2. Functional Design of Autonomous Scanning System In the design of thethe autonomousautonomous scanning system,system, we utilized prototyping platforms because, for the purpose of detailed function definition, definition, it wa wass necessary to first first confirm confirm the components to be used. Figure 22 showsshows thethe proposedproposed system’ssystem’s components.components.

Figure 2. Components of proposed autonomous scanning system. We utilized prototyping platforms Figure 2. Components of proposed autonomous scanning system. We utilized prototyping platforms such as Raspberry Pi and iRobot Create 2. such as Raspberry Pi and iRobot Create 2. A smartphone (Galaxy J6) is used for user-terminal monitoring and control. Raspberry Pi 3 A smartphone (Galaxy J6) is used for user-terminal monitoring and control. Raspberry Pi 3 Model B, a single-board computer, is used for a gateway. It has a 64-bit quad core processor, on-board Model B, a single-board computer, is used for a gateway. It has a 64-bit quad core processor, on-board WiFi, Bluetooth and USB boot capabilities. Raspberry Pi’s various communication and interface WiFi, Bluetooth and USB boot capabilities. Raspberry Pi’s various communication and interface support and computing capabilities are suitable for rapid prototyping gateway implementation and support and computing capabilities are suitable for rapid prototyping gateway implementation and experimentation. The autonomous scanner component is composed of a mobile robot, a lidar sensor, experimentation. The autonomous scanner component is composed of a mobile robot, a lidar sensor, a camera sensor, a temperature/humidity sensor and a particulate matter sensor. iRobot Create 2, a camera sensor, a temperature/humidity sensor and a particulate matter sensor. iRobot Create 2, based on 600, is used for mobile robots. This is a programmable robot platform designed to based on Roomba 600, is used for mobile robots. This is a programmable robot platform designed to enable the user to set various functions such as motion, sound and light. As such, it is equipped with enable the user to set various functions such as motion, sound and light. As such, it is equipped with LEDs and various sensors and can be programmed for specific sounds, behavior patterns, LED light, LEDs and various sensors and can be programmed for specific sounds, behavior patterns, LED light, and so on [37]. and so on [37]. Raspberry Pi Camera V2 is used as a camera sensor. It is attached to Raspberry Pi 3 via a Camera Raspberry Pi Camera V2 is used as a camera sensor. It is attached to Raspberry Pi 3 via a Camera Serial Interface (CSI). DHT22 is used as a low-cost digital temperature/humidity sensor. It uses a Serial Interface (CSI). DHT22 is used as a low-cost digital temperature/humidity sensor. It uses a capacitive humidity sensor and a thermistor to measure the surrounding air and spits out a digital capacitive humidity sensor and a thermistor to measure the surrounding air and spits out a digital signal on the data pin. Plantower PMS7003 is used as a particulate matter sensor. It is in fact a kind of signal on the data pin. Plantower PMS7003 is used as a particulate matter sensor. It is in fact a kind digital and universal particle-concentration sensor that can be used to obtain the number of suspended of digital and universal particle-concentration sensor that can be used to obtain the number of particles in the air, i.e., the concentration of particles, and output them in the form of a digital interface. suspended particles in the air, i.e., the concentration of particles, and output them in the form of a The main components of the output are the quality and number of particles with their different sizes digital interface. The main components of the output are the quality and number of particles with per unit volume, where the particle-number unit volume is 0.1 L and the unit of mass concentration is their different sizes per unit volume, where the particle-number unit volume is 0.1 L and the unit of µg/m3. Lidar is an acronym for light detection and ranging. It is a surveying method that measures mass concentration is μg/m3. Lidar is an acronym for light detection and ranging. It is a surveying distance to a target by illuminating the target with pulsed laser light and measuring the reflected pulses method that measures distance to a target by illuminating the target with pulsed laser light and with a sensor. RPLidar A2, used as a lidar sensor is a 360-degree 2D laser scanner that can take up to measuring the reflected pulses with a sensor. RPLidar A2, used as a lidar sensor is a 360-degree 2D 4000 samples of laser-ranging data per second. Its scanning range is 12 m [38]. laser scanner that can take up to 4000 samples of laser-ranging data per second. Its scanning range is 12 m [38].

Processes 2020, 8, 1133 6 of 28

3.2.1. Function Definitions The user terminal has two modes for indoor data collection, either manual or autonomous. In the manual mode, the movement of the mobile robot, as well as the lidar and camera sensors, is manually controlled. When the autonomous mode is selected, the mobile robot performs autonomous driving, avoiding obstacles to collect and provide sensed data periodically. The gateway receives and interprets a user’s command to execute a program that drives the autonomous scanner accordingly. It stores sensed data collected by autonomous scanners. It generates spatial data by converting the data collected from the lidar sensor to a two-dimensional drawing and stores temperature/humidity and particulate matter data that is measured periodically. These data are transmitted to the user in response to the user’s request. Additionally, a program that controls the movement of the robot is executed according to the driving mode of the mobile robot. It drives forward, backward, and makes left turns and right turns according to the user’s movement command. The gateway defines and operates the robot’s autonomous driving for autonomous scanning. An autonomous scanner is defined as a driving vehicle that receives commands from the user through the gateway. The mobile robot equipped with sensors moves in the indoor space according to the user’s command, collects spatial data and environmental data, and transmits the data to the gateway. The lidar sensor collects indoor spatial data and transmits it to the gateway. The temperature/humidity sensor and particulate matter sensor collect the corresponding data and transmit them to the gateway. The mobile robot operates in the manual mode or the autonomous mode according to the command received through the gateway. The robot can move forward, backward and turn left and right.

3.2.2. Autonomous Scanning Algorithm For the autonomous scanning operation, it is necessary to define the autonomous driving technique of the mobile robot. For the autonomous driving of the robot, we utilize the spatial data collected by the lidar sensor. The scanned data of the lidar sensor has distance and angle data from the current position of the mobile robot to another obstacle. Through the collected data, we extract the direction and distance to the next destination of the robot. The detailed algorithm is as follows. We rearrange the scanned data in descending order according to the distance, extract a certain amount of high-ranked data and average the angles. We decide the extracting amount of data considering of sampling rate and scanning frequency of lidar sensor in implementation. The average of the angles is obtained by following the mean of the circular quantities’ equation.     arctan s s > 0, c > 0  c   s  θ =  arctan + 180◦ c > 0 (1)   c   s + arctan c 360◦ s< 0, c >0

In Equation (1), s is the mean sine of angles, c is the mean cosine of angles, and θ is the mean of angles. The angle thus obtained becomes the direction of movement toward the next destination. The moving distance should be set in consideration of the distance value used in the calculation and the measurement range of the lidar sensor. Once the initial direction is defined, the back part of the angle corresponding to the rotation angle from the scan is excluded to prevent the robot from going back. The detailed operations of the autonomous driving algorithm are presented from Figures3–5. Additionally, as shown in Figure4, the algorithm works so that the proper movement direction can be extracted even if the mobile robot’s position is not centered. Once the initial direction is defined, to prevent the robot from going back, the back part of the angle corresponding to the rotation angle from the scan is excluded. The next direction is calculated by excluding the backward angle based on the current direction, as shown in Figure5 (left, middle). Processes 2020, 8, 1133 7 of 28

EvenProcesses when 2020, the8, x robotFOR PEER approaches REVIEW the wall, the next direction can be derived according to the proposed7 of 29 autonomousProcesses 2020, 8,driving x FOR PEER algorithm, REVIEW as presented in Figure5 (right). 7 of 29

Figure 3. Derivation of direction for next destination in proposed autonomous driving algorithm. Figure 3. Derivation of direction for next destination in proposed autonomous driving algorithm. Figure 3. Derivation of direction for next destination in proposed autonomous driving algorithm.

Figure 4. Derivation of direction for next destination in proposed autonomous driving algorithm when Figure 4. Derivation of direction for next destination in proposed autonomous driving algorithm mobile robot is at corner. Figurewhen mobile 4. Derivation robot is ofat corner.direction for next destination in proposed autonomous driving algorithm when mobile robot is at corner. Additionally, as shown in Figure 4, the algorithm works so that the proper movement direction can beAdditionally, extracted even as shown if the mobile in Figure robot’s 4, the position algorith ism not works centered. so that the proper movement direction can be extracted even if the mobile robot’s position is not centered.

Processes 2020, 8, x FOR PEER REVIEW 8 of 29

Processes 2020,, 8,, 1133x FOR PEER REVIEW 8 of 29 28

Figure 5. Exclusion of back part of angle corresponding to rotation angle from scan to derive direction for next destination in proposed autonomous driving algorithm.

Once the initial direction is defined, to prevent the robot from going back, the back part of the angle corresponding to the rotation angle from the scan is excluded. The next direction is calculated by excluding the backward angle based on the current direction, as shown in Figure 5 (left, middle). Even when the robot approaches the wall, the next direction can be derived according to the proposed autonomousFigure 5. drivingExclusion algorithm, of back part as of presented angle corresponding in Figure to5 (right). rotation angle from scan to derive direction forFigure next 5. destination Exclusion of in back proposed part of autonomous angle corresponding driving algorithm. to rotation angle from scan to derive direction 3.3. Prototypefor next destination Implementation in proposed autonomous driving algorithm. 3.3. Prototype Implementation InOnce this the section, initial directionwe present is defined, the implementati to prevent onthe ofrobot the fromautonomous going back, scanning the back system. part of The the In this section, we present the implementation of the autonomous scanning system. The prototype prototypeangle corresponding architecture to ofthe the rotation autonomous angle from scanning the sc ansystem is excluded. is defined The asnext shown direction in Figure is calculated 6. It is architecture of the autonomous scanning system is defined as shown in Figure6. It is composed of composedby excluding of threethe backward main parts: angle a user based terminal, on the acurrent gateway, direction, and an as autonomous shown in Figure scanner. 5 (left, middle). three main parts: a user terminal, a gateway, and an autonomous scanner. Even when the robot approaches the wall, the next direction can be derived according to the proposed autonomous driving algorithm, as presented in Figure 5 (right).

3.3. Prototype Implementation In this section, we present the implementation of the autonomous scanning system. The prototype architecture of the autonomous scanning system is defined as shown in Figure 6. It is composed of three main parts: a user terminal, a gateway, and an autonomous scanner.

Figure 6. Prototype architecture of proposed autonomous scanning system. The user terminal Figureexchanges 6. Prototype data with thearchitecture gateway viaof TCPproposed/IP communication. autonomous Thescanning autonomous system. scanner The user components terminal exchangesare connected data to with the gatewaythe gateway directly. via TCP/IP communication. The autonomous scanner components are connected to the gateway directly. The user-terminal controls the mobile robot as well as the lidar and camera sensors. It monitors data in the forms of scanned images from the lidar sensor, captured images from the camera sensor, and data from the temperature/humidity sensor and particulate matter sensor. It exchanges this data with the gateway via TCP/IP communication. The gateway receives control commands from the user terminal and sends monitoring data back to the user terminal via TCP/IP communication. Figure 6. Prototype architecture of proposed autonomous scanning system. The user terminal exchanges data with the gateway via TCP/IP communication. The autonomous scanner components are connected to the gateway directly.

Processes 2020, 8, x FOR PEER REVIEW 9 of 29

The user-terminal controls the mobile robot as well as the lidar and camera sensors. It monitors data in the forms of scanned images from the lidar sensor, captured images from the camera sensor, andProcesses data2020 from, 8, 1133 the temperature/humidity sensor and particulate matter sensor. It exchanges this9 data of 28 with the gateway via TCP/IP communication. The gateway receives control commands from the user terminal and sends monitoring data back to the user terminal via TCP/IP communication. The autonomous autonomous scanner scanner components components are are connected connected to the to gateway the gateway directly. directly. The mobile The mobile robot, particulaterobot, particulate matter sensor matter and sensor lidar and sensor lidar are sensor connected are connected and communicate and communicate via a USB via serial a USB interface. serial Theinterface. camera The sensor camera is sensor attached is attached to the togatewa the gatewayy via a via CSI a CSI(Camera (Camera Serial Serial Interface). Interface). And And the temperature/humiditytemperature/humidity sensor sensor is is connected connected to to the the gateway via a GPIO (General Purpose Input and Output) port.port. TheThe gateway gateway controls controls the the autonomous autonomo scannerus scanner according according to user to commands user commands and obtains and obtainsscanned scanned data. data.

3.3.1. Mobile Mobile Application Application for for User User Terminal Terminal For the purposes purposes of the the user user terminal, terminal, we we developed developed a a mobile mobile application application to to monitor monitor and and control control thethe system. system. Some Some screenshots screenshots of of the the mobile mobile application application are presented in Figure 7 7..

Figure 7. Screenshots of mobile application in proposed system. Intro screen of application (left), Figureand control 7. Screenshots screen of applicationof mobile application (right). in proposed system. Intro screen of application (left), and control screen of application (right). In the right of the figure, the control plane is shown. When the user touches the ‘click to release’ menu,In temperature,the right of the humidity figure, the and control particulate plane is matter shown. data When is updated the user periodically.touches the ‘click The to operation release’ menu,mode oftemperature, the autonomous humidity scanner and canparticulate be selected matter between data theis updated autonomous periodically. mode and The the operation manual modemode. of The the main autonomous function ofscanner the systems can be is selected autonomous between mode, the and autonomous the manual mode mode and is implemented the manual mode.additionally. The main When function the autonomous of the systems mode is isautono set, themous mobile mode, robot and moves the manual according mode to theis implemented autonomous additionally.scanning algorithm When and the collects autonomous indoor spatialmode data,is se imaget, the data, mobile temperature, robot moves humidity according and particulate to the autonomousmatter data. scanning The initial algorithm state of the and operation collects indoor mode isspatial manual, data, which image means data, thattemperature, the mobile humidity robot is andcontrolled particulate by the matter user data. using The the initial four directional state of the buttons operation below. mode The is manual, upward which button means is for forwardthat the mobilemovement; robot the is controlled downward by button the user is forusing backward the four movement; directional thebuttons leftward below. button The upward is for left button turns, isand for the forward rightward movement; button isthe for downward right turns. button is for backward movement; the leftward button is for leftIf theturns, user and touches the rightward the camera button icon, is for a picture-taking right turns. command is transmitted to the gateway, and the gateway controls the camera sensor and stores the captured image. If the user touches the gallery icon, the gateway transmits the captured image to the user terminal. The scan icon controls the lidar sensor. When the gateway receives a scan command, it controls the lidar sensor and obtains Processes 2020, 8, 1133 10 of 28 and draws 2D point coordinates as a scanned image. If the user touches the map icon, the gateway transmits the scanned image to the user. The main difference between the autonomous mode and the manual mode is the control of the mobile robot. In the manual mode, the user must control the movement of the mobile robot, and in the autonomous mode, the mobile robot is driven to avoid obstacles according to the autonomous driving algorithm. In the autonomous mode, indoor scanning and camera control are automatically performed at every movement step in the collection of sensor data, whereas in the manual mode, they must be performed manually by the user. Even so, both temperature and humidity data are collected periodically and sent to the user terminal in the manual mode.

3.3.2. Gateway The gateway operates in direct connection with mobile robots and sensors. According to the user’s command, it operates the associated control program and stores various data collected by the autonomous scanner. In Raspberry Pi, three main python scripts are implemented. The first script includes processing of the user’s commands and operations of iRobot in the manual or autonomous mode. The second script entails control of RPLidar A2 and Pi Camera V2. The third script includes control of the temperature/humidity and particulate matter sensors. Data collected from each sensor is stored as a file in Raspberry Pi. The format of a temperature/humidity sensor data file is as follows.

Temp = 23.6 ◦C Humidity = 29.1% Fri Mar 20 16:53:27 2020 The format of a particulate matter sensor data file is as follows. PM 1.0: 15 PM 2.5: 20 PM 10.0: 22 Fri Mar 20 17:55:46 2020 For the purposes of tracing and analysis, measured date and time information is added to each sensed data. The following Table1 shows sample angle and distance data obtained from the RPLidar A2 lidar sensor. The data of the lidar sensor is provided to the user by converting it to a scanned image in the form of a two-dimensional drawing, and is also used to calculate the movement direction and distance of the iRobot when it is operating in the autonomous mode.

Table 1. Sample angle and distance data obtained from RPLidar A2 lidar sensor.

Angle Distance Angle Distance 108.203125 1243.0 111.34375 1264.0 109.765625 1249.75 112.984375 1278.25

For an autonomous scan operation, autonomous driving of iRobot is implemented as follows. First, the gateway collects angle and distance data from the current position of iRobot to an obstacle using RPLidar A2 and sorts the data in descending order based on the distance values. Then, it extracts high-ranked 550~1100 data and averages the angles using the mean of circular quantities formula. This time, to prevent the robot from going back, we added a code that excludes the back part of the angle corresponding to the rotation angle from the scan. The angle of the current direction of the robot is always 0◦; therefore, to prevent the robot from going back, the back part of the angle to be excluded was set from 120◦ to 260◦ in this implementation. The angle thus obtained becomes the direction of movement toward the next destination. The moving distance should be set in consideration of the distance value used in the calculation and the measurement range of the lidar sensor. In this implementation, we choose the top value used in the calculation, divide it by 160, which is determined heuristically, and set the result as the movement distance. Processes 2020, 8, x FOR PEER REVIEW 11 of 29 calculation,Processes 2020, 8divide, 1133 it by 160, which is determined heuristically, and set the result as the movement11 of 28 distance.

3.3.3.3.3.3. Autonomous Autonomous Scanner Scanner TheThe autonomous autonomous scanner scanner is is a a driving vehicle that that actually operates operates according according to to a command issuedissued by thethe useruser via via the the gateway. gateway. The The mobile mobile robot robot equipped equipped with with sensors sensors moves moves in the in indoor the indoor space, space,collects collects spatial dataspatial and data environmental and environmental data, and data, transmits and thetransmits data to the gateway.data to the The gateway. movement The of movementiRobot and of control iRobot of and sensors control are of implemented sensors are impl usingemented Python using scripts Python in Raspberry scripts Pi,in Raspberry as mentioned Pi, as in mentionedSection 3.3.2 in. Section 3.3.2. FigureFigure 88 showsshows iRobot’siRobot’s sensorssensors thatthat areare mainlymainly relatedrelated toto driving.driving. TheThe wallwall sensorsensor allowsallows thethe robotrobot to to identify identify the the right wall and continue to move along it. The The light light bumper sensor sensor detects detects an an obstacleobstacle in in advance advance when when it it appears while the robot is driving and avoids it. The The bumper bumper sensor sensor detectsdetects a a collision collision when when the the robot robot collides collides with with ob obstaclesstacles that that the the robot robot has has not detected beforehand beforehand andand enables enables the the robot robot to continue to move after handing the collision.

Figure 8. Sensors of iRobot mainly related to driving: light bumper sensors, wall sensor, bumper sensors. Figure 8. Sensors of iRobot mainly related to driving: light bumper sensors, wall sensor, bumper sensors.In this implementation of the autonomous scanning algorithm, when the bumper sensor is activated, it is judged as a collision with an unexpected obstacle, and the robot stops and moves backwardIn this pre-defined implementation distance. of Then,the autonomous the next destination scanning is recalculatedalgorithm, when according the bumper to the autonomous sensor is activated,scanning algorithm.it is judged as a collision with an unexpected obstacle, and the robot stops and moves backward pre-defined distance. Then, the next destination is recalculated according to the autonomous4. Experiments scanning algorithm. In this section, we present the experimental results for the proposed autonomous scanning 4.system. Experiments In order to verify the operation of the proposed system, we selected some indoor spaces andIn performed this section, experiments. we present The the experiments experimental were results performed for the in proposed two parts, autonomous manual scanning scanning and system.autonomous In order scanning. to verify the operation of the proposed system, we selected some indoor spaces and performed experiments. The experiments were performed in two parts, manual scanning and 4.1. Manual Scanning Experiment autonomous scanning. The manual scanning experiment was carried out in the lobby shown in Figure9. The initial state 4.1.of the Manual mobile Scanning robot was Experiment the manual-movement mode. The user controlled the lidar and camera sensors. The manual scanning experiment was carried out in the lobby shown in Figure 9. The initial state of the mobile robot was the manual-movement mode. The user controlled the lidar and camera sensors.

Processes 20202020,, 88,, 1133x FOR PEER REVIEW 12 of 2829 Processes 2020, 8, x FOR PEER REVIEW 12 of 29

Figure 9. Indoor space (lobby) wherein experiment was performed. This was for the manual operation Figure 9. 9. IndoorIndoor space space (lobby) (lobby) wherein wherein experiment experiment was pe wasrformed. performed. This was This for the was manual for the operation manual experiment. operationexperiment. experiment.

Figure 1010 presentspresents the the environmental environmental data data and and captured captured image image for for the the indoor indoor lobby lobby along along with with the Figure 10 presents the environmental data and captured image for the indoor lobby along with scannedthe scanned result. result. The userThe coulduser could check check temperature temperat/humidity,ure/humidity, particulate particulate matter matter data, captured data, captured images the scanned result. The user could check temperature/humidity, particulate matter data, captured andimages the and scanned the scanned image immediately image immediately via smartphone. via smartphone. The indoor The spatial indoor and spatial environmental and environmental data was images and the scanned image immediately via smartphone. The indoor spatial and environmental alsodata maintainedwas also maintained in the gateway. in the gateway. data was also maintained in the gateway.

Figure 10. Environmental data result (left), captured image (middle) and scanned result (right) at Figure 10. Environmental data result (left), captured image (middle) and scanned result (right) at initialFigure position 10. Environmental in lobby. data result (left), captured image (middle) and scanned result (right) at initial position in lobby. initial position in lobby. In the second experiment, the user controlled the mobile robot’s movement in the manual mode. In the second experiment, the user controlled the mobile robot’s movement in the manual mode. FigureIn 11 the shows second the experiment, results of movement. the user controlled the mobile robot’s movement in the manual mode. Figure 11 shows the results of movement. Figure 11 shows the results of movement.

Processes 2020, 8, x FOR PEER REVIEW 13 of 29

Processes 2020,, 8,, 1133x FOR PEER REVIEW 13 of 29 28

Figure 11. Indoor space (lobby) wherein experiment was performed. This is the movement result of FigureiRobot 11.in theIndoor manual space operation (lobby) mode. wherein experiment was performed. This is the movement result of iRobotFigure in11. the Indoor manual space operation (lobby) mode.wherein experiment was performed. This is the movement result of iRobotFigure in 12 the presents manual operation the enviro mode.nmental data and captured image and scanned result after movementFigure of12 thepresents robot. the environmental data and captured image and scanned result after movement of theFigure robot. 12 presents the environmental data and captured image and scanned result after movement of the robot.

Figure 12. Environmental data result (left), captured image (middle) and scanned result (right) for movementFigure 12. Environmental in lobby. data result (left), captured image (middle) and scanned result (right) for movement in lobby. Figure 12. Environmental data result (left), captured image (middle) and scanned result (right) for 4.2. Autonomousmovement in Scanning lobby. Experiment

4.2. Autonomous Scanning Experiment

Processes 2020, 8, 1133 14 of 28 Processes 2020, 8, x FOR PEER REVIEW 14 of 29 Processes 2020, 8, x FOR PEER REVIEW 14 of 29 4.2. AutonomousThe autonomous Scanning scanning Experiment experiment was carried out in the corridor. Figure 13 shows an The autonomous scanning experiment was carried out in the corridor. Figure 13 shows an exampleThe of autonomous the autonomous scanning mode experiment operation was of carriedthe proposed out in thesystem. corridor. The Figureuser activates 13 shows autonomous an example example of the autonomous mode operation of the proposed system. The user activates autonomous modeof the autonomousoperation using mode a smartphone, operation of and the proposedthen the autonomous system. The userscanner activates operates autonomous in autonomous mode mode operation using a smartphone, and then the autonomous scanner operates in autonomous modeoperation and usingthe user a smartphone, obtains the scanned and then result. the autonomous scanner operates in autonomous mode and mode and the user obtains the scanned result. the user obtains the scanned result.

Figure 13.13.Autonomous Autonomous mode mode operation operation example. example. Activation Activation of autonomous of autonomous mode (left ),mode autonomous (left), Figure 13. Autonomous mode operation example. Activation of autonomous mode (left), autonomousscanning operation scanning of systemoperation (middle of system), and (middle scanned), and result scanned in autonomous result in autonomous mode (right ).mode (right). autonomous scanning operation of system (middle), and scanned result in autonomous mode (right). The first first autonomous scanning experiment was carriedcarried out in the corridor shown in FigureFigure 1414.. The first autonomous scanning experiment was carried out in the corridor shown in Figure 14.

Figure 14. IndoorIndoor space space (corridor) wherein first first autonomousautonomous scanning experiment was performed (panoramicFigure 14. Indoor view) space (corridor) wherein first autonomous scanning experiment was performed (panoramic view) Figure 15 shows the 2D drawing result obtained by integrating the scan data collected by the autonomousFigure 15 scanner. shows theEach Each 2D number number drawing represents represents result obtained a a point point by moved moved integrating to to by autonomous the scan data driving collected from by the autonomousstarting position scanner. of the Each autonomous number representsscanner. In In a the thepoint process process moved of tointegrating by autonomous the scan driving data, we from found the thatstarting a calibration position ofscheme the autonomous is needed. Forscanner. For example, example, In the there there process is is some some of integrating noise noise between between the scan numbers numbers data, 5 5we and and found 6 6 in thatfollowing a calibration scanned scheme result. This Thisis needed. issue issue will willFor be beexample, held over there for isfuture some research. noise between numbers 5 and 6 in following scanned result. This issue will be held over for future research.

Processes 2020, 8, x FOR PEER REVIEW 15 of 29

ProcessesProcesses2020 2020,,8 8,, 1133 x FOR PEER REVIEW 1515 ofof 2829

Figure 15. IntegratedIntegrated image image of of scanned scanned result result in first first autonomous scanning experiment.experiment. The numbers indicatingFigure 15. the Integrated positions image after of movement scanned result form form inthe first starting autonomous position scanning of of the autonomous experiment. scanner scanner The numbers ( (leftleft);); also,indicating the trajectory the positions of the after autonomousautonomous movement scanner form isthe indicated starting ((positionrightright).). of the autonomous scanner (left); also, the trajectory of the autonomous scanner is indicated (right). In thethe autonomous autonomous operation operation mode, mode, the mobilethe mobile robot robot moves moves according according to the autonomous to the autonomous scanning algorithmscanningIn the algorithm and autonomous collects and indoor operationcollects spatial indoor mode, data, image spatialthe mobile data, data, temperature, robot image moves data, humidity according temperature, and to particulate the humidity autonomous matter and dataparticulatescanning at each algorithm matter moving data position.and at eachcollects Themoving gatewayindoor position. storesspatial The collected data, gateway image data stores fromdata, collected each temperature, sensor. data from humidity each sensor. and particulateThe initial matter positionposition data atinin each thethe indoorindoor moving spacespace position. (corridor) (corrido Ther) forgateway for the the first first stores autonomous autonomo collectedus scanningda scanningta from experiment eachexperiment sensor. is shownis shownThe in ininitial Figure Figure position 16 .16. in the indoor space (corridor) for the first autonomous scanning experiment is shown in Figure 16.

Figure 16.16.Initial Initial position position in indoorin indoor space space where where first autonomous first autonomous scanning scanni experimentng experiment was performed. was performed.Figure 16. Initial position in indoor space where first autonomous scanning experiment was Figureperformed. 17 presents the environmental data and captured image for the indoor corridor along with the scannedFigure 17 result presents at initial the position.environmental The user data could and check captured temperature image for/humidity, the indoor particulate corridor matteralong data,with Figurethe capturedscanned 17 presents result image atthe and initial environmental the scannedposition. image The data user immediately and could captured check via image smartphone.temperature/humidity, for the Theindoor indoor corridor spatialparticulate along and environmentalmatterwith the data, scanned the data captured result was alsoat image initial maintained andposition. the in scanned The the gateway.user image could immediately check temperature/humidity, via smartphone. The particulate indoor spatialmatter anddata, environmental the captured dataimage was and also the maintained scanned image in the immediately gateway. via smartphone. The indoor spatial and environmental data was also maintained in the gateway.

Processes 2020, 8, x FOR PEER REVIEW 16 of 29

Processes 2020, 8, x 1133 FOR PEER REVIEW 1616 of of 29 28

Figure 17. Environmental data result (left), captured image (middle), and scanned result (right) at Figure 17. Environmental data result (left), captured image (middle), and scanned result (right) at initial position from which first autonomous scanning experiment was performed. Figureinitial position17. Environmental from which data first result autonomous (left), captured scanning image experiment (middle was), performed.and scanned result (right) at initial position from which first autonomous scanning experiment was performed. Figure 18 shows the position (marked 4 in Figure 15) in the corridor space where the first Figure 18 shows the position (marked 4 in Figure 15) in the corridor space where the first autonomous scanning experiment was performed. autonomousFigure 18 scanning shows experimentthe position was (marked performed. 4 in Figure 15) in the corridor space where the first autonomous scanning experiment was performed.

Figure 18. Position (marked 4 in Figure 15) in indoor space where first autonomous scanning experiment wasFigure performed. 18. Position (marked 4 in Figure 15) in indoor space where first autonomous scanning Figureexperiment 18. Position was performed. (marked 4 in Figure 15) in indoor space where first autonomous scanning experimentFigure 19 presents was performed. the environmental data and captured image and scanned result at the position (markedFigure 4 in 19 Figure presents 15) the from environmental which the first data autonomous and captured scanning image experimentand scanned was result performed. at the position (markedFigure 4 in 19 Figure presents 15) the from environmental which the first data autonomous and captured scanning image andexperiment scanned was result performed. at the position (marked 4 in Figure 15) from which the first autonomous scanning experiment was performed.

Processes 2020, 8, 1133x FOR PEER REVIEW 17 of 29 28 Processes 2020, 8, x FOR PEER REVIEW 17 of 29

. . Figure 19. Environmental data result (left), captured image (middle), and scanned result (right) at Figure 19. Environmental data result (left), captured image (middle), and scanned result (right) at positionFigure 19. (marked Environmental 4 in Figure data 15) resultfrom which(left), firscapturedt autonomous image ( middlescanning), andexperiment scanned was result performed. (right) at position (marked 4 in Figure 15) from which first autonomous scanning experiment was performed. position (marked 4 in Figure 15) from which first autonomous scanning experiment was performed. Figure 20 shows the corridor space at the position (marked 7 in Figure 15)15) from which the first first Figure 20 shows the corridor space at the position (marked 7 in Figure 15) from which the first autonomous scanning experiment was performed. autonomous scanning experiment was performed.

Figure 20. Position (marked 7 in Figure 15) in indoor space where first autonomous scanning experiment Figure 20. Position (marked 7 in Figure 15) in indoor space where first autonomous scanning was performed. experimentFigure 20. wasPosition performed. (marked 7 in Figure 15) in indoor space where first autonomous scanning experiment was performed. Figure 21 presents the environmental data, captured image and scanned result for the position Figure 21 presents the environmental data, captured image and scanned result for the position (marked 7 in Figure 15) from which the first autonomous scanning experiment was performed. (markedFigure 7 in 21 Figure presents 15) fromthe environmental which the first data, autonomous captured scanning image and experiment scanned resultwas performed. for the position (marked 7 in Figure 15) from which the first autonomous scanning experiment was performed.

Processes 2020, 8, 1133 18 of 28 Processes 2020, 8, x FOR PEER REVIEW 18 of 29

Figure 21.21. Environmental data result ( left), captured image ( middle), and scanned result (right) atat position (marked 77 inin FigureFigure 1515)) fromfrom whichwhich firstfirst autonomousautonomous scanningscanning experimentexperiment waswas performed.performed.

The secondsecond and and the the third third autonomous autonomous scanning scanning experiment experiment was carriedwas carried out in out the corridorin the corridor shown inshown Figure in 22Figure. 22.

Figure 22. Indoor space (corridor) wherein second and third autonomous scanning experiment was Figure 22. Indoor space (corridor) wherein second and third autonomous scanning experiment was performed (panoramic view). performed (panoramic view). Figure 23 shows the 2D drawing result obtained by integrating the scan data collected by the Figure 23 shows the 2D drawing result obtained by integrating the scan data collected by the autonomous scanner. Each number represents the point moved to by autonomous driving from the autonomous scanner. Each number represents the point moved to by autonomous driving from the starting position of the autonomous scanner. starting position of the autonomous scanner. In the autonomous operation mode, the mobile robot moves according to the autonomous scanning algorithm and collects indoor spatial data, image data, temperature, humidity and particulate matter data at each moving position. The gateway stores collected data from each sensor. The initial position in the indoor space for the second autonomous scanning experiment is shown in Figure 24. Figure 25 presents the environmental data and captured image for the indoor corridor along with the scanned result at the initial position in the second autonomous scanning experiment. The user could check temperature/humidity, particulate matter data, the captured image and the scanned image

Processes 2020, 8, x FOR PEER REVIEW 19 of 29

Processes 2020, 8, 1133 19 of 28 immediately via smartphone. The indoor spatial and environmental data was also maintained in theProcesses gateway. 2020, 8, x FOR PEER REVIEW 19 of 29

Figure 23. Integrated image of scanned result in second autonomous scanning experiment. The numbers indicate the positions after movement form the starting position of the autonomous scanner (left); also, the trajectory of the autonomous scanner is indicated (right).

In the autonomous operation mode, the mobile robot moves according to the autonomous scanning algorithm and collects indoor spatial data, image data, temperature, humidity and particulateFigure 23.matterIntegrated data at image each of moving scanned resultposition. in second The gateway autonomous stores scanning collected experiment. data from The numberseach sensor. indicateTheFigure initial 23. the Integrated positionsposition after inimage the movement ofindoor scanned form space result the startingfor in the second positionsecond autonomous ofautonomous the autonomous scanni scanningng scanner experiment. (experimentleft); also,The is shownthenumbers in trajectory Figure indicate 24. of the the autonomous positions after scanner movement is indicated form the (right starting). position of the autonomous scanner (left); also, the trajectory of the autonomous scanner is indicated (right).

In the autonomous operation mode, the mobile robot moves according to the autonomous scanning algorithm and collects indoor spatial data, image data, temperature, humidity and particulate matter data at each moving position. The gateway stores collected data from each sensor. The initial position in the indoor space for the second autonomous scanning experiment is shown in Figure 24.

Figure 24.24. InitialInitial position position in in indoor space where secondsecond autonomous scanningscanning experiment was performed. Front view of autonomous scanner (leftleft);); backwardbackward viewview ofof autonomousautonomous scannerscanner ((rightright).). Processes 2020, 8, x FOR PEER REVIEW 20 of 29

Figure 25 presents the environmental data and captured image for the indoor corridor along with the scanned result at the initial position in the second autonomous scanning experiment. The user could check temperature/humidity, particulate matter data, the captured image and the scanned image immediately via smartphone. The indoor spatial and environmental data was also maintained Figure 24. Initial position in indoor space where second autonomous scanning experiment was in the gateway. performed. Front view of autonomous scanner (left); backward view of autonomous scanner (right).

Figure 25 presents the environmental data and captured image for the indoor corridor along with the scanned result at the initial position in the second autonomous scanning experiment. The user could check temperature/humidity, particulate matter data, the captured image and the scanned image immediately via smartphone. The indoor spatial and environmental data was also maintained in the gateway.

Figure 25. Environmental data data result result ( leftleft),), captured captured image image ( (middle)) and scanned result ( right) at initial position from which secondsecond autonomousautonomous scanning experimentexperiment waswas performed.performed.

Figure 26 shows the corridor space at the position (marked 6 in Figure 23) where the second autonomous scanning experiment was performed.

Figure 26. Indoor space at position (marked 6 in Figure 23) from which second autonomous scanning experiment was performed.

Figure 27 presents the environmental data, the captured image and the scanned result at position (marked 6 in Figure 23) from which the second autonomous scanning experiment was performed.

Processes 2020, 8, x FOR PEER REVIEW 20 of 29

Figure 25. Environmental data result (left), captured image (middle) and scanned result (right) at Processes 2020, 8, 1133 20 of 28 initial position from which second autonomous scanning experiment was performed.

Figure 26 shows the corridor space at the position (marked 6 in Figure 23)23) where the second autonomous scanning experiment was performed.

Figure 26. IndoorIndoor space space at at position position (marked (marked 6 6 in in Figure Figure 23)23) from from which which second second autonomous autonomous scanning scanning experiment was performed.

Figure 27 presents the environmental data, the capt capturedured image and the scanned result at position position Processes 2020, 8, x FOR PEER REVIEW 21 of 29 (marked 6 in Figure 2323)) from which thethe secondsecond autonomousautonomous scanningscanning experimentexperiment waswas performed.performed.

Figure 27. Environmental data result ( leftleft),), captured image image ( middle), and scanned result ( right) at position (marked(marked 6 in6 Figurein Figure 23) from23) whichfrom whic secondh autonomoussecond autonomous scanning experimentscanning experiment was performed. was performed. Figure 28 shows the corridor space and scanned result at the position (marked 12 in Figure 23) from which the second autonomous scanning experiment was performed. Figure 28 shows the corridor space and scanned result at the position (marked 12 in Figure 23) from which the second autonomous scanning experiment was performed.

Figure 28. Indoor space (left) and scanned result (right) at position (marked 12 in Figure 23) from which second autonomous scanning experiment was performed.

The third autonomous scanning experiment was carried out in the corridor shown in Figure 22 again. Figure 29 shows the 2D drawing result obtained by integrating the scan data collected by the autonomous scanner. Each number represents a point moved to by autonomous driving from the starting position of the autonomous scanner. As can be seen in the results, movement path of the robot is different though the second and the third experiments were performed in the same indoor

Processes 2020, 8, x FOR PEER REVIEW 21 of 29

Figure 27. Environmental data result (left), captured image (middle), and scanned result (right) at position (marked 6 in Figure 23) from which second autonomous scanning experiment was performed.

Figure 28 shows the corridor space and scanned result at the position (marked 12 in Figure 23) Processes 2020, 8, 1133 21 of 28 from which the second autonomous scanning experiment was performed.

Figure 28. IndoorIndoor space space ( (leftleft)) and and scanned scanned result result ( (rightright)) at at position position (marked (marked 12 12 in in Figure Figure 23)23) from which second autonomous scanning experiment was performed.

The third autonomous scanning experiment was was ca carriedrried out out in in the corridor shown in Figure 22 Processes 2020, 8, x FOR PEER REVIEW 22 of 29 again. Figure Figure 2929 showsshows the the 2D 2D drawing drawing result result obtained by by integrating the the scan data collected by the autonomous scanner. Each number represents a point moved to by autonomous driving from the autonomousspace. The direction scanner. for Each next number destination represents is derive a dpoint dynamically moved to based by autonomous on scanned drivingdata in proposedfrom the starting position of the autonomous scanner. As can be seen in the results, movement path of the startingautonomous position driving of the algorithm. autonomous scanner. As can be seen in the results, movement path of the robotrobot is different different though the second and the third experiments were performed in the same indoor space. The direction for next destination is derived dynamically based on scanned data in proposed autonomous driving algorithm.

Figure 29. Integrated image of scanned result in third autonomous scanning experiment. The numbers indicating the positions after movement form the starting position of the autonomous scanner (left); Figurealso, the 29. trajectory Integrated of the image autonomous of scanned scanner result is indicated in third (rightautonomous). scanning experiment. The numbers indicating the positions after movement form the starting position of the autonomous scannerFigure (30left shows); also, the the trajectory position of (marked the autonomous 2 in Figure scanner 29 is) inindicated the corridor (right). space where the third autonomous scanning experiment was performed. Figure 30 shows the position (marked 2 in Figure 29) in the corridor space where the third autonomous scanning experiment was performed.

Figure 30. Position (marked 2 in Figure 29) in indoor space where third autonomous scanning experiment was performed.

Figure 31 presents the environmental data and captured image for the indoor corridor along with the scanned result at the position (marked 2 in Figure 29). The user could check temperature/humidity, particulate matter data, the captured image and the scanned image immediately via smartphone. The indoor spatial and environmental data was also maintained in the gateway.

Processes 2020, 8, x FOR PEER REVIEW 22 of 29 space. The direction for next destination is derived dynamically based on scanned data in proposed autonomous driving algorithm.

Figure 29. Integrated image of scanned result in third autonomous scanning experiment. The numbers indicating the positions after movement form the starting position of the autonomous scanner (left); also, the trajectory of the autonomous scanner is indicated (right).

Figure 30 shows the position (marked 2 in Figure 29) in the corridor space where the third Processesautonomous2020, 8 ,scanning 1133 experiment was performed. 22 of 28

Figure 30. Position (marked 2 in Figure 29) in indoor space where third autonomous scanning Figureexperiment 30. Position was performed. (marked 2 in Figure 29) in indoor space where third autonomous scanning experiment was performed. Figure 31 presents the environmental data and captured image for the indoor corridor along with the scannedFigure 31 result presents at the positionthe environmental (marked 2 indata Figure and 29 captured). The user image could for check the indoor temperature corridor/humidity, along particulatewith the scanned matter data,result the at captured the position image and(marked the scanned 2 in Figure image immediately29). The user via could smartphone. check Processes 2020, 8, x FOR PEER REVIEW 23 of 29 Thetemperature/humidity, indoor spatial and environmentalparticulate matter datawas data also, the maintained captured in image the gateway. and the scanned image immediately via smartphone. The indoor spatial and environmental data was also maintained in the gateway.

Figure 31. Environmental data result (left), captured image (middle), and scanned result (right) at positionFigure 31. (marked Environmental 2 in Figure data 29) result from which(left), thirdcaptured autonomous image (middle scanning), and experiment scanned wasresult performed. (right) at position (marked 2 in Figure 29) from which third autonomous scanning experiment was performed. Figure 32 shows the position (marked 5 in Figure 29) in the corridor space where the third autonomousFigure 32 scanning shows experimentthe position was (marked performed. 5 in Figure 29) in the corridor space where the third autonomous scanning experiment was performed.

Figure 32. Position (marked 5 in Figure 29) in indoor space where third autonomous scanning experiment was performed.

Figure 33 presents the environmental data and captured image and scanned result at the position (marked 5 in Figure 29) from which the third autonomous scanning experiment was performed.

Processes 2020, 8, x FOR PEER REVIEW 23 of 29

Figure 31. Environmental data result (left), captured image (middle), and scanned result (right) at position (marked 2 in Figure 29) from which third autonomous scanning experiment was performed.

Figure 32 shows the position (marked 5 in Figure 29) in the corridor space where the third autonomous scanning experiment was performed. Processes 2020, 8, 1133 23 of 28

Figure 32. Position (marked 5 in Figure 29) in indoor space where third autonomous scanning experimentFigure 32. Position was performed. (marked 5 in Figure 29) in indoor space where third autonomous scanning experiment was performed. Figure 33 presents the environmental data and captured image and scanned result at the position Processes 2020, 8, x FOR PEER REVIEW 24 of 29 (markedFigure 5 in 33 Figure presents 29) the from environmental which the third data autonomous and captured scanning image and experiment scanned wasresult performed. at the position (marked 5 in Figure 29) from which the third autonomous scanning experiment was performed.

Figure 33. Environmental data result (left), captured image (middle), and scanned result (right) at positionFigure 33. (marked Environmental 5 in Figure data 29 )result from which(left), thirdcaptured autonomous image (middle scanning), and experiment scanned wasresult performed. (right) at position (marked 5 in Figure 29) from which third autonomous scanning experiment was performed. An additional autonomous scanning experiment was carried out in the corridor with an obstacle and theAn 2Dadditional drawing autonomous result obtained scanning by integrating experiment the was scan carried data collected out in the by corridor the autonomous with an obstacle scanner shownand the in2D Figure drawing 34. result Each obtained number representsby integrating the the point scan moved data collected to by autonomous by the autonomous driving from scanner the startingshown in position Figure of34. the Each autonomous number represents scanner. As the described point moved in the to algorithm, by autonomous the direction driving for from the next the destinationstarting position and the of distancethe autonomous were extracted scanner. in considerationAs described in of notthe onlyalgorithm, the wall the but direction also other for obstacles. the next destination and the distance were extracted in consideration of not only the wall but also other obstacles.

Figure 34. Indoor space (corridor with an obstacle) wherein autonomous scanning experiment was performed (left) and integrated image of scanned result in experiment. The numbers indicating the positions after movement form the starting position of the autonomous scanner (middle); also, the trajectory of the autonomous scanner is indicated (right).

4.3. Results and Discussion

Processes 2020, 8, x FOR PEER REVIEW 24 of 29

Figure 33. Environmental data result (left), captured image (middle), and scanned result (right) at position (marked 5 in Figure 29) from which third autonomous scanning experiment was performed.

An additional autonomous scanning experiment was carried out in the corridor with an obstacle and the 2D drawing result obtained by integrating the scan data collected by the autonomous scanner shown in Figure 34. Each number represents the point moved to by autonomous driving from the starting position of the autonomous scanner. As described in the algorithm, the direction for the next destination and the distance were extracted in consideration of not only the wall but also other Processes 2020 8 obstacles. , , 1133 24 of 28

Figure 34. Indoor space (corridor with an obstacle) wherein autonomous scanning experiment was Figure 34. Indoor space (corridor with an obstacle) wherein autonomous scanning experiment was performed (left) and integrated image of scanned result in experiment. The numbers indicating performed (left) and integrated image of scanned result in experiment. The numbers indicating the the positions after movement form the starting position of the autonomous scanner (middle); also, positions after movement form the starting position of the autonomous scanner (middle); also, the the trajectory of the autonomous scanner is indicated (right). trajectory of the autonomous scanner is indicated (right). 4.3. Results and Discussion 4.3. Results and Discussion As in the above experiment, we selected an indoor space and performed experiments on manual scanning and autonomous scanning in order to verify the operation of the proposed system. Table2 shows portions of the temperature, humidity and particulate matter data collected in the previous autonomous scanning experiment.

Table 2. Temperature, humidity and particulate matter data collected in experiment.

Temperature (◦C) Humidity (%) PM 1.0 PM 2.5 PM 10.0 19.4 36.9 17 20 22 19.2 37.6 17 22 23 19.1 37.6 19 22 23 19.1 37.8 17 21 21 19.1 37.7 16 25 21

Table3 shows partial data for angle and distance obtained from the RPLidar A2 lidar sensor. Such data is provided to the user by converting it to a scanned image in the form of a two-dimensional drawing and is also used to calculate the movement direction and distance of the iRobot when it is operating in the autonomous mode.

Table 3. Lidar sensor data collected in experiment.

Angle Distance Angle Distance 127.265625 1068.25 135.718750 1127.75 128.984375 1079.75 137.421875 1146.50 130.734375 1088.75 139.078125 1162.75 132.343750 1100.75 140.734375 1182.75 134.031250 1116.25 142.421875 1207.25

Figure 35 shows a Python script for generation of a two-dimensional drawing using angle and distance data obtained from the RPLidar A2 lidar sensor. The data was obtained by dividing it into Processes 2020, 8, x FOR PEER REVIEW 25 of 29

As in the above experiment, we selected an indoor space and performed experiments on manual scanning and autonomous scanning in order to verify the operation of the proposed system. Table 2 shows portions of the temperature, humidity and particulate matter data collected in the previous autonomous scanning experiment.

Table 2. Temperature, humidity and particulate matter data collected in experiment.

Temperature (°C) Humidity (%) PM 1.0 PM 2.5 PM 10.0 19.4 36.9 17 20 22 19.2 37.6 17 22 23 19.1 37.6 19 22 23 19.1 37.8 17 21 21 19.1 37.7 16 25 21

Table 3 shows partial data for angle and distance obtained from the RPLidar A2 lidar sensor. Such data is provided to the user by converting it to a scanned image in the form of a two-dimensional drawing and is also used to calculate the movement direction and distance of the iRobot when it is operating in the autonomous mode.

Table 3. Lidar sensor data collected in experiment.

Angle Distance Angle Distance 127.265625 1068.25 135.718750 1127.75 128.984375 1079.75 137.421875 1146.50 130.734375 1088.75 139.078125 1162.75 132.343750 1100.75 140.734375 1182.75 134.031250 1116.25 142.421875 1207.25

Processes 2020, 8, 1133 25 of 28 Figure 35 shows a Python script for generation of a two-dimensional drawing using angle and distance data obtained from the RPLidar A2 lidar sensor. The data was obtained by dividing it into four casescases accordingaccording toto thethe angle angle value. value. Each Each case case changed changed the the angle angle to to the the radian radian and and generated generated the the X andX and Y coordinatesY coordinates from from the the distance distance and and the the radian. radian. The The two-dimensional two-dimensional drawing drawing was was generated generated by applyingby applying those those X and X and Y coordinates. Y coordinates.

Figure 35.35. Python code for generation of two-dimensional drawing using angle and distance data obtained from RPLidarRPLidar A2A2 lidarlidar sensor.sensor. As interest in data on indoor spaces has increased, research on scanning and positioning As interest in data on indoor spaces has increased, research on scanning and positioning methods for indoor space information collection and location recognition has been actively carried out, methods for indoor space information collection and location recognition has been actively carried and relevant systems have been proposed. Such systems incorporate indoor mapping and modeling, out, and relevant systems have been proposed. Such systems incorporate indoor mapping and positioning, and platform-related technologies. In addition, a smart home or building infrastructure modeling, positioning, and platform-related technologies. In addition, a smart home or building integrating IoT technology based on sensors and actuators for monitoring and control of indoor spaces infrastructure integrating IoT technology based on sensors and actuators for monitoring and control such as homes or buildings has been proposed. In this paper, we consider a system that can collect of indoor spaces such as homes or buildings has been proposed. In this paper, we consider a system indoor data autonomously for monitoring and control of indoor spaces in real time where there is no infrastructure for the purpose. The data include indoor spatial data as well as environmental data. In the design and implementation of the proposed system we utilized prototyping platforms and performed experiment in order to verify the operation of the system. As shown in the results, the proposed system functions properly, as it provides temperature, humidity, particulate matter, and image data as well as spatial data to the user in real time and can drive in an indoor space while avoiding obstacles in the autonomous mode. The mobile robot and each sensor are connected to the gateway directly. The user-terminal exchanges control commands and sensor data with the gateway via TCP/IP communication. In the proposed system, each sensor data is stored in the gateway; therefore, in cases where the communication environment is inadequate, it can be checked against stored data in addition to real-time checking. The iRobot Create 2 used as a prototyping tool has a battery that runs for 2 h when fully charged and has limitations in operating in more complex environments such as disaster areas. It is expected that if the proposed prototype’s robot and sensor functions and performance are improved, it will find use in various applications.

5. Conclusions People spend most of time indoors, and the indoor environment influences their health, wellbeing, and productivity. Accordingly, the demand for location-based services that can process space and location information as well as smart indoors with an emphasis on safe, healthy, comfortable, affordable, and sustainable living environments is increasing. In order to meet the demand for indoor applications, first, techniques for deriving indoor space information must be provided. Usually, schemes for deriving indoor maps and models are based on collected point cloud data. In addition, keeping the data of indoor up-to-date and checking it is problematic, although it is critical to support rapid intervention in situation of emergency. Information about building layouts and indoor objects occupancy is a critical factor to efficient and safer emergency response in disaster management. Processes 2020, 8, 1133 26 of 28

Second, the convergence of various new technologies such as sensing and actuation, advanced control, and data analytics has been proposed for smart indoors. Therefore, we consider a system that can collect indoor data autonomously for monitoring and control of indoor spaces in real time where there is no infrastructure for the purpose. The data include indoor spatial data as well as environmental data. We proposed herein an autonomous indoor scanning system that can be employed for acquisition of indoor environmental data including temperature, humidity and particulate matter information along with indoor imaging and spatial data for the purposes of indoor environment status awareness and indoor visualization. The system collects indoor data autonomously, monitoring and controlling indoor spaces in real time where there is no infrastructure for such purposes. For design and implementation of the proposed system, we utilized prototyping platforms including iRobot Create 2 and Raspberry Pi 3, along with the RPLidar A2 lidar sensor, Raspberry Pi Camera V2, the DHT22 temperature/humidity sensor, and the Plantower PMS7003 particulate matter sensor. A smartphone was used as the user terminal for monitoring and control of the system, to which ends, we implemented a mobile application. The results on our implementation and experimentation indicated proper functioning of the proposed system. It provides temperature, humidity, particulate matter, image as well as spatial data to the user in real time, and it can drive in indoor spaces while avoiding obstacles in the autonomous mode. In addition, in the proposed system, each sensor data is stored in the gateway; therefore, in cases where the communication environment is inadequate, it can be checked against stored data in addition to real-time checking. In the process of integrating the scan data, however, we found that a calibration scheme is needed. This issue will be held over for future research. The proposed system can be applied in facilities without infrastructure for indoor data collection, such as for routine indoor data collection purposes, and it can also be used for immediate indoor data collection in cases of emergency. As based on the proposed prototype, the system has limitations in operating in more complex environments such as disaster areas. It is expected that if the prototype’s robot and sensor functions and performance are improved, the system will find use in various applications. In future work, we will improve system performance by means of an advanced lidar sensor and mobile robot, and we will improve system functionality by installation of additional sensors. Additionally, a correction method for improved accuracy of scanned data and an autonomous-driving algorithm upgrade are required.

Author Contributions: S.H. prepared the paper structure, surveyed the related research, designed the proposed idea and research, and wrote the paper; D.P. implemented the prototype of the proposed system and performed the experiments. Both authors have read and agreed to the published version of the manuscript. Funding: This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (NRF-2017R1A2B4009167). Acknowledgments: The first draft of this paper was presented at the 15th IEEE International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob 2019) [39], Barcelona, Spain, 21–23 October 2019. Conflicts of Interest: The authors declare no conflict of interest.

References

1. Chen, Y.; Tang, J.; Jiang, C.; Zhu, L.; Lehtomäki, M.; Kaartinen, H.; Kaijaluoto, R.; Wang, Y.; Hyyppä, J.; Hyyppä, H.; et al. The Accuracy Comparison of Three Simultaneous Localization and Mapping (SLAM)-Based Indoor Mapping Technologies. Sensors 2018, 18, 3228. [CrossRef][PubMed] 2. Nikoohemata, S.; Diakitéb, A.A.; Zlatanovab, S.; Vosselmana, G. Indoor 3D reconstruction from point clouds for optimal routing in complex buildings to support disaster management. Autom. Constr. 2020, 113, 103109. [CrossRef] Processes 2020, 8, 1133 27 of 28

3. Dakheel, J.A.; Pero, C.D.; Aste, N.; Leonforte, F. Smart buildings features and key performance indicators: A review. Sustain. Cities Soc. 2020, 61, 102328. [CrossRef] 4. Jia, R.; Jin, B.; Jin, M.; Zhou, Y.; Konstantakopoulos, I.C.; Zou, H.; Kim, J.; Li, D.; Gu, W.; Arghandeh, R.; et al. Design Automation for Smart Building Systems. Proc. IEEE 2018, 106, 1680–1699. [CrossRef] 5. Markom, M.A.; Adom, A.H.; Tan, E.S.M.M.; Shukor, S.A.A.; Rahim, N.A.; Shakaff, A.Y.M. A mapping mobile robot using RP Lidar scanner. In Proceedings of the 2015 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS), Langkawi, Malaysia, 18–20 October 2015. 6. Markom, M.A.; Shukor, S.A.A.; Adom, A.H.; Tan, E.S.M.M.; Shakaff, A.Y.M. Indoor Scanning and Mapping using Mobile Robot and RP Lidar. Int’l J. Adv. Mech. Autom. Engg. (IJAMAE) 2016, 3, 42–47. 7. Adán, A.; Quintana, B.; Prieto, S.A. Autonomous Mobile Scanning Systems for the Digitization of Buildings: A Review. Remote Sens. 2019, 11, 306. [CrossRef] 8. Díaz-Vilariño, L.; Khoshelham, K.; Martínez-Sánchez, J.; Arias, P. 3D modeling of building indoor spaces and closed doors from imagery and point clouds. Sensors 2015, 15, 3491–3512. [CrossRef] 9. Lehtola, V.V.; Kaartinen, H.; Nüchter, A.; Kaijaluoto, R.; Kukko, A.; Litkey, P.; Honkavaara, E.; Rosnell, T.; Vaaja, M.T.; Virtanen, J.-P.; et al. Comparison of the Selected State-Of-The-Art 3D Indoor Scanning and Point Cloud Generation Methods. Remote Sens. 2017, 9, 796. [CrossRef] 10. Masiero, A.; Fissore, F.; Guarnieri, A.; Pirotti, F.; Visintini, D.; Vettore, A. Performance Evaluation of Two Indoor Mapping Systems: Low-Cost UWB-Aided Photogrammetry and Backpack Laser Scanning. Appl. Sci. 2018, 8, 416. [CrossRef] 11. Oh, T.; Lee, D.; Kim, H.; Myung, H. Graph structure-based simultaneous localization and mapping using a hybrid method of 2D laser scan and monocular camera image in environments with laser scan ambiguity. Sensors 2015, 15, 15830–15852. [CrossRef] 12. Yassin, A.; Nasser, Y.; Awad, M.; Al-Dubai, A.; Liu, R.; Yuen, C.; Raulefs, R.; Aboutanios, E. Recent Advances in Indoor Localization: A Survey on Theoretical Approaches and Applications. IEEE Commun. Surv. Tutor. 2017, 19, 1327–1346. [CrossRef] 13. Zafari, F.; Gkelias, A.; Leung, K.K. A Survey of Indoor Localization Systems and Technologies. IEEE Commun. Surv. Tutor. 2019, 21, 2568–2599. [CrossRef] 14. Chow, J.C.K.; Peter, M.; Scaioni, M.; Al-Durgham, M. Indoor Tracking, Mapping, and Navigation: Algorithms, Technologies, and Applications. J. Sens. 2018, 2018, 3. [CrossRef] 15. Huh, J.-H.; Seo, K. An Indoor Location-Based Control System Using Bluetooth Beacons for IoT Systems. Sensors 2017, 17, 2917. [CrossRef] 16. Huh, J.-H.; Bu, Y.; Seo, K. Bluetooth-tracing RSSI sampling method as basic technology of indoor localization for smart homes. Int. J. Smart Home 2016, 10, 9–22. [CrossRef] 17. Caso, G.; de Nardis, L.; di Benedetto, M.-G. A mixed approach to similarity metric selection in affinity propagation based WiFi fingerprinting indoor positioning. Sensors 2015, 15, 27692–27720. [CrossRef] [PubMed] 18. Castañón–Puga, M.; Salazar, A.; Aguilar, L.; Gaxiola-Pacheco, C.; Licea, G. A novel hybrid intelligent indoor location method for mobile devices by zones using Wi-Fi signals. Sensors 2015, 15, 30142–30164. [CrossRef] [PubMed] 19. Ma, L.; Xu, Y. Received signal strength recovery in green WLAN indoor positioning system using singular value thresholding. Sensors 2015, 15, 1292–1311. [CrossRef] 20. Ma, R.; Guo, Q.; Hu, C.; Xue, J. An improved WiFi indoor positioning algorithm by weighted fusion. Sensors 2015, 15, 21824–21843. [CrossRef] 21. Zhou, M.; Zhang, Q.; Xu, K.; Tian, Z.; Wang, Y.; He, W. Primal: Page rank-based indoor mapping and localization using gene-sequenced unlabeled WLAN received signal strength. Sensors 2015, 15, 24791–24817. [CrossRef] 22. Zou, H.; Lu, X.; Jiang, H.; Xie, L. A fast and precise indoor localization algorithm based on an online sequential extreme learning machine. Sensors 2015, 15, 1804–1824. [CrossRef][PubMed] 23. Nguyen, V.-H.; Pyun, J.-Y. Location detection and tracking of moving targets by a 2D IR-UWB radar system. Sensors 2015, 15, 6740–6762. [CrossRef][PubMed] 24. Yin, Z.; Cui, K.; Wu, Z.; Yin, L. Entropy-based TOA estimation and SVM-based ranging error mitigation in UWB ranging systems. Sensors 2015, 15, 11701–11724. [CrossRef][PubMed] Processes 2020, 8, 1133 28 of 28

25. Abbas, M.; Elhamshary, M.; Rizk, H.; Torki, M.; Youssef, M. WiDeep: WiFi-based Accurate and Robust Indoor Localization System using Deep Learning. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications (PerCom 2019), Kyoto, Japan, 11–15 March 2019. 26. Wang, F.; Feng, J.; Zhao, Y.; Zhang, X.; Zhang, S.; Han, J. Joint Activity Recognition and Indoor Localization with WiFi Fingerprints. IEEE Access 2019, 7, 80058–80068. [CrossRef] 27. Hoang, M.-T.; Yuen, B.; Dong, X.; Lu, T.; Westendorp, R.; Reddy, K. Recurrent Neural Networks for Accurate RSSI Indoor Localization. IEEE Internet Things J. 2019, 6, 10639–10651. [CrossRef] 28. Kim, M.-C.; Jang, M.-K.; Hong, S.-M.; Kim, J.-H. Practices on BIM-based indoor spatial information implementation and location-based services. J. KIBIM 2015, 5, 41–50. [CrossRef] 29. Cheng, X.; Yang, B.; Olofsson, T.; Liu, G.; Li, H. A pilot study of online non-invasive measuring technology based on video magnification to determine skin temperature. Build. Environ. 2017, 121, 1–10. [CrossRef] 30. Yang, B.; Cheng, X.; Dai, D.; Olofsson, T.; Li, H.; Meier, A. Real-time and contactless measurements of thermal discomfort based on human poses for energy efficient control of buildings. Build. Environ. 2019, 162, 106284. [CrossRef] 31. Cheng, X.; Yang, B.; Hedman, A.; Olofsson, T.; Li, H.; Gool, L.V. NIDL: A pilot study of contactless measurement of skin temperature for intelligent building. Energy Build. 2019, 198, 340–352. [CrossRef] 32. Martinez-Sala, A.; Losilla, F.; Sánchez-Aarnoutse, J.; García-Haro, J. Design, implementation and evaluation of an indoor navigation system for visually impaired people. Sensors 2015, 15, 32168–32187. [CrossRef] 33. Plagerasa, A.P.; Psannisa, K.E.; Stergiou, C.; Wang, H.; Gupta, B.B. Efficient IoT-based sensor BIG Data collection–processing and analysis in smart buildings. Future Gener. Comput. Syst. 2018, 82, 349–357. [CrossRef] 34. Yang, C.-T.; Chen, S.-T.; Den, W.; Wang, Y.-T.; Kristiani, E. Implementation of an Intelligent Indoor Environmental Monitoring and management system in cloud. Future Gener. Comput. Syst. 2019, 96, 731–749. [CrossRef] 35. Mujan, I.; AnCelkovi, A.S.; Muncan, V.; Kljajic, M.; Ruzic, D. Influence of indoor environmental quality on human health and productivity—A review. J. Clean. Prod. 2019, 217, 646–657. [CrossRef] 36. Dong, B.; Prakash, V.; Feng, F.; O’Neill, Z. A review of smart building sensing system for better indoor environment control. Energy Build. 2019, 199, 29–46. [CrossRef] 37. iRobot® Create® 2 Open Interface (OI) Specification Based on the iRobot® Roomba® 600. Available online: https://www.irobotweb.com/-/media/MainSite/Files/About/STEM/Create/2018-07-19_ iRobot_Roomba_600_Open_Interface_Spec.pdf (accessed on 10 September 2018). 38. RPLidar A2 Development Kit User Manual. Shanghai Slamtec. Co. Ltd. Available online: http://bucket.download.slamtec.com/a7a9b856b9f8e57aad717da50a2878d5d021e85f/LM204_SLAMTEC_ rplidarkit_usermanual_A2M4_v1.1_en.pdf (accessed on 11 September 2018). 39. Kim, H.; Shin, B.; Lee, Y.; Hwang, S. Design and Implementation of Mobile Indoor Scanning System. In Proceedings of the 15th IEEE International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob 2019), Barcelona, Spain, 21–23 October 2019.

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).