Quick viewing(Text Mode)

Assistant Personal Robot (APR): Conception and Application of a Tele-Operated Assisted Living Robot

Assistant Personal Robot (APR): Conception and Application of a Tele-Operated Assisted Living Robot

sensors

Article Assistant Personal (APR): Conception and Application of a Tele-Operated Assisted Living Robot

Eduard Clotet, Dani Martínez, Javier Moreno, Marcel Tresanchez and Jordi Palacín *

Department of Computer Science and Industrial Engineering, Universitat de Lleida, Jaume II, 69, 25001 Lleida, Spain; [email protected] (E.C.); [email protected] (D.M.); [email protected] (J.M.); [email protected] (M.T.) * Correspondence: [email protected]; Tel.: +34-973-702-760; Fax: +34-973-702-702

Academic Editor: Gonzalo Pajares Martinsanz Received: 31 December 2015; Accepted: 24 April 2016; Published: 28 April 2016

Abstract: This paper presents the technical description, mechanical design, electronic components, software implementation and possible applications of a tele-operated designed as an assisted living tool. This robotic concept has been named Assistant Personal Robot (or APR for short) and has been designed as a remotely telecontrolled robotic platform built to provide social and assistive services to elderly people and those with impaired mobility. The APR features a fast high-mobility motion system adapted for tele-operation in plain indoor areas, which incorporates a high-priority collision avoidance procedure. This paper presents the mechanical architecture, electrical fundaments and software implementation required in order to develop the main functionalities of an assistive robot. The APR uses a tablet in order to implement the basic peer-to-peer videoconference and tele-operation control combined with a tactile graphic user interface. The paper also presents the development of some applications proposed in the framework of an assisted living robot.

Keywords: Assistant Personal Robot; telecontrol; mobile robot; peer-to-peer video; assisted living

1. Introduction The Assistant Personal Robot (or APR) is proposed as a remotely telecontrolled mobile robotic platform with videoconference capabilities. The APR is designed to provide telepresence services [1] primarily for the elderly and those with mobility impairments. Reports from diverse institutions, such as the United Nations [2] and World Health Organization [3], postulate that the proportion of people aged 60 or over is expected to rise from 12% to 21% during the next 35 years as a result of a clear increase of human life expectancy. According to these predictions, elderly care is one of the fields where new technologies have to be applied in the form of passive monitoring systems [4,5] and also as autonomous and tele-operated . Such robots can be used to provide assistance to the elderly and people with mobility problems in homes and institutions [6]. The development of assistive robots can provide benefits through the development of optimized solutions for some specific problems that appear in typical household domains [7,8]. A review of the technical description of some mobile robots proposed for telepresence applications is available in [8]. As examples of these applications: in [9] a robotic assistant was used in order to help the residents of a geriatric institution perform their daily routine; in [10] an assistive robot was proposed to help elderly people with mild cognitive impairments; in [7] a telepresence robot showed positive results in domestic elder care assistance; in [11] a telepresence robot was used to improve communication and interaction between people with dementia and their relatives; finally, in [12] a home care monitoring system was proposed, based on a fixed sensor network infrastructure and some physiological sensors, offering the possibility of a virtual visit via a tele-operated mobile robot.

Sensors 2016, 16, 610; doi:10.3390/s16050610 www.mdpi.com/journal/sensors SensorsSensors2016 2016,,16 16,, 610 22 of of 23 23

In the context of this paper, a typical example for the application of a household robot is the In the context of this paper, a typical example for the application of a household robot is development of functionalities focused on detecting human inactivity, falls or other accidental the development of functionalities focused on detecting human inactivity, falls or other accidental situations. In this case, one of the goals is the reduction of the reaction time for external assistance. situations. In this case, one of the goals is the reduction of the reaction time for external assistance. This This aspect is very important because approximately one third of the population aged over 65 falls aspect is very important because approximately one third of the population aged over 65 falls every every year [13] and, in most of the cases, are unable to recover by their own means or ask for year [13] and, in most of the cases, are unable to recover by their own means or ask for assistance [14]. assistance [14]. The consequences for elderly people are not limited to physical injuries because a The consequences for elderly people are not limited to physical injuries because a loss of self-confidence loss of self-confidence also affects mobility drastically [15]. Therefore, telecontrolled mobile robots also affects mobility drastically [15]. Therefore, telecontrolled mobile robots can be used for social can be used for social interaction by decreasing isolation and promoting positive emotions and interaction by decreasing isolation and promoting positive emotions and psychological reinforcement. psychological reinforcement. The APR is designed to provide remote general assistance and help to elderly people and those The APR is designed to provide remote general assistance and help to elderly people and those withwith impairedimpaired mobilitymobility inin conventionalconventional unstructuredunstructured environments. Figure 11 showsshows anan imageimage of the APRAPR nextnext toto somesome volunteersvolunteers toto indicateindicate its size. Mobility is provided by by omnidirectional omnidirectional wheels wheels that that cancan movemove the the APR APR in in any any direction direction without without performing performing intermediate intermediate trajectory trajectory maneuvers. maneuvers. The headThe ofhead the APRof the is APR based is on based an on an Android Tablet, can Tablet, be rotated can be horizontally rotated horizontally and vertically, and and vertically, implements and theimplements videoconference the videoconference capabilities for capabilities telecontrol for and telecontrol telepresence. and Thetelepresence. APR includes The onboardAPR includes Light Detectiononboard Light and RangingDetection (LIDAR) and Ranging sensors (LIDAR) which sensor detects thewhich distance detect tothe the distance surrounding to the surrounding objects on a two-dimensionalobjects on a two-dimensional plane by measuring plane by the measuring time-of-flight the oftime-of-flight a rotating laser of a beam. rotating The laser main beam. electronic The boardmain electronic of the APR board uses of the the information APR uses the of information the LIDAR sensorsof the LIDAR directly sensors to implement directly to a high-priorityimplement a collisionhigh-priority avoidance collision procedure avoidance which procedure is able which to stop is theable mobile to stop robot. the mobile The APR robot. also The has APR two also mobile has armstwo withmobile one arms degree with of freedom one degree and which of freed can beom rotated and which clockwise can and be counterclockwiserotated clockwise through and ˝ 360counterclockwiseand placed at through any desired 360° angular and placed orientation. at any desired The arms angular can be orientation. used for gestural The arms interaction can be used or as afor soft gestural handle interaction as a walking or support.as a soft handle as a walking support.

Figure 1. Comparative image of the APR. Figure 1. Comparative image of the APR.

The new contribution of this paper is the complete technical description of a tele-operated mobileThe robot new contributiondesigned for of thisassisted paper living. is the completeComparing technical the capabilities description of of the a tele-operated APR with the mobile 16 robotcommercial designed and for prototype assisted mobile living. robots Comparing reviewed the capabilitiesin [8], the APR of the includes APR with common the 16 devices commercial and andfunctionalities prototype mobileavailable robots in other reviewed mobile in robots [8], the designed APR includes for telepresence. common devices The APR and is functionalities not designed with adjustable height, a capability included in four of the 16 robots reviewed. The APR includes an

Sensors 2016, 16, 610 3 of 23

Sensors 2016, 16, 610 3 of 23 available in other mobile robots designed for telepresence. The APR is not designed with adjustable omnidirectional motion system, which is included in two of the 16 robots reviewed. The APR height, a capability included in four of the 16 robots reviewed. The APR includes an omnidirectional includes ambient sensors, a capability not included in any of the mobile robots reviewed. Finally, the motion system, which is included in two of the 16 robots reviewed. The APR includes ambient sensors, APR includes soft arms that can be used as a support for walking, a feature which is not usually a capability not included in any of the mobile robots reviewed. Finally, the APR includes soft arms available in telepresence mobile robots. that can be used as a support for walking, a feature which is not usually available in telepresence mobile2. Mechanical robots. Design 2. MechanicalThe mechanical Design design of the APR is steered by the aim to provide personal assistance services in households or institutions without interfering with the inhabitants. The overall weight of the APR The mechanical design of the APR is steered by the aim to provide personal assistance services in is 30 kg. All the heavy elements in the APR are in the base, close to ground level to ensure a lower households or institutions without interfering with the inhabitants. The overall weight of the APR is center of gravity and stable displacement. The definitive height of the APR was determined by 30 kg. All the heavy elements in the APR are in the base, close to ground level to ensure a lower center performing an ergonomic test with volunteers of different ages. In this test, the volunteers agreed on of gravity and stable displacement. The definitive height of the APR was determined by performing an a robot height of 1.70 m to ensure a good view of the screen included in the head of the APR. The ergonomic test with volunteers of different ages. In this test, the volunteers agreed on a robot height of width was limited to 40 cm to simplify the remote telecontrol of the mobile robot when passing 1.70 m to ensure a good view of the screen included in the head of the APR. The width was limited to through doorways. The head of the APR contains a panoramic screen that can be moved with two 40 cm to simplify the remote telecontrol of the mobile robot when passing through doorways. The head degrees of freedom (up/down and left/right). The torso of the APR has two shoulders with one of the APR contains a panoramic screen that can be moved with two degrees of freedom (up/down degree of freedom in order to move the arms forward and backward. The motion of the APR is and left/right). The torso of the APR has two shoulders with one degree of freedom in order to move accomplished with three omnidirectional wheels. The physical design of the APR is inspired in and the arms forward and backward. The motion of the APR is accomplished with three omnidirectional includes several resemblances to humans in order to operate, maneuver and move the head and the wheels. The physical design of the APR is inspired in and includes several resemblances to humans in arms in a similar way in a typical household or institutional scenario. In [1], a low velocity was order to operate, maneuver and move the head and the arms in a similar way in a typical household or considered a drawback when moving long distances with a telecontrolled mobile robot. This institutional scenario. In [1], a low velocity was considered a drawback when moving long distances drawback could potentially deter people from utilizing this kind of robots. The APR has a maximum with a telecontrolled mobile robot. This drawback could potentially deter people from utilizing this forward velocity of 1.3 m/s, which is comparable with the speed of a person and considered kind of robots. The APR has a maximum forward velocity of 1.3 m/s, which is comparable with the adequate for long indoor displacements, such as along a corridor. speed of a person and considered adequate for long indoor displacements, such as along a corridor. The mechanical structure of the base of the APR (Figure 2a) is made of stainless steel for The mechanical structure of the base of the APR (Figure2a) is made of stainless steel for durability durability and resistance. This mechanical structure supports three battery shafts for three 12Ah and resistance. This mechanical structure supports three battery shafts for three 12Ah sealed lead acid sealed lead acid batteries, three P205 DC motors from Micromotor (Micromotor, Verderio, Italy) and batteries,a sloping three surface P205 to DC hold motors the fromLIDAR Micromotor sensors, the (Micromotor, main electronic Verderio, control Italy) board and a and sloping the surfacebattery tocharger hold the system. LIDAR The sensors, APR thehas mainthree electronicomnidirectiona controll wheels board and(Figures the battery 1 and charger2b) shifted system. 120° Theand ˝ APRattached has three to the omnidirectional motors by conical wheels mechanical (Figures 1conne and2ctors.b) shifted This 120motionand system attached maintains to the motorsthe engine by conicalaxle, mechanicalprovides proper connectors. torque This transmission, motion system and maintains simplifies the enginethe assembly axle, provides of other proper internal torque transmission,components. and simplifies the assembly of other internal components.

(a) (b)

Figure 2. CAD design of the base of the APR: (a) mechanical structure and (b) plastic cover. Figure 2. CAD design of the base of the APR: (a) mechanical structure and (b) plastic cover. The omnidirectional wheels contain multiple rollers on the external edge of each wheel creating a holonomic mobile platform where each wheel has two degrees of freedom, allowing wheel spin

Sensors 2016, 16, 610 4 of 23

SensorsThe 2016 omnidirectional, 16, 610 wheels contain multiple rollers on the external edge of each wheel creating4 of 23 a holonomic mobile platform where each wheel has two degrees of freedom, allowing wheel spin and perpendicularand perpendicular displacements displacements from from the wheel’s the wheel’s forward forward direction direction and thus and direct thus displacementdirect displacement in any desiredin any desired direction direction (Figure 3(Figure). The external 3). The whiteexternal cover white of the cover base of is the made base of ABSis made plastic of ABS (Figure plastic2b) using(Figure a fast2b) prototypingusing a fast 3Dprototyping printer. This 3D provides printer. aThis cheap provides method a ofcheap fabrication, method a smooth-finishedof fabrication, a appearancesmooth-finished and appearance easy replacement and easy of damagedreplacement parts. of damaged The circular parts. outline The circular of the outline base of of the the robot base providesof the robot an provides external an space-efficient external space-efficient shape and shape the absence and the of absence protruding of protruding and potentially and potentially harmful elements.harmful elements. This circular This design circular also design minimizes also minimizes the probability the probability of it becoming of it accidentallybecoming accidentally hooked on suchhooked furnishings on such furnishings as mats, curtains as mats, or clothing.curtains or clothing.

Figure 3. Representation of the direct motion capabilities of the APR. Figure 3. Representation of the direct motion capabilities of the APR. The chest and shoulders of the APR are located at a height of approximately 1.3 m, slightly lowerThe than chest the and average shoulders for human of the APRshoulders. are located This atpo asition height was of approximatelychosen to enable 1.3 elderly m, slightly people lower to thanhold thethe averagearms of forthe human robot directly. shoulders. The This shoulders position of was the chosen APR contain to enable two elderly heavy people Micromotor to hold DC the armsmotors, of thewhich robot are directly. connected The to shoulders two soft of arms the APRwith containa 35 cm two separation heavy Micromotor between them. DC motors, The arms which are are55 cm connected long, just to for two aesthetical soft arms reasons, with a 35 and cm can separation be used between as a support them. by The elder arms people are 55 when cm long, walking just foror for aesthetical gesture reasons,interaction. and The can arms be used are as moved a support periodically by elder to people mimic when the walkingnatural movements or for gesture of interaction.humans when The the arms APR are moves moved forward. periodically to mimic the natural movements of humans when the APR moves forward. The head of the APR (Figure4) is designed as the main interface between the tele-operator, the mobile robot and the user. The head is based on the Cheesecake 10.1” XL QUAD tablet (Approx Iberia, Gelves, Spain). This has four processors, a ten-finger touch interface, network connectivity, and multimedia capabilities in order to create and reproduce audio and video in videoconference communication between the APR and a remote tele-operator. The head of the APR has two small Micromotor DC motors (BS138F-2s.12.608) that provide two degrees of freedom. The head can rotate 120˝ (60˝ to each side) and also tilt from 0˝ to 90˝: 0˝ is with the camera pointing at the ground to offer a zenithal view of people lying down and the contour of the mobile robot, while 90˝ is the case with the camera pointed to the front. The APR is complemented with a low-cost 180˝ fish eye lens magnetically attached to the camera of the tablet that can provide a panoramic view of the surroundings.

(a) (b)

Figure 4. Detail of the head of the APR: (a) lateral view, (b) frontal view.

The head of the APR (Figure 4) is designed as the main interface between the tele-operator, the mobile robot and the user. The head is based on the Cheesecake 10.1″ XL QUAD tablet (Approx Iberia, Gelves, Spain). This has four processors, a ten-finger touch interface, network connectivity, and multimedia capabilities in order to create and reproduce audio and video in videoconference

Sensors 2016, 16, 610 4 of 23

and perpendicular displacements from the wheel’s forward direction and thus direct displacement in any desired direction (Figure 3). The external white cover of the base is made of ABS plastic (Figure 2b) using a fast prototyping 3D printer. This provides a cheap method of fabrication, a smooth-finished appearance and easy replacement of damaged parts. The circular outline of the base of the robot provides an external space-efficient shape and the absence of protruding and potentially harmful elements. This circular design also minimizes the probability of it becoming accidentally hooked on such furnishings as mats, curtains or clothing.

Figure 3. Representation of the direct motion capabilities of the APR.

The chest and shoulders of the APR are located at a height of approximately 1.3 m, slightly lower than the average for human shoulders. This position was chosen to enable elderly people to hold the arms of the robot directly. The shoulders of the APR contain two heavy Micromotor DC motors, which are connected to two soft arms with a 35 cm separation between them. The arms are 55 cm long, just for aesthetical reasons, and can be used as a support by elder people when walking Sensors 2016or, 16for, 610gesture interaction. The arms are moved periodically to mimic the natural movements of 5 of 23 humans when the APR moves forward.

Sensors 2016, 16, 610 5 of 23

communication between the APR and a remote tele-operator. The head of the APR has two small Micromotor DC motors (BS138F-2s.12.608) that provide two degrees of freedom. The head can rotate 120° (60° to each side) and(a) also tilt from 0° to 90°: 0° is with the camera( bpointing) at the ground to offer a zenithal view of people lying down and the contour of the mobile robot, while 90° is the case Figure 4. Detail of the head of the APR: (a) lateral view, (b) frontal view. with the cameraFigure pointed 4. Detail to the of front. the head The ofAPR the is APR: complemented (a) lateral view, with ( ba) low-cost frontal view. 180° fish eye lens magneticallyThe head attached of the APRto the (Figure camera 4) is designedof the tablet as th ethat main can interface provide between a panoramic the tele-operator, view of the the 3. Electronicsurroundings.mobile Componentsrobot and the user. The head is based on the Cheesecake 10.1″ XL QUAD tablet (Approx Iberia, Gelves, Spain). This has four processors, a ten-finger touch interface, network connectivity, The3. Electronicand electronic multimedia Components components capabilities of in theorder APR to create are placed and reproduce at the base audio and and are video composed in videoconference of a motor control board (MCB)The electronic and three components battery charger of the APR modules. are placed The at MCB the base (Figure and5 are) has composed an ARM of (Cambridge,a motor

England)control Cortex-M4 board (MCB) based and Microcontroller three battery charger Unit (MCU) modules. with The a MCB STM32F407VGT6 (Figure 5) has processor an ARM from STMicroelectronics(Cambridge, England) (Geneva, Cortex-M4 Switzerland), based operating Microcontroller at 168 MHz.Unit (MCU) This MCB with was a STM32F407VGT6 selected because it is a lowprocessor power device from STMicroelectronics that is able to control (Geneva, up to Switzerland), seven DC motors operating with at dual-channel168 MHz. This encodersMCB was while includingselected support because for it different is a low serialpower interfaces. device that This is able MCU to providescontrol up a Fullto seven Speed DC (12 motors Mb/s) with USB 2.0 On-The-Godual-channel (OTG) encoders wired interface while including with the support tablet devicefor different located serial at theinterfaces. head of This the MCU APR. provides The motors a are controlledFull Speed by four (12 HMb/s) Bridges USB on2.0 aOn-The-Go single Quad-Dual-Bridge (OTG) wired interface Driver with integrated the tablet circuitdevice located (TB6555FLG at the from head of the APR. The motors are controlled by four H Bridges on a single Quad-Dual-Bridge Driver Toshiba, Tokyo, Japan), two H Bridges to control the robot’s arms (HIP4020IBZ from Intersil, Milpitas, integrated circuit (TB6555FLG from Toshiba, Tokyo, Japan), two H Bridges to control the robot’s CA, Unitedarms (HIP4020IBZ States), and from three Intersil, H Bridges Milpitas, to control CA, Un theited motorsStates), thatand three drive H the Bridges omnidirectional to control the wheels (VNH2SP30motors that from drive STMicroelectronics). the omnidirectional The wheels MCU (VNH2SP30 implements from all STMicroelectronics). the specifications ofThe the MCU Android Accessoryimplements Development all the specifications Kit (ADK) of so the as Android to be recognized Accessory byDevelopment any Android Kit (ADK) Tablet so or as Smartphone to be throughrecognized a Full-speed by any USB Android wired Tablet connection. or Smartphone The MCB through communicates a Full-speed USB directly wired with connection. different The LIDAR sensorsMCB (such communicates as the small directly and compactwith different URG LIDAR and UTM sensors models, (such as Hokuyo, the small Osaka,and compact Japan), URG laser andrange sensorsUTM that models, provide Hokuyo, a two-dimensional Osaka, Japan), planarlaser range description sensors that of a provide predefined a two-dimensional planar-space planar around the robot.description The URG of model a predefined obtains planar-space new raw data around at a the sample robot. rate The ofURG approximately model obtains 100 new milliseconds raw data at and a sample rate of approximately 100 milliseconds and provides an angle of view of 240° with a provides an angle of view of 240˝ with a maximum operative range of 5.6 m and a maximum current maximum operative range of 5.6 m and a maximum current consumption of 0.5 A at 5 V. The UTM consumptionmodel provides of 0.5 A a athigher 5 V. Thesampling UTM rate model and providesoperative arange higher but, sampling in this case, rate the and maximum operative current range but, in thisconsumption case, the maximum is 1.0 A at 12 current V. consumption is 1.0 A at 12 V.

Figure 5. Schematic representation of the motor control board (MCB) of the APR. Figure 5. Schematic representation of the motor control board (MCB) of the APR. Finally, the MCB is connected to three complementary battery charger boards that provide information about the state of the internal batteries, allowing remote battery management without user intervention. The batteries are charged when the APR is manually plugged in to recharge. Additionally, the tablet of the APR has his own internal battery that is always fully charged in normal APR operation. Thanks to this internal battery, the tablet (and then the audio and video

Sensors 2016, 16, 610 6 of 23

Finally, the MCB is connected to three complementary battery charger boards that provide information about the state of the internal batteries, allowing remote battery management without user intervention. The batteries are charged when the APR is manually plugged in to recharge. Additionally, the tablet of the APR has his own internal battery that is always fully charged in normal APR operation. Thanks to this internal battery, the tablet (and then the audio and video communication capabilities of theSensors APR) 2016 remain, 16, 610 active and available for calls for several hours after full APR discharge.6 of Table23 1 shows the description of the connectors on the MCB (Figure5). communication capabilities of the APR) remain active and available for calls for several hours after full APR discharge. Table 1 showsTable 1.theDescription description of of the the MCB connectors connectors. on the MCB (Figure 5).

TAG DescriptionTable 1. Description of the TAG MCB connectors. Description CN1TAG ConnectorDescription for battery 1. TAGM6 Connector Description for the Head Pan motor. CN2CN1 ConnectorConnector for for battery battery 2.1. M6M7 ConnectorConnector for the for Head the Head Pan motor. Tilt motor. CN3CN2 ConnectorConnector for for battery battery 3.2. M7M8 Connector for theUnused. Head Tilt motor. M1 CN3 ConnectorConnector for the front for leftbattery wheel 3. motor. USB1M8 Micro-USBUnused. “On the Go” connector. M2 M1 ConnectorConnector for for the the front front right left wheel motor. motor. LASERUSB1 Micro-USBConnector “On for athe Hokuyo Go” connector. LIDAR device M3 M2 ConnectorConnector for for the the front back right wheel wheel motor. motor. BMON1LASER ConnectorConnector for a Hokuyo to monitor LIDAR battery device 1. M4 M3 ConnectorConnector for for thethe Leftback Armwheel motor. motor. BMON1BMON2 ConnectorConnector to monitor to monitor battery battery 1. 2. M5 M4 ConnectorConnector for for the the Right Left ArmArm motor. BMON2BMON3 ConnectorConnector to monitor to monitor battery battery 2. 3. M5 Connector for the Right Arm motor. BMON3 Connector to monitor battery 3.

The baseThe base of the of APRthe APR (Figure (Figure6) has 6) three has three high-capacity high-capacity batteries batteries and and three three battery battery charger charger boards basedboards on an based UC3906 on an Linear UC3906 Lead-Acid Linear Lead-Acid Battery ChargerBattery Charger combined combined with with a MJD2955T4 a MJD2955T4 PNP PNP bipolar transistorbipolar that transistor regulates that the chargingregulates process.the charging The threeprocess. battery The chargethree boardsbattery arecharge complemented boards are with a maincomplemented AC/DC switching with a main adapter AC/DC (220 switching V AC/18 adapter V DC) (220 for V battery AC/18 V charge DC) for and battery a secondary charge and AC/DC a switchingsecondary adapter AC/DC (220 switching V AC/12 adapter V DC) for(220 simultaneous V AC/12 V DC) powering for simultaneous the MCB during powering recharging the MCB of the internalduring batteries. recharging of the internal batteries.

Figure 6. Batteries (B1, B2 and B3) and main DC motors (M1, M2 and M3) at the base of the APR. Figure 6. Batteries (B1, B2 and B3) and main DC motors (M1, M2 and M3) at the base of the APR. Table 2 shows the electrical current provided by the batteries of the APR for a set of predefined Tablerobot 2motions. shows theIn this electrical prototype, current the batteries provided operate by the independently batteries of the in APRorder forto power a set of different predefined robotparts motions. of the In APR this prototype,directly. Battery the batteries 1 (Figure operate 6-B1) is independently exclusively dedicated in order to to powering power different motor 1 parts of the(Figure APR directly. 6-M1) and Battery battery 1 2 (Figure (Figure6 -B1)6-B2), is to exclusively powering motor dedicated 2 (Figure to powering 6-M2). These motor two 1 motors (Figure are6-M1) and batteryused exclusively 2 (Figure for6-B2), forward to powering (and backward) motor 2displacement, (Figure6-M2). which These is usually two motors performed are used at full exclusively speed for forwardand for long (and periods backward) of time. displacement, Battery 3 (Figure which 6-B3) is usuallyis used to performed power motor at full3 (Figure speed 6-M3), and forthe long periodsMCB, of time.the LIDAR Battery sensors, 3 (Figure and 6the-B3) Tablet, is used which to power also has motor its own 3 (Figure batteries.6-M3), Table the 2 shows MCB, that the LIDARthe maximum energy consumption is reached when the APR goes forward or backward at full speed, sensors, and the Tablet, which also has its own batteries. Table2 shows that the maximum energy the rotation and the transversal displacement of the APR is performed with the three motors in consumptionsimultaneous is reached operation. when Most the of APR the current goes forward provided or by backward Battery 3 at is full dedicated speed, to the powering rotation the and the transversalLIDAR displacementsensors. These can of thebe electrically APR is performed disconnected with in theorder three to reduce motors power in simultaneous consumption when operation. Mostthe of theAPR current is in standby provided operation. by Battery The APR 3 is has dedicated been tested to poweringin periods of the up LIDAR to four sensors.hours but These there is can be electricallycurrently disconnected still no estimation in order of battery to reduce life poweras this will consumption largely depend when on the usage. APR is in standby operation. The APR has been tested in periods of up to four hours but there is currently still no estimation of battery life as this will largely depend on usage.

Sensors 2016, 16, 610 7 of 23 Sensors 2016, 16, 610 7 of 23 Table 2. Average electrical current of the batteries when performing different predefined motions.

Table 2.MotionAverage electrical Battery current 1 of (mA) the batteries Battery when 2 (mA) performing Battery different 3 (mA) predefined Total motions. (mA) Standby 5 5 480 490 Motion Battery 1 (mA) Battery 2 (mA) Battery 3 (mA) Total (mA) Stopped 5 5 890 900 GoStandby Forward 5700 5803 4801202 4902705 Stopped 5 5 890 900 Go Backward 934 951 880 2765 Go Forward 700 803 1202 2705 RotateGo Backward (left/right) 934120 95197 880976 27651193 RotateMove (left/right) right 120176 97194 9761582 11931952 MoveMove right left 176154 194181 15821447 19521782 Move left 154 181 1447 1782 4. Software Implementation 4. Software Implementation The APR is defined as a remotely operated platform that has to be compatible with environmentsThe APR isdesigned defined asfor a humans remotely rather operated than platform for robots. that The has toAPR be compatiblehas to offer with a reliable environments remote designedcontrol operation for humans with rather a stable than remote for robots. videoconference The APR has system to offer in aorder reliable to ensure remote its control application operation as a withtele-operated a stable remoteassisted-living videoconference robot. The system transmission in order of to audio, ensure video its application and control as orders a tele-operated is carried assisted-livingout by using robot.direct Thepeer-to-peer transmission (P2P) of audio,or client-to-client video and control communication orders is carried to reduce out by delay using in direct the peer-to-peerremote control (P2P) system. or client-to-client communication to reduce delay in the remote control system.

4.1. Transmission ofof MotionMotion ControlControl OrdersOrders The remote controlcontrol ofof thethe APRAPR isis performedperformed withwith anan AndroidAndroid applicationapplication (APP)(APP) running in a remoteremote smartphonesmartphone or or tablet. tablet. Figure Figure7 shows7 shows an an image imag ofe of the the remote remote control control interface. interface. This This is based is based on theon touch-screenthe touch-screen capabilities capabilities of such of mobilesuch mobile devices. devices. The telecontrol The telecontrol APP is composedAPP is composed of an on-screen of an joystickon-screen that joystick is used that to controlis used theto control position the of position the robot, of the two robot, sliders two to controlsliders to the control pan and the tilt pan of and the robottilt of head,the robot two slidershead, two to control sliders the to robotcontrol arms the (leftrobot and arms right (left arm), and and right an extraarm), sliderand an used extra to rotateslider theused APR to rotate over the its verticalAPR over axis. its Thevertical telecontrol axis. The APP telecontrol can be usedAPP can with be one used or with two fingersone or two by trainedfingers andby trained non-trained and non-trained people. The people. remote The control remote of the control mobile of robotthe mobile is not robot a challenging is not a taskchallenging because task the high-mobilitybecause the high-mobility capabilities of capabili the APRties do notof the usually APR require do not intermediate usually require trajectory intermediate maneuvers. trajectory Future implementationsmaneuvers. Future of implementations the telecontrol APP of the will telecontro include suchl APP additional will include features such as additional direct control features of the as trajectorydirect control of the of APRthe trajectory with small of head the displacementsAPR with small [16 head] in order displacements to allow remote [16] in robot order control to allow by peopleremote withrobot impaired control by mobility. people with impaired mobility.

Figure 7. Android APP used for the remote control of the APR. Figure 7. Android APP used for the remote control of the APR.

Table 3 shows the structure of the commands sent in a single seven-byte user datagram protocol (UDP) packet where each byte has a specific interpretation. The main on-screen joystick codifies the desired input motion orders in two bytes: robot angle (Angle) and movement velocity (Module).

Sensors 2016, 16, 610 8 of 23

Table3 shows the structure of the commands sent in a single seven-byte user datagram protocol (UDP) packet where each byte has a specific interpretation. The main on-screen joystick codifies the desired input motion orders in two bytes: robot angle (Angle) and movement velocity (Module). Other controls, such as the rotation of the APR, the orientation of the head, and the rotation of the arms, are coded and submitted simultaneously. The complete control of the mobile robot is performed with network messages containing only seven bytes.

Table 3. Structure of an UDP motors control packet.

ID Head Pan Head Tilt Left Arm Right Arm Module Angle Rotation Byte 0 1 2 3 4 5 6

The UDP protocol provides fast transmission rates at the expense of omitting error detection procedures and also with a lack of knowledge about the status of the communication. The use of the UDP protocol in a real-time remote control system is justified since there is no interest on recovering lost or delayed packets containing outdated information. In this implementation, a control command is sent at least every 100 ms according to the operator’s tactile-gestures. Finally, the APR automatically stops after a time-lapse of 300 ms with no new control commands as a way to avoid the execution of unsupervised movements. At this moment, the APR automatically stops in case of network connection failure and remains inactive waiting for a new connection. In the future, this state will be complemented with the possibility of returning to the starting point or another predefined point. The tablet located in the head of the APR receives the UDP control packets and generates specific low-level orders for the MCB through the Fast USB wired connection. Table4 show the basic instruction set orders interpreted by the MCB. There are specific motion orders (starting with MF and MB) that are used for long straight forward and backward displacements. These displacements require the application of an electric brake to the unused (and unpowered) M3 motor to prevent free rotation and the continuous rectification of the angle of the wheel.

Table 4. Instruction set orders interpreted by the MCB.

Command Parameters Description AL d˚ d: from ´45 to 45˝ Moves the left arm to the specified position in degrees. AR d˚ d: from ´45 to 45˝ Moves the right arm to the specified position in degrees. HP d˚ d: from ´60 to 60˝ Rotates the head of the robot to the specified position in degrees. HT d˚ d: from 0 to 90˝ Tilts the head of the robot to the specified position in degrees. S1: PWM from 0 to 60% Fixes the Pulse Width Modulation (PWM) applied to motors M1, M2 M S1S2S3˚ S2: PWM from 0 to 20% and M3 of the APR. S3: PWM from 0 to 20% Applies electric braking to M3 and fixes the same PWM to M1 and M2 MF S˚ S: PWM from 0 to 60% to generate a forward displacement. Applies electric braking to M3 and fixes the same PWM to M1 and M2 MB S˚ S: PWM from 0 to 60% to generate a backward displacement. TL S˚ S: PWM from 0 to 20% Fixes the same PWM to M1, M2 and M3 to rotate the robot to the left. TR S˚ S: PWM from 0 to 20% Fixes the same PWM to M1, M2 and M3 to rotate the robot to the right.

4.2. Transmission of Audio and Video Streaming The bidirectional transmission of audio and video data streaming is a fundamental task in a mobile robot designed for tele-operation and remote interaction. This section presents a detailed description of the generic bidirectional audio and video transmission system developed for the APR. Sensors 2016, 16, 610 9 of 23

This system is specifically designed to connect two Android devices: the tablet that controls the APR and the remote operator’s smartphone or tablet.

4.2.1. Videoconference Architecture The videoconference system defined in an Android device is controlled from the upper-class CommunicationsManager. This class contains all the methods required to configure and initiate a new videoconference with a remote device. The videoconference system is made up of four different Sensors 2016, 16, 610 9 of 23 classes: UdpRxImages, CameraSurfaceView, Recorder, and UdpRxAudio. Each of these classes is executed inexecuted an independent in an independent thread and thread an independent and an indepen networkdent socket network to limit socket the possibilityto limit the of possibility blocking the of deviceblocking in casethe device of unexpected in case networkof unexpected problems network when performingproblems when long operations. performing Figure long8 operations.shows the hierarchyFigure 8 shows of the the videoconference hierarchy of the system. videoconference system.

Figure 8. Depiction of the videoconference system hierarchy. Figure 8. Depiction of the videoconference system hierarchy.

The exchange of data between classes is performed though custom event listeners (from now on, listeners)The exchangeand by creating of data callback between functions classes is that performed are triggered though bycustom specific event events listeners and(from notified now to on,all listeners)the classes and registered by creating to that callback listener. functions The main that re areason triggered that justifies by specific the use events of listeners and notified instead to allof thehandlers classes is registeredthat listeners to that are listener. immediately The main exec reasonuted once that received, justifies the while use ofhandler listeners callbacks instead are of handlersqueued for is thatfuture listeners execution. are immediately executed once received, while handler callbacks are queued for future execution. 4.2.2. Network Communications Protocol 4.2.2. Network Communications Protocol The audio and video data is exchanged by using UDP packets based on sending single packets The audio and video data is exchanged by using UDP packets based on sending single packets to to a specific address and port instead of creating a dedicated communication channel. Since UDP a specific address and port instead of creating a dedicated communication channel. Since UDP does does not perform any control on the packets, it offers a time efficient communication protocol. This not perform any control on the packets, it offers a time efficient communication protocol. This lack of lack of control makes UDP one of the best transmission methods for communications where delivery control makes UDP one of the best transmission methods for communications where delivery time is time is more important than data integrity. The main reason of using UDP in audio/video more important than data integrity. The main reason of using UDP in audio/video communications communications is the prevention of communication delays caused by errors produced in a single is the prevention of communication delays caused by errors produced in a single packet. Then, packet. Then, the lost data packets are ignored and communication is restored when the next packet the lost data packets are ignored and communication is restored when the next packet is received. is received. Unlike the video and audio data, the control parameters for the communication are Unlike the video and audio data, the control parameters for the communication are simultaneously simultaneously transmitted by using the transmission control protocol (TCP). The use of TCP transmitted by using the transmission control protocol (TCP). The use of TCP sockets provide more sockets provide more reliable communication channels, which are used to exchange essential reliable communication channels, which are used to exchange essential configuration data from configuration data from videoconference parameters that each device has to know in order to videoconference parameters that each device has to know in order to successfully decode the video successfully decode the video and audio data received from the remote device. and audio data received from the remote device. 4.2.3. Video Communications Video transmission/reception is an indispensable feature for a tele-operated assisted living robot. Video communication is a computationally expensive application that needs to be efficient, reliable, robust, and with low-delay in the transmissions. Currently, one of the most commonly used encoders to perform live streams is the H.264/MPEG-4 due to its high-compression capabilities. This encoder is designed to generate stable and continuous transmissions with low-bandwidth requirements but has several drawbacks. On one hand, this encoder can have a variable communication delay from 3 to 5 s and high-motion scenes can appear blurred because of the dependence between consecutive frames [17]. On the other hand, this encoder requires highly optimized libraries which are difficult to integrate into a custom application. The bidirectional video communication method implemented in the APR overcomes

Sensors 2016, 16, 610 10 of 23

4.2.3. Video Communications Video transmission/reception is an indispensable feature for a tele-operated assisted living robot. Video communication is a computationally expensive application that needs to be efficient, reliable, robust, and with low-delay in the transmissions. Currently, one of the most commonly used encoders to perform live streams is the H.264/MPEG-4 due to its high-compression capabilities. This encoder is designed to generate stable and continuous transmissions with low-bandwidth requirements but has several drawbacks. On one hand, this encoder can have a variable communication delay from 3 to 5 s and high-motion scenes can appear blurred because of the dependence between consecutive frames [17]. On the other hand, this encoder requires highly optimized libraries which are difficult to integrate into a custom application. The bidirectional video communication method implemented in the APR overcomes these drawbacks by transmitting video as a sequence of compressed images in JPEG format instead of using a common video streaming format. This approach simplifies the implementation and control of the communication. Therefore, the bidirectional video communication implemented in the APR has to acquire, compress and submit surrounding images while receiving, decompressing and showing the images submitted by the remote device. The procedure for image acquisition in the Android environment starts in the CameraSurfaceView class, which extends the Android class SurfaceView. The SurfaceView class is used as a canvas providing a dedicated drawing area where the preview of the camera is displayed. Once the camera has been initialized, the onPreviewFrame(byte[] data, Camera camera) callback is triggered each time a new image is obtained, retrieving an array containing the YUV image data encoded with the NV21 Android native format for camera preview as well as the camera instance that has captured the image. The images are then compressed to the standard JPEG format (which reduces the file size at the expense of quality and computing time) to be submitted as an image packet. Table5 shows an example of typical common image resolutions with the relationship between size, quality and time required for the JPEG compression of the image, and the theoretical maximum rate of frames per second (fps) that can be obtained in communication with the Tablet of the APR. Video communication can use any resolution available in the telecontrol device. Under normal conditions, the time required to compress one image is similar to the time needed to decompress the image. The large size reduction achieved with the JPEG compression (Table5) is because the original YUV-NV21 image format is already a compressed image format with averaged and grouped U and V color planes so this image has less spatial variability than a conventional raw RGB color image and can be described with less information and size after compression. The implementation of a videoconference system requires image acquisition, image compression, transmission, reception, and image decompression in a bidirectional way. The real frame rate of the image acquisition devices is usually 15 fps. Some images may be lost during this process: (a) because a new image is available while the previous image is still being compressed so this must be discarded; (b) because of UDP transmission on the network; and (c) because a new image has been received while the previous image is still being decompressed so this must be discarded. However, current smartphones and tablets usually include multiple CPUs and use different isolated threads for transmission and reception so they can achieve higher videoconference frame rates than when only one CPU is available for processing. Figure9 shows a visual example of the impact of the quality parameter on a JPEG compression and decompression. Figure9 shows the face the remote operator, acquired with a smartphone, compressed as JPEG, submitted to the network, received by the tablet of the APR, decompressed, and shown on the screen of the APR for visual interaction. In this example with small images, the difference between 100% and 60% qualities is almost imperceptible to the human eye while the image size is 12 times smaller. There are also small differences between 60% and 30% qualities in terms of compressed image size, processing time and expected frames per second. The APR uses a video transmission procedure configured to operate with a starting default resolution of 320 ˆ 240 pixels with a JPEG quality factor Sensors 2016, 16, 610 11 of 23 of 60%. Then, the expected average image submitted to the network is 56 times smaller than the original YUV-NV21 images acquired by the device, and 84 times smaller than the RGB version of the original image. This image resolution can be changed manually during the videoconference or adapted dynamically to the network bandwidth capabilities.

Table 5. Relationship between image size and JPEG quality.

Color Image JPEG Compression Resolution RGB Size YUV-NV21 JPEG JPEG Size Compression Fps (max) (Pixels W ˆ H) (Bytes) Size (Bytes) Quality (%) (Bytes) Time (ms) Sensors 2016, 16, 610 11 of 23 176 ˆ 144 76,032 50,688 30 1233 12.46 80 176 ˆ 144 76,032 50,688 60 1391 11.74 85 different176 isolatedˆ 144 threads76,032 for transmission 50,688 and rece 100ption so they 11,911 can achieve 16.50 higher videoconference 60 frame 320ratesˆ 240than when230,400 only one CPU 153,600 is available for 30 processing. 2280 19.96 50 Figure320 ˆ 240 9 shows a230,400 visual example 153,600 of the impact 60 of the quality 2718 parameter 19.51 on a JPEG compression 51 320 ˆ 240 230,400 153,600 100 33,617 24.74 40 and decompression.640 ˆ 480 921,600Figure 9 shows 614,400 the face the 30 remote operator, 6351 acquired 29.90 with a smartphone, 33 compressed640 ˆ 480 as JPEG,921,600 submitted to 614,400 the network, received 60 by the 7526 tablet of the 30.26 APR, decompressed, 33 and shown640 ˆ 480 on the screen921,600 of the APR 614,400 for visual inte 100raction. In 115,491 this example 55.14 with small images, 18 the ˆ difference720 between480 100%1,036,800 and 60% 691,200 qualities is al 30most imperceptible 7618 to the 32.54 human eye while 30 the 720 ˆ 480 1,036,800 691,200 60 9316 31.86 31 image720 sizeˆ 480is 12 times1,036,800 smaller. There 691,200 are also small 100 differences 126,430 between 60% 61.68 and 30% qualities 16 in terms1280 of compressedˆ 720 2,764,800 image size, 1,843,200processing time 30and expected 17,810 frames per 72.47second. The APR 13 uses a video1280 transmissionˆ 720 procedure2,764,800 configured 1,843,200 to operate 60 with a starting 21,230 default 72.87 resolution of 13320 × 240 1280 ˆ 720 2,764,800 1,843,200 100 311,409 131.72 7 pixels1280 withˆ a960 JPEG quality3,686,400 factor of 2,457,600 60%. Then, the 30expected average 23,699 image submitted 96.58 to the 10 network is 56 times1280 ˆ smaller960 than3,686,400 the original 2,457,600 YUV-NV21 images 60 acquired 28,647 by the device, 98.02 and 84 times 10 smaller than the1280 RGBˆ 960 version3,686,400 of the original 2,457,600 image. This 100image resolution 426,045 can be changed 177.25 manually 5 during the videoconference or adapted dynamically to the network bandwidth capabilities.

Figure 9.9. SampleSample 320 320 ׈ 240240 color color image image showing showing the the small small differences differences obtained after a JPEG compression and decompression procedureprocedure withwith qualityquality settingssettings ofof ((aa)) 100%;100%; ((bb)) 60%;60%; ((cc)) 30%.30%.

The compressed image is sent to the UdpTxImages thread, which creates a new UDP packet The compressed image is sent to the UdpTxImages thread, which creates a new UDP packet containing the image. Once an image has been received at the UdpTxImages thread, a flag is activated containing the image. Once an image has been received at the UdpTxImages thread, a flag is activated to to notify the CameraSurfaceView class that there is a transmission in progress, so newly generated notify the CameraSurfaceView class that there is a transmission in progress, so newly generated images images are not transmitted. This flag is used to indicate to the CameraSurfaceView that there is no are not transmitted. This flag is used to indicate to the CameraSurfaceView that there is no need to need to compress and send new images to the UdpTxImages thread until the current transmission is compress and send new images to the UdpTxImages thread until the current transmission is completed. completed. The main goal of this procedure is to avoid wasting CPU cycles by compressing images The main goal of this procedure is to avoid wasting CPU cycles by compressing images that will not be that will not be sent. However, this method slightly increases the video delay since the next image to sent. However, this method slightly increases the video delay since the next image to be transmitted be transmitted is not compressed until the last transmission ends. In general, a video transmission of is not compressed until the last transmission ends. In general, a video transmission of compressed compressed JPEG images of 320 × 240 pixels obtained at a frame rate of 15 fps and compressed with a JPEG images of 320 ˆ 240 pixels obtained at a frame rate of 15 fps and compressed with a quality quality factor of 60% with an average size of 2720 bytes requires a network bandwidth of 0.318 Mb/s factor of 60% with an average size of 2720 bytes requires a network bandwidth of 0.318 Mb/s per per video streaming (0.881 Mb/s for a color image of 640 × 480 pixels). This image-based video video streaming (0.881 Mb/s for a color image of 640 ˆ 480 pixels). This image-based video streaming streaming gives shorter delays in communication but may require higher network bandwidth than gives shorter delays in communication but may require higher network bandwidth than alternative alternative buffered-based streaming systems. In general, typical domestic networks have enough buffered-based streaming systems. In general, typical domestic networks have enough upload and upload and download bandwidth for two-directional video streaming communication with color images 640 × 480-pixels processed at 15 fps. Video reception is carried out by the UdpRxImages execution thread. Once initialized, this thread starts receiving UDP packets containing the frames transmitted by the remote device. Each time an image packet is received, the data is decoded into a bitmap using the Android method BitmapFactory.decodeByteArray(). The new bitmap is then sent to the application main activity using the OnImageReceived() listener. Once the new image has been received at the main activity, the interface view is updated by the User Interface (UI) thread. In order to improve the memory usage at the reception side of the application, the inMutable flag from the BitmapFactory class is set to true, forcing the BitmapFactory class to always use the same memory space to store new decoded images by avoiding the creation of a new bitmap each time a new image is received. When using the inMutable flag the bitmap used to decode the received image must have the same resolution. If an image with a different resolution is received, the bitmap memory used by the BitmapFactory class is

Sensors 2016, 16, 610 12 of 23 download bandwidth for two-directional video streaming communication with color images 640 ˆ 480-pixels processed at 15 fps. Video reception is carried out by the UdpRxImages execution thread. Once initialized, this thread starts receiving UDP packets containing the frames transmitted by the remote device. Each time an image packet is received, the data is decoded into a bitmap using the Android method BitmapFactory.decodeByteArray(). The new bitmap is then sent to the application main activity using the OnImageReceived() listener. Once the new image has been received at the main activity, the interface view is updated by the User Interface (UI) thread. In order to improve the memory usage at the reception side of the application, the inMutable flag from the BitmapFactory class is set to true, forcing the BitmapFactory class to always use the same memory space to store new decoded images by avoiding the creation of a new bitmap each time a new image is received. When using the inMutable flag the bitmap used to decode the received image must have the same resolution. If an image with a different resolution is received, the bitmap memory used by the BitmapFactory class is reallocated using the new image dimensions, so the only situation where a new bitmap is created is when the image resolution of the images is changed. There is no a specific optimum resolution suitable for all situations and networks. The resolution of the local and remote images used in the video communication can be changed dynamically. If required, the video communication sends a TCP packet to the remote device to request a list containing the resolutions at which the remote camera can operate. In order to obtain such information, the getSupportedRes() function from the CameraSurfaceView class is called. This function uses the getParameters() method from the camera class that contains the instance of the camera that is being used. The getSupportedPreviewSizes() method from camera instance parameters retrieves a list of the supported resolutions for live video preview. This list is then sent back to the device that initially made the query and all the resolutions are shown on the screen. This system allows a fast change or selection of the best image resolution required for each specific situation. Once the image resolution has been changed, a new TCP packet containing the new image width and height is sent to the remote device. When this packet is received, the remote device calls the changeResolution() method from the CameraSurfaceView class. This method stores the new resolution and enables a flag that notifies the onPreviewFrame() callback to change the configuration of the camera. In this case, the camera is stopped and the method setNewConfiguration() is called in order to change the video configuration and restarts the video transmission.

4.2.4. Audio Communications The transmission of audio during a remote interaction with the APR is a natural, direct and efficient communication channel between the person in charge of the telecontrol of the robot and the person assisted by the robot. The main problem that arises when establishing bidirectional audio communication is the delay between audio transmission and reproduction at the remote device when using buffered-based libraries such as VoIP/SIP. This problem is produced because the Android VoIP requirements force the application to ensure that a minimum amount of data is received (and buffered) before starting the audio playback. This audio transmission is similar as the P2P procedure used for video transmission. This implementation uses the Android AudioRecord and AudioTrack classes to record and reproduce audio data in both devices respectively (telecontrol device and APR head device). Although AudioRecord is not designed to perform audio streaming, the functionality of this class has been modified to generate small audio chunks that can be sent through the network immediately and reproduced at the remote device. Firstly an instance of the AudioTrack class is initialized with the following specifications: Sample rate of 8000 Hz, encoding format PCM 16bit, 1 channel (mono). Once initialized, the AudioRecorder instance stores all the data obtained by the device microphone in an internal buffer. When the internal buffer is filled with 3840 bytes of voice data, the buffer is sent to the UdpAudioTx thread, wrapped inside an UDP packet, and transmitted to the remote device. The UDP packet is received at the Sensors 2016, 16, 610 13 of 23

UdpAudioRx execution thread of the remote device and sent to the AudioTrack class running inside the Playback thread. This system has been initialized with the same audio parameters as the AudioRecord instance and it queues the received audio data for reproduction. In order to avoid an increasing delay on the reception side of the application, packets received with a time difference greater than 50 ms, or empty audio streams, are dropped thus ensuring that the delay of a single packet does not accumulate into a whole communication delay.

4.3. External Network Server The communication of the APR with a remote telecontrol device not located in the same local area network (LAN) is a specific problem that requires the development of an external network server. The main function of this server is to provide a common control space where telecontrol clients and robots are managed in order to establish a P2P communication between two devices. In this common space, the server is continuously waiting for a new incoming connection. This is then associated with an event listener executed in an independent execution thread. Once a new thread has been started, the server requests the mandatory information related to the connection itself (Table6). The communication can be initiated by the APR or by a remote telecontrol device according to a predefined policy usage.

Table 6. Required data to connect with the external server.

Tag Type Provided by Description A string containing the role of the device Role String Device APP connected to the server (service or robot) A string containing the name of the robot or Name String Configuration service. If no name is specified, the string “Default” is used as a name. Network Address Translation (NAT) port NAT Port Integer Network protocol assigned to the communication The IP address of the device that requested IP Address InetAddress Network protocol the connection (external IP) The IP address of the device that requested Local IP Address String Device APP the connection (internal IP)

The connection data is analyzed to determine if the device is connected under the same network domain as the server. This procedure is realized by checking the domain of the IP address provided during the connection process. If the IP address provided exists inside the private address ranges (Table7), the device is then under the same network domain as the server and thus the internal IP address will be used to send packets to the device. If the device fails at providing the requested information the communication will be closed, otherwise, the communication will be accepted and registered to the server. Once a device is connected, the specified role determines the accessible functions for the device. Therefore, when an operator wants to initiate the control of a specific APR, a packet containing the robot name is sent to the server and, once received; the server sends a request to both devices to fire multiple UDP packets to the server through specific ports. The objective of these UDP packets is to open a direct path between the client and the robot on both network firewalls by using the UDP hole punching technique [18,19]. This protocol allows direct P2P communication between different registered APRs and different registered telecontrol devices using the ports opened when starting the communication with the external server. When P2P communication starts, the server closes the UDP communications and uses the TCP socket to send and receive specific packets from both devices. As a final task, the external server keeps an updated list of on-line APRs ready for new connections. In general, the use of an external server allows long-range internet communications between a client and a robot regardless of whether they are connected at the same network domain or not. Sensors 2016, 16, 610 14 of 23

However, when using UDP packets this method can fail to engage a communication channel if any of the devices is connected to the internet using mobile networks (such as 3G or 4G) in the case of staying under multiple levels of the network address translation (NAT) protocol, see [19] for additional details. This problem can be solved by using TCP packets instead of UDP packets or by changing the configuration of the communication channel in order to accept UDP packets.

Table 7. Ranges of private IP addresses.

RFC1918 Name First Available IP Last Available IP 24-bit block 10.0.0.0 10.255.255.255 20-bit block 172.16.0.0 172.31.255.255 16-bit block 192.168.0.0 192.168.255.255

5. Tests This section presents the preliminary usability tests performed with the APR and also the specific test of the high-priority collision system implemented in the mobile robot. Future works will center on developing usability tests focused on the experience of the users such as the one proposed in [1] and on improving the human-robot interaction experience according to the guidelines proposed in [20].

5.1. Preliminary Usability Test The preliminary usability tests were performed with 12 volunteers aged from 12 to 50 years during prototype development in order to evaluate the degree of satisfaction of the different features implemented in the APR according to the different classification: “poor”, “acceptable” and “good”. Future exhaustive tests will use additional classification levels such as proposed in [1]. The results obtained in these preliminary tests have been used to refine the design and implementation of the APR. The first test consisted of introducing the robot to each of the 12 volunteers, who were then invited to rate the appearance of the APR. The volunteers gave an average score of “good”, showing that the participants liked the APR design. Future in depth usability tests will be performed with target users in order to evaluate the usability of the APR, but currently, the focus was to obtain the subjective impression caused by the design of the mobile robot. The assumption was that a good impression is required to facilitate the continuous use a mobile robot at home. The second test consisted of evaluating the videoconference capabilities implemented in the APR and the remote telecontrol APP under static conditions. During this test, the 12 volunteers were grouped into pairs and placed in different rooms in the same building and network. Each volunteer participating in the experiment played both roles: as a robot tele-operator and as a passive user, performing a total of 12 experiments. The volunteers rated this static videoconference as “acceptable” and “good”. Some of the volunteers complained about a small but sometimes noticeable desynchronization between the audio and the images in the video communication. The cause of this desynchronization is that the audio and images of the video are sent in different network UDP packets that may not be received with the same cadence as they are submitted at. This problem was partially improved by including synchronization information into the audio packets to allow automatic discarding of delayed audio packets with no audio information. The third experiment consisted of evaluating the maneuverability of the APR. In this test each of the 12 volunteers has to control the APR as a remote tele-operator. Each volunteer was trained for ten minutes in the use of the remote telecontrol APP and then the APR had to be displaced between different starting points and destinations. The volunteers rated this telecontrol as “good”, expressing increasing confidence during the experiment. The last test consisted of evaluating the videoconference capabilities implemented in the APR and the remote telecontrol APP under dynamic conditions. During this test, the 12 volunteers were grouped into pairs. Each volunteer participating in the experiment played both roles: as a static robot Sensors 2016, 16, 610 15 of 23 tele-operator and as a passive user who has to walk along a corridor with the mobile robot, performing a total of 12 experiments. This test was performed in the same building and network. The volunteers rated this dynamic videoconference as “poor” and “acceptable”. The cause of this low score was a loud background noise from a mechanical vibration generated in the APR during displacements. This problem was addressed and partially solved by adding rubber bands to the structure that supports the tablet to minimize the transmission of mechanical vibrations to the microphone embedded in the tablet. Then, this dynamic experiment was repeated with these improvements and the volunteers rated the videoconference as “acceptable”.

5.2. High-Priority Collision Avoidance System The APR is a tele-operated mobile robot designed to operate in domestic environments with limited space and its operation must be compatible with the presence of static objects such as walls and furniture, and also dynamic objects such as people and doors [21]. In general, control of a Sensorstele-operated 2016, 16, 610 robot is strongly conditioned by external factors such as network stability and15 of the 23 operator’s experience. The APRAPR has has a maximuma maximum forward forward velocity velocity of 1.3 m/sof 1.3 and, m/s for thisand, reason, for this includes reason, a high-priority includes a highcollision-priority avoidance collision system avoidance implemented system implemented in the MCB. Thisin the high-priority MCB. This procedurehigh-priority processes procedure the proinformationcesses the from information the LIDARs from directly the LIDARs in order directly to stop in theorder mobile to stop robot the automatically mobile robot withoutautomatically delay whenwithout a collisiondelay when situation a collision is estimated situation or detected is estimated regardless or of detected any other regardless consideration. of any Figure other 10 consideration.depicts the planar Figure information 10 depicts gathered the planar by information the main LIDAR gathered placed byparallel the main to LIDAR the ground placed at aparallel height toof 38the cm. ground The high-priority at a height procedureof 38 cm. isThe based high on-priority the delimitation procedure of differentis based semicircularon the delimitation safety areas of differentaround the semicircular APR [22]. Currently,safety areas other around alternatives, the APR such [22 as]. Currently an automatic, other change alternatives in the trajectory, such as [ 23an], automaticare not implemented change in the as they trajectory may cause [23], confusionare not implemented to the operator as they in charge may ofcause the telecontrolconfusion to of the operatormobile robot. in charge of the telecontrol of the mobile robot.

800

600

400

200

0

-200

-400

-600

-800

-1000 -800 -600 -400 -200 0 200 400 600 800 1000

Figure 10. DepictionDepiction ofof the the APR APR position position and and direction direction (red (red arrow), arrow), distance distance data data points points detected dete withcted withthe LIDAR the LIDAR (blue (blue points) points) and depictionand depiction of the of frontal the frontal (brownish) (brownish) and lateral and lateral (greenish) (greenish) safety safety areas areasaround around the mobile the mobile robot. robot.

The collisioncollision avoidanceavoidance system system automatically automatically stops stops the the APR APR if the if LIDARthe LIDAR detects detects anything anything inside insideone of one the differentof the different safety areas.safety Inareas. this In emergency this emergency situation, situation, the tele-operator the tele-operator of the APRof the can APR rotate can rotatethe robot the orrobot perform or perform backward backward or transversal or transversal displacements. displacements. The radius The radius and orientation and orientation of the safetyof the safetyareasaround areas around the APR the are APR adjusted are adjusted dynamically dynamically according according to the trajectory to the trajectory and velocity and velocity of the mobile of the mobilerobot. Forrobot. example, For example, the APR the stops APR if stop it triess if toit passtries throughto pass through a doorway a doorway at maximum at maximum speed while speed it willwhile not it will stop not when stop passing when atpassing a reduced at a reduced speed. speed. A set of different experiments were carried out to verify the dynamic definition of the safety areas around the APR. Figure 11 shows a sequence of images while conducting one experiment where the APR goes forward at full speed in the direction of an obstacle. The depiction in Figure 10 corresponds approximately to sequence 3 in Figure 11 in which an object is detected in front of the mobile robot. The results of different collision avoidance experiments have shown that the APR can avoid collisions except in the case of objects with transparent glasses or mirrors because these objects are not properly detected by laser LIDAR sensors.

Figure 11. Sequence of images (1–4) obtained in a collision avoidance experiment with the APR moving forward.

Sensors 2016, 16, 610 15 of 23

The APR has a maximum forward velocity of 1.3 m/s and, for this reason, includes a high-priority collision avoidance system implemented in the MCB. This high-priority procedure processes the information from the LIDARs directly in order to stop the mobile robot automatically without delay when a collision situation is estimated or detected regardless of any other consideration. Figure 10 depicts the planar information gathered by the main LIDAR placed parallel to the ground at a height of 38 cm. The high-priority procedure is based on the delimitation of different semicircular safety areas around the APR [22]. Currently, other alternatives, such as an automatic change in the trajectory [23], are not implemented as they may cause confusion to the operator in charge of the telecontrol of the mobile robot.

800

600

400

200

0

-200

-400

-600

-800 -1000 -800 -600 -400 -200 0 200 400 600 800 1000 Figure 10. Depiction of the APR position and direction (red arrow), distance data points detected with the LIDAR (blue points) and depiction of the frontal (brownish) and lateral (greenish) safety areas around the mobile robot.

The collision avoidance system automatically stops the APR if the LIDAR detects anything inside one of the different safety areas. In this emergency situation, the tele-operator of the APR can rotate the robot or perform backward or transversal displacements. The radius and orientation of the

Sensorssafety2016 areas, 16 ,around 610 the APR are adjusted dynamically according to the trajectory and velocity16 of of the 23 mobile robot. For example, the APR stops if it tries to pass through a doorway at maximum speed while it will not stop when passing at a reduced speed. A setset ofof different different experiments experiments were were carried carried out out to verifyto verify the dynamicthe dynamic definition definition of the of safety the safety areas aroundareas around the APR. the Figure APR. 11Figure shows 11 ashows sequence a sequen of imagesce of while images conducting while conducting one experiment one experiment where the APRwhere goes the forwardAPR goes at fullforward speed at in full the speed direction in the of dire an obstacle.ction of an The obstacle. depiction The in Figuredepiction 10 correspondsin Figure 10 approximatelycorresponds approximately to sequence 3to in sequence Figure 11 3 in in whichFigure an 11 object in which is detected an object in is front detected of the in mobile front of robot. the Themobile results robot. of differentThe results collision of different avoidance collision experiments avoidance have experiments shown that have the APRshown can that avoid the collisionsAPR can exceptavoid collisions in the case except of objects in the with case transparent of objects with glasses transparent or mirrors glasses because or thesemirrors objects because are notthese properly objects detectedare not properly by laser detected LIDAR sensors. by laser LIDAR sensors.

Figure 11.11. Sequence of images ( 1–4) obtained in aa collisioncollision avoidanceavoidance experimentexperiment withwith thethe APRAPR moving forward.

Table8 summarizes the collision avoidance experiments performed. In each experiment, the APR is configured to go forward at a fixed speed. The first column in Table8 shows the forward speed relative to the maximum forward speed of 1.3 m/s. The second column in Table8 shows the radius of the frontal safety area (see Figure 10) which is fixed according to the forward speed. The last column shows the final distance APR-obstacle measured when the robot has been effectively stopped. Table8 shows the proposed calibration between the robot velocity and radius of the safety frontal distance, which has a value of 120 cm when the APR reaches the maximum forward velocity of 1.3 m/s. The minimum gap-distance between the APR and the object after braking at full speed was 22.5 cm. The procedure for stopping the mobile robot is based on using the motors as electrical brakes by forcing a shortcut through the H-bridges.

Table 8. Collision avoidance test results.

Relative APR Radius of the Frontal Distance APR-Obstacle Forward Speed (%) Safety Area (cm) When Stopped (cm) 100 120.0 22.5 88 108.6 23.0 75 96.4 26.3 66 86.7 25.5 50 72.3 30.8 33 56.3 29.0 25 48.9 30.2

In the future, this high-priority collision avoidance system will be improved in order to estimate the trajectory of dynamic obstacles around the mobile robot and reduce the braking distance.

6. Applications of the APR This section proposes some sample applications for the APR to take advantage of its capabilities as a tele-operated assisted living robot.

6.1. Mobile Videoconference Service The APR has been designed as a telecontrolled mobile videoconference service prepared to provide social and assistive services to elderly people and persons with impaired mobility. In the case Sensors 2016, 16, 610 16 of 23

Table 8 summarizes the collision avoidance experiments performed. In each experiment, the APR is configured to go forward at a fixed speed. The first column in Table 8 shows the forward speed relative to the maximum forward speed of 1.3 m/s. The second column in Table 8 shows the radius of the frontal safety area (see Figure 10) which is fixed according to the forward speed. The last column shows the final distance APR-obstacle measured when the robot has been effectively stopped. Table 8 shows the proposed calibration between the robot velocity and radius of the safety frontal distance, which has a value of 120 cm when the APR reaches the maximum forward velocity of 1.3 m/s. The minimum gap-distance between the APR and the object after braking at full speed was 22.5 cm. The procedure for stopping the mobile robot is based on using the motors as electrical brakes by forcing a shortcut through the H-bridges.

Table 8. Collision avoidance test results.

Relative APR Radius of the Frontal Distance APR-Obstacle Forward Speed (%) Safety Area (cm) When Stopped (cm) 100 120.0 22.5 88 108.6 23.0 75 96.4 26.3 66 86.7 25.5 50 72.3 30.8 33 56.3 29.0 25 48.9 30.2

In the future, this high-priority collision avoidance system will be improved in order to estimate the trajectory of dynamic obstacles around the mobile robot and reduce the braking distance.

6. Applications of the APR This section proposes some sample applications for the APR to take advantage of its capabilities as a tele-operated assisted living robot.

6.1. Mobile Videoconference Service Sensors 2016, 16, 610 17 of 23 The APR has been designed as a telecontrolled mobile videoconference service prepared to provide social and assistive services to elderly people and persons with impaired mobility. In the ofcase this of application, this application, the APR the is locatedAPR is atlocated home andat home the mobile and the robot mobile can be robot configured can be to configured automatically to toautomatically accept calls fromto accept registered calls from remote registered operators remote or services operators that willor services take the that control will oftake the the mobile control robot of duringthe mobile a mobile robot videoconference. during a mobile The videoconference. APR can be turned The APR on by can pressing be turned a single on buttonby pressing (Figure a single12) so asbutton to make (Figure it easy 12) forso as users to make with lowit easy technological for users with skills low to technological use. skills to use.

(a) (b)

Figure 12. ((aa)) Starting Starting the the APR; APR; (b (b) )APR APR initializing initializing the the onboard onboard sensor sensorss and and initializing initializing the the Tablet Tablet of ofthe the head. head.

This unique button starts the MCB which configures its main USB as a host interface. The tablet This unique button starts the MCB which configures its main USB as a host interface. The is also configured to start automatically when activity is detected in a USB host interface. The tablet tablet is also configured to start automatically when activity is detected in a USB host interface. The tablet of the APR always operates as a slave device connected to the USB host interface of the MCB. Once started, the tablet of the APR automatically launches the client APR application, which also connects to a global server to register the robot as an available remotely operated APR ready for a new connection (Figure 12). The APR only requires specific operational configuration the first time it is started (global server IP and robot name), and after that, the robot remembers the parameters used for the configuration and uses them as its default parameters. On one hand, the APR operates like a basic phone or videoconference device that can use the information of the onboard LIDAR sensors automatically to generate a call or an alarm for relatives and preconfigured registered services. This call can also be generated simply by touching the screen of the tablet. On the other hand, the people or services registered can use the telecontrol APR APP to initiate a remote communication with the APR as audio only, audio and video, or audio and video combined with remote telecontrol operation. The remote tele-operation APP establishes a connection with a global external server and shows a list of all the authorized and available APRs registered for this service. Figure 13 shows the screen of the tablet of the APR with a small window that previews the local image acquired by the APR and submitted during the videoconference, the image received during the videoconference and four main buttons used to control the communication. The buttons enable or disable the submission of video, the submission of audio, the reception of audio, and also change the resolution of the images used in the videoconference. Currently, the complete videoconference starts automatically but the automatic response of the APR can be configured depending on the authorization level of the service in charge of the call. One of the objectives of the APR is to enable direct bidirectional communication with relatives and technical or medical services to be established. On one hand, this communication can be used to generate social interaction with elderly people. On the other hand, the sensors on the APR can detect problems or inactivity and automatically generate alarms. In such a context, the APR can be used by a remote tele-operated service to explore a typical household scenario visually, evaluate the situation and reduce the reaction time if external assistance is needed. Sensors 2016, 16, 610 17 of 23

of the APR always operates as a slave device connected to the USB host interface of the MCB. Once started, the tablet of the APR automatically launches the client APR application, which also connects to a global server to register the robot as an available remotely operated APR ready for a new connection (Figure 12). The APR only requires specific operational configuration the first time it is started (global server IP and robot name), and after that, the robot remembers the parameters used for the configuration and uses them as its default parameters. On one hand, the APR operates like a basic phone or videoconference device that can use the information of the onboard LIDAR sensors automatically to generate a call or an alarm for relatives and preconfigured registered services. This call can also be generated simply by touching the screen of the tablet. On the other hand, the people or services registered can use the telecontrol APR APP to initiate a remote communication with the APR as audio only, audio and video, or audio and video combined with remote telecontrol operation. The remote tele-operation APP establishes a connection with a global external server and shows a list of all the authorized and available APRs registered for this service. Figure 13 shows the screen of the tablet of the APR with a small window that previews the local image acquired by the APR and submitted during the videoconference, the image received during the videoconference and four main buttons used to control the communication. The buttons enable or disable the submission of video, the submission of audio, the reception of audio, and also change the resolution of the images used in the videoconference. Currently, the complete videoconference Sensorsstarts 2016automatically, 16, 610 but the automatic response of the APR can be configured depending on18 ofthe 23 authorization level of the service in charge of the call.

Figure 13. Screenshot of the Tablet of the APR sh showingowing the face of the person calling. calling.

6.2. MobileOne of Telepresence the objectives Service of the APR is to enable direct bidirectional communication with relatives and technical or medical services to be established. On one hand, this communication can be used to generateAlternatively, social interaction the concept with ofelderly telepresence people. On [8] implementedthe other hand, in the the sensors APR can on bethe used APR directlycan detect by peopleproblems with or impaired inactivity mobility and automatically [24] located generate in their homealarms. in In order such to a connect context, with the APR another can place.be used In by the casea remote of this tele-operated application, the service APR isto located explore at onea typical facility household and the mobile scenario robot visually, is configured evaluate to accept the callssituation from and registered reduce usersthe reaction who will time take if controlexternal of assistance the mobile is robotneeded. as telepresence service. Figure 14 shows a simulated example of the APR used as a telepresence platform during an informal talk between6.2. Mobile three Telepresence colleagues. Service In this simulated case, the APR is only used as a tool for social interaction. This is an alternative application demanded by people with impaired mobility and which will be Alternatively, the concept of telepresence [8] implemented in the APR can be used directly by deployed in future work. peopleSensors 2016 with, 16 ,impaired 610 mobility [24] located in their home in order to connect with another place.18 of In 23 the case of this application, the APR is located at one facility and the mobile robot is configured to accept calls from registered users who will take control of the mobile robot as telepresence service. Figure 14 shows a simulated example of the APR used as a telepresence platform during an informal talk between three colleagues. In this simulated case, the APR is only used as a tool for social interaction. This is an alternative application demanded by people with impaired mobility and which will be deployed in future work.

Figure 14. Depiction of the telepresence capabilities of the APR. Figure 14. Depiction of the telepresence capabilities of the APR. This application will be installed in a typical industry, office or institution to offer telepresence contactThis between application employees. will be Telepresence installed in a is typical achieved industry, by offering office a or physical institution andto mobile offer telepresencedepiction of contactthe person between who is employees. controlling Telepresencethe APR. Theis telecontrol achieved application by offering of a physicalthe APR andenables mobile direct depiction control ofof thethe personrobot’s whoarms is and controlling head as thea way APR. of Thesimulating telecontrol the creation application of non-verbal of the APR communication enables direct controlbetween of groups the robot’s of people arms and who head usually as a waycollaborate of simulating with each the creation other. In of the non-verbal future, this communication non-verbal betweencommunication groups will of people be enhanced who usually with the collaborate inclusion with of configurable each other. Inbuttons the future, with predefined this non-verbal and communicationpersonalized combination will be enhanced of small witharm and the inclusionhead movements. of configurable buttons with predefined and personalizedFuture work combination based on of this small application arm and will head focus movements. on deploying the APR as a telepresence service for theFuture industry work as based an informal on this applicationway of maintaining will focus contact on deploying between the groups APR asof aemployees telepresence in case service of forsomeone the industry having asimpaired an informal mobility way that of maintainingimpedes their contact presence between and interaction groups of with employees the group. in caseThe ofhypothesis someone havingis that impairedfrequent and mobility informal that impedestelepresence their contact presence can and improve interaction productivity with the group.while Thecontributing hypothesis to reducing is that frequent the effort and spent informal on unpleasant, telepresence long contact or diffic canult improvejourneys. productivityIn this context, while the APR will be tested as a telepresence tool that may contribute to enhancing the employability of people with impaired mobility. Figure 15 shows a screenshot of the telecontrol application of the APR. This application is used by a trained operator and the interface is operated with one or two fingers. The interface includes a circular motion joystick where the red circle depicts the current direction of the APR and a rotation joystick used specifically to rotate the mobile robot. The external circle of the motion joystick is also used to show the relative location of obstacle warnings detected by the APR. There are additional sliders to control the head and the arms of the robot. This screen shows the status of the battery and wireless communication as text messages at top center of the screen but this information will be replaced with small graphic indicators in the future.

Figure 15. Screenshot of the remote telecontrol application.

Sensors 2016, 16, 610 18 of 23

Figure 14. Depiction of the telepresence capabilities of the APR.

This application will be installed in a typical industry, office or institution to offer telepresence contact between employees. Telepresence is achieved by offering a physical and mobile depiction of the person who is controlling the APR. The telecontrol application of the APR enables direct control of the robot’s arms and head as a way of simulating the creation of non-verbal communication between groups of people who usually collaborate with each other. In the future, this non-verbal communication will be enhanced with the inclusion of configurable buttons with predefined and personalized combination of small arm and head movements. Future work based on this application will focus on deploying the APR as a telepresence service

Sensorsfor the2016 industry, 16, 610 as an informal way of maintaining contact between groups of employees in case19 of of 23 someone having impaired mobility that impedes their presence and interaction with the group. The hypothesis is that frequent and informal telepresence contact can improve productivity while contributingcontributing toto reducingreducing thethe efforteffort spentspent onon unpleasant, long or diffic difficultult journeys. In In this this context, context, the the APRAPR willwill be be tested tested as as a telepresencea telepresence tool tool that that may may contribute contribute to enhancing to enhancing the employability the employability of people of withpeople impaired with impaired mobility. mobility. FigureFigure 1515 showsshows aa screenshotscreenshot ofof thethe telecontroltelecontrol application of the APR. This application is is used used byby aa trainedtrained operatoroperator andand thethe interfaceinterface is operatedoperated with one or two fingers.fingers. The interface includes includes a a circularcircular motionmotion joystickjoystick wherewhere thethe red circle depicts the current di directionrection of of the the APR APR and and a a rotation rotation joystickjoystick usedused specificallyspecifically toto rotaterotate thethe mobilemobile robot.robot. The external circle circle of of the the motion motion joystick joystick is is also also usedused toto showshow thethe relativerelative locationlocation ofof obstacleobstacle warningswarnings detected by the APR. There are are additional sliderssliders toto control the the head head and and the the arms arms of of the the robot. robot. This This screen screen shows shows the status the status of the of battery the battery and andwireless wireless communication communication as text as text messages messages at top at top cent centerer of ofthe the screen screen but but this this information information will will be be replacedreplaced withwith smallsmall graphicgraphic indicatorsindicators inin thethe future.future.

Figure 15. Screenshot of the remote telecontrol application. Sensors 2016, 16, 610 19 of 23 6.3. Walking Assistant Tool 6.3. Walking Assistant Tool The APR can be used as a walking assistant tool [25] and to avoid the use of additional devices The APR can be used as a walking assistant tool [25] and to avoid the use of additional devices in limited spaces [26]. Figure 16 shows a sample image of a person using the soft arms of the APR in limited spaces [26]. Figure 16 shows a sample image of a person using the soft arms of the APR as as a walking support. In this mode, the velocity of the APR is automatically limited while the arms a walking support. In this mode, the velocity of the APR is automatically limited while the arms are ˝ areset set at atanan angle angle of of35° 35 to theto the back. back. The The functionality functionality of the of themobile mobile robot robot as a aswalking a walking support support can be can beremotely remotely supervised supervised by by using using the the front front or or rear rear ca cameramera of of the the tablet tablet and and by by rotating rotating the the head. head. The The orientationorientation of of the the motors motors ofof thethe armsarms hashas a closed control loop loop and and any any disturbance disturbance (force (force applied applied by by thethe user) user) [ 26[26]] can can be be used used toto refinerefine thethe displacement of of the the robot robot automatically. automatically. Finally, Finally, in in this this mode, mode, thethe information information gathered gathered by by the the LIDAR LIDAR sensors sensors is used is used automatically automatically to center to center the robot the whenrobot passingwhen throughpassing doorways through doorways or along aor narrow along a corridor. narrow corridor.

(a) (b) (c) (d)

FigureFigure 16.16.Applications Applications (a–d) of the APR as as a a support support for for walking. walking.

6.4. Scheduling Tool 6.4. Scheduling Tool The APR can be used to help elderly people to remember their daily routines [27,28]. This task is The APR can be used to help elderly people to remember their daily routines [27,28]. This task is performed by means of individual programmable alarms that show a visual just-in-time performed by means of individual programmable alarms that show a visual just-in-time representation representation of a daily routine on the screen of the APR, combined with additional acoustic signals or recorded advices. In this application, the tactile screen of the tablet of APR is used for direct feedback. Figure 17 shows a simulation of the APR operating as a scheduling tool. In our region, there is a public service for elderly people. This involves a daily call to remind them of all the daily routines but the hypothesis is that a visual indication at the scheduled moment of, for example, taking a medicine will be more effective than a daily phone-call.

Figure 17. Simulation of the APR providing a daily or regular routine notification.

Sensors 2016, 16, 610 19 of 23

6.3. Walking Assistant Tool The APR can be used as a walking assistant tool [25] and to avoid the use of additional devices in limited spaces [26]. Figure 16 shows a sample image of a person using the soft arms of the APR as a walking support. In this mode, the velocity of the APR is automatically limited while the arms are set at an angle of 35° to the back. The functionality of the mobile robot as a walking support can be remotely supervised by using the front or rear camera of the tablet and by rotating the head. The orientation of the motors of the arms has a closed control loop and any disturbance (force applied by the user) [26] can be used to refine the displacement of the robot automatically. Finally, in this mode, the information gathered by the LIDAR sensors is used automatically to center the robot when passing through doorways or along a narrow corridor.

(a) (b) (c) (d)

Sensors 2016, 16, 610 Figure 16. Applications (a–d) of the APR as a support for walking. 20 of 23

6.4. Scheduling Tool of a daily routineThe APR on thecan be screen used to of help the elderly APR, people combined to remember with their additional daily routines acoustic [27,28]. signals This task oris recorded performed by means of individual programmable alarms that show a visual just-in-time advices. In this application, the tactile screen of the tablet of APR is used for direct feedback. Figure 17 representation of a daily routine on the screen of the APR, combined with additional acoustic signals shows a simulationor recorded ofadvices. the APR In this operating application, as the a scheduling tactile screen tool. of the In tablet our of region, APR is there used isfora direct public service for elderlyfeedback. people. Figure Thisinvolves 17 shows aa dailysimulation call of to the remind APR operating them of as all a thescheduling daily routines tool. In our but region, the hypothesis is that a visualthere is indication a public service at the for scheduledelderly people. moment This involves of, fora daily example, call to remind taking them a medicine of all the daily will be more routines but the hypothesis is that a visual indication at the scheduled moment of, for example, effective than a daily phone-call. taking a medicine will be more effective than a daily phone-call.

Figure 17. Simulation of the APR providing a daily or regular routine notification. Sensors 2016, 16,Figure 610 17. Simulation of the APR providing a daily or regular routine notification. 20 of 23

6.5.6.5. FallFall DetectionDetection ToolTool InIn thisthis context,context, thethe lastlast APRAPR application proposed proposed in in this this paper paper is is the the detection detection of of falls falls or or inactivityinactivity [[4,5].4,5]. This functionality requires requires a a sequential sequential analysis analysis of of the the raw raw data data provided provided by by the the onboardonboard LIDARLIDAR sensors,sensors, the detection of the bac backgroundkground objects objects and and abno abnormalrmal user user movements. movements. FigureFigure 1818 showsshows a simulation simulation of of the the APR APR used used as asa fall a fall detector detector that that generates generates a warning a warning when when the thevolume volume of ofthe the user user detected detected in inthe the information information provided provided by by the the LIDARs LIDARs decreases decreases or or increases increases suddenly.suddenly. Then, Then, anan automaticautomatic callcall isis generated if th thee user does does not not press press a a button button that that appears appears on on the the screenscreen of of thethe tablettablet onon thethe APR.APR.

(a) (b)

Figure 18. Simulation of the APR used as a fall detector (a) with activity and (b) without activity. Figure 18. Simulation of the APR used as a fall detector (a) with activity and (b) without activity.

6.6. Mobile Ambient Monitoring Platform 6.6. Mobile Ambient Monitoring Platform The APR can be used as a mobile ambient monitoring platform that gathers information from The APR can be used as a mobile ambient monitoring platform that gathers information from the environment. For example, in [29], a mobile robot with specialized onboard sensors was used to the environment. For example, in [29], a mobile robot with specialized onboard sensors was used to measure such environmental parameters as air velocity and gas concentrations in the air. Other ambient parameters, such as temperature, humidity, and luminance, were also measured using low-cost sensors. These environmental values are directly correlated with living conditions and were used to detected areas or rooms with abnormal conditions automatically. Alternatively, ambient information can be gathered with a fixed-sensor network infrastructure. For example, the GiraffPlus project [12] allowed remote examination of body temperature, blood pressure and electrical usage based on a sensor network infrastructure and some physiological sensors. This system offers the possibility of a virtual visit via a tele-operated robot to discuss the physiological data and activities occurring over a period of time and the automatic generation of alarms in case of detecting instantaneous problems or warnings based on long-term trend analysis. Figure 19 shows the ambient information gathered by the APR during a telecontrol experiment based on the methodologies presented in [29,30] with a common sampling sensory time of 1 s. Figure 19 show the temperature registered by the APR with extreme values of 26 °C in one room and 12 °C in another room with an open window (in winter). The humidity reached abnormal values when the APR was in the room with an open window. The luminance level was poor with the lowest values in an indoor corridor. The air velocity was always low and probably originated by the displacement of the APR. Finally, the measurement of volatile gas concentrations in the air can be performed by using one sensor per gas type or by the use of a versatile sensor [29], which can be configured to measure different gasses. In this case, a photo-ionization sensor (PID) was configured to measure acetone because this was a substance used in the area explored [30]. The results showed a low concentration of acetone in the air during the experiment. The APR can be used as a supporting tool to implement a mobile ambient monitoring platform and develop new assisted-living applications.

Sensors 2016, 16, 610 21 of 23 measure such environmental parameters as air velocity and gas concentrations in the air. Other ambient parameters, such as temperature, humidity, and luminance, were also measured using low-cost sensors. These environmental values are directly correlated with living conditions and were used to detected areas or rooms with abnormal conditions automatically. Alternatively, ambient information can be gathered with a fixed-sensor network infrastructure. For example, the GiraffPlus project [12] allowed remote examination of body temperature, blood pressure and electrical usage based on a sensor network infrastructure and some physiological sensors. This system offers the possibility of a virtual visit via a tele-operated robot to discuss the physiological data and activities occurring over a period of time and the automatic generation of alarms in case of detecting instantaneous problems or warnings based on long-term trend analysis. Figure 19 shows the ambient information gathered by the APR during a telecontrol experiment based on the methodologies presented in [29,30] with a common sampling sensory time of 1 s. Figure 19 show the temperature registered by the APR with extreme values of 26 ˝C in one room and 12 ˝C in another room with an open window (in winter). The humidity reached abnormal values when the APR was in the room with an open window. The luminance level was poor with the lowest values in an indoor corridor. The air velocity was always low and probably originated by the displacement of the APR. Finally, the measurement of volatile gas concentrations in the air can be performed by using one sensor per gas type or by the use of a versatile sensor [29], which can be configured to measure different gasses. In this case, a photo-ionization sensor (PID) was configured to measure acetone because this was a substance used in the area explored [30]. The results showed a low concentration of acetone in the air during the experiment. The APR can be used as a supporting tool to implement a mobileSensors ambient 2016, 16, 610 monitoring platform and develop new assisted-living applications. 21 of 23

FigureFigure 19. 19.Example Example ofof ambientambient information auto automaticallymatically gathered gathered by by the the APR. APR.

7. Conclusions 7. Conclusions This paper presents the conception and application of a tele-operated mobile robot as an assistedThis paper living presents tool. The the conceptionproposed Assistant and application Personal of aRobot tele-operated (APR) is mobile described robot in as terms an assisted of livingmechanical tool. The design, proposed electronics Assistant and Personal software Robot implementation. (APR) is described The APR in uses terms three of mechanicalomnidirectional design, electronicswheels in and order software to offer implementation. high-mobility capabilities The APR uses for effective three omnidirectional indoor navigation. wheels The in orderAPR has to offera high-mobilitymaximum speed capabilities of 1.3 form/s effective and implements indoor navigation. a high-priority The APR collision has a maximumavoidance speedsystem of in 1.3 the m/s electronic board that controls the motors of the mobile robot, which can stop the APR regardless of the current motion orders. The APR is described in terms of applications proposed to develop the concept of a tele-operated assisted living tool mainly focused on assistive applications for elderly people and those with impaired mobility. The preliminary experiments performed in this paper have been used to refine the design of the APR as a tele-operated assisted living robot. Future work will focus on improving the design of the APR and the development of the assistive services proposed in this paper.

Acknowledgments: This work is partially founded by Indra, the University of Lleida, and the RecerCaixa 2013 grant. The authors would like to thank the collaboration of Marius David Runcan and Mercè Teixidó in the initial stage of mobile robot development. The authors would like to thank the anonymous reviewers for their helpful and constructive comments that greatly contributed to improving the final version of the paper.

Author Contributions: Eduard Clotet and Dani Martínez designed and developed the software applications and edited most parts of this manuscript. Javier Moreno designed and implemented the mechanical adaptation of the mobile robot. Marcel Tresanchez designed and implemented the electronic boards. Eduard Clotet and Dani Martínez also designed and performed the experimental tests described in the paper. Jordi Palacín coordinated the design and implementation of the software, the electronics and the mechanical design of the APR and also coordinated the final edition of the manuscript.

Conflicts of Interest: The authors declare no conflict of interest.

Sensors 2016, 16, 610 22 of 23 and implements a high-priority collision avoidance system in the electronic board that controls the motors of the mobile robot, which can stop the APR regardless of the current motion orders. The APR is described in terms of applications proposed to develop the concept of a tele-operated assisted living tool mainly focused on assistive applications for elderly people and those with impaired mobility. The preliminary experiments performed in this paper have been used to refine the design of the APR as a tele-operated assisted living robot. Future work will focus on improving the design of the APR and the development of the assistive services proposed in this paper.

Acknowledgments: This work is partially founded by Indra, the University of Lleida, and the RecerCaixa 2013 grant. The authors would like to thank the collaboration of Marius David Runcan and Mercè Teixidó in the initial stage of mobile robot development. The authors would like to thank the anonymous reviewers for their helpful and constructive comments that greatly contributed to improving the final version of the paper. Author Contributions: Eduard Clotet and Dani Martínez designed and developed the software applications and edited most parts of this manuscript. Javier Moreno designed and implemented the mechanical adaptation of the mobile robot. Marcel Tresanchez designed and implemented the electronic boards. Eduard Clotet and Dani Martínez also designed and performed the experimental tests described in the paper. Jordi Palacín coordinated the design and implementation of the software, the electronics and the mechanical design of the APR and also coordinated the final edition of the manuscript. Conflicts of Interest: The authors declare no conflict of interest.

References

1. Desai, M.; Tsui, K.M.; Yanco, H.A.; Uhlik, C. Essential features of telepresence robots. In Proceedings of the 2011 IEEE Conference on Technologies for Practical Robot Applications, Woburn, MA, USA, 11–12 April 2011; pp. 15–20. 2. United Nations. World Population Ageing 2013. Available online: http://www.un.org/en/development/desa/ population/publications/pdf/ageing/WorldPopulationAgeing2013.pdf (accessed on 15 December 2015). 3. World Health Organization. World Report on Ageing and Health. 2015. Available online: http://apps.who. int/iris/bitstream/10665/186463/1/9789240694811_eng.pdf (accessed on 18 February 2016). 4. Delahoz, Y.S.; Labrador, M.A. Survey on Fall Detection and Fall Prevention Using Wearable and External Sensors. Sensors 2014, 14, 19806–19842. [CrossRef][PubMed] 5. Palmerini, L.; Bagalà, F.; Zanetti, A.; Klenk, J.; Becker, C.; Cappello, A. A Wavelet-Based Approach to Fall Detection. Sensors 2015, 15, 11575–11586. [CrossRef][PubMed] 6. Vermeersch, P.; Sampsel, D.D.; Kelman, C. Acceptability and usability of a telepresence robot for geriatric primary care: A pilot. Geriatr. Nurs. 2015, 36, 234–238. [CrossRef][PubMed] 7. Bevilacqua, R.; Cesta, A.; Cortellessa, G.; Macchione, A.; Orlandini, A.; Tiberio, L. Telepresence Robot at Home: A Long-Term Case Study. In Ambient Assisted Living; Springer International Publishing: New York, NY, USA, 2014; pp. 73–85. 8. Kristoffersson, A.; Coradeschi, S.; Loutfi, A. A Review of Mobile Robotic Telepresence. Adv. Hum.-Comput. Interact. 2013.[CrossRef] 9. Pineau, J.; Montemerlo, M.; Pollack, M.; Roy, N.; Thrun, S. Towards robotic assistants in nursing homes: Challenges and results. Robot. Auton. Syst. 2003, 42, 271–281. [CrossRef] 10. Schroeter, C.; Muller, S.; Volkhardt, M.; Einhorn, E.; Bley, A.; Martin, C.; Langner, T.; Merten, M. Progress in developing a socially assistive mobile home robot companion for the elderly with mild cognitive impairment. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), San Francisco, CA, USA, 25–30 September 2011; pp. 2430–2437. 11. Moyle, W.; Jones, C.; Cooke, M.; O’Dwyer, S.; Sung, B.; Drummond, S. Connecting the person with dementia and family: A feasibility study of a telepresence robot. BMC Geriatr. 2014, 14.[CrossRef][PubMed] 12. Palumbo, F.; Ullberg, J.; Štimec, A.; Furfari, F.; Karlsson, L.; Coradeschi, S. Sensor Network Infrastructure for a Home Care Monitoring System. Sensors 2014, 14, 3833–3860. [CrossRef][PubMed] 13. Tromp, A.M.; Pluijm, S.M.F.; Smit, J.H.; Deeg, D.J.H.; Boutera, L.M.; Lips, P. Fall-risk screening test: A prospective study on predictors for falls in community-dwelling elderly. J. Clin. Epidemiol. 2001, 54, 837–844. [CrossRef] Sensors 2016, 16, 610 23 of 23

14. Sadasivam, R.S.; Luger, T.M.; Coley, H.L.; Taylor, B.B.; Padir, T.; Ritchie, C.S.; Houston, T.K. Robot-assisted home hazard assessment for fall prevention: A feasibility study. J. Telemed. Telecare 2014, 20, 3–10. [CrossRef] [PubMed] 15. Vellas, B.J.; Wayne, S.J.; ROMERO, L.J.; Baumgartner, R.N.; Garry, P.J. Fear of falling and restriction of mobility in elderly fallers. Age Ageing 1997, 26, 189–193. [CrossRef][PubMed] 16. Palleja, T.; Guillamet, A.; Tresanchez, M.; Teixido, M.; Fernandez del Viso, A.; Rebate, C.; Palacin, J. Implementation of a robust absolute virtual head mouse combining face detection, template matching and optical flow algorithms. Telecommun. Syst. 2013, 52, 1479–1489. [CrossRef] 17. Khan, A.; Sun, L.; Ifeachor, E.; Fajardo, J.O.; Liberal, F. Video Quality Prediction Model for H.264 Video over UMTS Networks and Their Application in Mobile Video Streaming. In Proceedings of the IEEE International Conference on Communications (ICC), Cape Town, South Africa, 23–27 May 2010; pp. 1–5. 18. Ford, B.; Srisuresh, P.; Kegel, D. Peer-to-Peer Communication across Network Address Translators. In Proceedings of the 2005 USENIX Annual Technical Conference, Anaheim, CA, USA, 10–15 April 2005; pp. 179–192. 19. Halkes, G.; Pouwelse, J. UDP NAT and Firewall Puncturing in the Wild. In Networking 2011, Proceedings of the 10th international IFIP TC 6 conference on Networking, Valencia, Spain, 9–13 May 2011; pp. 1–12. 20. Tsui, K.M.; Dalphond, J.M.; Brooks, D.J.; Medvedev, M.S.; McCann, E.; Allspaw, J.; Kontak, D.; Yanco, H.A. Accessible Human-Robot Interaction for Telepresence Robots: A Case Study. Paladyn J. Behav. Robot. 2015, 6, 1–29. [CrossRef] 21. Pai, N.-S.; Hsieh, H.-H.; Lai, Y.-C. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory. Sensors 2012, 12, 13947–13963. [CrossRef] [PubMed] 22. Clotet, E.; Martínez, D.; Moreno, J.; Tresanchez, M.; Palacín, J. Collision Avoidance System with Deceleration Control Applied to an Assistant Personal Robot. Trends Pract. Appl. Agents Multi-Agent Syst. Sustain. 2015, 372, 227–228. 23. Almasri, M.; Elleithy, K.; Alajlan, A. Sensor Fusion Based Model for Collision Free Mobile . Sensors 2016, 16.[CrossRef][PubMed] 24. Tsui, K.M.; McCann, E.; McHugh, A.; Medvedev, M.; Yanco, H.A. Towards designing telepresence robot navigation for people with disabilities. Int. J. Intell. Comput. Cybern. 2014, 7, 307–344. 25. Wandosell, J.M.H.; Graf, B. Non-Holonomic Navigation System of a Walking-Aid Robot. In Proceedings of the 11th IEEE International Workshop on Robot and Human Interactive Communication, Berlin, Germany, 25–27 September 2002; pp. 518–523. 26. Shi, F.; Cao, Q.; Leng, C.; Tan, H. Based On Force Sensing-Controlled Human-Machine Interaction System for Walking Assistant Robot. In Proceedings of the 8th World Congress on Intelligent Control and Automation, Jinan, China, 7–9 July 2010; pp. 6528–6533. 27. Pollack, M.E.; Brown, L.; Colbry, D.; McCarthy, C.E.; Orosz, C.; Peintner, B.; Ramakrishnan, S.; Tsamardinos, I. Autominder: An intelligent cognitive orthotic system for people with memory impairment. Robot. Auton. Syst. 2003, 44, 273–282. [CrossRef] 28. De Benedictis, R.; Cesta, A.; Coraci, L.; Cortellessa, G.; Orlandini, A. Adaptive Reminders in an Ambient Assisted Living Environment. Ambient Assist. Living Biosyst. Biorobot. 2015, 11, 219–230. 29. Martinez, D.; Teixidó, M.; Font, D.; Moreno, J.; Tresanchez, M.; Marco, S.; Palacín, J. Ambient Intelligence Application Based on Environmental Measurements Performed with an Assistant Mobile Robot. Sensors 2014, 14, 6045–6055. [CrossRef][PubMed] 30. Martinez, D.; Moreno, J.; Tresanchez, M.; Clotet, E.; Jiménez-Soto, J.M.; Magrans, R.; Pardo, A.; Marco, S.; Palacín, J. Measuring Gas Concentration and Wind Intensity in a Turbulent Wind Tunnel with a Mobile Robot. J. Sens. 2016, 2016.[CrossRef]

© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).