<<

IEEE and Automation Letters (RAL) paper presented at the 2020 IEEE/RSJ International Conference on Intelligent and Systems (IROS) October 25-29, 2020, Las Vegas, NV, USA (Virtual)

Lio - A Personal Assistant for Human-Robot Interaction and Care Applications

Justinas Miseikis,ˇ Pietro Caroni, Patricia Duchamp, Alina Gasser, Rastislav Marko, Nelija Miseikienˇ e,˙ Frederik Zwilling, Charles de Castelbajac, Lucas Eicher, Michael Fruh,¨ Hansruedi Fruh¨

Abstract— Lio is a platform with a multi- careers [4]. A possible staff shortage of 500’000 healthcare functional arm explicitly designed for human-robot interaction employees is estimated in Europe by the year of 2030 [5]. and personal care assistant tasks. The robot has already been Care robotics is not an entirely new field. There has deployed in several health care facilities, where it is functioning autonomously, assisting staff and patients on an everyday basis. been significant development in this direction with robots Lio is intrinsically safe by having full coverage in soft artificial- ranging in a number of categories. Starting with mobile leather material as well as collision detection, limited speed robots with manipulation capabilities, one of the best known and forces. Furthermore, the robot has a compliant motion robots is Pepper by SoftBank Robotics. It was created for controller. A combination of visual, audio, laser, ultrasound interaction and entertainment tasks. It is capable of voice and mechanical sensors are used for safe navigation and environment understanding. The ROS-enabled setup allows interactions with humans, face and mood recognition. In the researchers to access raw sensor data as well as have direct healthcare sector Pepper is used for interaction with dementia control of the robot. The friendly appearance of Lio has resulted patients [6]. However, despite having two arms, Pepper is not in the robot being well accepted by health care staff and designed to manipulate objects as a core functionality. patients. Fully autonomous operation is made possible by a A WALKER by UBTECH is a bipedal robot with focus flexible decision engine, autonomous navigation and automatic recharging. Combined with time-scheduled task triggers, this on walking capabilities including going up and down the allows Lio to operate throughout the day, with a battery life of stairs. It can manipulate objects, however exact capabilities up to 8 hours and recharging during idle times. A combination and limitations have not been defined publicly [7]. of powerful computing units provides enough processing power The Care-o-bot 4 by Unity Robotics and Fraunhofer IPA to deploy artificial intelligence and deep learning-based solu- is able to recognise faces, provide daily news and handle tions on-board the robot without the need to send any sensitive data to cloud services, guaranteeing compliance with privacy objects [8]. Despite the initial focus on the healthcare market, requirements. During the COVID-19 pandemic, Lio was rapidly the existing applications are limited to pilot projects only, adjusted to perform additional functionality like disinfection which current applications focused on shopping centers. and remote elevated body temperature detection. It complies REEM-C and TIAGo robots by PAL Robotics are ROS- with ISO13482 - Safety requirements for personal care robots, based research robot platforms, which could be used in meaning it can be directly tested and deployed in care facilities. the healthcare sector. They have been used in a number of research projects, however no permanent healthcare deploy- I.INTRODUCTION ment is known [9]. Recently robots have been gaining popularity outside the One more healthcare oriented robot is Moxi by Diligent factory floors and entering unstructured environments such Robotics. Compared to previously described robots, the focus as homes, shops and hospitals. They range from small of Moxi is not interacting with people, but rather lies in the devices designed for internet-of-things (IoT) applications to logistic of hospitals. The robot can deliver medical samples, larger physical robots which are capable of autonomously carry laundry, bring supplies and greet patients [10]. It uses a navigating in indoor and outdoor environments, sharing the mobile platform, arm, gripper and an integrated lift to adjust workspace with people and even interacting with them. its height. Moxi is mostly capable of completing the tasks in Given the issue of ageing population and shortage of an end-to-end manner. Door opening and closing procedures medical and nursing staff in many countries, this naturally are not in the range of its capabilities, it relies on automatic leads to attempts to use robotics and automation addressing door systems or places the packages outside the room. this problem [1] [2]. For example, in Switzerland, the number RIBA robot by RIKEN is specifically desgined to assist of people over 80 years of age is expected to double from hospital staff with carrying patients who are unable to walk. 2015 to 2040 [3]. It will result in nearly triple nursing costs The robot is capable of localising a voice source and lifting for the Swiss healthcare sector. patients weighing up to 80 kg. The robot is operated using Furthermore, healthcare employees are experiencing se- remote control and does not have extended functionality or vere working conditions due to stress, underpayment and autonomy mode [11]. overtime. For example, Between 8% and 38% of health The other category are mobile robots without the manipu- workers suffer physical violence at some point in their lation capabilities and social robots. The Sanbot Elf robot by QIHAN is a mobile robot for monitoring health conditions Authors are with F&P Robotics AG, Zurich,¨ Switzerland, of patients and residents. It can indicate unusual conditions, [email protected], www.fp-robotics.com detect fallen people and send warning signs, however it is not

Copyright ©2020 IEEE Soft Fingers with sensors 6 DoF with Soft A set of heterogeneous sensors are embedded in the robot. to grasp objects Skin out of artificial leather It provides Lio with the capabilities of navigating, under- Camera for person- and standing and interacting with the environment and people. Microphone object recognition A combination of in-house developed robot control and Embedded Holders for placing items Computing Units programming software myP together with algorithms based and custom attachments on the Robot Operating System (ROS) introduces a node- Display to show the LED Strip for direction indication state and information based system functionality. It enables the communication Ultrasonic sensors between all the modules and an easy overview, control and Fisheye camera interchangeability of all the software modules. For custom LiDAR scanners Speakers development, full access to the sensor data and control of Floor sensors Intel RealSense Cameras the robot is provided. Fig. 1. Overview of Lio: sensor and hardware setup. A. Robot Hardware Overview capable of directly assisting people in these situations [12]. Lio consists of a robot arm placed on top of a mobile Many platforms exist focused on interaction platform. The robot was designed by keeping in mind the with people without having extended mobility or manipu- workspace of the robot arm. It enables Lio to combine arm lation capabilities such as PARO, KIKI, AIBO, Buddy, Kuri, and mobile platform movements to grasp objects from the Mykie and Cozmo. While such robots often achieve their goal ground, reach table-tops and be at a comfortable height for of creating positive emotions, or even therapeutic progress, interaction with people in wheelchairs. It was a necessary they are incapable of assisting or solving tasks. measure to ensure that the robot is not too tall in order not The COVID-19 crisis has given an increase in robots to be intimidating to people who are sitting. applicable to the health care sector. Most of them are not able Padded artificial-leather covers ensure that Lio will not to manipulate objects, they are optimised for autonomous cause injuries in the event of a collision but also provide driving and delivery of objects. Robots like Peanut from significantly more cosy and friendly appearance compared Keenon Robotics Co., the SAITE Robot or the NOAH Robot to hard-shell robots. Majority of the body has a soft-feel to have been deployed to assist with it. For the disinfection pur- it, which helps with the acceptance of the robot. poses UVD robots capable of navigating to and disinfecting In any health care environment, data privacy is at the the whole room gained popularity [13]. utmost importance. Lio was specifically designed to have In this paper, a personal robot platform - Lio, shown in enough processing power to be able to run complex algo- Fig. 1, is presented. The robot is specifically designed for rithms, including deep learning-based ones, entirely locally autonomous operation in health care facilities and home care without the need for cloud computing or transferring data by taking into account the limitations of other robot platforms outside of the robot. Lio has four embedded computing units: and improving upon it. Lio is intrinsically safe for human- Intel NUC, Nvidia Jetson AGX Xavier, Raspberry Pi and an robot interaction. It is the next iteration robot platform built embedded PC with Atom processor. upon the experience of creating a P-Care robot, which was Each of the computing units is responsible for running a personal robot developed for Chinese markets [14]. separate modules with constant communication using an First of all, the system is described from the hardware internal secure Ethernet network. Communication is based point of view, explaining the functionality and capabilities of on ROS topics, meaning that additional modules could be the robot. Then software, interfaces and existing algorithms integrated into the system by using standard ROS communi- are presented followed by the challenges of operating a robot cation protocols and make use of sensor data. in hospital environments. Eventually, existing use cases and deployments of Lio are presented incorporating evaluation B. Mobile Platform and Navigation and current limitations of the system. The paper is concluded The mobile platform allows Lio to safely navigate in the with an overview of Lio in the context of personal care and environment and plan movement trajectories even in cluttered research applications followed by current developments and places. The mobile platform is 790x580 mm in size, as future work. shown in Fig. 2. Therefore, it is suitable to operate in any environment prepared for wheelchairs. The mobile platform II.SYSTEM DESCRIPTION uses a differential drive system with two caster wheels. Lio, a personal assistant robot for care applications, can For safe navigation, different sensors are embedded in complete a wide variety of complex tasks. It can handle the mobile platform. Two LiDARs are placed behind the and manipulate objects for applications in health care in- bumpers in the front and the back of the mobile platform stitutions and home environments. Furthermore, it operates for coverage around the robot. Their measurements are autonomously in an existing environment without requiring merged into a 360◦ scan. Additionally, distance sensors are significant adaptations for its deployment. The design of distributed around the circumference of the mobile platform Lio evolved in an iterative process. During the deployments, for redundant proximity detection of the obstacles. In the observations and interactions with staff and patients were scenario of Lio bumping into an obstacle, four mechanical taken into account for further development. bumpers ensure that the collision is detected and the robot includes a compliant position control mode. A feed-forward proportional-derivative (PD) controller with adjustable gain and stiffness creates a soft motion behaviour. The mode is particularly useful when interacting with people as well as 1630 upon contact with rigid real-world objects such as doors. Compliant control is used for human-robot interaction tasks as an input option. For example, users can push the head of the robot to initiate actions or provide a positive answer, 740 pushing the head side-to-side indicates a negative answer. 790 580 Safety is ensured by limited joint accelerations and velocities, as well as limited forces and collision detection. Fig. 2. Technical drawing of Lio. P-Rob 3 is specifically designed to have easily inter- will stop before any damage is caused. Four downward- changeable end-effectors to allow task-specific manipulation. facing infrared floor sensors were added to detect stairs Custom made mechanical interface requires only one screw or any edges. The robot moves at speeds up to 1 m/s, to lock or release the connection. Integrated electrical pins ensuring short stopping distance when needed. To provide pass through electrical signals as well as 24V power, which visual information, two Intel RealSense D435 Depth Cameras can then be adjusted using converters if needed. Additionally, are embedded in the front. One camera is facing downwards a gigabit Ethernet cable is embedded in the robot arm. It to detect the floor plane, while the other one faces up for allows establishing a connection between the devices in the higher obstacle detection in front of the robot. Fields of view gripper and the internal computers of Lio. For additional (FoV) of both cameras overlap and cover the frontal view. input options, an interactive ring can be mounted at the In addition to that, a wide-angle fisheye camera overviews gripper, which has several programmable buttons. the full frontal view with over 170◦ FoV. The map of the environment is created from the data of the D. Gripper two LiDARs. Mapping uses gmapping from the OpenSLAM Lio comes with a customised P-Grip gripper, which has package and a probabilistic localisation system is based a friendly appearance and detachable magnetic eyes. The on the adaptive Monte Carlo method [15] [16]. Both of gripper has two custom-designed fingers with soft covers for these methods were specifically adapted for Lio. On the grasping and lifting and manipulating objects up to 130 mm. map of the facility, there is a possibility to add a layer of One proximity sensor is embedded at the base of the virtual obstacles to prevent the robot from going to these gripper and each gripper finger has four infrared-based areas. Lio can locate the charging station and navigate to it proximity sensors. They are used to detect if an object is autonomously when running low on battery. inside the gripper and measure the distance to the objects Custom positions can be saved on the map such as specific over and under the gripper fingers. It can be used for collision waypoints or general areas like kitchen, patient rooms and avoidance, surface and object detection and for interactive nursing stations. All the sensor information is used to ensure tasks like handshakes and entertainment. Covering the finger safe navigation through the facilities for Lio. Furthermore, sensors by hand can also be used to confirm specific actions. the LED strip placed around the mobile platform indicates Gripper fingers can be controlled using position and force the current driving direction and state of the robot allowing controllers. Lio is also capable of learning objects using the people to understand the intentions of Lio. sensor readings combined with finger positions to classify them later on during the grasping motion. C. Robot Arm Gripper fingers are easily interchangeable. Four digital The robot arm placed on top of the mobile platform is a inputs and four digital outputs together with two 24 VDC P-Rob 3. It is a six-degree-of-freedom (DoF) collaborative power outputs are available for the sensors. robot arm designed for human-robot interaction tasks with Furthermore, there is an interchangeable camera module the maximum payload of 3-5 kg. Optical pseudo-absolute attached below the gripper fingers. It provides a video feed encoders simplify the calibration task upon startup. The of 30 frames per second (FPS). calibration can be executed from any position, takes under 3 Also, Force Sensitive Resistor (FSR) sensors are placed seconds, and each joint is moved by a maximum of 5◦. below the artificial leather of the gripper to detect physical A trajectory planner differentiates between joint space and presses on the head. tool space paths. An extensive kinematics module handles singularities and all possible robot configurations. Both an- E. Perception and Behaviour Algorithms alytical and numerical calculations with various positional Lio utilises sensors to enable behaviours used for as- and directional constraints can be executed. High-frequency sistance in everyday operations in health care institutions calculations are made possible by having the kinematics as well as for research purposes. Each of the algorithms module as a standalone C++ library, which can also be can be enabled or disabled depending on the use case and imported into external projects. overridden if needed. From the development perspective, For increased safety, the Motion Control Module (MCM) it allows projects to reuse what is needed for the main Nursing Interface Node Monitor The browser interface also includes a Python IDE for ap- plications development. Control of myP is possible from any device using multiple interfaces such as TCP, ROS and Modbus (PLC). Vice versa, the robot can also control other machines via the same interfaces. Different access levels can be set up on myP. If a multi- Home Interface myP access configuration is needed, each user can work according

Voice to the permissions granted. Any scripts developed in Python Commands Water Elderly Care Exercise IDE can be copied between the robots and maintained using version control. The robot can be updated remotely for fixing Functions Give Ball bugs, updating functionality and adding new features. Fig. 3. Available software interfaces for Lio: myP, Nursing Interface, The status of the robot can be monitored in real-time on Home Interface and Node Monitor. All of the interfaces are web-based and can be accessed from any browser-enabled device. multiple levels. It allows easy reconfiguration of Lio not only before the operation but also during runtime. For non- operation of the robot while focusing on the development technical users, there are multiple simple interfaces with or improvement in specific areas. clearly visualised information. The Home Interface provides Lio is equipped with a variety of sensors to get information high-level functions like launching specific scripts, switching about the environment. Developed algorithms make use of autonomy mode on and off and remote control. For more them by fusing necessary information to improve certain detailed control, Nursing Interface is used to schedule tasks tasks such as grasping. To assist people grasping an object and monitor system status in an easy to understand manner. from the floor, the following information is used: It consists of the following information: • The object detection for classification of the object and its location in the camera image • Location of the robot • The 3D pointcloud of the RealSense camera for object • Battery level localisation in the space • Current action and calendar • The mobile platform’s position for knowledge about the • History of actions and logs position of the platform in the environment • Real-time feeds of the cameras • Patient data and teaching-in face recognition By fusing all this information, Lio is able to accurately drive • Remote control and manual steering of the robot to the target and move the robotic arm to grasp it. To improve the success rate of grasping, information of proximity sensors Additional sensor status information together with the in the gripper fingers are used to fine-tune the gasping point script development and testing environment is available on and confirm successful grasp. myP, specifically designed for advanced, technical users and Currently, special markers need to be placed next to developers. The connection between ROS and myP is estab- special objects for localisation in unstructured environments, lished using messages and services providing full control of for example, door handles, during the door opening task. the robot from both interfaces. The Node Monitor provides However, the rest of the task, of observing if the door is a summary of all the ROS nodes that are running on the open, trajectory calculation and opening action success is robot together with the ability to start, stop and restart each observed and confirmed by fusing visual and LiDAR data. node individually and dynamically adjust parameters of the Additionally, there are several AI algorithms deployed and running processes. All of the interfaces can be accessed from fully integrated on Lio. These include: any browser-enabled device. Additionally, all the standard • Human Pose Estimation ROS tools and interfaces fully function on Lio for more • Object Detection and Recognition in-depth debugging and development. Examples of myP, • Object Grasping Nursing Interface, Home Interface and Node Monitor are • Face Detection and Recognition shown in Fig. 3. • Door Opening and Closing For the interaction purposes, a non-touch display is embed- • Voice Recognition and Synthesis ded in the front of the mobile platform. Use case studies have shown that voice and touch interactions with the robot-arm F. Control Software and User Interfaces are preferred over the touch screen interface [17]. Mounting P-Rob 3 is controlled by a C++ based low-level architec- touch-screen devices low on the platform could potentially ture called Motion Control Module (MCM) and an optional be a hazard due to bending down and risk of falling. Python-based high level architecture myP. The MCM allows The display shows the status of the robot, displays text dur- the immediate position and current control with a 100 Hz ing the voice interaction and custom-designed visualisations update cycle via TCP socket and includes all important that might be beneficial during the operation of the robot. safety features. There is also Python and C++ API for Visualisation on the screen is fully customisable through the MCM. myP provides robot pose and script database, the scripts. Additionally, robot camera feeds can also be path planning and machine learning features which can be visualised on the screen. Embedded loudspeakers, together accessed through the robots user-friendly browser interface. with a multi-directional microphone, enable Lio to interact using voice and sound. Lio can understand voice commands H. Error Handling as well as generate speech from text or play music. To achieve a reliable performance of a complex system For safety purposes, an emergency button is located on like Lio in an unpredictable environment, such as an elderly the back of the robot platform and is easily reachable in care home, requires a sophisticated error handling. case the robot has to be suddenly stopped. When pressed, Error handling on Lio is divided into three main areas: the robot and the mobile platform are released, making it Low-level and hardware related error handling is provided easy to move by hand. Upon the release of the emergency by myP for the cases like arm collisions, activation of an button, Lio returns to the normal operation mode. emergency stop and sensor failure. For the arm collisions, multiple resolution options are available: pausing and re- suming the movement or stopping the movement so that the G. Adaptations for COVID-19 skill fails followed by the next high-level skill resolving the issue. Software components for sensing and perception tasks In the face of the COVID-19 pandemic, additional func- are monitored and eventually restarted by operating system tionality was introduced to assist health care professionals in services and in the case of ROS nodes by the Node Monitor. these extenuating circumstances. In contrast to other COVID- Higher-level skill-specific error is handled in behavioural 19 specific robots, Lio carries out this functionality next to scripts. To explain the concept, an example use case of its regular routines in health care institutions. Autonomous bringing a bottle of water to the person is taken. If a path item delivery to staff and patients has already proven to be is blocked while navigating to the person, the robot will beneficial as it can be carried out in a contact-less manner. actively ask a person for help. After the bottle is grasped from the inventory, a sanity check is performed by verifying Room disinfection is important measure to prevent the the angle of gripper fingers in order to confirm the object spread of the virus, especially in hospitals. Lio was adapted was grasped. Upon failure, either the same motion is tried, to carry out disinfection using an approved UV-C light capa- or an alternative inventory slot is used to grasp the object ble of effectively killing exposed germs, bacteria and viruses again. If a person fails to take an object from Lio during on the surfaces. The development focus was targeted at the object handover, the robot would detect it and place the disinfection of frequently touched surfaces like door handles, object back into the inventory. light switches, elevator buttons and handrails. A custom holder for an UV-C light was designed for Lio that can be III.ROBOT OPERATION carried on the back of the robot. At the current stage, ArUco Lio is an autonomous robot proactively performing tasks. markers are placed next to the items to be disinfected. During On a high-level, this proactive autonomy is controlled by the the disinfection routine, Lio drives to dedicated locations decision engine system, which gathers information about the on the map, locates the marker, grasps the UV-C light and robot status and environment to choose the best-suited action the places it over the object to be disinfected. During the to perform at a given moment [19]. operation, Lio indicates warning signs on the screen, LED Operation of Lio is ensured by a combination of per- light strip and gives a visual and verbal warning if a person ception, memory and planning components. Perception of is detected close to the robot. Exposure is limited only to the environment ranges from low-level sensory inputs up the object of interest by the covered design of the holder. to high-level information delivered by AI algorithms. This One indicator of possibly infected people is Elevated real-time feed, together with introspective information and Body Temperature (EBT). In the institutions where visitors robot memory, serves as the information pool for the decision are allowed, a common practice during the pandemic is engine - the core component ensuring proactive autonomous to take body temperature measurements of people entering behaviour of the robot. It is a high-level component deciding the building. To assist with that, remote measurement of which behaviour should be executed in a particular situation. passers-by was developed. The system is based on a thermal The decision engine uses the SWI Prolog logic program- camera placed on the gripper, which is coupled with the ming language to model relations in the environment of colour gripper camera. Faces detected in the colour image Lio [20]. In addition to pre-defined common-sense rules, are mapped to the thermal camera image. The point at the users can define their reasoning mechanisms tailored to the tear glands on the inside of the eye is located, as it their application. All information available to the robot is has been proven to be a stable point with close to actual evaluated by the set of decision rules to produce the most body temperature [18]. For persons wearing non-transparent suitable action proposal at the given moment. This proposal glasses, a second stable point, the maximum temperature in is compared with inputs from manual user commands and the area of the forehead is detected. EBT detection is done by scheduled actions from the calendar. Based on a priority comparing relative temperature differences between healthy system the most important action is selected for execution. people, who are recorded during the calibration process to This priority selection is also able to interrupt running skills the temperature detected of the passer-by. If EBT is detected to start more important ones. with high confidence, a member of staff is notified to take Commonly the deployment of Lio in a facility is connected a manual control measurement of the particular person with with a set of project-specific routines. For example, in some an approved medical thermometer. institutions, Lio is distributing mail around the clinic and collecting blood samples to be delivered for analysis. A privacy requirements and making sure the robot can connect routine may require special equipment, material or tools. Lio and has good coverage along the operational routes. can have custom holders for the tasks. The robot requests the Once Lio is delivered, the developer will install the robot. staff to place a specific holder before starting the routine. One of the first steps is to create a map of the facility and to find a designated charging spot for the robot. The teaching A. Health Care Institution Requirements of the staff is done in two steps. One is to introduce staff Given the target robot deployment area being health care members of the facility to the robot, explain the capabilities institutions, followed by home environments in the future, and limitations of the robot. Then, a selected number of em- it is critical to comply with their requirements. The main ployees are given more extensive training on how to operate concern when discussing robot deployment in health care the robot, restart, manually charge it if needed and a full institutions is data protection. Hospitals, rehabilitation clinics introduction to Nursing and Home Interfaces for controlling or elderly homes do have concerns about having sensitive the robot. It is common to have a small Welcome Lio event information in the institution. With several cameras on the involving residents and patients in the facility. From then on, robot, it has to be ensured that any unwanted observations do Lio starts working autonomously in the facility. not take place. It includes remote monitoring of the sensor Lio deployment in research institutions follows similar data, storing or sending out any visual information. steps. However, depending on the project goals of the lab acquiring the robot, the robot is adapted to the needs and To comply with the privacy requirements, Lio was de- training plan is tailored according to the development and signed to have all the visual and navigation data processed by use plans. For the labs focusing on in-depth algorithm on-board computers. If any information needs to be stored, development, a thorough introduction to myP, scripting and for example, learning face recognition data, it is encoded ROS integration is provided. in a way that original images cannot be recovered. At the moment, only voice data is processed off-board, but a data IV. EXPERIMENTSAND USE CASES protection contract is signed with companies providing these Lio robots have been deployed in seven different health services. Furthermore, it is possible to set up a secure Virtual care institutions in Germany and Switzerland. Some robots Private Network (VPN) connection to access and support Lio already operate for over a year. Robots have daily routines remotely. such as collecting lab samples from different wards and For full-feature operation, Lio needs a secure WiFi con- bringing them to the pick-up point at the reception and then nection, which typically is present in many health care delivering the mail to those same wards in a rehabilitation institutions. Lio can connect to existing networks, so the clinic. In the remaining time, Lios entertain patients and IT department of the institution sets security measures. The visitors with stories of former patients or encourages to join robot can operate without a wireless connection, but access in on a few simple exercises. Robots also remind residents to the interfaces will not be possible. about their scheduled activities by knocking on their door, Another common request is to prevent the robot from opening it, saying out loud information about upcoming driving to restricted areas. Allowed and no-go areas can events and closing the door when leaving. Staff can control be defined after creating a floor map of the institution and Lio using a tablet to schedule reminders and start high-level adjusted as needed later on. Also, it is possible to define commands like playing the music and offer snacks. Robots these areas for different times of the day depending on the are also capable of driving to the rooms of the residents task Lio is executing. and taking the menu orders. This information is sent to the Integration to existing hospital systems, like phones or catering service. nursing call system is also requested. However, the existing In one of the projects, Lio was deployed in the home of a standards depend on the systems used, and it can be chal- person with paraplegia for several weeks. The robot allowed lenging to have a universal solution. Given that the system the woman to have more independence by assisting with her used has an existing API, integration over TCP or scripting daily tasks like taking and opening a bottle using a built- language, it can be done individually for each institution. in automatic bottle opener and handing it over. She could manually control Lio over her smartphone to assist with more B. Lio Deployment complex tasks such as taking off the jacket. Deployment of Lio in health care institutions consists of several steps, including preparation. First of all, the facility A. Usability Evaluation is analysed by indicating the areas of operation for the robot, According to the guidance document for medical de- evaluating the navigation and traversability capabilities and vices by the FDA, development of the behaviours of recording the areas, usually by taking photos of the floor, Lio was guided by human factors and usability engineer- hallways, doors, rooms and handles that the robot will need ing [21]. Contextual inquiries, interviews and formative to operate. Then, the specific tasks of the robot are discussed evaluations were conducted to evaluate all the existing func- and decided upon with the client. The developer will adapt tions [22] [14] [23]. Following the simulated testing at the the existing functionalities and develop new functions. The lab, usability tests were conducted with the actual end-users client prepares the WiFi network for Lio according to their in their everyday environment. B. Current Limitations New challenges are constantly observed during the deploy- ments of Lio in real-life situations. Even though combined voice and tactile inputs have proven to be a suitable op- tion, in certain situations, it is difficult for Lio to reliably understand some people with disabilities as well as some elderly people. Also, navigation in cluttered environments and some narrower places not adapted for wheelchairs can be challenging. Furthermore, depth camera is not currently present in the gripper, making precise picking of the objects placed higher up, like a tabletop more complicated compared to the picking from the floor. For some specific manipulation actions, Fig. 4. Patients and staff are interacting with Lio in a rehabilitation clinic like a door opening, markers are still needed. Language in Switzerland. understanding is currently limited to keyword matching and In a user study on co-robots, it was determined that the still relies on online services, thus constant listening is not introduction of an innovative device such as a robot in health used due to privacy concern. care leads to unexpected challenges [24]. One finding was Currently, Lio does not proactively approach people for that the head which is sometimes used to interact with Lio interaction. Because of this, a reduction of interest in the was not always reachable for a person in a wheelchair due robot was observed among some people as time passes. At to the distance. For safety reasons, the head of Lio was the moment, additional functionality is gradually introduced programmed not to extend beyond the base of the mobile as it is developed during the lifetime of the robot to keep platform which impeded interaction with the robot [22]. people interacting with Lio and discovering new behaviours. Operational data was collected in one of the rehabilitation clinics where Lio was deployed. On a monthly basis, Lio V. CONCLUSIONSAND FUTURE WORK drove on average 16.8 km for delivery tasks and additionally Lio is considered an all-in-one platform suitable for 25.2 km during entertainment tasks. Entertainment functions human-robot interaction and personal care assistance tasks. were triggered 96 times per month on average. The success Its design has evolved, both in hardware and software, to rate of the autonomous daily delivery function was 85.5% address the limitation of similar platforms and ensure the in the period from February 1 to May 8, 2020. In total requirements of health care institutions, as well as home there were 186 planned deliveries of transporting mail and care, are met. Lio is intrinsically safe due to the padded blood-samples around the clinic. The most common reasons artificial-leather covers covering the majority of the robot, for failure were device failure of low-level sensors, charging limited forces and speed, soft mode, advanced navigation and issues, and network connection problems. There were also behaviour algorithms. The robot complies with ISO13482 errors such as the pressed emergency stop not being detected. standard - safety requirements for personal care robots. Lio Generally, elderly people and health care staff are very has a robotic arm placed on a moving platform. It is capable curious and open towards the robot. People are polite to of both, assisting and interacting with people. Lio even though they are aware that it is a machine [23]. Lio has been proven during deployments in health care Also, multiple usability tests and qualitative interview studies institutions and had positive feedback regarding functionality have been conducted to improve the interaction patterns. as well as acceptance by patients, residents and medical staff. Supporting findings from other home trials [25] [26] [27], Currently, the majority of the robots were sold as products, patients expected the robot to have a certain personality. while the remaining deployments are part of the on-going During an ethical evaluation of the robot’s deployment, projects. Autonomous operation capability ensures that Lio an elderly person described Lio as a play companion. He can be easily deployed with just a brief training time for staff consciously engaged in a game with Lio multiple times, on how the robot operates and how it should be used. giving an impression that it is a sentient being. The robot What differentiates Lio from other service robots are the gets often patted on the head by passersby. manipulation capabilities of the large arm and gripper which On the interaction side, additional studies were done on the allows opening doors, grasping, bringing, and handing over topic of robot speech. Despite female voices being perceived water bottles, and handling tools for UV-C disinfection. more friendly according to the studies, Lio is programmed Combined with large and customisable inventory space, it to speak using a male voice [28]. A deeper male voice was allows Lio to carry and operate a large variety of tools. easier to understand for elderly people. If the robot speaks All the basic functionalities, as well as some advanced in a realistic human voice, users are more likely to reply navigation, perception and AI algorithms, are deployed on to the robot using voice over other types of inputs [23]. the robot. Algorithms can be easily enabled and disabled Typically the pronunciation and conversations used by Lio depending on the project needs making it easy for researchers are simplified, which in turn resulted in a higher response and developers to use existing algorithms and focus on their rate. field of expertise for improvements. Having multiple user interfaces makes Lio usable by both, [8] R. Kittmann, T. Frohlich,¨ J. Schafer,¨ U. Reiser, F. Weißhardt, and tech-savvy and inexperienced users and allows them to easily A. Haug, “Let me introduce myself: I am Care-O-bot 4, a gentleman robot,” Mensch und computer 2015–proceedings, 2015. observe the status of the robot, schedule and adjust the be- [9] S. Pfeiffer and C. Angulo, “Gesture learning and execution in a hu- haviour according to the needs. With a possibility of having manoid robot via dynamic movement primitives,” Pattern Recognition a TCP communication, integration can be implemented with Letters, vol. 67, pp. 100–107, 2015. [10] E. Ackerman, “Moxi Prototype from Diligent Robotics Starts external systems. Helping Out in Hospitals,” IEEE Spectrum. https://spectrum. According to customer feedback, Lio is constantly being ieee. org/automaton/robotics/industrial-robots/moxi-prototype-fro m- updated both, in terms of hardware and software. The de- diligent-robotics-starts-helping-out-in-hospitals, 2018. [11] M. Goeldner, C. Herstatt, and F. Tietze, “The emergence of care velopment is focused to improve upon identified issues and robotics—A patent and publication analysis,” Technological Forecast- to increase the level of autonomy. Multi-floor mapping and ing and Social Change, vol. 92, pp. 115–131, 2015. elevator use is planned, as well as marker-less identification [12] J. Bauer, L. Grundel,¨ J. Seßner, M. Meiners, M. Lieret, T. Lechler, C. Konrad, and J. Franke, “Camera-based fall detection system with and localisation of objects like door handles, elevator buttons the sanbot ELF,” in Smart Public Building 2018 Con- and light switches to allow the robot operation in larger areas ference Proceedings, Stuttgart, DE, 2018, pp. 15–28. of the facilities. Lio can manipulate a variety of objects and [13] “Autonomous robots are helping kill coronavirus in hospitals - ieee spectrum,” https://spectrum.ieee.org/automaton/robotics/medical- have them in inventory. Motion planner will be adapted to robots/autonomous-robots-are-helping-kill-coronavirus-in-hospitals, take items held on the gripper or on the back of the robot (Accessed on 06/26/2020). into consideration when calculating the collision-free path. [14] M. Fruh¨ and A. Gasser, “Erfahrungen aus dem Einsatz von Pfleger- obotern fur¨ Menschen im Alter,” in Pflegeroboter. Springer, 2018, Another goal is to make Lio more proactive in terms of pp. 37–62. interaction. It means advanced behaviour models to allow [15] C. Stachniss, U. Frese, and G. Grisetti, “OpenSLAM,” URL: the robot to find specific people around the facility, actively http://www. openslam. org, vol. 2, 2007. [16] D. Fox, W. Burgard, F. Dellaert, and S. Thrun, “Monte carlo localiza- approaching them and enabling scene understanding to de- tion: Efficient position estimation for mobile robots,” AAAI/IAAI, vol. termine if certain expected actions occurred. For example, if 1999, no. 343-349, pp. 2–2, 1999. a glass of water was handed to a person, to ensure that he [17] D. Fischinger, P. Einramhof, K. Papoutsakis, W. Wohlkinger, P. Mayer, P. Panek, S. Hofmann, T. Koertner, A. Weiss, A. Argyros, et al., or she drank the water to stay hydrated. “Hobbit, a care robot supporting independent living at home: First Additionally, hardware adjustments are planned to include prototype and lessons learned,” Robotics and Autonomous Systems, advanced 3D sensor in the gripper, as well as to update the vol. 75, pp. 60–78, 2016. [18] “Thermal Imaging for Detecting Elevated Body Temperature — appearance to convey emotional responses in user commu- FLIR Systems,” https://www.flir.com/discover/public-safety/thermal- nication. Improvements on natural language processing are imaging-for-detecting-elevated-body-temperature/, (Accessed on planned as well to enhance the communication with wake- 05/05/2020). [19] T. Niemueller, G. Lakemeyer, and A. Ferrein, “Incremental task-level word recognition and chatbot capabilities. reasoning in a competitive factory automation scenario,” in 2013 AAAI To enhance the robot behaviour development, improve Spring Symposium Series, 2013. testing and evaluate capabilities of Lio before acquiring the [20] J. Wielemaker, T. Schrijvers, M. Triska, and T. Lager, “SWI-Prolog,” Theory and Practice of Logic Programming, vol. 12, no. 1-2, pp. 67– robot, a full-feature realistic simulation is being developed 96, 2012. with full support of ROS and ROS2, including all the inter- [21] Food, D. Administration, et al., “Applying human factors and usability faces described in this paper. engineering to medical devices: Guidance for industry and Food and Drug Administration staff,” Washington, DC: FDA, 2016. [22] L. Wirth, J. Siebenmann, and A. Gasser, “Erfahrungen aus dem Einsatz ACKNOWLEDGMENT von Assistenzrobotern fur¨ Menschen im Alter,” in Mensch-Roboter- Authors would like to acknowledge the hard work and Kollaboration. Springer, 2020, pp. 257–279. F&P Robotics AG [23] A. Gasser, S. Steinemann, and K. Opwis, “A Qualitative View on dedication of all employees and partners Elders Interacting with a Health Care Robot with Bodily Movements,” in developing and testing a personal assistant robot Lio. Master’s thesis, the Department of Psychology of the University of Basel for the degree of Master of Science in Psychology, 2017. REFERENCES [24] O. Bendel, “Co-Robots as Care Robots,” arXiv preprint arXiv:2004.04374, 2020. [1] P. Flandorfer, “Population ageing and socially assistive robots for [25] D. Fischinger, P. Einramhof, W. Wohlkinger, K. Papoutsakis, P. Mayer, elderly persons: the importance of sociodemographic factors for user P. Panek, T. Koertner, S. Hofmann, A. Argyros, M. Vincze, et al., acceptance,” International Journal of Population Research, 2012. “HOBBIT-The Mutual Care Robot,” in RSJ International Conference [2] D. Oliver, C. Foot, and R. Humphries, Making our health and care on Intelligent Robots and Systems, Tokyo (November 2013), 2013. systems fit for an ageing population. King’s Fund London: UK, 2014. [26] E. Broadbent, R. Stafford, and B. MacDonald, “Acceptance of health- [3] Credit Suisse, “Die Zukunft des Pflegeheimmarkts,” Pfaffikon:¨ Schel- care robots for the older population: Review and future directions,” lenberg Druck AG, 2015. International journal of social robotics, vol. 1, no. 4, p. 319, 2009. [4] World Health Organisation, “WHO — Violence against health [27] D. S. Syrdal, K. Dautenhahn, M. L. Walters, and K. L. Koay, “Sharing workers,” https://www.who.int/violence injury prevention/violence/ Spaces with Robots in a Home Scenario-Anthropomorphic Attribu- workplace/en/, (Accessed on 05/13/2020). tions and their Effect on Proxemic Expectations and Evaluations in [5] R. Heinz, M. Rolf, and U. Rainer, Themenreport “Pflege 2030” Was a Live HRI Trial.” in AAAI fall symposium: AI in Eldercare: new ist zu erwarten – was ist zu tun? Bertelsmann Stiftung, 2014. solutions to old problems, 2008, pp. 116–123. [6] T. Ikeuchi, R. Sakurai, K. Furuta, Y. Kasahara, Y. Imamura, and [28] E. Broadbent, R. Tamagawa, A. Patience, B. Knock, N. Kerse, K. Day, S. Shinkai, “Utilizing social robot to reduce workload of healthcare and B. A. MacDonald, “Attitudes towards health-care robots in a professionals in psychiatric hospital: A preliminary study,” Innovation retirement village,” Australasian journal on ageing, vol. 31, no. 2, in Aging, vol. 2, no. suppl 1, pp. 695–696, 2018. pp. 115–120, 2012. [7] “UBTECH Shows Off Massive Upgrades to Walker - IEEE Spectrum,” https://spectrum.ieee.org/automaton/robotics/ humanoids/ubtech-upgrades-walker-humanoid-robot, (Accessed on 05/19/2020).