Haptic-Enabled Mixed Reality System for Mixed-Initiative Remote Robot

Haptic-Enabled Mixed Reality System for Mixed-Initiative Remote Robot

1 Haptic-enabled Mixed Reality System for Mixed-initiative Remote Robot Control Yuan Tian, Lianjun Li, Andrea Fumagalli, Member, IEEE, Yonas Tanesse, Member, IEEE, and Balakrishnan Prabhakaran, Member, IEEE, Abstract—Robots assist in many areas that are considered awareness, decreasing the workload of the human operator and unsafe for humans to operate. For instance, in handling pandemic at the same time guaranteeing safe operations. diseases such as the recent Covid-19 outbreak and other out- breaks like Ebola, robots can assist in reaching areas dangerous for humans and do simple tasks such as pick up the correct medicine (among a set of bottles prescribed) and deliver to patients. In such cases, it might not be good to rely on the fully autonomous operation of robots. Since many mobile robots are fully functional with low-level tasks such as grabbing and moving, we consider the mixed-initiative control where the user can guide the robot remotely to finish such tasks. In this paper, we proposed a novel haptic-enabled mixed reality system, that provides haptic interfaces to interact with the virtualized environment and give remote guidance for mobile robots towards high-level tasks. The system testbed includes the local site with Fig. 1: The system setup in our proposed work. Left is local site: a mobile robot equipped with RGBD sensor and a remote site KUKA youBot with a Kinect V2 placed on top. Right is the remote with a user operating a haptic device. A 3D virtualized real- site, the user interacted with the 3D reconstructed scene and guide world static scene is generated using real-time dense mapping. the robot using haptic device. The user can use a haptic device to “touch” the scene, mark the Some previous methods have introduced haptic feedback scene, add virtual fixtures, and perform physics simulation. The and mixed reality for mixed-initiative control [9], [10], most of experimental results show the effectiveness and flexibility of the them applied haptic device as a multidimensional teleoperation proposed haptic-enabled mixed reality system. controller or used haptic guidance force and environment Index Terms—Haptic Rendering, Dense Mapping, Haptic force as the feedback of robot motions. In this paper, we Guidance, Mixed Reality, Mixed-initiative Control consider how 3D haptic interaction in the mixed reality could be expanded to help the robot mixed-initiative control. Haptic interaction with the 3D virtual environment is very popular I. INTRODUCTION in computer graphics applications to provide immersive expe- Networked mixed reality has become popular for all kinds riences, which is to use a haptic avatar to explore freely in of the applications such as distributed collaborations [1], a 3D world and interact with 3D objects (push, touch) and training [2], and video streaming [3]. Such a networked mixed feel the force feedback. Combining haptic interaction with reality system can merge the real and virtual worlds to produce mixed-initiative control will provide more flexibility towards new environments where physical and virtual objects interact the control. with each other in real-time. Many researchers have applied Several high-quality sensors can be used to provide the arXiv:2102.03521v2 [cs.RO] 7 Jun 2021 mixed reality for robot teleoperations [4], [5], human-robot mapping and localization information for mobile robots, such interaction control [6], [7], [8] and mixed-initiative control [9], as Light Detection and Ranging (LIDAR), RGBD cameras, [10]. Among all these robot controls, mixed-initiative control etc. The sensor data can generate the virtualized real-world is a hot topic that has drawn much attention. [11], [12], [13] scene with dense geometry [14]. Users can use a mouse introduced the different levels of autonomy: full autonomy cursor, joystick, or any other input device to operate the (robot-initiative), mixed-initiative and teleoperation (human- virtualized objects in a mixed reality environment to realize initiative). In reality, more control systems are designed with some goals [15]. The introduction of the haptic interaction the capability to “sliding autonomy”, which means the system into the mixed reality environment will add more flexibil- supports the seamless transfer to different levels of control. ity to operations. Haptic devices provide more degrees of Mixed-initiative control is to give the robot high-level com- freedom for cursor motions and provide force feedback for mands instead of teleoperation, as shown in Figure 1. It more immersive experiences. Using haptic devices, users can becomes more important in particular for improving situational remotely “touch” and mark the virtualized environment from the streaming data [16]. Furthermore, the haptic interface Yuan Tian is with OPPO Research Center, Palo Alto, CA 94303, USA can be integrated with the physics simulation of objects, the email: [email protected] virtualized object can be moved in the scene. These operations Lianjun Li, Dr. Andrea Fumagalli, Dr. Yonas Tanesse, Dr. Balakrishnan Prabhakaran are with the University of Texas at Dallas, Richardson, TX 75080. can provide more flexible control and guidance for the mobile email: lianjun.li1, andreaf, yonas.tadesse, [email protected] robot since the robot also uses dense mapping for localization and navigation [17]. remote robot teleoperation visual interface. Son et al. [20] In this paper, we assume the robots have the full functional- investigated 3 haptic cues for bilateral teleoperation of multiple ity for low-level tasks such as grabbing, moving based on the mobile robots, and found force cue feedback best support input 3D objects and positions. To have such a haptic-enabled the maneuverability. In [14], dense geometry and appearance mixed reality system for mixed-initiative remote control, there data were used to generate a photorealistic synthetic exterior are several challenges: line-of-sight view of the robot including the context of its (i) The first challenge is real-time streaming of virtualized surrounding terrain. This technique converted remote teleop- data over a network that is susceptible to data loss and eration into a line-of-sight remote control with the capacity delays. to remove latency. Chouiten et al. [18] proposed a distributed (ii) The second challenge comes from the requirement of the mixed reality system that implemented the real-time display haptic rendering with the 3D virtualized environment, of digital video stream to web users, by mixing 3D entities which needs to be robust, efficient, and smooth. with 2D live videos by a teleoperated ROV. Some methods (iii) Object segmentation from the 3D scene that will be introduced the haptic feedback into the mixed reality control needed. The segmented object will be the input for robot platforms. A mixed reality system [9] is developed with GUI grabbing. interfaces and force feedback including path guidance forces, (iv) Furthermore, network latency will delay the guidance collision preventing forces or environmental force to improve commands from the server, and hurt the haptic interac- the performance of high-level tasks operations. Later, Cacace tions, which might lead to the disparity of goal motions et al. [10] proposed a mixed-initiative control system to add a and real-world motions for the robot. human loop to control the velocity of aerial service vehicles, To address these challenges, we have proposed a novel and force feedback is used to enhance the control experience. haptic-enabled mixed reality system for mixed-initiative re- Different from these methods, our system introduced the haptic mote control. The system provides a haptic interface to interact interaction with 3D scenes into mixed reality, which will bring with the virtualized environment and gives remote guidance for more flexibility. Lee et al. [21] proposed a visual guidance mobile robots towards high-level tasks. The system includes algorithm that dynamically manipulates the virtual scene to a local site with a mobile robot equipped with an RGBD compensate for the spatial discrepancies in a haptic augmented sensor and a remote site with a user operating a haptic virtuality system. In [19], an interface based on Microsoft device. A 3D virtualized real-world static scene is generated HoloLens is proposed, which can display the map, path, using real-time dense mapping. The user can use a haptic control command, and other information related to the remote device to remotely “touch” the scene, mark the scene, add mobile robot, also provide interactive ways for robot control. virtual fixtures, and perform physics simulation. Specifically, Recently, many robots are equipped the depth sensors for the technical contributions of our method are as follows: localization and mapping [22], [17], [23]. KinectFusion[24], [25] is one of the most popular methods, which fuses the • A real-time efficient and robust mixed reality platform streaming RGBD data from the Kinect camera, and saves it as for mixed-initiative control is proposed to enable haptic a Truncated Signed Distance Function (TSDF). KinectFusion interactions with streaming data. can provide the full-scene dense geometry to enable mixed • A TSDF-based haptic rendering method with the streaming reality. [16] firstly introduced the real-time haptic rendering surface is proposed to ensure the smooth and robust haptic with streaming deformable surface generated by KinectFusion. interaction with a virtualized static scene. Besides the simulation of surface deformation, this method • A superpixel-enhanced

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    12 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us