
Active Clothing Material Perception Using Tactile Sensing and Deep Learning The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation Yuan, Wenzhen et al. "Active Clothing Material Perception Using Tactile Sensing and Deep Learning." IEEE International Conference on Robotics and Automation (ICRA), May 2018, Brisbane, Australia, Institute of Electrical and Electronics Engineers, September 2018. © 2018 IEEE As Published http://dx.doi.org/10.1109/icra.2018.8461164 Publisher Institute of Electrical and Electronics Engineers (IEEE) Version Original manuscript Citable link https://hdl.handle.net/1721.1/126688 Terms of Use Creative Commons Attribution-Noncommercial-Share Alike Detailed Terms http://creativecommons.org/licenses/by-nc-sa/4.0/ Active Clothing Material Perception using Tactile Sensing and Deep Learning Wenzhen Yuan1, Yuchen Mo1;2, Shaoxiong Wang1, and Edward H. Adelson1 Abstract— Humans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the tactile data. Those proper- ties include the physical properties, like thickness, fuzziness, softness and durability, and semantic properties, like wearing season and preferred washing methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616 robot exploring iterations on them. To extract the useful information from the high-dimensional sensory output, we applied Convolutional Neural Networks (CNN) on the tactile data for recognizing the clothing properties, and on the Kinect depth images for selecting exploration locations. Experiments show that using the trained neural networks, the robot can autonomously explore the unknown clothes and learn their properties. This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework. I. INTRODUCTION A core requirement for intelligent robots is to under- stand the physical world, which contains understanding the Fig. 1. (a)The robotic system that automatically perceives clothes when properties of physical objects in the real-world environment. they are in the natural environment. The system includes a robot arm, a Among the common objects, clothing is an important part. gripper, a GelSight sensor mounted on the gripper, and a Kinect sensor. (b)The Fingertip GelSight sensor. (c)The gripper with a GelSight sensor Humans evaluate an article of clothes largely according mounted is gripping on the clothes. (d)-(f) The tactile images from GelSight to its material properties, such as thick or thin, fuzzy or when gripping with increasing force. smooth, stretchable or not, etc. The understanding of the clothes’ properties helps us to better manage, maintain and wash the clothes. If a robot is to assist humans in daily environment remains a big challenge, and discriminating the life, understanding those properties will enable it to better subtle difference between the objects in the same category, understand human life, and assist with daily housework such such as clothing, is more difficult. The challenge comes from as laundry sorting, clothes maintenance and organizing, or two major sides: how to obtain adequate information from a arXiv:1711.00574v2 [cs.RO] 25 Feb 2018 choosing clothes. tactile sensor, and how to generate an effective exploration For perceiving material properties, tactile sensing is im- procedure to obtain the information. portant. Lederman and Klatzky [1], Tiest [2] demonstrated At the same time, clothing related tasks have been a that humans use different exploratory procedures to sense research interest for a long time, and the major focus has different properties of objects, such as roughness or compli- been in both the manipulation and recognition sides. Most ance. Researchers have been trying to make a robot to learn of the related works use only vision as sensory input, which the material properties through touch as well. Chu et al. measures the clothes’ global shapes. Therefore, the clothing [3], Xu et al. [4] developed setups to perceive properties recognition is mostly restricted to the rough classification of of general objects using tactile sensors and a set of pre- the clothing type. The perception of fine-grained clothing set procedures, like squeezing and sliding. However, making properties, or the study on common clothes with a wide robots explore the refined object properties in the natural variety, is still undeveloped. In this paper, we design a robotic system that perceives 1 Computer Science and Artificial Intelligence Laboratory (CSAIL), MIT, the material properties of common clothes using autonomous Cambridge, MA 02139, USA 2 Department of Computer Science and Technology, Tsinghua University, tactile exploration procedures. The hardware setup of the Beijing, 100084, China system is shown in Figure 1(a). We address the two chal- Fig. 2. Examples of GelSight images when the robot squeezes on clothes(color rescaled for display purpose). Different clothes make different textures on GelSight images, as well as different overall shapes and folding shapes. The top left example shows the example when there is no clothing in the gripper. lenges of tactile exploration of object properties: to collect fine-grained clothing properties with robot tactile sensing. and interpret the high-resolution tactile data, and to generate The methodologies of this work will enable robots to un- exploration procedures for data collection. The tactile sensor derstand common clothes better, and assist humans on more we apply is a GelSight sensor [5, 6], which senses the high- housework such washing laundry and clothing sorting. resolution geometry and texture of the contact surface. A II. RELATED WORK GelSight sensor uses a piece of soft elastomer as the contact medium, and an embedded camera to capture the deforma- A. Clothes classification tion of the elastomer. The exploration procedure is squeezing The robotic community has been interested in clothing a part of the clothes, mostly a wrinkle, and recording a set related topics for years, especially for the home assistant of tactile images with GelSight (see Figure 1(c)-(f)). Then robotic tasks. The major focus has been clothing manipula- we train a Convolutional Neural Network (CNN) for multi- tion and recognition/classification. Researches on clothing label classification to recognize the clothing properties. For manipulation are mostly about grasping, folding and un- generating exploration procedures autonomously, we use an folding. On the clothing recognition or classification tasks, external Kinect sensor to get the overall shapes of the clothes, most of the researches use vision as sensory input, and clas- especially the positions of the wrinkles, and train another sify the clothes according to their rough types, such as pants, CNN to pick up preferable points on the wrinkles. The robot t-shirts, coats, etc. Willimon et al. [7], Li et al. [8], Gabas will follow the Kinect detection for effective exploration. We et al. [9] introduced methods for clothing classification by also make the exploration closed-loop: if the tactile data is matching the 2D or 3D shape of the clothing to the clothing not good, which means the neural network cannot recognize dataset. Sun et al. [10] proposed a method to recognize the properties with high confidence, then the robot will re- clothing type from stereo vision, where they applied more explore the clothing on another location, until it gets good local features, such as the clothing’s wrinkle shapes and tactile data and confident results. textures. On multi-modal clothing perception, Kampouris et al. [11] The 11 properties we studied are the physical proper- proposed a robotic system to classify clothes’ general types ties, including thickness, fuzziness, smoothness, softness, and materials. They used an RGBD camera to capture the etc., and the semantic properties that are more related to global shape, a photometric stereo sensor to record surface the application of the clothes, including wearing seasons, texture, and a fingertip tactile sensor to measure the dynamic preferred washing methods, and textile type. The semantic force when rubbing the clothing. They showed that the properties could help robots to sort the clothes for multiple multi-modal input, especially the texture perception from the house chore tasks. To make the system robust to a wide photometric stereo sensors, largely improve the precision of range of common clothes, we collect a dataset of 153 pieces the material recognition. However, recognizing fine-grained of clothes for the training, and the dataset covers different clothing properties of common clothes remains a challenge. clothing types, materials and sizes. Experimental results show that the system can recognize the clothing properties B. Tactile sensors and GelSight for both seen and unseen items, as well as detecting effective Tactile sensing is an important sensory modality for robots. locations to generate tactile exploration. The robot can use In the past decades, different kinds of tactile
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages9 Page
-
File Size-