Robot Skin the Introduction Is Almost Done, So Edit This Seriously

Robot Skin the Introduction Is Almost Done, So Edit This Seriously

Did this stuf get in? Between the Nanoscribe, Protolaser U3, and current 3D printing projects, there’s plenty we can do with rapid integration of polymers and microfluidics for soft complex systems. CGA TODO: Facilities Summary; consists of an overview, a statement on the intellectual merit of the proposed activity, and a statement on the broader impacts of the proposed activity. 2 and 6 sets of keywords at the end of the overview in the Project Summary. Keywords: xxx; yyy at end of overview Science center stuff Fix Dickey word stuff Robot Skin The introduction is almost done, so edit this seriously. Our goal is to develop robot skin that is actually used by real robots. Skin is an integrated system that forms the mechanical interface with the world. It should be rich in sensors, and support actuation. We take inspiration from human skin, where there are many types of sensors embedded in a mechanical structure that maximizes their performance. There are several new elements of our approach: • We will create and deploy many types of sensors, including embedded accelerometers, gyros, tem- perature sensors, vibration sensors, sound sensors, optical sensors sensing nearby objects, and optical sensors tracking skin and object velocity and movement. Previous skin and tactile sensing projects typically focused on one or only a few types of sensors. • We will optimize the skin mechanics for manipulation and tactile perception. When the needs of manipulation and tactile perception conflict or are unclear, we will focus on optimizing performance on a set of benchmark tasks. Previous tactile sensing projects often place a tactile sensor on bare metal fingers, with little consideration of skin and tissue mechanics. • We will explore a relatively thick soft skin, and consider soft tissue surrounding internal structure (bones) with a relatively human-scale ratio of soft tissue to bone volume, or structures that are com- pletely soft (no bones/rigid elements). • We will explore a wide variety of surface textures including arbitrary ridge patterns (fingerprints), hairs, posts, pyramids, and cones. These patterns may vary across the skin provide a variety of contact affordances. • We will explore superhuman sensing. For example, we will create vision systems (eyes) that look outward from the skin for a whole body vision system. We will use optical tracking to estimate slipping and object velocity relative to the skin. We will explore embedding ultrasound transducers in the skin to use ultrasound to image into soft materials that are in contact such as parts of the human body. • We will explore deliberately creating air and liquid (sweat) flows (both inwards and outwards) for better sensing (measuring variables such as pressure, conductivity, and temperature) and controlling 1 adhesion. We will explore humidifying the air for better airflow sensing, contact management, adhe- sion control, and ultrasound sensing. • We will develop materials to make the skin rugged, and methods to either easily replace or repair damage. • We will define a set of benchmark tasks to guide design and evaluation of our and other’s work, The tasks include exploring and manipulating rigid and articulated (jointed) objects, and deformable objects such as wire bending, paper folding, screen (2D surface) bending, and working with clay (kneading, sculpting with fingers and tools, and using a potters wheel). The system we construct will recognize, select, and manipulate objects among a set of objects (find keys in your pocket, for example). Our most difficult set of benchmarks will be mockups of tasks often found in caring for humans: wiping, combing hair, dressing, moving in bed, lifting, transfer, and changing adult diapers. • We will explore a range of perceptual approaches, including object tracking based on contact types, forces, and distances, feature based object recognition based on features such as texture, stiffness, damping, and plasticity, feature based event recogntion based on spatial and temporal multimodal features such as the frequency content of vibration sensors, and multimodal signature based event recognition. • We will explore behavior and control based on explicit object trajectories and force control, discrimi- nant or predicate based policies, and matching learned sensory templates. 1 Research Plan This section is almost done except for the Generation 3 skin section, which needs to reflect the Proposed Research writetup. Our research plan has two thrusts: A: developing skin and skin sensors, and B: developing and eval- uating perception, reasoning, and control algorithms that allow the skin to do desired tasks. We have a three step plan for developing skin and skin sensors: A.1 Generation 1: Use off the shelf sensors embedded in optical grade silicone (Near Infrared (NIR)). This skin includes optical sensing of marker movement to measure strain in all three directions. We will use range finding sensors to sense objects at up to a 0.5m distance. We will use embedded IMUs which include an accelerometer, gyro, magnetometer, and temperature sensor. We will embed piezoelectric material to capture high frequency vibration, and pressure sensors. We will embed induction loops to sense electric field, and explore imposing local or global (possibly time varying) electric fields to resolve orientation. We will glue hairs or whiskers to piezoelectric crystals to provide mechanical sensing at a (short) distance. A.2 Generation 2: Integrate current benchtop prototypes into Generation 1 skin. Hyperelastic sensing elements will be composed of soft silicone elastomer embedded with microfluidic channels of non- toxic liquid metal alloy, eg. eutectic gallium-indium (EGaIn ). Strain, shear deformation, and/or applied surface pressure causes predictable changes in resistance or capacitance of the embedded liquid metal sensing elements. EGaIn can also be used for capacitive touch measurements. Likewise, we can pattern soft, conductive laminate films to create arrays of capacitive touch pixels (for example, graphene pastes or films separated by elastomer). Conductive elastomers will be used to interface liquid EGaIn sensors and circuits with a flexible PC board populated with rigid microelectronics. This will include a microcontroller, battery, and off-the-shelf (OTS) RF transmitter or transceiver. 2 A.3 Generation 3: Develop completely new skin sensing technology using novel concepts, components, and materials. For example, we will explore artificial hair cells with hairs and whiskers attached to five or six axes of force and motion sensing. Instead of liquid metal, these integrated sensors will be composed of insulating and conductive elastomers for detecting deformation through changes in elec- trical capacitance. They will be produced with UV laser micromachining or additive manufacturing through customized 3D printing or 2-photon polymerization. As before, the robot skin will interface with a flexible printed circuit containing a microcontroller and power. For Generation 3, the antenna in the transmitter/transceiver will be replaced with a soft, elastically deformable antenna integrated into the skin itself. We need a unified discussion of the use of imposed and inherent magnetic and electrical fields. At some abstract level the next two paragraphs are doing similar things. I can imagine imposing an AC electric field and using inductive sensors. Move the details of all of this to the proposed work section? Co-PI Onal is studying the use of Hall effect sensing ICs to detect the deformation of soft materials using an embedded miniature magnet located at a precise location with respect to the Hall element. Our preliminary results have demonstrated accurate and high-bandwidth curvature measurements for bending elements. We propose to extend upon this work to develop distributed 6-D force/moment measurements using an array of multiple magnet-Hall pairs. For touch and force sensing we will explore distribute nanoparticles in a gel. When pressed, the particles form a local percolated network that changes the local conductivity of the gel. In biology, touch generates an action potential that signals to the brain. We propose to use soft materials that generate a small potential when deformed. For example, droplets of EGaIn form an oxide skin that protects the underlying metal, but when pressed, this skin cracks and exposes the metal. This could, in principle, be harnessed to generate a potential. I may even have some prelim results to dig up. We can utilize hydrogels to replace EGaIn for touch sensing because the gels are transparent. We plan to develop the “off the shelf” skin in Year 1, the EGaIn microchannel-based skin in Year 2, and then 3rd generation skin in Year 3. Skin sensing technology needs to remain functional under extreme deformation. The skin with embedded sensors must remain mechanically suitable for desired tasks. The development and evaluation of perception, reasoning, and control algorithms for our three skin sys- tems will use a series of test setups: B.1 Evaluate a patch of skin on the laboratory bench. B.2 Evaluate the skin on a simple hand. B.3 Evaluate the skin on our current lightweight arm and hand. B.4 Evaluate the skin on our Sarcos Primus humanoid (whole body sensing). 2 Why This Matters This section needs to be edited down to be more concise and actually flow. We are motivated by the need to build better assistive robots and environments for people with disabilities, older adults trying to live independently, and people with strokes, ALS, and spinal cord and traumatic brain injury. These people need help. It is a tremendous burden on caregivers (typically a spouse) to provide 24/7 care, and many people would rather have a machine change their diapers than a stranger. Here are several ways better robot skin is on the critical path: 3 Figure 1. Testbeds. Left: lightweight arm and hand. Right: SARCOS Primus hydraulic humanoid. • After decades of research, robot hands are nowhere close to human levels of performance. One huge problem is terrible sensing and perception. Better robot skin would make a huge difference.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    46 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us