<<

History, Technologies and Applications

Advanced Computing Center for the Arts and Design Ohio State University

Vita Berezina-Blackburn

©2003- 2016, The Ohio State University Motion Capture

• motion capture (mocap) is sampling and recording motion of humans, animals and inanimate objects as 3d data for analysis, playback and remapping

• performance capture is acting with motion capture in and games • motion tracking is real-time processing of motion capture data

©2003- 2016, The Ohio State University History of Motion Capture

• Eadweard Muybridge (1830-1904)

• Etienne-Jules Marey (1830-1904)

• Nikolai Bernstein (1896-1966)

• Harold Edgerton (1903-1990)

• Gunnar Johansson (1911- 1998)

©2003- 2016, The Ohio State University Eadward Muybridge

• the flying horse • 20,000 photos of animal and human locomotion • UK-USA, 1872

© Kingston Museum

©2003- 2016, The Ohio State University Eadward Muybridge

• zoopraxiscope

© Kingston Museum ©2003- 2016, The Ohio State University Etienne-Jules Marey

• first person to analyze human and animal motion with film • created chronophotographic gun and fixed plate camera • France, 1880s

©2003- 2016, The Ohio State University Modern Art

• Futurism (Boccionni, Balla and others)

• Marcel Duchamp

©2003- 2016, The Ohio State University

• allowed to trace cartoon characters over photographed frames of live performances.

• invented in 1915 by Max Fleischer

• Koko the Clown

• Snow White

© Walt Disney

©2003- 2016, The Ohio State University Nikolai Bernstein

• General Biomechanics – 1924, Central Institute of Labor, Moscow

• physiology of sport and labor activities, foundations of ergonomics • cyclography • concepts of degrees of freedom and hierarchical structure of motion control

©2003- 2016, The Ohio State University Harold Edgerton

• electronic stroboscope and flash • exposures of 1/1000th to 1/1000000 sec • MIT, 1930s-1960

© Palm Press Inc. ©2003- 2016, The Ohio State University GUNNAR JOHANSSON

• Visual perception of biological motion, experimental psychology, 1970s, University of Uppsala, Sweden • Retro-reflective patches on joints • Video recording instead of film, search light mounted very closely to the camera lens, light reflects from patches into the lens • Computer modeling of motion variations

©2003- 2016, The Ohio State University 1980’s

• military and medical research purposes

• first computer graphics use in research labs

• first production use o Brilliance by Robert Abel , brute force technique(1985 Superbowl ad) o Waldo C. Graphic (1988) PDI for Jim Henson tour o Mike the Talking Head (Siggraph 88) ’ o Don t Touch Me (1989)

©2003- 2016, The Ohio State University Mocap Technologies

ACTIVE PASSIVE

•electromechanical •optical: retroreflective markers •optical fiber •acoustic •optical: strobing LEDs •optical markerless (video based) •acoustic •inertial •optical markerless based on structured light •optical markerless based on video

©2003- 2016, The Ohio State University Optical motion capture systems

• light weight, variable size, retro- reflective markers

• VGA to16 megapixel resolution cameras with strobing LEDs digitize different views of performance

• up to 5000fps

• under 1mm accuracy

• marker occlusion

• capture volume limits

VICON NATURAL POINT MOTION ANALYSIS QUALISYS

©2003- 2016, The Ohio State University Strobing LED marker system

• red or Infrared LEDs

• unique strobing frequency for each marker

• no marker swapping

• limited volume

• limited capture time due to battery life for LED

• wires running up and down capture subject

PHASESPACE

©2003- 2016, The Ohio State University Electromechanical suits

• linked structures • potentiometers determine degree of rotation for each link • no occlusion • no magnetic or electrical interference • unlimited capture volume • low cost

• no global translation • restricted movement • fixed configuration of sensors • low sampling rate • inaccurate joints

GYPSY MOCAP SYSTEM

©2003- 2016, The Ohio State University Inertial systems

• inertial trackers placed on joints • measures orientation and position with accelerometers, gyroscopes, magnetometers on each segment • UWB RF for position tracking • unlimited capture volume • no occlusion, multiple subjects

• positional drift • translational data needs to be collected separately • battery packs and wires on the performer’s body.

XSENS MOCAP SYSTEM

©2003- 2016, The Ohio State University Electromagnetic systems

• electromagnetic sensors placed on joints or other critical points • measures orientation and position of sensor relative to electromagnetic field generated by the transmitter • no sight line requirements

• no occlusion, multiple subjects • electromagnetic interference, small volume if body translation tracking is needed

ASCENSION-TECH NORTHERN DIGITAL

©2003- 2016, The Ohio State University Optical fiber system

• fiber-optic sensor

• bend and twist sensors measure transmitted light

• no occlusion

• flexible capture volume

• adjustment to individual proportions is limited

• less accurate data

CYBERGLOVE ©2003- 2016, The Ohio State University Acoustic system

• set of transducers/transcievers generate and evaluate high frequency sound wave

• other sounds in frequency range can disrupt capture

• accuracy not as high as other systems

INTERSENSE

©2003- 2016, The Ohio State University Markerless Motion Capture

Full Body o Max Plank Institute research (3d scanner + silhouette analysis from video) o Captury

©2003- 2016, The Ohio State University Markerless Motion Capture

Full Body

o and other RGB-d sensor development

ORGANIC MOTION ILM and ManhattanMocap Group’s Multitrack System (markers for )

©2003- 2016, The Ohio State University Markerless Motion Capture

Face

FACS/Paul Ekman

Video based:

Original R&D: Digital Emily Project Faceware

Medusa (Disney Zurich)

RGB-d based:

Faceshift

©2003- 2016, The Ohio State University Markerless Motion Capture

Hands

Leap Sensor

©2003- 2016, The Ohio State University Video-based Motion Analysis

Research Areas o equipment and subject calibration o motion tracking o 3D movement reconstruction (markerless motion capture) o skeletal solving o action recognition o 3D surface reconstruction (surface scanning)

Challenges: o complex environment variability o body segmentation o occlusion o data volume

©2003- 2016, The Ohio State University Typical Marker Based Optical Motion Capture Pipeline

• planning (performers and actions, props, space requirements)

• recording point data(Vicon Blade)

• data processing, realtime or post standard skeletal solving (Vicon Blade, MotionBuilder, Ikinema )

• skeleton creation (3d animation software)

• remapping standard skeletal motion to customized characters (MotionBuilder)

• binding skeleton to a model (3D animation software)

©2003- 2016, The Ohio State University Optical Marker Based 3D Motion Reconstruction

• Single camera o Model assumptions required

• Multiple cameras o Require at least 2 cameras, unique with 3 o Camera calibration

• Motion capture with markers o Use retroreflective markers to simplify video information

©2003- 2016, The Ohio State University Problems Related to Marker Occlusion

©2003- 2016, The Ohio State University Skeletal Solving (remapping mocap data to a character model) • how to make markers move a skeleton o photo reference or 3d scan of a performer o CG model o Motion Builder or Ikinema o Vicon Blade o other methods

• problems with detecting joint centers…

• organization of joint hierarchies

©2003- 2016, The Ohio State University Planning

• shot list • performance space dimensions • interactions in shot • shots to be blended or looped • length of shots • size and location of props • gross proportional differences for retargeting • camera motion

©2003- 2016, The Ohio State University Planning

• Character/Prop setup o target skeleton/character topology o ready stance considerations o space preparation/occlusion removal/ camera stability

• Marker setup o marker redundancy o three markers per segment o place markers close to bone o asymmetry o recognizable configuration

• output format • file naming conventions • • target software platform • database management • potential technical issues

©2003- 2016, The Ohio State University Virtual Production

• Pioneered for the production of ’s “Avatar” • virtual camera • simulcam

©2003- 2016, The Ohio State University Feature , Games and VR applications

• Avatar • Dawn of the Planet of the Apes

• The Force Awakens • Curious Case of Benjamin Button

• EA Sports football capture session • EA Sports soccer capture session

• ILMxLab • PrioVR • Sixense

• VR news

©2003- 2016, The Ohio State University Applications

• Biomedical and Physical Rehabilitation Rehabilitation Markerless Gait Analysis Tongue Capture for Speech Therapy

©2003- 2016, The Ohio State University Applications

• Historical Preservation Native American Performance

• Arts Open Ended Group Walking City ACCAD Motion Lab Deakin University Motion Lab Projects Virtual in Landing Place Robotic Camera Choreography via Motion Capture

©2003- 2016, The Ohio State University Applications

• Life Sciences

• Engineering

• Military and Law Enforcement VR weapon training with acoustic tracking system Virtual Crime Scene

• Sports Golf Training Simulator Various Sports Analysis and Training

©2003- 2016, The Ohio State University Motion Technology and Integration Researchers

• Univesity of South California (Paul Debevec) • Chris Bregler (NYU, Stanford, Google) • Carnegie Mellon (Jessica Hodgins) • Max Planck Center (Christian Theobalt) • Stanford (Vladlen Koltun) • Synlab at Georgia Tech (Ali Mazalek)

©2003- 2016, The Ohio State University Mocap Studios

Giant Studios

Capture Lab

WETA

House of Moves

Imaginarium Studios

Jim Henson Digital Studio

ILMxLab

©2003- 2016, The Ohio State University References

1. Menache, Alberto, “Understanding Motion Capture for and Video Games” 2. http://www.kingston.ac.uk/Muybridge/ 3. http://www.anotherscene.com/cinema/firsts/marey.html 4. http://cmp1.ucr.edu/exhibitions/edgerton/edgerton.html 5. Siggraph 2001 Course 51: 6. http://www.metamotion.com 7. http://gvv.mpi-inf.mpg.de/files/pami2013/jgall_motioncapture_multiple_pami13.pdf 8. http://dl.acm.org/citation.cfm?id=2614176 9. http://www.utdallas.edu/~xxg061000/tongue.pdf 10.http://en.wikipedia.org/wiki/Nikolai_Bernstein 11.http://masgutovamethod.com/content/overlays/nikolai-bernstein.html 12. http://www.theartstory.org/movement-futurism.htm 13. Reinhard Klette and Garry Tee “Understanding Human Motion: A Historic Review” 14. Johansson, Gunnar. “Visual perception of biological motion and a model for its analysis”, Perception & Psychophysics, 1973. Vol. 14. No.2. 201·211

©2003- 2016, The Ohio State University