Animation Outline 1 Animation 2 Key-Framing 3 Skeletal Animation 4 Motion Capture Animation Victorian Animation

Total Page:16

File Type:pdf, Size:1020Kb

Animation Outline 1 Animation 2 Key-Framing 3 Skeletal Animation 4 Motion Capture Animation Victorian Animation Outline Animation 1 Animation COM3404 2 Key-framing 3 Skeletal animation Richard Everson 4 Motion Capture School of Engineering, Computer Science and Mathematics University of Exeter References [email protected] http://www.secamlocal.ex.ac.uk/studyres/COM304 Fundamentals of 3D Computer Graphics. Watt. Computer Graphics: Principles and Practice. Foley et al (1995). Principles of Three-Dimensional Computer Animation. M. O’Rourke. Richard Everson Animation 1 / 43 Richard Everson Animation 2 / 43 Animation Victorian animation Animation Bring to life Phenakistoscope Zoetrope Moving graphics of any kind: sequences of single images flipbooks, kineographs phenakistoscope, zoetrope cartoon films computer animation Speed 6 frames per second for impression of movement Films: 24 fps TV: 30 fps Computers: upto 100 fps Double buffering Render to off-screen buffer; swap with displayed buffer Richard Everson Animation 4 / 43 Richard Everson Animation 5 / 43 Computer Animation Key-framing Specify positions in a few key frames AMC What can be animated 40602 602 AMC U.S. AIR FORCE 40602 602 U.S. AIR FORCE AMC Objects Size, position, rotation, colour, texture, shape 40602 602 U.S. AIR FORCE Camera Position direction, field of view, focal length AMC 40602 602 U.S. AIR FORCE AMC 40602 602 Lights Position, type, colour U.S. AIR FORCE AMC 40602 602 U.S. AIR FORCE Methods Key-framing/interpolation Motion paths Interpolate intermediate frames: ‘in-betweening’ Hierarchical systems AMC 40602 602 AMC U.S. AIR FORCE 40602 602 Forward and inverse kinematics U.S. AIR FORCE AMC 40602 602 Motion dynamics U.S. AIR FORCE AMC 40602 602 AMC U.S. AIR FORCE Motion capture 40602 602 U.S. AIR FORCE AMC AMC 40602 40602 602 602 U.S. AIR FORCE U.S. AIR FORCE AMC 40602 602 U.S. AIR FORCE Richard Everson Animation 6 / 43 Richard Everson Animation 8 / 43 Key frame interpolation Example: Blender Linear interpolation A fighter dives into a canyon, flies next to the water and rises again. The t − t0 camera follows the motion and there is a reflection in the water. v(t) = v0 + (v1 − v0) t1 − t0 t − t0 x = x0 + (x1 − x0) t1 − t0 t − t0 y = y0 + (y1 − y0) t1 − t0 t − t0 z = z0 + (z1 − z0) t1 − t0 Simple Motion appears jerky at the key-frames Spline interpolation Hermite splines or NURBS give fine control over location Hermite splines guaranteed to pass through control points Most modelling packages allow interactive editing of motion path Linking of paths to permit camera to track a moving object etc. Richard Everson Animation 9 / 43 Richard Everson Animation 10 / 43 Example: Blender Keyframing: collision avoidance Key 1 2 3 4 Key 5 6 7 8 Richard Everson Animation 11 / 43 Richard Everson Animation 29 / 43 Keyframing: collision avoidance Keyframing Keyframing of limbs requires a significant additional effort to avoid collions and unrealistic positions during interpolation. Inverse kinematics determines joint motions necessary to achieve a goal: more in following lectures. Key 9 10 11 12 Additional constraints to limit the joint only to realistic motions. Lotus position keyframing by Jean-Christophe Nebel, Keyframe interpolation with self-collision avoidance in Computer Animation and Simulation ’99, pages 77-86. Eurographics, 1999. Key 13 14 15 16 Richard Everson Animation 30 / 43 Richard Everson Animation 31 / 43 Key-frame orientation Key-frame colour Choose rotation axis Interpolate angle about axis Interpolation can be done in Interpolate takes the shortest path on a unit sphere (geodesic path) RGB or HSV space Richard Everson Animation 32 / 43 Richard Everson Animation 33 / 43 Skeletal animation: rigging Matrix palette skinning Animating complex polygonal meshes Bones of skeleton each has a transformation matrix Mi describing its animation from the 0 X Construct a skeleton of untransformed location in world v = wi Mi v bones capturing the coordinates i components and joints of Vertices are a linear combination the body v untransformed vertex (blend) of a palette of neighbouring location Rig the skeleton by bone locations: 0 attaching vertices to v transformed vertex skeleton location Animate skeleton Mi transformation matrix v describing motion of Each vertex location is a w1 bone i weighted combination of w2 wi weighting for ith bone. bone locations: P eg: 1 part shin and 2 parts i wi = 1 thigh for a vertex in the knee Richard Everson Animation 35 / 43 Richard Everson Animation 36 / 43 Matrix palette skinning Matrix palette skinning Untransformed bones and mesh Texture mapped mesh http://www.gup.uni-linz.ac.at/∼gk/Praktika/meshskin webdata/ Transformed bones and mesh Texture mapped mesh http://www.gup.uni-linz.ac.at/∼gk/Praktika/meshskin webdata/ Richard Everson Animation 37 / 43 Richard Everson Animation 38 / 43 Morph target animation Morph target animation Vertices of key-frames are edited manually Intermediate frames interpolated Artist has more fine control over location Libraries of facial expressions corresponding to emotions or speech to permit ‘procedural’ animation Expensive to define the vertex positions Non-realistic distortions as the vertices take different paths http://ivizlab.sfu.ca/research/iface Richard Everson Animation 39 / 43 Richard Everson Animation 40 / 43 Motion capture Motion capture Place markers on the human body Surround body with (at least 8) cameras Position of each marker inferred if visible to at least 2 cameras Rapid acquisition of motion Active markers to avoid ‘marker swapping’ Richard Everson Animation 42 / 43 Richard Everson Animation 43 / 43.
Recommended publications
  • Computerising 2D Animation and the Cleanup Power of Snakes
    Computerising 2D Animation and the Cleanup Power of Snakes. Fionnuala Johnson Submitted for the degree of Master of Science University of Glasgow, The Department of Computing Science. January 1998 ProQuest Number: 13818622 All rights reserved INFORMATION TO ALL USERS The quality of this reproduction is dependent upon the quality of the copy submitted. In the unlikely event that the author did not send a com plete manuscript and there are missing pages, these will be noted. Also, if material had to be removed, a note will indicate the deletion. uest ProQuest 13818622 Published by ProQuest LLC(2018). Copyright of the Dissertation is held by the Author. All rights reserved. This work is protected against unauthorized copying under Title 17, United States C ode Microform Edition © ProQuest LLC. ProQuest LLC. 789 East Eisenhower Parkway P.O. Box 1346 Ann Arbor, Ml 48106- 1346 GLASGOW UNIVERSITY LIBRARY U3 ^coji^ \ Abstract Traditional 2D animation remains largely a hand drawn process. Computer-assisted animation systems do exists. Unfortunately the overheads these systems incur have prevented them from being introduced into the traditional studio. One such prob­ lem area involves the transferral of the animator’s line drawings into the computer system. The systems, which are presently available, require the images to be over- cleaned prior to scanning. The resulting raster images are of unacceptable quality. Therefore the question this thesis examines is; given a sketchy raster image is it possible to extract a cleaned-up vector image? Current solutions fail to extract the true line from the sketch because they possess no knowledge of the problem area.
    [Show full text]
  • The Uses of Animation 1
    The Uses of Animation 1 1 The Uses of Animation ANIMATION Animation is the process of making the illusion of motion and change by means of the rapid display of a sequence of static images that minimally differ from each other. The illusion—as in motion pictures in general—is thought to rely on the phi phenomenon. Animators are artists who specialize in the creation of animation. Animation can be recorded with either analogue media, a flip book, motion picture film, video tape,digital media, including formats with animated GIF, Flash animation and digital video. To display animation, a digital camera, computer, or projector are used along with new technologies that are produced. Animation creation methods include the traditional animation creation method and those involving stop motion animation of two and three-dimensional objects, paper cutouts, puppets and clay figures. Images are displayed in a rapid succession, usually 24, 25, 30, or 60 frames per second. THE MOST COMMON USES OF ANIMATION Cartoons The most common use of animation, and perhaps the origin of it, is cartoons. Cartoons appear all the time on television and the cinema and can be used for entertainment, advertising, 2 Aspects of Animation: Steps to Learn Animated Cartoons presentations and many more applications that are only limited by the imagination of the designer. The most important factor about making cartoons on a computer is reusability and flexibility. The system that will actually do the animation needs to be such that all the actions that are going to be performed can be repeated easily, without much fuss from the side of the animator.
    [Show full text]
  • FLUID MODELING with STOCHASTIC and STRUCTURAL FEATURES a Dissertation Submitted to Kent State University in Partial Fulfillment
    FLUID MODELING WITH STOCHASTIC AND STRUCTURAL FEATURES A dissertation submitted to Kent State University in partial fulfillment of the requirements for the degree of Doctor of Philosophy by Zhi Yuan August 2013 Dissertation written by Zhi Yuan B.S., Huazhong University of Science and Technology, 2005 Ph.D., Kent State University, 2013 Approved by Dr. Ye Zhao , Chair, Doctoral Dissertation Committee Dr. Ruoming Jin , Members, Doctoral Dissertation Committee Dr. Austin Melton Dr. Xiaoyu Zheng Dr. Robin Selinger Accepted by Dr. Javed Khan , Chair, Department of Computer Science Dr. Raymond A. Craig , Dean, College of Arts and Sciences ii TABLE OF CONTENTS LISTOFFIGURES..................................... vi LISTOFTABLES ..................................... ix Acknowledgements ................................... .. x Dedication......................................... xi 1 Introduction ...................................... 1 1.1 Significance,ChallengeandObjectives. ........ 1 1.2 MethodologyandContribution . .... 3 1.3 Background.................................... 5 1.3.1 PhysicallyBasedFluidSimulationMethods . ...... 5 1.3.2 FluidTurbulence ............................. 6 1.3.3 FluidControl ............................... 7 1.3.4 FluidCompression ............................ 8 2 Incorporating Fluctuation and Uncertainty in Particle-basedFluidSimulation. 10 2.1 Introduction.................................... 10 2.2 BasicSPHAlgorithm............................... 15 2.3 StochasticTurbulenceinSPH . ... 16 2.4 TurbulenceEvolution . .. 17
    [Show full text]
  • Ipi Soft Motion Capture Software Inspires Feature Length Machinima
    iPi Soft Motion Capture Software Inspires Feature Length Machinima Horror Movie Ticket Zer0 Director Ian Chisholm Uses End-to-End Markerless Performance Capture Pipeline For Character Animation; Film Now Airing on YouTube _____ MOSCOW, RUSSIA – September 25, 2018 – For his most ambitious project to date, the feature length horror film Ticket Zer0, motion capture director/writer Ian Chisholm once again leaned heavily on iPi Motion Capture, the advanced markerless motion capture technology from iPi Soft. The 66-minute animated machinima cinematic production is currently available for streaming on YouTube or Blu-Ray here. Ticket Zer0 is a follow-up to Chisholm’s Clear Skies trilogy series (viewed by hundreds of thousands on YouTube), which also used iPi Mocap throughout for performance capture. Ticket Zer0 tells the story of the night shift at a deserted industrial facility. When a troubleshooter is sent in to fix a mysterious problem, something far more sinister emerges than any of them could have imagined. Chisholm notes production for Ticket Zer0 took almost five years to complete and used iPi Motion Capture for recording movement in every single frame for all the characters in the film. “iPi Soft technology has proven to be an invaluable creative resource for me as a filmmaker during motion capture sessions,” he says. “There wasn't anything on screen that wasn't recorded by iPi Mocap. The ability to perform the motion capture was exactly what I needed to expand and improve the immersive quality of my film.” Workflow Details: For motion capture production, Chisholm used iPi Mocap in conjunction with six Sony PS3 Eye cameras along with two PlayStation Move controllers to track hand movements.
    [Show full text]
  • Download the Program (PDF)
    ヴ ィ ボー ・ ア ニ メー シ カ ョ ワ ン イ フ イ ェ と ス エ ピ テ ッ ィ Viborg クー バ AnimAtion FestivAl ル Kawaii & epikku 25.09.2017 - 01.10.2017 summAry 目次 5 welcome to VAF 2017 6 DenmArk meets JApAn! 34 progrAmme 8 eVents Films For chilDren 40 kAwAii & epikku 8 AnD families Viborg mAngA AnD Anime museum 40 JApAnese Films 12 open workshop: origAmi 42 internAtionAl Films lecture by hAns DybkJær About 12 important ticket information origAmi 43 speciAl progrAmmes Fotorama: 13 origAmi - creAte your own VAF Dog! 44 short Films • It is only possible to order tickets for the VAF screenings via the website 15 eVents At Viborg librAry www.fotorama.dk. 46 • In order to pick up a ticket at the Fotorama ticket booth, a prior reservation Films For ADults must be made. 16 VimApp - light up Viborg! • It is only possible to pick up a reserved ticket on the actual day the movie is 46 JApAnese Films screened. 18 solAr Walk • A reserved ticket must be picked up 20 minutes before the movie starts at 50 speciAl progrAmmes the latest. If not picked up 20 minutes before the start of the movie, your 20 immersion gAme expo ticket order will be annulled. Therefore, we recommended that you arrive at 51 JApAnese short Films the movie theater in good time. 22 expAnDeD AnimAtion • There is a reservation fee of 5 kr. per order. 52 JApAnese short Film progrAmmes • If you do not wish to pay a reservation fee, report to the ticket booth 1 24 mAngA Artist bAttle hour before your desired movie starts and receive tickets (IF there are any 56 internAtionAl Films text authors available.) VAF sum up: exhibitions in Jane Lyngbye Hvid Jensen • If you wish to see a movie that is fully booked, please contact the Fotorama 25 57 Katrine P.
    [Show full text]
  • Application of Performance Motion Capture Technology in Film and Television Performance Animation
    Proceedings of the 2nd International Symposium on Computer, Communication, Control and Automation (ISCCCA-13) Application of Performance Motion Capture Technology in Film and Television Performance Animation ZHANG Manyu Dept. Animation of Art, Tianjin Academy of Fine Arts Tianjin , CHINA E-mail: [email protected] Abstract—In today’s animation films, virtual character tissue simulation technique” for the first time, so that acting animation is a research field developing rapidly in computer skills of actors can be delivered to viewers maximally. graphics, and motion capture is the most significant What’s more, people realize that “motion capture component, which has brought about revolutionary change for technology” shall be called as “performance capture 3D animation production technology. It makes that animation technology” increasingly. Orangutan Caesar presented in producers are able to drive animation image models directly with the use of actors’ performance actions and expressions, Rise of the Apes through “motion capture” technology is which has simplified animation manufacturing operation very classical. Various complicated and subtle expressions greatly and enhanced quality for animation production. and body languages of Caesar’s not vanished bestiality and initially emerged human nature have been presented Keywords- film and television films, motion capture, naturally and freshly in the film, which has achieved the technology excellent realm of mixing the false with the genuine. When Caesar stands in front of Doctor Will in equal gesture, its I. INTRODUCTION independence and arbitrariness in its expression with dignity With the development of modern computer technology, of orangutan has undoubtedly let viewers understand computer animation technology has developed rapidly.
    [Show full text]
  • Introduction to Animation, Key-Frame Animation, Kinematics, Motion Capture
    What is Animation? Generate perception of motion with sequence of images shown in rapid succession EECS 487: Interactive • humans “see” smooth motion at 12−70 fps Must be technically excellent, but more importantly, aesthetically, emotionally compelling Computer Graphics • violation of realism may at times be desirable Animation “pipeline”: Lecture 33: • Introduction to animation • Key-frame character animation • Inverse kinematics and motion capture McMillan,O’Brien Traditional Animation Computer Animation 2D animation: 1. Straight ahead: draw each frame, • CADrawing and painting are now routine one frame at a time • but 2D in-betweening (morphing) • lead to spontaneity is hard to get right • great control Instead, we assume 3D model of scene • tedious: 24 fps, 1,440 frames/minute, • for each scene, vary parameters to generate 130K frames for a 1.5 hour movie desired pose for all objects 2. Pose-to-pose (developed by Walt Disney): • stop-motion: shooting miniature physical • director plans shots using storyboards models frame by frame • senior artists sketch key poses (keyframes) • typically when motion changes • interns fill in the in-between frames • all line drawings are painted on cels • composed in layers • background changes infrequently, can be reused • photograph finished cel-stack onto film Yu,Marschner,Durand,Hodgins Some Artistic Considerations Principles of Traditional Animation Eleven principles of traditional Goal: make characters that move in animation compiled by Lasseter: a convincing way to communicate 1. Squash and stretch personality and emotion 2. Slow in, slow out Many of these principles Animation principles developed by 3. Timing follow indirectly from Disney in the 20’s−30’s, adapted by 4.
    [Show full text]
  • Motion Capture Assisted Animation: Texturing and Synthesis
    Motion Capture Assisted Animation: Texturing and Synthesis Katherine Pullen Christoph Bregler Stanford University Stanford University Abstract Keyframed Data 2 We discuss a method for creating animations that allows the anima- 0 (a) tor to sketch an animation by setting a small number of keyframes −2 on a fraction of the possible degrees of freedom. Motion capture −4 data is then used to enhance the animation. Detail is added to de- −6 grees of freedom that were keyframed, a process we call texturing. −8 Degrees of freedom that were not keyframed are synthesized. The −10 method takes advantage of the fact that joint motions of an artic- −12 ulated figure are often correlated, so that given an incomplete data 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 set, the missing degrees of freedom can be predicted from those that Motion Capture Data are present. 1 CR Categories: I.3.7 [Computer Graphics]: Three-Dimensional 0 (b) Graphics and Realism—Animation; J.5 [Arts and Humantities]: −1 performing arts −2 Keywords: animation, motion capture, motion texture, motion −3 synthesis −4 translation in inches −5 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 time in seconds 1 Introduction Figure 1: Comparison of keyframed data and motion capture As the availability of motion capture data has increased, there has data for root y translation for walking. (a) keyframed data, with been more and more interest in using it as a basis for creating com- keyframes indicated by red dots (b) motion capture data.
    [Show full text]
  • Creating 3DCG Animation with a Network of 3D Depth Sensors Ning
    2016 International Conference on Sustainable Energy, Environment and Information Engineering (SEEIE 2016) ISBN: 978-1-60595-337-3 Creating 3DCG Animation with a Network of 3D Depth Sensors Ning-ping SUN1,*, Yohei YAMASHITA2 and Ryosuke OKUMURA1 1Dept. of Human-Oriented Information System Engineering 2Advanced Electronics and Information Systems Engineering Course, National Institute of Technology, Kumamoto College, 2659-2 Suya, Koshi, Kumamoto 861-1102, Japan *Corresponding author Keywords: Depth sensor, Network, 3D computer graphics, Animation production. Abstract. The motion capture is a useful technique for creating animations instead of manual labor, but the expensive camera based devices restrict its practical application. The emerging of inexpensive small size of depth-sensing devices provides us opportunities to reconstitute motion capture system. We introduced a network of 3D depth sensors into the production of real time 3DCG animation. Different to the existing system, the new system works well for data communication and anime rendering. We discuss below server how communicate with a group of sensors, and how we operate a group of sensors in the rendering of animation. Some results of verification experiments are provided as well. Introduction The presentation of character’s movement is very important for animation productions. In general, there are two kinds of technique to present movement in an anime. First one is a traditional method, in which animators draw a series of frames of character’s movements individually, and show these frames continuously in a short time interval. Fps, i.e., frames per second is involved the quality of an animation. The more the fps is, the better the quality becomes.
    [Show full text]
  • Using Dragonframe 4.Pdf
    Using DRAGONFRAME 4 Welcome Dragonframe is a stop-motion solution created by professional anima- tors—for professional animators. It's designed to complement how the pros animate. We hope this manual helps you get up to speed with Dragonframe quickly. The chapters in this guide give you the information you need to know to get proficient with Dragonframe: “Big Picture” on page 1 helps you get started with Dragonframe. “User Interface” on page 13 gives a tour of Dragonframe’s features. “Camera Connections” on page 39 helps you connect cameras to Drag- onframe. “Cinematography Tools” on page 73 and “Animation Tools” on page 107 give details on Dragonframe’s main workspaces. “Using the Timeline” on page 129 explains how to use the timeline in the Animation window to edit frames. “Alternative Shooting Techniques (Non Stop Motion)” on page 145 explains how to use Dragonframe for time-lapse. “Managing Your Projects and Files” on page 149 shows how to use Dragonframe to organize and manage your project. “Working with Audio Clips” on page 159 and “Reading Dialogue Tracks” on page 171 explain how to add an audip clip and create a track reading. “Using the X-Sheet” on page 187 explains our virtual exposure sheet. “Automate Lighting with DMX” on page 211 describes how to use DMX to automate lights. “Adding Input and Output Triggers” on page 241 has an overview of using Dragonframe to trigger events. “Motion Control” on page 249 helps you integrate your rig with the Arc Motion Control workspace or helps you use other motion control rigs.
    [Show full text]
  • Anima II – a 3D Animation System
    ANIMA II: A 3-D COLOR ANIMATION SYSTEM Ronald J. Hackathorn COMPUTER GRAPHICS RESEARCH GROUP THE OHIO STATE UNIVERSITY ABSTRACT attempt to maximize the trade-offs involved in 3-D color animation. The goal has been to achieve An animation software system has been developed at the capability and image quality necessary for total The Computer Graphics Research Group which allows complex animation and, yet, maintain the ani- a person with no computer background to develop an system efficiency necessary for a production animation idea into a finished color video product mation environment. which may be seen and recorded in real time. The animation may include complex polyhedra forming Anima II is a computer animation system designed words, sentences, plants, animals and other crea- for the production of color, three-dimensional educa- tures. The animation system, called Anima II, has video tapes. It is aimed at the animator, a high as its three basic parts: a data generation rou- tor and artist who requires anything from pur- tine used to make colored, three-dimensional volume of short color sequences for teaching objects, an animation language with a simple poses, to realistic key frame animation involving life- script-like syntax used to describe parallel mo- complex color objects and precisely timed tion and display transformations in a flexible, like movements. The Anima II system provides an scheduled environment, the Myers algorithm used in efficient environment for the creation, animation the visible surface and raster scan calculations and real-time playback display of color-shaded connected for the color display. This paper discusses the polyhedra.
    [Show full text]
  • 384 Cinemacraft: Immersive Live Machinima As an Empathetic
    Cinemacraft: Immersive Live Machinima as an Empathetic Musical Storytelling Platform Siddharth Narayanan Ivica Ico Bukvic Electrical and Computer Engineering Institue for Creativity, Arts & Technology Virginia Tech Virginia Tech [email protected] [email protected] ABSTRACT or joystick manipulation for various body motions, rang- ing from simple actions such as walking or jumping to In the following paper we present Cinemacraft, a technology- more elaborate tasks like opening doors or pulling levers. mediated immersive machinima platform for collaborative Such an approach can often result in non-natural and po- performance and musical human-computer interaction. To tentially limiting interactions. Such interactions also lead achieve this, Cinemacraft innovates upon a reverse-engineered to profoundly different bodily experiences for the user and version of Minecraft, offering a unique collection of live may detract from the sense of immersion [10]. Modern day machinima production tools and a newly introduced Kinect game avatars are also loaded with body, posture, and ani- HD module that allows for embodied interaction, includ- mation details in an unending quest for realism and com- ing posture, arm movement, facial expressions, and a lip pelling human representations that are often stylized and syncing based on captured voice input. The result is a mal- limited due to the aforesaid limited forms of interaction. leable and accessible sensory fusion platform capable of These challenges lead to the uncanny valley problem that delivering compelling live immersive and empathetic mu- has been explored through the human likeness of digitally sical storytelling that through the use of low fidelity avatars created faces [11], and the effects of varying degrees of re- also successfully sidesteps the uncanny valley.
    [Show full text]