(12) Patent Application Publication (10) Pub. No.: US 2014/0378222 A1 Balakrishnan Et Al

Total Page:16

File Type:pdf, Size:1020Kb

(12) Patent Application Publication (10) Pub. No.: US 2014/0378222 A1 Balakrishnan Et Al US 20140378222A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2014/0378222 A1 Balakrishnan et al. (43) Pub. Date: Dec. 25, 2014 (54) MOBILE VIRTUAL CINEMATOGRAPHY A63F 3/29 (2006.01) SYSTEM H04N5/9 (2006.01) (71) Applicant: Drexel University, Philadelphia, PA (52) U.S. Cl. (US) CPC ............... H04N5/2224 (2013.01); H04N 5/91 (2013.01); H04N5/77 (2013.01); A63F 13/04 (72) Inventors: Girish Balakrishnan, Santa Monica, CA (2013.01); A63F2300/1093 (2013.01) (US); Paul Diefenbach, Collingswood, USPC ............................................. 463/31: 386/224 NJ (US) (73)73) AssigneeAssi : PADREXEL (US) UNIVERSITY, rhiladelphia,Philadelphi s. ABSTRACT (21)21) Appl. NoNo.: 14/309,8339 A virtual cinematography system (SmartWCS) is disclosed, (22) Filed: Jun. 19, 2014 including a mobile tablet device, wherein the mobile tablet device includes a touch-sensor Screen, a first hand control, a Related U.S. Application Data second hand control, and a motion sensor. The SmartWCS (60) Provisional application No. 61/836,829, filed on Jun including a motion tracking camera configured to track the 19, 2013 sy-1-1 s movements of the motion sensor and generate motion track s ing information and a computer operatively coupled to the Publication Classification motion tracking camera configured to receive the motion tracking information and to transmit the motion tracking (51) Int. Cl. information to the first mobile tablet device, wherein the first H04N 5/222 (2006.01) mobile device is configured to overlay the motion tracking HO)4N 5/77 (2006.01) information onto a virtual landscape. OC Patent Application Publication Dec. 25, 2014 Sheet 1 of 14 US 2014/0378222 A1 Of 150 F.G. B Patent Application Publication Dec. 25, 2014 Sheet 2 of 14 US 2014/0378222 A1 2OO FG 2 Patent Application Publication Dec. 25, 2014 Sheet 3 of 14 US 2014/0378222 A1 3 O F.G. 3 Patent Application Publication Dec. 25, 2014 Sheet 4 of 14 US 2014/0378222 A1 oo: I 2.02 Tracking rack oying Patent Application Publication Dec. 25, 2014 Sheet 5 of 14 US 2014/0378222 A1 M-SO3 F.G. 5B Patent Application Publication Dec. 25, 2014 Sheet 6 of 14 US 2014/0378222 A1 Oa. 5p1 SO4. FG 5D Patent Application Publication Dec. 25, 2014 Sheet 7 of 14 US 2014/0378222 A1 -- SO3 Patent Application Publication Dec. 25, 2014 Sheet 8 of 14 US 2014/0378222 A1 N REA-OR) WRA-OR. (O w w w w w Y O Decease : Seisitivity o - ich foot increase sensitivity SSSSSS FG. 8 Patent Application Publication Dec. 25, 2014 Sheet 9 of 14 US 2014/0378222 A1 8 3S Of 35 50 Orr OO 2OOff 3OO Wedi in Wide Angie Norra elephoto elephoto saaaaaaaaaaaaaaaaaaaaaass saanaaaaaaaaaaaaaaaaaaaaas, Shorter Focal length longer Foca length T" 3COff FG 9 Patent Application Publication Dec. 25, 2014 Sheet 10 of 14 US 2014/0378222 A1 4:3 Standard Patent Application Publication Dec. 25, 2014 Sheet 11 of 14 US 2014/0378222 A1 Camera Saw & ( F.G. Patent Application Publication Dec. 25, 2014 Sheet 12 of 14 US 2014/0378222 A1 M Patent Application Publication Dec. 25, 2014 Sheet 13 of 14 US 2014/0378222 A1 K wear rer . San (...) s S. 5 . rer C 8. (f Patent Application Publication Dec. 25, 2014 Sheet 14 of 14 US 2014/0378222 A1 FG. 4. Reai-ite Carnerg-Position Wisdaizer PS Move co SCC itect F.G. 5 US 2014/0378222 A1 Dec. 25, 2014 MOBILE VIRTUAL CINEMATOGRAPHY 0007. The SmartVCS uses affordable hardware, an intui SYSTEM tive user interface, real-time benefits of game engines, and an intelligent camera system, to provides professional directors CROSS-REFERENCE TO RELATED as well as a new market of amateur filmmakers the ability to APPLICATION previsualize their films or game cinematics with familiar and 0001. This application claims the benefit of U.S. Provi accessible technology. The Support of both free-space move sional Application No. 61/836,829 filed Jun. 19, 2013, which ment and controller-based navigation with adjustable scene is incorporated by reference as if fully set forth herein. scales permits the user to navigate the virtual space and record camera motion in a variety of means. The user can addition FIELD OF INVENTION ally mark key frames for virtual dollies or booms, and control camera parameters such as focal length or aspect ratio. The 0002. The embodiments described herein are related to a SmartWCS can include cinematic principals for intelligent mobile virtual cinematography system and a method of pro generation of shots. For example, dolly tracks and timings are ducing films and game cinematics using commercially avail generated based on physically-guided principals in order to able equipment. produce realistic camera paths. SmartWCS also Supports rudi mentary scene editing, and has the ability to integrate into a BACKGROUND professional production pipeline Such as exporting camera 0003. With development of powerful computer processing paths or scene descriptions with industry Software packages equipment and graphic Support, virtual cinematography has such as the one sold under the trademark of Autodesk Maya. become a staple of the movie industry. Virtual content is 0008. In addition, the system has precision virtual camera assembled into a scene within a 3-D engine, (e.g. by filming controls mapped to analog joysticks, buttons, and Scroll live action elements or through computer generation). After wheel input. With the systems flexible design, operators may wards the scene can be composed, re-photographed, and modify the form factor of the device through shoulder, tripod, edited from other angles by a virtual camera as if the action dolly, and hand-held mounts was happening for the first time. The rendered image can 0009. A virtual cinematography system (VCS) is dis appear realistic. closed herein. The virtual cinematography system may com 0004 Virtual cinematography with a new virtual camera prise: a first mobile tablet device including a touch-sensor system has a wide field of applications and opportunities to be screen, a first hand control, a second hand control, and a useful, ranging at least from film industry, architecture/inte motion sensor, a motion tracking camera configured to track rior/landscape design studios, city planning offices, video and record movements of the motion sensor and generate game industries and medical field, for example. However, in motion tracking information to navigate a virtual camera; and the current production environment, the ability to previsual a computer operatively coupled to the motion tracking cam ize shots utilizing a virtual camera system requires expensive era that is configured to receive from the motion tracking hardware and large motion capture spaces only available to camera data of the motion tracking information and transmit large studios. There exists a need to develop a new virtual the data to the first mobile tablet device. In this system, the camera system that is more easily accessible in terms of the first mobile tablet device is configured to generate and over space, costs, and availability. lay virtual camera images based on the motion tracking infor mation data onto a virtual landscape and produce a virtual SUMMARY motion capture. 0005. A Smart Virtual Cinematography System (Smart 0010. In another aspect, a method of producing a virtual VCS) is designed for both amateur and professional users, motion capture using a SmartWCS is provided. The method who wish to embrace the notion of virtual production for may comprise the steps of pre-loading and launching a stan films, game cinematics, designs and medical images without dalone SmartWCS application into a mobile tablet device of a big studio budget. A user will be able to compose and record the SmartWCS device; tracking a movement of the motion camera motions in freespace or in fixed, steady cinema shoot sensor of the SmartWCS device via a motion tracking camera ing motions, and manipulate scene elements such as charac connected to a game console to produce motion information ters and environments through a touch interface or hand con data; transmitting the motion information data from the game trols. These interfaces permit previsualizing scenes and shot console to the mobile tablet device; operating a virtual camera compositions without the need for expensive hardware or within the SmartWCS application based upon the motion large motion capture Volumes. SmartWCS has potential appli information data So as to navigate the virtual camera in a cations to other areas including game level design, real-time virtual 3-D landscape according to the motion sensor, and compositing and post-production, and architectural visual recording a virtual motion capture including the virtual cam ization. era motion. 0006. The SmartWCS, a consumer based previsualization platform as described herein, provides a solution that was BRIEF DESCRIPTION OF THE DRAWINGS developed to make production of virtual cinematics more efficient. Being an open platform device, developers may 0011. A more detailed understanding may be had from the begin to build on top of the existing virtual production expe following description, given by way of example in conjunc rience, tailoring the Software to their production needs and tion with the accompanying drawings wherein: providing a system and method available for both profes 0012 FIG. 1 shows a traditional virtual camera capture in sional and amateur content makers including students, archi a motion capture studio (1A) in comparison with a SmartWCS tects, designers, city planners, and medical professionals to motion capture (1B); Small-production teams that cannot afford the cost of studio 0013 FIG. 2 shows an example SmartWCS system archi previsualization tools. tecture; US 2014/0378222 A1 Dec. 25, 2014 0014 FIG.3 shows an example hand control, motion sen includes strong graphical processing power to display com Sor, and camera; plex models, animations, lighting, etc.
Recommended publications
  • Depth Assisted Composition of Synthetic and Real 3D Scenes
    ©2016 Society for Imaging Science and Technology DOI: 10.2352/ISSN.2470-1173.2016.21.3DIPM-399 Depth Assisted Composition of Synthetic and Real 3D Scenes Santiago Cortes, Olli Suominen, Atanas Gotchev; Department of Signal Processing; Tampere University of Technology; Tampere, Finland Abstract In media production, previsualization is an important step. It Prior work allows the director and the production crew to see an estimate of An extensive description and formalization of MR systems the final product during the filmmaking process. This work focuses can be found in [1]. After the release of the ARToolkit [2] by the in a previsualization system for composite shots, which involves Nara Institute of Technology, the research community got much real and virtual content. The system visualizes a correct better access to real time blend of real and virtual data. Most of the perspective view of how the real objects in front of the camera AR applications running on web browsers produced during the operator look placed in a virtual space. The aim is to simplify the early 2000s were developed using the ARToolkit. MR has evolved workflow, reduce production time and allow more direct control of with the technology used to produce it, and the uses for it have the end result. The real scene is shot with a time-of-flight depth multiplied. Nowadays, in the embedded system era, a single device camera, whose pose is tracked using a motion capture system. can have most of, if not all, the sensors and computing power to Depth-based segmentation is applied to remove the background run a good MR application.
    [Show full text]
  • (12) United States Patent (10) Patent No.: US 9,729,765 B2 Balakrishnan Et Al
    USOO9729765B2 (12) United States Patent (10) Patent No.: US 9,729,765 B2 Balakrishnan et al. (45) Date of Patent: Aug. 8, 2017 (54) MOBILE VIRTUAL CINEMATOGRAPHY A63F 13/70 (2014.09); G06T 15/20 (2013.01); SYSTEM H04N 5/44504 (2013.01); A63F2300/1093 (71) Applicant: Drexel University, Philadelphia, PA (2013.01) (58) Field of Classification Search (US) None (72) Inventors: Girish Balakrishnan, Santa Monica, See application file for complete search history. CA (US); Paul Diefenbach, Collingswood, NJ (US) (56) References Cited (73) Assignee: Drexel University, Philadelphia, PA (US) PUBLICATIONS (*) Notice: Subject to any disclaimer, the term of this Lino, C. et al. (2011) The Director's Lens: An Intelligent Assistant patent is extended or adjusted under 35 for Virtual Cinematography. ACM Multimedia, ACM 978-1-4503 U.S.C. 154(b) by 651 days. 0616-Apr. 11, 2011. Elson, D.K. and Riedl, M.O (2007) A Lightweight Intelligent (21) Appl. No.: 14/309,833 Virtual Cinematography System for Machinima Production. Asso ciation for the Advancement of Artificial Intelligence. Available (22) Filed: Jun. 19, 2014 from www.aaai.org. (Continued) (65) Prior Publication Data US 2014/O378222 A1 Dec. 25, 2014 Primary Examiner — Maurice L. McDowell, Jr. (74) Attorney, Agent, or Firm — Saul Ewing LLP: Related U.S. Application Data Kathryn Doyle; Brian R. Landry (60) Provisional application No. 61/836,829, filed on Jun. 19, 2013. (57) ABSTRACT A virtual cinematography system (SmartWCS) is disclosed, (51) Int. Cl. including a mobile tablet device, wherein the mobile tablet H04N 5/222 (2006.01) device includes a touch-sensor Screen, a first hand control, a A63F 3/65 (2014.01) second hand control, and a motion sensor.
    [Show full text]
  • A Cost Effective, Accurate Virtual Camera System for Games, Media Production and Interactive Visualisation Using Game Motion Controllers
    EG UK Theory and Practice of Computer Graphics (2013) Silvester Czanner and Wen Tang (Editors) A Cost Effective, Accurate Virtual Camera System for Games, Media Production and Interactive Visualisation Using Game Motion Controllers Matthew Bett1, Erin Michno2 and Dr Keneth B. McAlpine1 1University of Abertay Dundee 2Quartic Llama ltd, Dundee Abstract Virtual cameras and virtual production techniques are an indispensable tool in blockbuster filmmaking but due to their integration into commercial motion-capture solutions, they are currently out-of-reach to low-budget and amateur users. We examine the potential of a low budget high-accuracy solution to create a simple motion capture system using controller hardware designed for video games. With this as a basis, a functional virtual camera system was developed which has proven usable and robust for commercial testing. Categories and Subject Descriptors (according to ACM CCS): I.3.6 [Computer Graphics]: Methodology and Techniques—Interaction Techniques I.3.7 [Information Interfaces and Presentations]: Three-Dimensional Graphics and Realism —Virtual reality 1. Introduction sult is a loss of truly ’organic’ camera shots being created with ease in the production process. In recent years, advances in computer graphics, motion cap- Virtual camera systems allow the user to apply conven- ture hardware and a need to give greater directorial control tional camera-craft within the CGI filmmaking process. A over the digital filmmaking process have given rise to the physical device similar to a conventional camera is used by world of virtual production as a tool in film and media pro- the camera operator with a monitor acting as the view-finder duction [AUT09].
    [Show full text]
  • Download File
    Improvements in the robustness and accuracy of bioluminescence tomographic reconstructions of distributed sources within small animals Bradley J. Beattie Submitted in partial fulfillment of the requirements for the degree of Doctor of Engineering Science in the Fu Foundation School of Engineering and Applied Science COLUMBIA UNIVERSITY 2018 © 2018 Bradley J. Beattie All rights reserved ABSTRACT Improvements in the robustness and accuracy of bioluminescence tomographic reconstructions of distributed sources within small animals Bradley J. Beattie High quality three-dimensional bioluminescence tomographic (BLT) images, if available, would constitute a major advance and provide much more useful information than the two-dimensional bioluminescence images that are frequently used today. To-date, high quality BLT images have not been available, largely because of the poor quality of the data being input into the reconstruction process. Many significant confounds are not routinely corrected for and the noise in this data is unnecessarily large and poorly distributed. Moreover, many of the design choices affecting image quality are not well considered, including choices regarding the number and type of filters used when making multispectral measurements and choices regarding the frequency and uniformity of the sampling of both the range and domain of the BLT inverse problem. Finally, progress in BLT image quality is difficult to gauge owing to a lack of realistic gold-standard references that engage the full complexity and uncertainty within a small animal BLT imaging experiment. Within this dissertation, I address all of these issues. I develop a Cerenkov-based gold- standard wherein a Positron Emission Tomography (PET) image can be used to gauge improvements in the accuracy of BLT reconstruction algorithms.
    [Show full text]
  • NOAO/NSO Newsletter Issue 86 June 2006
    NOAO/NSO Newsletter Issue 86 June 2006 Science Highlights Cerro Tololo Inter-American Observatory Using the ‘NOAO System’ of Small and Large Telescopes Conociendo al Pueblo Atacameño ............................................26 to Investigate Young Brown Dwarfs.......................................3 The El Peñón DIMM ...................................................................28 The First Complete Solar Cycle of GONG Observations SOAR Telescope Update .............................................................28 of Solar Convection Zone Dynamics .....................................5 Robert Blum Moves from NOAO South to North ................ 29 Discovery of the First Radio-Loud Quasar at z > 6 ..................6 Reversed Granulation and Gravity Waves Kitt Peak National Observatory in the Mid-Photosphere............................................................8 The Tohono O’odham Nation, the NSF, VERITAS, and The Volume-Averaged Properties Kitt Peak National Observatory ............................................30 of Luminous Galaxies at z < 3....................................................9 The WIYN One-Degree Imager and its Precursor QUOTA ..................................................................32 Director’s Office Thank You, Judy Prosser, for 18 Years of Service! ..................34 TMT Project Passes Conceptual Design Review ....................12 Taft Armandroff moves from NGSC Director National Solar Observatory/GONG to Keck Director .....................................................................
    [Show full text]
  • An Extensible System for Physically-Based Virtual Camera Control Using Rotational Motion Capture
    AN EXTENSIBLE SYSTEM FOR PHYSICALLY-BASED VIRTUAL CAMERA CONTROL USING ROTATIONAL MOTION CAPTURE A Thesis by ROBERT SHELBY HUEBEL Submitted to the Office of Graduate and Professional Studies of Texas A&M University in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE Chair of Committee, Philip Galanter Committee Members, Tim McLaughlin John Keyser Head of Department, Tim McLaughlin May 2014 Major Subject: Visualization Copyright 2014 Robert Shelby Huebel ABSTRACT An important characteristic of any well-designed system is user interface design. Even in extremely complex systems with high learning curves, a designer's goal is to create an interface that is simple to use and that feels natural, while at the same time allowing the user full access to all of the system's functionality. In the field of visual effects and computer animation, effective manipulation of the virtual camera is essential to any successful production. In most cases, the virtual camera is manipulated with standard input devices such as the mouse and keyboard. This thesis presents an alternative user interface design for virtual cameras using physical controls. This thesis describes an extensible system which offers physically-based, real-time virtual camera control in a commercial 3D software package by means of rotational motion capture. The system described is composed of both programmed physical hardware and a software application. The physical hardware consists of multiple analog sensors programmed to work in conjunction to detect rotation about the de- vice's three axes. The software application interprets the data sent from the physical hardware into a form a 3D software package can use to manipulate a virtual cam- era, while also providing additional functionality for the motion capture and camera manipulation processes.
    [Show full text]
  • Eye Space: an Analytical Framework for the Screen-Mediated Relationship in Video Games
    Art and Design Review, 2017, 5, 84-101 http://www.scirp.org/journal/adr ISSN Online: 2332-2004 ISSN Print: 2332-1997 Eye Space: An Analytical Framework for the Screen-Mediated Relationship in Video Games Yu-Ching Chang, Chi-Min Hsieh Institute of Applied Arts, National Chiao Tung University, Taiwan How to cite this paper: Chang, Y.-C., & Abstract Hsieh, C.-M. (2017). Eye Space: An Ana- lytical Framework for the Screen-Mediated This article explores the connections between players and game worlds Relationship in Video Games. Art and through the screen-mediated space in games—i.e., eye space. Eye space is the Design Review, 5, 84-101. crucial link between players and the game world; it’s the decisive area where https://doi.org/10.4236/adr.2017.51007 the gameplay takes place. However, the concept of eye space is not attention Received: January 14, 2017 to and is frequently confused with game space because the two are closely Accepted: February 25, 2017 connected and sometimes they can even interchange with each other. Thus, Published: February 28, 2017 the study focuses on the basic building blocks and the structure of eye space in Copyright © 2017 by authors and games. An analytical framework based on the existing literature and practice Scientific Research Publishing Inc. is proposed with special attention on the interactive nature of video games in This work is licensed under the Creative order to examine the interplay between players and game spaces. The frame- Commons Attribution International work encompasses three aspects: the visual elements within the eye space, the License (CC BY 4.0).
    [Show full text]
  • Technology Title: Smart Virtual Camera System Tech ID: 13-1563D Technology Summary: Applications: Advantage(S): IP Status: Pate
    Technology Title: Smart Virtual Camera System Tech ID: 13-1563D Technology Summary: In the current video production environment, the ability to previsualize shots utilizing a virtual camera system requires expensive hardware and large motion capture spaces making this technology accessible to a limited audience. To overcome these limitations, a student/faculty Drexel team has developed an innovative low-cost alternative called the Smart VCS that offers the potential to make virtual cinematograhy accessible to the consumer market while offering an array of novel features. The Smart VCS is able to achieve both advantages by integrating consumer technologies such a multi-touch tablet (iPad) and video game motion controller (PS3 Move) with openly accessible game engines. The Smart VCS system is designed for directors, both amateur and professional, who wish to embrace the notion of Virtual Production for films and game cinematics without a big studio budget. The Smart VCS enables a director to compose and record camera motions in freespace and manipulate scene elements, such as characters & environments, through a real-time intuitive touch interface that is guided by system intelligence and based on cinematic principles. Applications: Virtual Cinematography Game level design Real-time compositing & post-production Architectural visualization Advantage(s): Lower equipment costs Lower production costs – supports iterative production pipeline allowing directors and cinematographers to experiment with their shot compositions throughout the production process instead of simply at the previsualization stage Tablet (iPad) integration - provides opportunity to build intelligence into the tool that may open up a new market of amateur content makers while also enabling higher quality content Smaller/Lighter Form Factor IP Status: Patent Pending Inventors: Girish Balakrishnan, Paul Diefenbach Relevant Publications: SIGGRAPH 2013 Conference (SIGGRAPH Talk Site) Multi-media Links: Video Demonstration Production Blog Unity Awards 2013 .
    [Show full text]
  • Computer Generation of Integral Images Using Interpolative Shading Techniques
    Computer Generation of Integral Images using Interpolative Shading Techniques Graham E. Milnthorpe A thesis submitted for the degree of Doctor of Philosophy School of Engineering and Manufacture, De Montfort University, Leicester December 2003 IMAGING SERVICES NORTH Boston Spa, Wetherby West Yorkshire, LS23 7BQ www.bl.uk The following has been excluded at the request of the university Pages 151 -164 Summary i Research to produce artificial 3D images that duplicates the human stereovision has been ongoing for hundreds of years. What has taken millions of years to evolve in humans is proving elusive even for present day technological advancements. The difficulties are compounded when real-time generation is contemplated. The problem is one of depth. When perceiving the world around us it has been shown that the sense of depth is the result of many different factors. These can be described as monocular and binocular. Monocular depth cues include overlapping or occlusion, shading and shadows, texture etc. Another monocular cue is accommodation (and binocular to some extent) where the focal length of the crystalline lens is adjusted to view an image. The important binocular cues are convergence and parallax. Convergence allows the observer to judge distance by the difference in angle between the viewing axes of left and right eyes when both are focussing on a point. Parallax relates to the fact that each eye sees a slightly shifted view of the image. If a system can be produced that requires the observer to use all of these cues, as when viewing the real world, then the transition to and from viewing a 3D display will be seamless.
    [Show full text]
  • 3GPP TR 26.928 V. 1.0.0
    ETSI TR 126 928 V16.0.0 (2020-11) TECHNICAL REPORT 5G; Extended Reality (XR) in 5G (3GPP TR 26.928 version 16.0.0 Release 16) 3GPP TR 26.928 version 16.0.0 Release 16 1 ETSI TR 126 928 V16.0.0 (2020-11) Reference DTR/TSGS-0426928vg00 Keywords 5G ETSI 650 Route des Lucioles F-06921 Sophia Antipolis Cedex - FRANCE Tel.: +33 4 92 94 42 00 Fax: +33 4 93 65 47 16 Siret N° 348 623 562 00017 - NAF 742 C Association à but non lucratif enregistrée à la Sous-Préfecture de Grasse (06) N° 7803/88 Important notice The present document can be downloaded from: http://www.etsi.org/standards-search The present document may be made available in electronic versions and/or in print. The content of any electronic and/or print versions of the present document shall not be modified without the prior written authorization of ETSI. In case of any existing or perceived difference in contents between such versions and/or in print, the prevailing version of an ETSI deliverable is the one made publicly available in PDF format at www.etsi.org/deliver. Users of the present document should be aware that the document may be subject to revision or change of status. Information on the current status of this and other ETSI documents is available at https://portal.etsi.org/TB/ETSIDeliverableStatus.aspx If you find errors in the present document, please send your comment to one of the following services: https://portal.etsi.org/People/CommiteeSupportStaff.aspx Copyright Notification No part may be reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying and microfilm except as authorized by written permission of ETSI.
    [Show full text]
  • Previsualization in Computer Animated Filmmaking THESIS
    Previsualization in Computer Animated Filmmaking THESIS Presented in Partial Fulfillment of the Requirements for the Degree Master of Fine Arts in the Graduate School of The Ohio State University By Nicole Lemon Graduate Program in Industrial, Interior and Visual Communication Design The Ohio State University 2012 Master's Examination Committee: Maria Palazzi, Advisor Alan Price Dan Shellenbarger Copyright by Nicole Lemon 2012 Abstract Previsualization (previs) is a pre-production process that uses 3D animation tools to generate preliminary versions of shot or sequences. This process is quickly gaining popularity in live action film, and is beginning to be used in animation production as well. This is because it fosters creativity by allowing for designers and artists to experiment more freely and intuitively with visual design choices, and insures efficiency in production and post-production. Previs is also able to provide a means to communicate and test plans visually in the pre-production stage which enhances clarity and understanding. The intention of this thesis is to make available information about previs that is, for the most part, unpublished or unknown by all but those already deeply involved in the process, and to explore and document the application of a previs process of my own in the production my first short film. To begin I will describe the previs process from several perspectives. Previs will be presented in historical context in order to provide insight into its development. Next I will present the results of an industry professionals survey conducted in late 2011 and early 2012 as a way of revealing an insider’s viewpoint on the use of previs in commercial computer animation production.
    [Show full text]
  • JUN-JULY- EDUSUP2011.Pdf
    houdiniad_education.pdf 1 11-07-12 3:45 PM C M Y CM MY CY CMY K Light a Fire for your Students with Film-Quality VFX Skills Demand for Houdini talent is at an all-time high and top film, broadcast, and gaming production studios are looking for graduates with strong VFX skills and a creative mind-set. Putting Houdini in your school curriculum will give your students the edge they deserve - a deeper understanding of computer graphics and the problem-solving skills essential for the reality of industry production. Schools that teach Houdini’s node-based procedural workflow find that they produce smarter and more efficient visual effects and animation graduates - graduates that studios would love to meet. Visit us at Booth #712 at SIGGRAPH 2011 to learn how your institution can benefit from going procedural with Houdini. www.sidefx.com/education houdiniad_education.pdf 1 11-07-12 3:45 PM Beyond the Basics Faculty and industry combine efforts and resources to deliver novel, nontraditional tools for an advanced educational experience By Courtney E. Howard A basic, or general, education is no longer enough. As those in the industry to provide unique opportunities and the computer graphics industry and its various market invest in future generations of digital artists. segments—from visual effects, animation and its many “We provide computer graphics training aimed at giv- nuances, and game development, to computer-aided de- ing aspiring CG artists the skills and tools they need to C sign, digital art, motion graphics, and more—continue to get straight to work in the VFX and CG industries,” explains evolve, educational facilities are keeping pace through the Dominic Davenport, CEO and founder of Escape Studios.
    [Show full text]