<<

REDEFINING VFX

ENTERTAINMENT WHAT’S INSIDE

2 Introduction

4 Who uses Shōgun?

6 Key market use cases

10 Digital humans for all

12 What can you do with Shōgun?

14 What’s new in 1.4?

BUILT FROM THE GROUND UP, SHŌGUN TAKES ADVANTAGE OF VICON’S 35 YEARS’ EXPERIENCE IN MOTION CAPTURE AND THE IMPROVED TECHNOLOGY AVAILABLE IN VICON VANTAGE AND VICON VERO CAMERAS.

Today’s productions need to be achieved in "We consider Vicon the gold standard for production. real time and deliver the highest quality skeletal data in the It is unbelievably powerful" shortest time possible. Shōgun Live and Post are designed to - Dan Pack, Founder help studios of any size optimize capture and processing for Silver Spoon maximum quality results.

3 WHO USES SHŌGUN?

FILM PRODUCTION GAMES COMPANIES COMPANIES AND STUDIOS

Activision Myrkur Games Disney Bandai Plarium Digic Pictures ILM Digital Domain Square Enix Double Negative EA Ubisoft Dreamworks Warner Brothers

UNIVERSITY & SERVICE PROVIDERS GAME DEPARTMENTS

AudioMotion Neoscape DAVE School Savannah College of Art & Design Beyond Capture MOOV Drexel University Staffordshire University House of Moves Silver Spoon FIA University of Leeds Imaginarium The Capture Lab NYU Portsmouth University University of Westminster Queen Mary USC University London 5 KEY MARKET USE CASES SQUARE ENIX – VISUAL WORKS “You basically use that as training FRAMESTORE – data to constrain the hand model. FINGER CAPTURE For each subject that came in, we knew where the joints should be titan Square Enix has built placed and then also how their its reputation and its fan base by drawing hand deformed during the range of gamers deep into immersive worlds such “Most people talk with their hands motion,” says Tim. a lot,” says Richard Graham, as those of the famous series. One of the cornerstones of this CaptureLab Supervisor for VFX The model the team produced worldbuilding is the use of lengthy, house Framestore. “There's a lot of then became a reference point that increasingly lifelike cut scenes that bring gestural information that we give to Shōgun can use to interpret a hand the games' stories and characters to life, each other when we speak through movement tracked by a dramatically and one of the most powerful tools for our hands.” Despite the importance reduced number of markers. of hands in communication, finger creating them is Square's Vicon motion tracking has long been a white “Using 10, six or three markers capture stage. whale in motion capture - one of you're saying, ‘okay, 10 are in this the barriers stopping VFX artists position, we think it's going to from crossing the uncanny valley to be one of these poses, and it's Visual Works create a complete ‘digital human’. most likely to be this one. So let's put the hand in that pose,” says Since the groundbreaking release of The director of Visual Works “With a hand there's just so much Richard. “So therefore it can't ever Final Fantasy VII in 1997, Square's emphasizes the speed that the occlusion going on at any pose,” do something that the human hand Visual Works CGI studio's Vantage cameras and ō says Tim Doubleday, can’t do.” has been the driving force behind Sh gun software allow his team Product Manager for Vicon. Markers the cutscenes illuminating the to work at. Being able to quickly disappear from the view of cameras, Framestore has wasted no time publisher's games. The studio process capture data in-house and getting tangled up in processing, in putting the new process into has worked on multiple entries in output real-time animation in the and the sheer complexity of hands action. The system has been used blockbuster franchises such as Final studio is a powerful advantage, aggravates the problems further. in realtime, directly out of Shōgun, Fantasy. particularly when it comes to creating fantasy which on everything from characters in Branching out, the production The result would be messy data, might take the form of anything blockbuster movies (including house began extending itself to with models often resembling from a realistic medieval knight to a Mulan, Captain Marvel, Aeronauts, standalone CGI in 2005 with pretzels more closely than they did bobble-headed character. and Spider-Man: Far From Home) Final Fantasy VII: Advent Children, hands. through to producing reference and taking on AAA Western series Whether Visual Works is subtly material for when such as since Square's grounding and balancing fantastical To finally address the problem, animating creatures and animals. acquisition of British publisher Eidos characters or creating spectacular Framestore collaborated with Vicon in 2009. set-pieces, everything the studio on a 2017 government-funded “The takeaway for us is that when does comes back to bringing project through Innovate UK called Visual Works has used Vicon we offer fingers to our clients they beloved characters to life, such as ProDip. The plan was to come up technology since 2006 and currently always say yes, and now it's much Cloud in the recent Final Fantasy VII with a working solution over an 18 boasts a huge motion capture less of a headache.It's been a big Remake. month period. advantage for the last two years volume equipped with 100 Vantage that we've been using it,” says cameras, making it one of the Vicon’s approach was to build a Richard. largest mocap stages in Japan. virtual model of a hand skeleton In keeping with the epic scale of that Tim’s team trained to Square’s games the team often uses understand what a ‘real’ pose the large space to capture casts of looked like. To do this, they had 22 10 or more actors, while at a more subjects each place 58 markers on granular level they’re investigating their hand and perform a range of Shōgun’s new high fidelity finger key actions, meticulously tracking solver as an option for producing them as they went along. hand animation. SILVER SPOON DREXEL ANIMATION UNIVERSITY

Silver Spoon was originally conceived by founder Dan The Animation Capture & Effects Nick points to the value the system “We're looking at ways of creating Pack as a one-stop shop for visual effects support to Lab (ACE-Lab) at Drexel University’s offers. “Price-to-performance was virtual learning experiences that other businesses working in the field. Motion capture Westphal College of Media Arts hands-down the best for what we are better than in-person. What was initially a small part of the equation, but grew as & Design looks, on the surface, needed. There was nothing at can we do in these spaces that isn't part of Silver Spoon’s business and evolved into real- like a training program for VFX-led that price point that would deliver even possible when you’re here in time animation. entertainment industries. Under the that type of performance - five person?” stewardship of Nick Jushchyshyn, megapixel cameras, and several The diversity of industries that the "We're being much more involved in the creative end, Program Director for VR & hundreds of frames per second. program feeds into means that Nick and taking our technology and years of experience Immersive Media, Drexel prepares That was a whole new order of and his colleagues are constantly working in this format, and applying that to these new students not only for the visual accuracy for us.” looking to the future. “The types of opportunities and new types of engagements effects applications that exist right trajectory is then transferring these with viewers," says Pack. now but also those that are coming technologies that are being driven up five or even 10 years down A versatile approach by entertainment production into He points to developments in as the line. especially important to Silver Spoon's work. "Finger Aiming to give students a dynamic non-entertainment fields,” he says. tracking has always been a complex issue. They are The department’s news page is full mocap skillset, the department has “How would you use virtual small, they cover each other, they are complicated! of headlines about alumni working brought in subjects ranging from production in aerospace design, on high-profile projects such as Star martial artists to dancers to a troupe automotive design? How do Wars and Frozen II, but the ACE- of performers in the mold of Cirque you use that in training law Lab takes its students down less du Soleil for capture sessions. enforcement? How can you build well-trodden paths, too. In fact, it’s Collaborations have brought in the up awareness and train people Real-time animation on a national stage had a wide-ranging mission from engineering college, the education without putting them at risk? the outset. college, the law school and nursing How can you create Planters, VaynerMedia and Silver Spoon teamed up and medical students. that are genuinely visceral, that to introduce Planter’s Baby Nut to the world during a feel completely real, but they're 4.5-hour animated livestream running on Twitter during Early adopters Virtual production is an increasingly completely safe and allow people and after the 2020 Super Bowl. This was something that crucial component of ACE-Labs’ to experience all sides of a hadn’t been seen at that scale before – an animated Motion capture is a core part of programs, both for entertainment scenario? character responding live, in real time, to a worldwide the department’s offering. ACE- and for other sectors. “Right now Those are the directions that I see audience following along through Twitter. Lab was an early adopter of Vicon’s we’re investigating and deploying Vantage cameras, proud that its virtual production technologies our practitioners moving towards in Silver Spoon’s Vicon motion capture setup allowed entertainment set-up was one of towards remote teaching and the years ahead.” game actress Erica Citrin, with direction from director the first Vantage installations in the learning scenarios, incorporating Marcus Perry, to play, dance and delight viewers as US. The lab upgraded their T-Series motion capture into virtual Baby Nut throughout the entire performance. system when the department production and leveraging that for The team built a virtual, interactive bedroom for Baby moved to a larger 40ft x 40ft stage, teaching,” Nick says. Peanut ahead of time, and then created physical props complete with a huge green screen in the studio that were twice their normal size to reflect cyclorama. the fact that Baby Peanut is only half the size of the Vicon has always been leading the pack in pushing actress. Vicon’s ability to track multiple props made the mocap development and they were the first to really integration between the two seamless. nail down proper finger tracking.” “We can utilize this technology to tell engaging stories "So now, we're capturing unbelievable finger and to create rich interaction between viewers or movement, which is such a big deal, especially when consumers,” says Pack. "And if we can do it in a way, you're doing any type of real-time engagement with like with any good VFX, that makes less of a spectacle a client. It adds a depth and realism to characters that of the technology and allows people to interact with body language and facial expression alone can’t offer”, characters in a way that's more seamless, that's what says Pack. Then Shōgun, plugged into , we're all about. enables the turnaround speed that Silver Spoon needs to generate animation in real time.

9 DIGITAL HUMANS FOR ALL REALTIME FACIAL CAPTURE Vicon has built a pipeline designed to be used by USING APPLE AR KIT studios of any size looking to leverage technology used on the latest games and films. The pipeline works on a small 12 camera Vero system that makes use of Shōgun 1.3’s new high fidelity finger solver and AFFORDABLE, retargeting workflow. You can now stream your Digital ENTRY LEVEL Humans directly into a game engine and record the full CAMERA SYSTEM performance directly in engine. Utilizing Apple’s Face AR plugin every nuance of the performance is captured and delivered in the game engine at 60fps.

REALTIME CAPTURE HIGH FIDELITY WITHIN SHŌGUN LIVE OPTICAL FINGERS

RUNS IN UNREAL ENGINE 4.24 CHARACTER RIGGING FULL RETARGETING SUPPLIED BY PIPELINE USING A NEW DESIGN SHŌGUN POST www.anewdesign.studio

11 TM

Senua, Ninja Theory

TM

Cloud, Square Enix

Siren, Epic Games TM WHAT CAN YOU DO WHAT CAN YOU DO WITH SHŌGUN LIVE? WITH SHŌGUN POST?

Realtime retargeting direct into game engines Automatic gap filling and data assessment including Kassandra, Ubisoft TM without using 3rd party software innovative gap list feature

High fidelity finger solver allowing complex hand Full retargeting pipeline direct onto character fbx like sign language​ Interactive solver that runs in real-time and gives 4K SDI video camera calibration complete instantaneous results with overlay. Fully scriptable using Python or HSL.

Incredible Hulk, Marvel TM

SHŌ GUN LIVE VS SHŌ GUN POST VS

UNBREAKABLE REAL-TIME ONLY MOCAP PROVIDER TO SOLVER THAT'S BEST IN CLASS​ SUPPORT USD EXPORT FOR VIEWING ANIMATION ON FASTEST TIME TO CAPTURE IOS DEVICES (INCLUDING CALIBRATION AND RECORDING OF 3D DATA DIRECT TO DISK) TM

Snoke, ILM 13 WHAT’S NEW IN 1.4

IMPROVED CALIBRATED PYTHON 3 SUPPORT SKELETON We have added support for Python 3 in both the Live Subject Calibration skeleton hands now look more Live and Post API. This allows users to create their realistic and the shoulder position is a better fit. This own applications that interact with Shōgun using the makes the whole character more anatomically correct latest version of Python. This includes full support for and helps for a more believable performance. commands based around capture, system calibration, subject calibration and MCP review.

NEW TRACKING PANEL A brand new tracking panel makes it much easier to OCCLUSION FIXING FOR mange and create subjects and props. This panel HIGH FIDELITY FINGERS combines features from the subject panel and subject Improved occlusion fixing in Shōgun now works calibration panel and brings everything together in one better when capturing fingers. This allows users to simpler interface. Calibration, Re-calibration, perform motion where many of the markers get Retargeting and Color selection can all be setup and hidden yet the skeleton underneath still looks each subject, prop and cluster is listed in the tracking believable. panel list.

MCP REPROCESSING TO IMPROVED CLUSTER REMOVE DROPPED FRAMES ATTACHMENT Reprocess an MCP file that has dropped frames using Improved cluster workflow - clusters are now labeled either the command line or Shōgun Post. This is and attached more reliably. This means they don't especially useful for users with large systems looking become un-attached and can be placed anywhere on to capture lots of subjects and props at once. If you the subjects body. open an MCP file with dropped frames into Post it will quickly reprocess the data before opening the file. MULTI-MACHINE CUSTOM IP RANGE IMPROVED VIDEO PLAY You can now specify the IP range you want to use BACK IN MCP REVIEW for multi-machine. This opens up it’s use on existing We have drastically improved playback of video when networks and allows slave machines to help with reviewing data using MCP review. It’s now possible processing tasks like Reconstruction, Labeling to review 4 * 4K video alongside the overlaid motion or Solving. capture data.

15 For more information visit our website or contact us: www.vicon.com/vfx www.vicon.com/shōgun [email protected]

VICON DENVER VICON LA VICON OXFORD © Copyright 2020 Vicon Motion 7388 S. Revere Parkway 3750 S. Robertson 6, Oxford Industrial Park Systems Limited. All rights reserved. Suite 901 Boulevard, Suite 100 Yarnton Vicon® is a registered trademark of Centennial Culver City Oxford Oxford Metrics plc. Other product CO 80112 , USA CA 90232, USA OX5 1QU and company names herein may be the trademarks of their respective T:+1.303.799.8686 T:+1.310.437.4499 T:+44.1865.261800 owners. F:+1.303.799.8690 F:+1.310.388.3200