Avid® Maestro | Virtual Set User Guide Version 2019.8 Legal Notices Product specifications are subject to change without notice and do not represent a commitment on the part of , Inc.

This product is subject to the terms and conditions of a software license agreement provided with the software. The product may only be used in accordance with the license agreement.

This product may be protected by one or more U.S. and non-U.S patents. Details are available at www.avid.com/patents.

This document is protected under copyright law. An authorized licensee of may reproduce this publication for the licensee’s own use in learning how to use the software. This document may not be reproduced or distributed, in whole or in part, for commercial purposes, such as selling copies of this document or providing support or educational services to others. This document is supplied as a guide for . Reasonable care has been taken in preparing the information it contains. However, this document may contain omissions, technical inaccuracies, or typographical errors. Avid Technology, Inc. does not accept responsibility of any kind for customers’ losses due to the use of this document. Product specifications are subject to change without notice.

Copyright © 2019 Avid Technology, Inc. and its licensors. All rights reserved.

The following disclaimer is required by Epic Games, Inc.: Unreal® is a trademark or registered trademark of Epic Games, Inc. in the United States of America and elsewhere.

Unreal® Engine, Copyright 1998 - 2019 Epic Games, Inc. All rights reserved."

The following disclaimer is required by Apple Computer, Inc.: APPLE COMPUTER, INC. MAKES NO WARRANTIES WHATSOEVER, EITHER EXPRESS OR IMPLIED, REGARDING THIS PRODUCT, INCLUDING WARRANTIES WITH RESPECT TO ITS MERCHANTABILITY OR ITS FITNESS FOR ANY PARTICULAR PURPOSE. THE EXCLUSION OF IMPLIED WARRANTIES IS NOT PERMITTED BY SOME STATES. THE ABOVE EXCLUSION MAY NOT APPLY TO YOU. THIS WARRANTY PROVIDES YOU WITH SPECIFIC LEGAL RIGHTS. THERE MAY BE OTHER RIGHTS THAT YOU MAY HAVE WHICH VARY FROM STATE TO STATE.

The following disclaimer is required by Sam Leffler and Silicon Graphics, Inc. for the use of their TIFF library: Copyright © 1988–1997 Sam Leffler  Copyright © 1991–1997 Silicon Graphics, Inc.

Permission to use, copy, modify, distribute, and sell this software [i.e., the TIFF library] and its documentation for any purpose is hereby granted without fee, provided that (i) the above copyright notices and this permission notice appear in all copies of the software and related documentation, and (ii) the names of Sam Leffler and Silicon Graphics may not be used in any advertising or publicity relating to the software without the specific, prior written permission of Sam Leffler and Silicon Graphics.

THE SOFTWARE IS PROVIDED “AS-IS” AND WITHOUT WARRANTY OF ANY KIND, EXPRESS, IMPLIED OR OTHERWISE, INCLUDING WITHOUT LIMITATION, ANY WARRANTY OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.

IN NO EVENT SHALL SAM LEFFLER OR SILICON GRAPHICS BE LIABLE FOR ANY SPECIAL, INCIDENTAL, INDIRECT OR CONSEQUENTIAL DAMAGES OF ANY KIND, OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER OR NOT ADVISED OF THE POSSIBILITY OF DAMAGE, AND ON ANY THEORY OF LIABILITY, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

The following disclaimer is required by the Independent JPEG Group: This software is based in part on the work of the Independent JPEG Group.

This Software may contain components licensed under the following conditions: Copyright (c) 1989 The Regents of the University of California. All rights reserved.

Redistribution and use in and binary forms are permitted provided that the above copyright notice and this paragraph are duplicated in all such forms and that any documentation, advertising materials, and other materials related to such distribution and use acknowledge that the software was developed by the University of California, Berkeley. The name of the University may not be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED ``AS IS'' AND WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, WITHOUT LIMITATION, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.

Copyright (C) 1989, 1991 by Jef Poskanzer.

Permission to use, copy, modify, and distribute this software and its documentation for any purpose and without fee is hereby granted, provided that the above copyright notice appear in all copies and that both that copyright notice and this permission notice appear in supporting documentation. This software is provided "as is" without express or implied warranty.

Copyright 1995, Trinity College Computing Center. Written by David Chappell.

Permission to use, copy, modify, and distribute this software and its documentation for any purpose and without fee is hereby granted, provided that the above copyright notice appear in all copies and that both that copyright notice and this permission notice appear in supporting documentation. This software is provided "as is" without express or implied warranty.

Copyright 1996 Daniel Dardailler.

Permission to use, copy, modify, distribute, and sell this software for any purpose is hereby granted without fee, provided that the above copyright notice appear in all copies and that both that copyright notice and this permission notice appear in supporting documentation, and that the name of Daniel Dardailler not be used in advertising or publicity pertaining to distribution of the software without specific, written prior permission. Daniel Dardailler makes no representations about the suitability of this software for any purpose. It is provided "as is" without express or implied warranty.

Modifications Copyright 1999 Matt Koss, under the same license as above.

2 Copyright (c) 1991 by AT&T.

Permission to use, copy, modify, and distribute this software for any purpose without fee is hereby granted, provided that this entire notice is included in all copies of any software which is or includes a copy or modification of this software and in all copies of the supporting documentation for such software.

THIS SOFTWARE IS BEING PROVIDED "AS IS", WITHOUT ANY EXPRESS OR IMPLIED WARRANTY. IN PARTICULAR, NEITHER THE AUTHOR NOR AT&T MAKES ANY REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.

This product includes software developed by the University of California, Berkeley and its contributors.

The following disclaimer is required by Nexidia Inc.: © 2010 Nexidia Inc. All rights reserved, worldwide. Nexidia and the Nexidia logo are trademarks of Nexidia Inc. All other trademarks are the property of their respective owners. All Nexidia materials regardless of form, including without limitation, software applications, documentation and any other information relating to Nexidia Inc., and its products and services are the exclusive property of Nexidia Inc. or its licensors. The Nexidia products and services described in these materials may be covered by Nexidia's United States patents: 7,231,351; 7,263,484; 7,313,521; 7,324,939; 7,406,415, 7,475,065; 7,487,086 and/or other patents pending and may be manufactured under license from the Georgia Tech Research Corporation USA.

The following disclaimer is required by Paradigm Matrix: Portions of this software licensed from Paradigm Matrix.

The following disclaimer is required by Ray Sauers Associates, Inc.: “Install-It” is licensed from Ray Sauers Associates, Inc. End-User is prohibited from taking any action to derive a source code equivalent of “Install-It,” including by reverse assembly or reverse compilation, Ray Sauers Associates, Inc. shall in no event be liable for any damages resulting from reseller’s failure to perform reseller’s obligation; or any damages arising from use or operation of reseller’s products or the software; or any other damages, including but not limited to, incidental, direct, indirect, special or consequential Damages including lost profits, or damages resulting from loss of use or inability to use reseller’s products or the software for any reason including copyright or patent infringement, or lost data, even if Ray Sauers Associates has been advised, knew or should have known of the possibility of such damages.

The following disclaimer is required by Videomedia, Inc.: “Videomedia, Inc. makes no warranties whatsoever, either express or implied, regarding this product, including warranties with respect to its merchantability or its fitness for any particular purpose.”

“This software contains V-LAN ver. 3.0 Command Protocols which communicate with V-LAN ver. 3.0 products developed by Videomedia, Inc. and V- LAN ver. 3.0 compatible products developed by third parties under license from Videomedia, Inc. Use of this software will allow “frame accurate” editing control of applicable videotape recorder decks, videodisc recorders/players and the like.”

The following disclaimer is required by Altura Software, Inc. for the use of its Mac2Win software and Sample Source Code: ©1993–1998 Altura Software, Inc.

The following disclaimer is required by Ultimatte Corporation: Certain real-time compositing capabilities are provided under a license of such technology from Ultimatte Corporation and are subject to copyright protection.

The following disclaimer is required by 3Prong.com Inc.: Certain waveform and vector monitoring capabilities are provided under a license from 3Prong.com Inc.

The following disclaimer is required by Interplay Entertainment Corp.: The “Interplay” name is used with the permission of Interplay Entertainment Corp., which bears no responsibility for Avid products.

This product includes portions of the Alloy Look & Feel software from Incors GmbH.

This product includes software developed by the Apache Software Foundation (http://www.apache.org/).

© DevelopMentor

This product may include the JCifs library, for which the following notice applies: JCifs © Copyright 2004, The JCIFS Project, is licensed under LGPL (http://jcifs.samba.org/). See the LGPL.txt file in the Third Party Software directory on the installation CD.

Avid Interplay contains components licensed from LavanTech. These components may only be used as part of and in connection with Avid Interplay.

This product includes the Warlib library, for which the following notice applies: Copyright Jarle (jgaa) Aase 2000 - 2009

COPYRIGHT file which is included in the distribution:

warlib is copyright Jarle (jgaa) Aase 2000 - 2009

The warlib C++ Library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 3.0 of the License, or (at your option) any later version.

The warlib C++ Library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.

Portions copyright © 2012 Avid Technology, Inc.

3 Attn. Government User(s). Restricted Rights Legend U.S. GOVERNMENT RESTRICTED RIGHTS. This Software and its documentation are “commercial computer software” or “commercial computer software documentation.” In the event that such Software or documentation is acquired by or on behalf of a unit or agency of the U.S. Government, all rights with respect to this Software and documentation are subject to the terms of the License Agreement, pursuant to FAR §12.212(a) and/or DFARS §227.7202-1(a), as applicable.

Trademarks 003, 192 Digital I/O, 192 I/O, 96 I/O, 96i I/O, Adrenaline, AirSpeed, ALEX, Alienbrain, AME, AniMatte, Archive, Archive II, Assistant Station, AudioPages, AudioStation, AutoLoop, AutoSync, Avid, Avid Active, Avid Advanced Response, Avid DNA, Avid DNxcel, Avid DNxHD, Avid DS Assist Station, Avid , Avid Liquid, Avid Media Engine, Avid Media Processor, Avid MEDIArray, Avid Mojo, Avid Remote Response, Avid , Avid Unity ISIS, Avid VideoRAID, AvidRAID, AvidShare, AVIDstripe, AVX, Beat Detective, Beauty Without The Bandwidth, Beyond Reality, BF Essentials, Bomb Factory, Bruno, C|24, CaptureManager, ChromaCurve, ChromaWheel, Cineractive Engine, Cineractive Player, Cineractive Viewer, Color Conductor, Command|24, Command|8, Control|24, Cosmonaut Voice, CountDown, d2, d3, DAE, D-Command, D-Control, Deko, DekoCast, D- Fi, D-fx, Digi 002, Digi 003, DigiBase, Digidesign, Digidesign Audio Engine, Digidesign Development Partners, Digidesign Intelligent Noise Reduction, Digidesign TDM Bus, DigiLink, DigiMeter, DigiPanner, DigiProNet, DigiRack, DigiSerial, DigiSnake, DigiSystem, Digital Choreography, Digital Nonlinear Accelerator, DigiTest, DigiTranslator, DigiWear, DINR, DNxchange, Do More, DPP-1, D-Show, DSP Manager, DS-StorageCalc, DV Toolkit, DVD Complete, D-Verb, Eleven, EM, Euphonix, EUCON, EveryPhase, Expander, ExpertRender, Fader Pack, Fairchild, FastBreak, Fast Track, Film Cutter, FilmScribe, Flexevent, FluidMotion, Frame Chase, FXDeko, HD Core, HD Process, HDpack, Home-to-Hollywood, HYBRID, HyperSPACE, HyperSPACE HDCAM, iKnowledge, Image Independence, Impact, Improv, iNEWS, iNEWS Assign, iNEWS ControlAir, InGame, Instantwrite, Instinct, Intelligent Content Management, Intelligent Digital Actor Technology, IntelliRender, Intelli-Sat, Intelli-sat Broadcasting Recording Manager, InterFX, Interplay, inTONE, Intraframe, iS Expander, iS9, iS18, iS23, iS36, ISIS, IsoSync, LaunchPad, LeaderPlus, LFX, Lightning, Link & Sync, ListSync, LKT-200, Lo-Fi, MachineControl, Magic Mask, Make Anything Hollywood, make manage move | media, Marquee, MassivePack, Massive Pack Pro, Maxim, Mbox, , MediaFlow, MediaLog, MediaMix, Media Reader, Media Recorder, MEDIArray, MediaServer, MediaShare, MetaFuze, MetaSync, MIDI I/O, Mix Rack, Moviestar, MultiShell, NaturalMatch, NewsCutter, NewsView, NewsVision, Nitris, NL3D, NLP, NSDOS, NSWIN, OMF, OMF Interchange, OMM, OnDVD, Open Media Framework, Open Media Management, Painterly Effects, Palladium, Personal , PET, Podcast Factory, PowerSwap, PRE, ProControl, ProEncode, Profiler, Pro Tools, Pro Tools|HD, Pro Tools LE, Pro Tools M-Powered, Pro Transfer, QuickPunch, QuietDrive, Realtime Synthesis, Recti-Fi, Reel Tape Delay, Reel Tape Flanger, Reel Tape Saturation, Reprise, Res Rocket Surfer, Reso, RetroLoop, Reverb One, ReVibe, Revolution, rS9, rS18, RTAS, Salesview, Sci-Fi, Scorch, ScriptSync, SecureProductionEnvironment, Serv|GT, Serv|LT, Shape-to-Shape, ShuttleCase, Sibelius, SimulPlay, SimulRecord, Slightly Rude Compressor, Smack!, Soft SampleCell, Soft-Clip Limiter, SoundReplacer, SPACE, SPACEShift, SpectraGraph, SpectraMatte, SteadyGlide, Streamfactory, Streamgenie, StreamRAID, SubCap, Sundance, Sundance Digital, SurroundScope, Symphony, SYNC HD, SYNC I/O, Synchronic, SynchroScope, Syntax, TDM FlexCable, TechFlix, Tel-Ray, Thunder, TimeLiner, Titansync, Titan, TL Aggro, TL AutoPan, TL Drum Rehab, TL Everyphase, TL Fauxlder, TL In Tune, TL MasterMeter, TL Metro, TL Space, TL Utilities, tools for storytellers, Transit, TransJammer, Trillium Lane Labs, TruTouch, UnityRAID, Vari-Fi, Video the Web Way, VideoRAID, VideoSPACE, VTEM, Work-N-Play, Xdeck, X-Form, Xmon and XPAND! are either registered trademarks or trademarks of Avid Technology, Inc. in the United States and/or other countries.

Avid Maestro | Virtual Set User Guide v2019.8 • Created 8/14/19 • This document is distributed by Avid in online (electronic) form only, and is not available for purchase in printed form.

4 Contents

Symbols and Conventions ...... 8 Chapter 1 Introduction ...... 9 What is Maestro | Virtual Set? ...... 9 Maestro | Virtual Set Workflow...... 10 System Requirements...... 11 Licensing ...... 11 Runtime...... 11 Authoring ...... 12 Features...... 12 Chapter 2 Working with 4 ...... 13 Getting Started ...... 13 Unreal Engine 4 Tutorials ...... 13 Unreal Engine Game Framework...... 13 What is Blueprint and its Advantage over C++ ...... 14 Learning Blueprint ...... 14 Instructions ...... 14 Blueprint Communication ...... 14 Editor Tips and Tricks ...... 15 Authoring Using Unreal Engine...... 15 Chapter 3 Authoring with Unreal Editor ...... 16 Maestro | Virtual Set Authoring Workflow ...... 17 Avid Maestro | Virtual Set Solution for Unreal Editor ...... 19 Supporting Blueprint Projects ...... 19 Project resources hierarchy...... 19 Changing Scenes on Air ...... 20 Preview for Scene Creator ...... 20 Way of Handling Clips...... 20 Way of Handling Video Insertions...... 20 Plugins Support...... 20 Project Preparation ...... 20 New Scene with Starter Content ...... 22 Importing Content...... 23 Creating Map ()...... 24 Preview ...... 24 Final Cooking ...... 24 Unreal Editor Extensions ...... 24

5 Working with Exports ...... 25 Example 1: Registering of text content for a TextRenderActor object ...... 26 Example 2: Registering of an X axis position for a simple Cone object ...... 26 Helper Functions...... 27 Working with Animations ...... 28 Skeleton Animations...... 32 Working with Alpha Matte ...... 34 Augmented Reality Objects ...... 34 Garbage Objects ...... 34 Alpha Matte Composition (Helpers) ...... 36 AR Objects ...... 36 Garbage Objects ...... 37 Alpha Matte Composition (Manual)...... 40 Garbage Objects ...... 40 Generating Garbage Matte Objects ...... 40 Working with Depth of Field (DoF) ...... 43 DoF Example Scenarios ...... 47 Diagnostics ...... 48 Working with Clip Insertions...... 49 Authoring Workflow ...... 49 Preparing the Virtual Screen ...... 49 Advanced Workflow...... 50 Exports ...... 53 Runtime ...... 54 Working with Video Insertions ...... 55 Authoring Workflow ...... 55 Preparing the Virtual Screen...... 55 Advanced Workflow...... 56 Using Exports ...... 59 Runtime ...... 60 Working with NDI® Insertions ...... 61 Authoring Workflow ...... 61 Preparing the Virtual Screen...... 61 Advanced Workflow...... 64 Exports ...... 68 Runtime ...... 69 Working with Unreal Planar Reflections ...... 70 Enabling Planar Reflections ...... 70 Troubleshooting Planar Reflections ...... 71 Working with AvidTalent ...... 72

6 Export Examples...... 75 Global Exports ...... 75 Working with Animated Camera Movement...... 76 Blueprint ...... 76 Sequencer ...... 77 Chapter 4 Building an Unreal Engine Scene ...... 80 Building Scenes with the “Avid Base Template” ...... 80 Chapter 5 Troubleshooting ...... 85 How to Improve the Performance of Maestro | Virtual Set ...... 85 Frequently Asked Questions ...... 89 Appendix A Guidelines for exporting from 3Ds Max® to Unreal ...... 92 Introduction ...... 92 Animations ...... 92 Materials...... 93 Baked Texture Method ...... 97 Appendix B Guidelines to Configure Mixing in Maestro | Virtual Set ...... 100 Introduction ...... 100 How Does It Work? ...... 100 Mixing FullScreen Options ...... 101 AR Objects ...... 101 Translucent Materials Issue ...... 103 Garbage...... 103 Example Configuration ...... 104

7 Using This Guide

Symbols and Conventions

Avid documentation uses the following symbols and conventions:

Symbol or Convention Meaning or Action

A note provides important related information, reminders, n recommendations, and strong suggestions.

A caution means that a specific action you take could cause harm to c your computer or cause you to lose data.

A warning describes an action that could cause you physical harm. w Follow the guidelines in this document or on the unit itself when handling electrical equipment.

> This symbol indicates menu commands (and subcommands) in the order you select them. For example, File > Import means to open the File menu and then select the Import command.

This symbol indicates a single-step procedure. Multiple arrows in a list indicate that you perform one of the actions listed.

(Windows), (Windows This text indicates that the information applies only to the specified only), (Macintosh), or operating system, either Windows or Macintosh OS X. (Macintosh only)

Bold font Bold font is primarily used in task instructions to identify user interface items and keyboard sequences.

Italic font Italic font is used to emphasize certain words and to indicate variables.

Courier Bold font Courier Bold font identifies text that you type.

Ctrl+key or mouse action Press and hold the first key while you press the last key or perform the mouse action. For example, Command+Option+C or Ctrl+drag.

| (pipe character) The pipe character is used in some Avid product names, such as Interplay | Production. In this document, the pipe is used in product names when they are in headings or at their first use in text. 1 Introduction

In this section: • Maestro | Virtual Set Workflow • System Requirements • Licensing

What is Maestro | Virtual Set?

A full Avid Maestro | Virtual Set system is made up of 3 software modules: • Maestro | Virtual Set Authoring - set of plugins and templates for the Unreal® Editor tool. • Control Application - control Maestro | Virtual Set Engine and broadcast the graphics created with Maestro | Virtual Set Authoring, with applications such as Maestro | News or Maestro | Live. • Maestro | Virtual Set Engine - real-time application for rendering the graphics to a video output using powerful Unreal Engine on dedicated Render Device.

The Maestro | Virtual Set Authoring modules together with Unreal Editor address all requirements for integrating your OAG applications into an automated studio. First, 3D graphic scenes are created in the Unreal Editor, where you can define animated sequences and using additional plugins determine what can be changed in real time by “exporting” functions to the Control module. Reporters can then insert or import current data into the templates using a Controller, which can be run from a remote station.

Maestro | Virtual Set Authoring takes the advantage of Unreal Editor and addresses it's power into Virtual Studio creation tool. Maestro | Virtual Set Workflow

The following workflow contains the main tasks performed when creating a scene in Maestro | Virtual Set.

Unreal Editor - New project from template

Scene Define Define Define Create assets Import assets composition Exports Animations interactions

Unreal Editor – Prepared scene

Validate scene Deploy to Runtime Machine

Unreal Engine Runtime – Controlled by Maestro|Live

Load/Unload Activate/ Control Set Export values Camera Tracking scene Deactivate scene Animations

Maestro|VirtualSet Studio Production

10 System Requirements

The optimum platform for Maestro | Virtual Set uses an HDVG4 with GTX1080 as the target rendering device with authoring on Microsoft Windows®. To get started with creating or developing with Unreal Engine 4 right away, a considerably powerful setup is recommended.

The platform for the authoring purpose should at least follow the specification for the Unreal Editor on the Windows platform.

Recommended hardware specifications for Maestro | Virtual Set: • /10 64-bit • Intel® i5 or Intel® i7 Quad Core™ processor or similar (2.5 GHz or faster for high-performance graphics) • NVIDIA® GeForce® GTX 1080 • HDD: 1TB (at least) • RAM: minimum - 16 GB; recommended -32 GB 3000MHz • 24" 1920x1080 resolution capable monitor • DVD-ROM/CD-Writer or DVD-Writer • USB Keyboard and Mouse

Licensing

Install Maestro | Virtual Set Authoring using the Installation Wizard and the provided installation guide to help you through the various installation requirements.

After the installation, the HASP ID number or sysinfo of the Authoring Machine should be noted and sent to the Avid support for licensing and registration.

Runtime

The below table lists the licenses required during the runtime.

License Version Type Features

MaestroVirtualSetEngine 2019.8 Linux - Machine code -

Video_Inputs_MVSE 1.0 Linux - Machine code +

Video_Inputs_NDI_MVSE 1.0 Linux - Machine code +

RenderEngine 2018.12 Linux - Machine code +

Video_Inputs 1.0 Linux - Machine code +

Video_Outputs 1.0 Linux - Machine code +

Video_Mixers 1.0 Linux - Machine code +

Video_Standard_3G 1.0 Linux - Machine code -

VideoEngine 2019.6 Linux - Machine code +

11 License Version Type Features

MaestroTracker 2018.9 Linux - Machine code +

Authoring

The below table lists the licenses required for authoring purposes.

License Version Type Features

MaestroVirtualSetPlugins 2019.8 Dongle, Nodelock, Floating license -

MaestroDesigner 2018.12 Dongle, Nodelock, Floating license -

Features

The following features are available for the Maestro | Virtual Set rendering components:

System Functionality Unreal Engine Render Engine

Scene Editing Unreal Editor Maestro | Designer

Tracking Data X X

Animations X X

Exports X X

Imports from Designer tools X X

Garbage Matte X X

Alpha Matte for AR objects X1 X

Newsroom System Integration X X

Scene Preview X2 X

Depth of Field X X

Clips X3 X4

Video Insertions X X

Internal Mixing X5 X

Built-in Transitions X

Built-in Charts X

Audio X

Ticker X

1. Opacity elements are mixed with the Background and not with the Camera feed. 2. Scene preview is possible in the Unreal Editor prior to the scene deployment running on Linux. Preview can’t be seen during production mode. 3. Using VideoEngine. 4. Using ClipPlayer. 5. Using external chroma key which delivers Fill and Matte from studio camera utilizing two video insertions.

12 2 Working with Unreal Engine 4

In this section: • Getting Started • Unreal Engine 4 Tutorials • Unreal Engine Game Framework • What is Blueprint and its Advantage over C++ • Learning Blueprint • Authoring Using Unreal Engine

Getting Started

Unreal Engine has an extensive documentation, available online. • To learn how to install the Engine, see Installing Unreal Engine. • To find out about the Terminology, see Unreal Engine 4 Terminology. • For a quick guide about the Editor, see Tools and Editors. • When you first run Unreal Editor, the Project Browser appears. To learn more about it, see Project Browser.

Unreal Engine 4 Tutorials

On the Unreal Engine website, you can also find comprehensive tutorials describing the authoring process in detail: • Level Designer Quick Start: https://docs.unrealengine.com/en-US/Engine/QuickStart • Realistic Rendering Example: https://docs.unrealengine.com/en-US/Resources/Showcases/ RealisticRendering • Lighting and Shadows Quick Start: https://docs.unrealengine.com/en-US/Resources/Showcases/ RealisticRendering • Landscape Quick Start Guide:https://docs.unrealengine.com/en-US/Resources/Showcases/ RealisticRendering • Video Tutorials: https://www.youtube.com/user/UnrealDevelopmentKit/ playlists?sort=dd&view=50&shelf_id=17

Unreal Engine Game Framework

If you are new to the editor, it will be useful to acquaint yourself with a few basic concepts regarding the user interface and general workflow. More information can be found in the Unreal Editor Interface guide (https://docs.unrealengine.com/en-us/Engine/UI). What is Blueprint and its Advantage over C++

The Blueprints Visual Scripting system in Unreal Engine is a complete gameplay scripting system based on the concept of using a node-based interface to create gameplay elements from within Unreal Editor. As with many common scripting languages, it is used to define object-oriented (OO) classes or objects in the engine. Objects defined using Blueprint are referred to as just “Blueprints.”

This system is extremely flexible and powerful as it provides the ability for designers to use virtually the full range of concepts and tools generally only available to programmers.

For more information on Blueprint, see Blueprints Visual Scripting Documentation (https:// docs.unrealengine.com/en-us/Engine/Blueprints).

Learning Blueprint

Instructions

You can learn Blueprint from the following Unreal Engine tutorials:

Tutorial URL

- Glossary https://docs.unrealengine.com/en-us/Engine/Blueprints/Glossary

- Getting started https://docs.unrealengine.com/en-us/Engine/Blueprints/Scripting

- Level Blueprint Editor https://docs.unrealengine.com/en-US/engine/blueprints/editor/ uibreakdowns/levelbpui

- Blueprint Class UI https://docs.unrealengine.com/en-us/Engine/Blueprints/Editor/ UIBreakdowns/ClassBPUI

- Anatomy of Blueprint https://docs.unrealengine.com/en-us/Engine/Blueprints/Anatomy

- Workflow tools https://docs.unrealengine.com/en-us/Engine/Blueprints/Workflow

- Specialized Node Groups https://docs.unrealengine.com/en-us/Engine/Blueprints/UserGuide

Blueprint Communication

To learn how the Blueprints interact with one another, visit the following pages:

Tutorial URL

- Blueprint Communication Usage https://docs.unrealengine.com/en-us/Engine/Blueprints/UserGuide/ BlueprintCommsUsage

- Direct Communication https://docs.unrealengine.com/en-us/Engine/Blueprints/UserGuide/ BlueprintComms

- Event Dispatchers https://docs.unrealengine.com/en-US/engine/blueprints/editor/ uibreakdowns/levelbpui

- Blueprint Interfaces https://docs.unrealengine.com/en-US/Engine/Blueprints/UserGuide/ Types/Interface/UsingInterfaces

14 Tutorial URL

- Casting https://docs.unrealengine.com/en-US/Engine/Blueprints/UserGuide/ CastNodes

Editor Tips and Tricks

The Blueprint Editor has a lot of productivity boosting shortcuts built-in, and while many will come naturally as you use the editor, others are a little bit hidden. The table below lists some useful Unreal Engine tips and tricks:

Tutorial URL

- Blueprint Editor Cheat Sheet https://docs.unrealengine.com/en-US/Engine/Blueprints/UserGuide/ CheatSheet

- Best Practices https://docs.unrealengine.com/en-us/Engine/Blueprints/BestPractices

- Performance and Profiling https://docs.unrealengine.com/en-US/Engine/Performance

- Blueprint Interfaces https://docs.unrealengine.com/en-US/Engine/Blueprints/UserGuide/ Types/Interface/UsingInterfaces

- Casting https://docs.unrealengine.com/en-US/Engine/Blueprints/UserGuide/ CastNodes

Authoring Using Unreal Engine

Artist Quick Guide (https://docs.unrealengine.com/en-US/Engine/Content/QuickStart) provides comprehensive information on how to create and modify export scenes to Unreal Engine. n Make sure to select Blueprint instead of C++ when starting a project. To learn about Asset Creation, visit https://docs.unrealengine.com/en-US/Engine/Content/ AssetCreation

Unreal Engine features an FBX import pipeline which allows simple transfer of content from any number of digital content creation applications that support the format. To learn more about it, visit the FBX Content Pipeline (https://docs.unrealengine.com/en-US/Engine/Content/FBX) manual.

15 3 Authoring with Unreal Editor

In this section: • Maestro | Virtual Set Authoring Workflow • Avid Maestro | Virtual Set Solution for Unreal Editor • Working with Exports • Working with Animations • Working with Alpha Matte • Working with Depth of Field (DoF) • Working with Clip Insertions • Working with Video Insertions • Working with NDI® Insertions • Working with Unreal Planar Reflections • Working with AvidTalent • Working with Animated Camera Movement

Maestro | Virtual Set supports Epic’s Unreal Engine, giving you richer, sharper graphics capabilities than ever before. With Epic’s Unreal Editor, you can produce higher resolution, more photorealistic imagery with dynamic shadows, reflections, and lighting options.

Maestro | Virtual Set seamlessly integrates with Epic’s Unreal Editor to generate virtual environments. Maestro | Virtual Set Authoring Workflow

• Use one of the templates for a Blueprint based project delivered in the Maestro | Start a project Virtual Set Authoring package. from Avid template

• To create your scene, first add objects, and then define their placement in a three- Position objects dimensional space. in the Editor Viewport

• You can define unique characteristics for each object added to the scene, customizing the Define groups geometric properties of the objects according to your requirements. and geometric properties

• You can give character to the scene by defining attributes such as color, material, Define surface texture and lighting for each object. attributes

17 • You can export most parameters to the Maestro | Live Controller, where data can be Export and edit imported and manipulated in real-time. real-time parameters

• Unreal Editor enables you to animate almost anything in your scene and control it in real- time by means of the Maestro | Live Create and edit Controller. animations

• Unreal Editor allows you to render preview of a scene and test it. In-Editor Testing (Play & Simulate)

• Project can be saved on any storage for future use. Save Scene

18 Avid Maestro | Virtual Set Solution for Unreal Editor

There are a few basic concepts which distinguish the Maestro | Virtual Set workflow from a regular game development by means of Unreal Editor.

Supporting Blueprint Projects

Our solution is only supporting Blueprint type projects. When starting a new project for the authoring purpose, you should follow the Blueprint approach and avoid C++.

Project resources hierarchy

All resources related to the project need to be stored in a single folder inside Content. The structure below that folder is not important and can be organized freely. It also applies to all asset resources binded to the project, e.g. Marketplace.

Example 1:

Project VirtualStudioA was created in Unreal Editor. The whole tree contains resources (like levels, materials, textures, blueprints, etc.) which were stored under Content/VirtualStudioA.

19 Changing Scenes on Air

Content prepared by the designer has to be cooked for the target platform and deployed to an available shared space on a HDVG machine (Control_Data).

To enable this, the Unreal Editor contains an additional button which triggers an automatic process of cooking and packing the scene for a Linux based platform. It also deploys the scene for a future use on a HDVG machine. The separation between the content and the launcher allows for a free scene manipulation in runtime.This in turn gives the controllers the possibility to load/unload content and show it without having to restart the rendering process.

Preview for Scene Creator

Additionally, the Unreal Editor can still be employed as a real preview of the scene, using the tracked camera object available as an Add-on. This way, you could adjust final output parameters and settings in the Unreal Editor. Additional controls are built into the Unreal Editor to allow simulated control over the tracked camera for testing purposes. Other cameras can only be used in Mixing mode when Working with Animated Camera Movement functionality.

Way of Handling Clips

Currently, MediaPlayer is not supported directly from an Unreal scene. Our solution takes advantage of the mixing with Render Engine functionality, which is handling all tasks related to Clips and Video Insertions.

Way of Handling Video Insertions

Video Insertions can be taken from the SDI/IP board (Working with Video Insertions) or in form of NDI streams (Working with NDI® Insertions).

Plugins Support

Presently, our solution supports only content plugins from the Marketplace and external sources. There are no further restrictions, as long they follow the resource structure mentioned above.

For plugins containing any C++ code the support is limited as the cooked scene is migrating between platforms and it is controlled on-air.

Project Preparation

A basic description of how to work with the Project Browser can be found in the Unreal Documentation.

Use the Blueprint project only and place all resources under an appropriate folder in Content.

20 The Avid template projects should be used as a starting point.

21 New Scene with Starter Content

Creating a new scene with the starter content option requires additional steps.

After creating a new scene using the Starter Content, you should move the content directory into the project directory:

n Because of caching inside the Unreal Editor, it is sometimes required to fix the references. To fix the references, right-click on the target folder and select the Fix Up Redirections in Folder option.

22 After moving the content directory into the right place, you can delete the empty directory:

Importing Content

Supplementary Unreal assets can be found on the Marketplace and added directly to the project using the Epic Games Launcher (as illustrated on the picture below) or migrated from a different project directly to the appropriate structure inside Content as described in the Unreal Engine Migrating Assets (https://docs.unrealengine.com/en-us/Engine/Content/Browser/UserGuide/Migrate) user guide.

23 External resources can be imported from FBX files which are created with applications such as Autodesk MotionBuilder, , and . The approach is supporting particular objects (https://docs.unrealengine.com/latest/INT/Engine/Content/FBX) as well as full scenes (https://docs.unrealengine.com/en-us/Engine/Content/FBX/FullScene).

Creating Map (level)

After the project preparation, you focus on the Level creation. Using Unreal Engine 4 terminology, a Level is made up of a collection of Static Meshes, Volumes, Lights, Blueprints and more. All working together to bring the desired virtual environment. All these elements are used afterwards to render a virtual studio on a dedicated rendering machine (HDVG). More information about Levels can be found in the Unreal Engine Levels (https://docs.unrealengine.com/en-us/Engine/Levels) manual. The actual Level name should be in line with the Project name to simplify controlling.

For preview purposes, a creator should add an AvidCamera Pawn instance to allow tracked camera preview.

Preview

Any time during the Authoring process, an author is able to play the level to see how it will be rendered. When using an AvidCamera Pawn actor on a scene, an author can take control over a tracked camera and simulate its working directly from the editor. It can be also needed to set some additional settings and post-processing effects.

Final Cooking

After finishing the map preparation, an author triggers deploying the project into Control_Data and thus makes it available to the render machine and controller.

Unreal Editor Extensions

The following Unreal Editor extensions are available: • 1 click solution to cook the map (level) package and deploy it to the Render Unit. - cross-compilation for Linux platform - store the cooked map (level) in Control_Data - mimic scene_data.xml file to simplify controlling • Tracked camera visualization • Defining exports for level components and other parameters • Additional content, e.g.: -Actors -Blueprints -Textures

24 Item Description

1 Garbage matte object creation plugin

2 Map (level) cooking and deployment

3 Additional content inside the content tree

4 Additional content inside the content browser

5 Editor viewport

6 Play the level with fake tracking

Working with Exports

Maestro | Virtual Set provides maximum flexibility for manipulation of on-air graphics in real time. By exporting parameters from Maestro | Virtual Set Authoring to the Control module, data can be sent to any actor on a scene (e.g. animated sequences, object properties, and text strings), allowing them to be updated in real time. For example, a bar may grow and change color in real time to represent the score and colors of the winning team.

Defining Exports is straightforward thanks to the built-in Blueprint graph editor in the Unreal Editor. Avid Maestro | Virtual Set Authoring plugins allow to connect any action and object property to the named Export, which is easy to control from an external controller.

An export is a predefined user function available in the Blueprint Editor. The name of the function is the actual name of the export and an external controller can use it to send data through a common communication protocol (ReTalk3). The solution combines simple use cases as well as complex functionalities involving multiple objects and math, which allows for animated actions based on real- time data.

25 Example 1: Registering of text content for a TextRenderActor object

The below procedure describes how to register content for a particular TextRenderActor object.

To register the text content for a TextRenderActor object: 1. Click the Create Export button in the Blueprint Editor. A new function with predefined signature is added to the list. 2. Change the function name to something meaningful. In this example, Text_Value is used.

The actual name of the export function can be changed anytime. It will be set in a scene during the n Deploy process.

3. Create logic inside the function. In our example, text for a particular TextRender is set.

Example 2: Registering of an X axis position for a simple Cone object

The below procedure describes how to register the X axis position of a Cone object.

To register the X axis position of a Cone object: 1. Click the Create Export button in the Blueprint Editor. A new function with predefined signature is added to the list. 2. Change the function name to something meaningful. In this example, Position_X is used. 3. Create logic inside the function. In our example, text for a particular Cone object is set.

26 The list of available exports will be visible inside the Controller UI and the real value can be changed in runtime.

Helper Functions

Additional helper functions can be used in the BlueprintEditor for various actions such as connecting to exports.

The AvidCamera plugin provides helper functions to manipulate some of the parameters:

DoF Enable

Enable/Disable Depth Of Focus (DoF).

Value: 0/1

DoF Set Aperture

Set Aperture if not provided by camera tracking.

Value: Aperture as number (eg 5.6)

27 DoF Use Real Camera Focus

Use focus distance from camera tracking.

Value: 0/1

DoF Use Real Camera Aperture

Use the Aperture value from camera tracking (if provided).

Value: 0/1

Working with Animations

The SceneLoader supports animations done by a MatineeActor or LevelSequence.

To create an animation with a Matinee Actor: 1. Add a Matinee Actor to the level,

or edit a Matinee Actor.

28 2. Add an empty group.

3. Add a track, for example a movement track.

4. Add actor(s) to the group. a. Select the actor(s). b. Add the selected actor(s) by right-clicking on the group.

29 5. Add keyframes. a. Move the time slider to the desired time position.

b. Move the actor (movement track). c. Add the keyframe.

n Repeat these steps to add more keyframes. To have the correct animation length in a Controller, keep the timeline length the same as the last n keyframe of the animation.

To create a Level Sequence animation: 1. Add a Level Sequence to the level.

2. Add a track for a specific actor.

3. Right-click on the track and change Properties > When finished to Keep State.

30 4. Add keyframes. a. Move the time slider to the desired time position. b. Change the value of the property that you are animating. c. Add the keyframe.

n Repeat these steps to add more keyframes. To have the correct animation length in a Controller, keep the timeline length the same as the last n keyframe of the animation.

31 Skeleton Animations

Skeleton animations can be controlled as simple exports.

In the example below, the idea is to have the same objects with the same animations sequences, but not running in a parallel motion. As presented on the image below:

Offset for each of players are controlled by the following property Animation > Initial Position:

32 Using The Blueprint Editor, you can attach an Avid export to the selected property, as illustrated on the example below:

33 Working with Alpha Matte

The Maestro | Virtual Set solution is able to produce a Fill+Key output. The Key consists of Matt for augmented reality (AR) objects and of garbage Matt. It can be used by ChromaKey to mix Virtual Studio background, camera feed foreground and AR elements on top of that.

Augmented Reality Objects

AR objects should be rendered on top of a talent in the Virtual Studio.

Properties

For translucent objects, Maestro | Virtual Set generates the Alpha Matte value according to their Opacity.

n Translucent objects will be mixed with the Background graphics and not with the talent. Garbage Objects

Garbage objects are used when overshooting to composite a Virtual Studio into the shot that is larger than the green screen.

Properties

These objects are not rendered on the Fill output - they exists only in the Key.

34 These objects do not take depth into account - the whole Garbage geometry will be visible on the Key output.

Editor View

Fill + Key

The Alpha Matte value is according to Opacity of Garbage objects to allow for creation of smooth edges.

35 Alpha Matte Composition (Helpers)

AR Objects

To create Alpha Matte composition for AR objects: 1. Select an object from World Outliner which should be visible on top of a talent.

2. Click Add Component in the Details panel.

36 3. Start typing “alp” in the Search box to find the AlphaMatteSceneComponent and press Enter.

4. A component is added which will mark the object in the runtime for AR use case.

Garbage Objects

To create garbage objects, you need to add a special object “Garbage Root Actor” on the top of the hierarchy. Then, all child geometry will be recognized as garbage objects. The “Garbage Root Actor” object has all types of transformations blocked in the editor mode. It can only be modified in the game mode.

To manually create garbage matte objects: 1. Select an object “Garbage Root Actor” from the Modes tab and drag it into scene.

2. Create geometry of the garbage objects, according to your needs (may be complex). In this example we will use 2 Planes.

37 3. Create material for the garbages objects and apply it to planes.

38 4. Select all garbage objects and change Mobility to “Movable”.

5. Select all garbage objects and move them under the GarbageRootActor.

39 Alpha Matte Composition (Manual)

Garbage Objects

All child objects of GarbageRootActor will be treated as garbage objects which means that they will not be seen in the Fill viewport, but they will be visible in the Key viewport.

Generating Garbage Matte Objects

Garbage matte objects allow to create objects to mask part of a scene outside of the green or blue box, to generate chroma key outside of it.

The blur part of garbage matte objects allows for a smooth transition between the keying areas.

To generate a garbage matte object: 1. Open the Avid Garbage Matte Editor. It allows to define a size for each of garbage matte object part. Additionally, it also gives the option to select one of two ways of garbage matte generation: wrapped blur part and not wrapped.

40 2. Enter the dimension values in cm, specific to your scene (1-4). Select the parts of the garbage matte object that you want to generate (5). Select the type of the object (Wrapped/ not wrapped) (6). Press the 'Create' button (7).

3. After pressing the 'Create' button, the FBX Scene Import Option window appears. Press the 'Import' button.

41 4. The window showing the progress of import appears for a few seconds.

5. The geometry of the garbage matte object appears in your scene.

In the Content Browser window, you can see the components of the garbage matte object.

In the World Outliner you can see all components. As you can see GarbageRootActor was created and garbage object is child of GarbageRootActor.

42 Working with Depth of Field (DoF)

An in-depth description of the Depth of Field feature can be found in the Unreal Documentation at n https://docs.unrealengine.com/en-us/Engine/Rendering/PostProcessEffects/DepthOfField.

Depth of Field is broken up into three layers - Near, Far, and the Focal Region - each of which are processed separately and then later composited together to achieve the final effect. Objects in the near and far layers - objects not in the focal region - are always fully blurred. Then, these layers are blended with the non-blurred scene. • Objects within the focal region use the non-blurred scene layer. • Objects in the near or far layers, but outside any transition regions, are fully blended to the blurred layer. • Objects within a transition region are blended linearly between the non-blurred scene layer and their blurred layer based on their position within the transition region.

Circle Depth of Field

Important parameters (which could be exposed as export parameters): • Enable/Disable • Aperture F-stop - it defines the Iris • Focal distance in cm • Depth Blur km for 50% • Depth Blur Radius - Depth blur radius in pixels at 1920x • Focal Region in cm

• Near Transition Range in cm

43 • Far Transition Range in cm

Depth of Field Implementation Using Information from Tracking Camera

Parameter Description

Overwrite depth of field settings When enabled, it sets the current method of DoF depending on the next field - “Depth of field method” and changes the max blur size depending on the field “Depth of field max blur %“.

DoF method Select the DoF calculation method.

DoF max blur % Change the max blur size.

Calculate DoF distance When enabled, it calculates the Focus plane distance from tracking.

44 Parameter Description

Focus plane distance Value taken from Maestro | Tracker. It represents the real focal plane.

Near Transition Value of the transitional blur area before the Focal region.

Far Transition Value of the transitional blur area after the Focal region.

Focal region The value represents the size of the area that will be sharp.

Use real camera focus Switch between two methods 1: normalized focal plane value, 2: real focal plane value.

Use real camera aperture Switch between two methods 1: manual set, 2: real camera aperture value.

45 Adding Focus Plane Values to Lens Configuration File

The focal plane values can be calibrated manually in relation to the focus ring position and added to the lens calibration file.

To do so, position the focus ring on the lens on each distance mark (e.g. 3m distance mark set on the lens):

Next, read the focus sensor value on the ASB-9 (go to STATUS > FOCUS):

or in the TsDispCamera with the -r option:

46 Using these values, create a FOCUS PLANE section at the beginning of the lens.ecf file, just after the comments and the CCD SIZE field:

FOCUS PLANE nr_of_samples

Focus_value_1 Distance_value_1

Focus_value_2 Distance_value_2 n Distance inside the lens.cfg file is entered in cm. c The focus values order must be the same as in the FOCUS NODAL section. For the Near focus position, set the lens minimum focus distance or 0 if unknown.

For the Infinity focus position, set a distance at least twice as big as the one entered in Near focus.

Example of a FOCUS PLANE section with 9 rows: CCD SIZE 2/3 INCH

FOCUS PLANE 9 257 2000 1574 1000 2451 500 3731 300 5455 200 7152 150 8915 120 11900 90 12469 50 FOCUS NODAL 20

To use the calibrated focus distances, the Unreal DoF configuration must be setup as follows: • Depth of field method: CircleDoF • Calculate depth of field: On • Depth of field region: 1 • Use real camera focus: On

DoF Example Scenarios

In this topic, two example scenarios of the Depth of Field are discussed. In the first example, we will use DoF with tracking data. The second example shows how to work with Dof without tracking data.

To work with DoF using Tracking Data: 1. Start SceneLoader (mstart.sh > Restart SceneLoader). 2. Load and Activate any scene, e.g. Avid_EpicStudio.

47 3. Adjust the floor level through GlobalExports > TrackedPawnFloorLevel. 4. Configure DoF for the currently used TrackedPawn. a. Enable the effect through GlobalExports > DofEnable. b. Enable the use of tracking data > DofUseRealCameraFocus. c. Set the aperture according to the actual lens setup (it is not tracked, thus it needs to be set manually but it should not change during production) through GlobalExports > DofAperture. d. Control the smoothness of transition regions (if needed) through GlobalExports > DofNearTransition, DofFarTransition.

To work with DoF without using Tracking Data: 1. Start SceneLoader (mstart.sh > Restart SceneLoader). 2. Load and Activate any scene, e.g. Avid_EpicStudio. 3. Adjust the floor level through GlobalExports > TrackedPawnFloorLevel. 4. Configure DoF for the currently used TrackedPawn. a. Enable the effect through GlobalExports > DofEnable. b. Disable the use of tracking data > DofUseRealCameraFocus. c. Set the aperture according to the actual lens setup (it is not tracked, thus it needs to be set manually but it should not change during production) through GlobalExports > DofAperture. d. Control the focus plane distance from the camera (in cm) through GlobalExports > DofFocusPlaneDistance. e. Control the smoothness of transition regions (if needed) through GlobalExports > DofNearTransition, DofFarTransition.

Diagnostics

Use GlobalExports > ExecuteConsoleCommand to show the VisualizeDOF option which displays visualization overlay. For more information, see Vizualizing Depth of Field in the Unreal Documentation.

48 Working with Clip Insertions

The solution allows to use the Avid VideoEngine (FastServePlayout) as a source for clip insertions inside Maestro |Virtual Set.

There are eight (8) predefined channels, easy to use out of the box. The material (Media Texture) from a single channel can be used on multiple objects in a scene and it can be changed during the runtime.

Authoring Workflow

This topic describes how to prepare a virtual screen when working with clip insertions, illustrates a more advanced workflow where a clip insertion is used on a geometry and provides an example of how exports can be implemented for this functionality.

Preparing the Virtual Screen

To prepare a virtual screen for Clip insertions: 1. Start a new project using the Avid template. 2. Make sure that the AvidMedia plugin (go to Edit > Plugins) is enabled.

c Editor restart is required after plugin activation. 3. Go to AvidMedia Content in the Content Browser.

n Make sure that the Engine Content and Plugin Content are visible in the Content Browser window.

49 4. Drag and drop a ClipInsertion actor into a scene to create a scene object in form of a virtual screen.

5. Check the Details panel for the newly created scene element and the InputNo property in the Clip Insertion section.

The value points to the predefined clip insertion channel which will be associated with the virtual n screen.

Advanced Workflow

The Clip Insertion can be used on any geometry (not just on a plane screen). To achieve that, the material should be created with sampling predefined Media Texture for appropriate channel.

To use a Clip Insertion on any geometry: 1. Follow the steps 1 - 3 listed in the To prepare a virtual screen for Clip insertions: procedure. 2. Go to the project folder in the Content Browser.

50 n In our example, the project is called StudioWithVI. 3. Right-click on an empty space (1) to display additional options and select Material (2) to create a new material.

4. Enter a name for the newly created material (MyClipIn_Mat in our example).

5. Double-click on the newly created material to open the Material Editor. 6. Drag and drop the required properties, such as Base Color, create a TextureSample node and start typing “texture”.

51 7. Bind the VEChannel_1 texture.

On this level you can create any desired effects on your material. The example above is only a simple n material displaying the texture. For more details, check the Unreal Engine Academy.

8. Save the material.

9. Use the material on an arbitrary geometry.

52 Exports

To change the channel number: 1. Open the Level Blueprint.

2. Create a new Export function and assign a relevant name to it.

3. Drag and drop the screen object reference into the function graph.

4. Select and drag the pin from the actor reference (1), (2). Start typing “setinputno” (3) and select the function (4).

53 5. Connect the graph as illustrated below.

n The conversion node between Value (String) and InputNo (Integer) will be done automatically. 6. Deploy the scene to allow for changing the channel in a Controller using the Exports functionality (using the name ChangeChannelForScreen and a number as the value).

Runtime

The availability of clip insertions is determined by the hardware configuration and the available license (VideoEngine).

54 Working with Video Insertions

The solution allows to use inputs of the DVG board as a source for video insertion in Maestro | Virtual Set.

There are eight (8) predefined channels, easy to be used out of the box. The material (Media Texture) from a single channel can be used on multiple objects in a scene and be changed during the runtime.

To find out how to configure the Maestro | Virtual Set system for mixing, see “Guidelines to Configure Mixing in Maestro | Virtual Set” on page 100.

Authoring Workflow

This topic describes how to prepare a virtual screen when working with video insertions, illustrates a more advanced workflow where a video insertion is used on a geometry and provides an example of how exports can be implemented for this functionality.

Preparing the Virtual Screen

To prepare a virtual screen for a video insertion: 1. Start a new project using the Avid template. 2. Make sure that the AvidMedia plugin is enabled (go to Edit > Plugins).

n Editor restart is required after plugin activation. 3. Go to AvidMedia Content in the Content Browser window.

n Make sure that the Engine Content and Plugin Content are visible in the Content Browser window.

55 4. Drag and drop the VideoInsertion actor into a scene. A scene object in form of a virtual screen is created.

5. Check the Details panel for the newly created scene element and the InputNo property in the Video Insertion section.

n The value points to the predefined insertion channel which will be associated with the virtual screen. Advanced Workflow

The Video Insertion can be used on any geometry (not only plane screen). In order to do that the material should be created with sampling predefined Media Texture for appropriate channel.

To use video insertion on any geometry: 1. Follow the steps 1 - 3 listed in the To prepare a virtual screen for a video insertion: procedure. 2. Go to the project folder in Content Browser.

56 n In our example, the project is called StudioWithVI. 3. Right-click on an empty space (1) to display additional options and select Material (2) to create a new material.

4. Enter a name for the newly created material (MyVidIn_Mat in our example).

5. Double-click on the newly created material to open the Material Editor. 6. Drag and drop the properties, such as Base Color, create a TextureSample node and start typing “texture”.

57 7. Bind the In1_MT texture.

On this level you can create any desired effects on your material. The example above is only a simple n material displaying the texture. For more details, check the Unreal Engine Academy. n For interlace inputs alpha channel contains polarity, so it should be masked. 8. Save the material.

9. Use the material on an arbitrary geometry.

58 Using Exports

To change the channel number: 1. Open the Level Blueprint.

2. Create a new Export function and assign a relevant name to it.

3. Drag and drop the screen object reference into the function graph.

4. Select and drag the pin from the actor reference (1), (2). Start typing “setinputno” (3) and select the function (4).

5. Connect the graph as illustrated below.

59 6. Deploy the scene to allow for changing the channel in a Controller using the Exports functionality (using the name ChangeChannelForScreen and a number as the value).

Runtime

The availability of insertions is determined by the hardware configuration and the available license (Video_Inputs_MVSE).

Video Delay

To change the video delay for a particular video input, set the global export VideoDelay to the following value:

,

As an example, the value “0,2” sets a delay 2 for stream 0. It can also be changed by the following ReTalk command:

[1] Application.Dvg.SetVideoInDelay(VidinId=0, Value=6);

Settings are stored in the Game.ini file in one section [/Script/DvgMedia.DvgMediaSettings] for each input:

[/Script/DvgMedia.DvgMediaSettings]

InputPortSettings=(MediaPort=(DeviceName="DVG",DeviceIndex=0,PortIndex=0),VideoDelay=0)

InputPortSettings=(MediaPort=(DeviceName="DVG",DeviceIndex=0,PortIndex=1),VideoDelay=2)

...

60 Working with NDI® Insertions

NDI®, NewTek’s Network Device Interface technology, enables compatible systems, devices, and applications to connect and communicate over IP to share video, audio, and data.

The Maestro | Virtual Set system allows to use the NewTek NDI as a source for video insertions.

There are four (4) predefined channels, easy to use straight out of the box. The material (NDI Media Texture) from a single channel can be used on multiple objects in a scene and can be changed during the runtime.

NewTek NDI 3.5 - the Redist binaries are included into MaestroVirtualSetAuthoring installer.

Authoring Workflow

This topic describes how to prepare a virtual screen when working with NDI insertions, illustrates a more advanced workflow where an NDI video insertion is used on a geometry and provides an example of how exports can be implemented for this functionality.

Preparing the Virtual Screen

To prepare a virtual screen for NDI video insertions: 1. Start a new project using the Avid template. 2. Enable the NDI Media Player and the AvidMedia plugin (menu: Edit >Plugins).

c Editor restart is required after plugin activation. 3. Go to NdiMedia Content in the Content Browser.

n Make sure that the Engine Content and Plugin Content are visible in the Content Browser window.

61 4. Drag and drop the NdiVideoInsertion actor into a scene. A scene object in form of a virtual screen is created.

5. Check the Details panel for the newly created scene element and the InputNo property in the Ndi Video Insertion section.

62 The value points to the predefined NDI insertion channel which will be associated with the virtual n screen.

6. To test the NDI playback during the authoring, set up the NDI source for the given channel. The procedure described below can be applied to all four predefined NDI channels.

The steps listed below are only valid during the authoring stage. Channels binding during runtime n can be done using the GlobalExport as described in the Runtime section below.

a. Search for “source” inside NdiMedia Content.

b. Double-click on Ndi Media Source. c. Set up the NDI source for channel 1.

NewTek NDI Tools can be used to produce valid test streams . Install the tools to check test patterns n and / or clips playing from VLC. Refer to the NewTek NDI documentation for details.

- Set one of the available NDI streams using Source Name in the NDI section (1), (2). - Set appropriate Color Format for the stream (UYVY or RGBA) (3). - Save the Ndi_MediaSource_1 asset.

d. Go to the Details panel of the NdiVideoInsertion actor on the scene and click the Init button.

The Init button should be used after any change on a related Ndi_MediaSource_* asset and after n playing the level in the editor.

e. The virtual screen should start playing the NDI stream after a short time.

63 Advanced Workflow

The NDI Video Insertion can be used on any geometry (not only plane screen). In order to do that the material should be created with sampling predefined NDI Media Texture for appropriate channel.

To use NDI video insertion on any geometry: 1. Follow the steps 1 - 3 listed in the To prepare a virtual screen for NDI video insertions: procedure. 2. Go to the project folder in the Content Browser.

n In our example, the project is called StudioWithNdi. 3. Right-click on an empty space (1) to display additional options and select Material (2) to create a new material.

4. Enter a name for the newly created material (MyNdi1_Mat in our example).

5. Double-click on the newly created material to open the Material Editor.

64 6. Drag and drop the properties, such as Base Color, create a TextureSample node and start typing “texture”.

7. Bind the NdiIn1_MT texture.

On this level you can create any desired effects on your material. The example above is only a simple n material displaying the texture. For more details, check the Unreal Engine Academy.

8. Save the material.

9. Use the material on an arbitrary geometry.

65 10. Test the material in the editor.

The steps listed below are only valid during the authoring stage and can be applied on all four n predefined NDI channels. Channels binding during runtime can be done using the GlobalExport as described in the Runtime section below.

a. Search for “player” (2) inside NdiMedia Content (1).

b. Double-click on NDI_MediaPlayer_1(3) to open the editor. c. Right-click on NDI_MediaPlayer_1 to open additional options and select Edit to set up the media source.

d. Set up the NDI source. - Set one of the available NDI streams using Source Name in the NDI section (1), (2). - Set appropriate Color Format for the stream (UYVY or RGBA) (3). - Save the Ndi_MediaSource_1 asset.

66 e. Go back to the Ndi_MediaPlayer_1 editor and double-click the Ndi_MediaSource_1 on the list. The stream will start to play.

f. Go to the Level editor. The geometry should have an NDI stream material mapped on it.

67 Exports

To change the channel number: 1. Open the Level Blueprint.

2. Create a new Export function and assign a relevant name to it.

3. Drag and drop the screen object reference into the function graph.

4. Select and drag the pin from the actor reference (1), (2). Start typing “setinputno” (3) and select the function (4).

5. Connect the graph as illustrated below.

68 n The conversion node between Value (String) and InputNo (Integer) will be done automatically. 6. Deploy the scene to allow for changing the channel in a Controller using the Exports functionality (using the name ChangeChannelForScreen and number as the value).

Runtime

The availability of NDI insertions is determined by license - Video_Inputs_NDI_MVSE.

Use the GlobalExport concept to control NDI streams binding during the runtime.

NdiMediaOpenSource

It opens given Source Name for appropriate NDI channel available in Maestro | Virtual Set.

Value syntax ::= , ::= Integer number from 1 to 4 inclusive pointing channel number ::= () ::= String hostname of the machine ::= String name of the NDI stream

NdiMediaSetFormat

It sets the color format for the appropriate NDI channel available in Maestro | Virtual Set. Value syntax ::= , ::= Integer number from 1 to 4 inclusive pointing channel number ::= BGRA|UYVY

NdiMediaPlay

It starts playing the given NDI channel. Value syntax ::= ::= Integer number from 1 to 4 inclusive pointing channel number

NdiMediaStop

It stops playing the given Ndi channel. Value syntax ::= ::= Integer number from 1 to 4 inclusive pointing channel number

69 Working with Unreal Planar Reflections

Unreal Engine 4 supports real-time Planar Reflection that can give more accurately looking reflections than Screen Space Reflections (SSR) but come with a higher rendering cost. This higher rendering cost occurs due to the way Planar Reflection works, as Planar Reflection actually renders the level again from the direction of the reflection. Screen Space Reflections (SSR) are much more efficient than Planar Reflections when it comes to rendering, but are also much less reliable.

Enabling Planar Reflections

To enable and use Planar Reflections in your project, follow the steps below.

To enable Planar Reflections: 1. From the Main Toolbar go to Edit > Project Settings and from the Project Settings menu, go to the Rendering > Lighting section. 2. Click on the check box next to the Support global clip plane for Planar Reflections and restart the UE4 Editor when prompted.

3. If you have any objects with assigned reflecting material, select this object and add the Planar Reflection component.

70 Troubleshooting Planar Reflections

Jagged Edges

In case you are facing problems with jagged edges as illustrated below:

Adjust the following parameters in the appropriate Planar Reflection component.

• Screen percentage - improves the quality of the reflection. • Extra FOV - helps to eliminate the jagged edges. n Increasing the values of these parameters also increases the rendering thread and the GPU cost.

71 Working with AvidTalent n This feature is only available in the Mixing mode. The AvidTalent object modifies the distance settings for the PlaneFront and PlaneBack objects that are children of the TrackedPawn (Camera) object. It allows to obtain the correct reflections and shadows of objects displayed on the PlaneFront and PlaneBack objects.

To deactivate AvidTalent, use the “hidden in game” property.

In the illustration below, it is referred to as 'distance to plane'.

You can access the AvidTalent object from the 'Modes' panel in the 'Place Tool' tab.

72 After placing it in the scene, it is visible in the 'World Outliner'.

By default, after placing an object in the scene, it is active in the game mode and it is not used during editing.

When launched in the 'Game' mode, it modifies the 'PlaneFront '(2) and 'PlaneBack '(3) distance, if the AvidTalent position is in front of the camera.

To deactivate the operation of this object, export it and set the value of the 'Actor Hidden In Game' field (1).

By changing the position of the AvidTalent object in the scene, through animations or using exports, we can influence the correct positioning of the reflection and shadows.

73 74 Export Examples

Example 1: Changing AvidTalent Location

Export argument with position is expected, string should to have format "X=value, Y=value, Z=Value"

n Z value is not important, it will be taken from studio 0 point position (TrackedPawn Z value). Example 2: Activation/deactivation of AvidTalent in Scene.

As argument in this export 0/1 value is expected.

Global Exports

For debug purposes, it is possible to visualize the Avid Talent position by setting the “ShowTalentMarker” global export to true.

A full list of available Global Exports is provided in the Controlling with Exports topic of the Avid Maestro Virtual Set Setup Guide.

75 Working with Animated Camera Movement n This feature is only available in the Mixing mode. This chapter describes how to switch between cameras using the Unreal animated camera movement.

Blueprint

Switching between cameras can be done using the Set View Target With Blend Blueprint node.

More information about the Set View Target With Blend mode can be found in the official Unreal Engine Documentation.

The following Target should be used. More information about Get Player Controller is provided in the Unreal Engine Documentation

Any Camera Actor can be used as the New View Target.

Examples

Example wrapper function (defined in Level Blueprint):

76 n Blend Time in seconds, where 0.0 means instant camera cut. Example Export function - to switch to CineCameraActor3 from the scene:

Example Export function - to switch back to TrackedPawn from the scene:

Sequencer

This topic describes how to create the animated camera movement using the Unreal Sequencer Editor.

To create a camera movement sequence: 1. Add a CameraActor (without DoF) or a CineCameraActor (when working with DoF) into the scene:

2. Create and open a Level Sequence in the Sequencer Editor. 3. Add a Camera Cut Track:

77 4. Bind a camera:

5. As the result, the track is pointing the actual rendering camera:

By default, the camera cut will be done at the beginning of the sequence so just after triggering the n start of the animation. As the result, the camera switching will be instant.

You can use this track to switch between different cameras. More details can be found in the official g Unreal Engine Documentation.

Blending between Tracked Camera and CameraActor

There is a possibility to blend between the Tracked Camera and CameraActor instead of using an instant cut. It can be achieved by shifting a camera on Camera Cuts track. The blend time is calculated as the offset between the position 0.0 and the beginning of Camera segment, e.g.:

or right-click on the Camera segment:

78 In the scenario above there will be 1 second animation with linear interpolation between Tracked Camera and CineCameraIn - the behavior similar to calling Set View Target With Blend with non Blend Time.

79 4 Building an Unreal Engine Scene

In this section: • Building Scenes with the “Avid Base Template”

Building Scenes with the “Avid Base Template”

This chapter includes a description of how to create a new Unreal Engine scene using the Avid Base Template.

To create an Unreal Engine scene with the Avid Base Template: 1. Run the Unreal Editor from the desktop shortcut.

2. Select “New Project” and choose Avid base template. Add a name for your project (in our example we will use the name Sample 2019_8).

3. Move the StarterContent to your newly created (Sample 2019_8 in our case) directory. Then, delete the StarterContent folder in the project root. 4. Add a Cube and a Sphere to the scene.

5. Go to the StarterContent and assign sample materials to the floor, the cube and the sphere.

81 6. Change the sphere’s and cube’s mobility from static to movable. This is required to enable transformation changes for these objects from animations or exports.

7. To enable alpha output for objects on the scene, you need to add an Alpha Matte Scene Component to the object instance.

8. Now you can add animations for your cube and sphere objects. For more information on animations, see Working with Animations. a. Create an animation in Level Sequence named animation_seq for the Cube:

9. Add planar reflections to the floor as described in the Working with Unreal Planar Reflections topic. 10. To create exports, open Level Blueprint. 11. Click on Export to create a new export. Name it “Cube_X”. Drag and drop the “cube” property to the Node Editor.

82 12. Drag and drop the target from the Cube Node and search for “set actor location”:

An export for changing the X position of the Cube should look like the image below:

Create a Sphere_Scale export:

13. To adjust the floor level:

83 In case the floor level of the virtual scene is not in the zero height, the camera needs to be moved n accordingly to the height of the floor (put exactly on the floor level).

a. Use Global Export (TrackedPawnPos) and send:. Value syntax ::= X=0.0, Y=0.0, Z= ::= Float number with the floor level value in [cm] - or - b. Move the camera over the point where you want to set the Camera height to the floor height level and use GlobalExport (TrackedPawnFloorLevel). You need to send anything in this export to trigger special function which moves the camera height to the floor level. 14. Now you can rebuild the lighting in the map.

After rebuilding the light, you could see the following error(s): “Static mesh actor has NULL StaticMesh property. Ignore this message. 15. After rebuilding the lighting, you are ready to deploy the scene. Click Deploy, save the entire content, confirm your selection in the summary window and wait till the scene is built.

16. If the deploy process is complete, then you should see the "/Sample2019_8/Sample2019_8” directory in your Control Data. 17. Now, the scene is prepared to be loaded from the controllers with the SceneLoader. Deploy prepares and packs the scene for running on Linux. The result of Deploy is the correct directory structure with packed scene files (Sample2019_8.pak, Sample2019_8.lst, SceneData.xml) which will be created in the G:\Projects\Sample2019_8\.

First packaging of a new project for Linux takes approx. 1 hour on 4 cores i7. Consecutive packaging n takes significantly less time.

If the catalog and scene with the same names exist, then the process will overwrite them c without prompting for a confirmation!

84 5 Troubleshooting

In this section: • How to Improve the Performance of Maestro | Virtual Set • Frequently Asked Questions

How to Improve the Performance of Maestro | Virtual Set

Built-in Diagnostic Tools

You can improve the performance of your Maestro | Virtual Set system by introducing a few changes in your working environment. Follow the steps below to increase the efficiency of your system. 1. Run SceneLoader. 2. Use GlobalExport ExecuteConsoleCommand to execute the Console Command. A full list of commands can be found in the Unreal Engine documentation (https:// docs.unrealengine.com/en-us/Engine/Performance/StatCommands). The most important commands are: - stat FPS - stat UnitGraph - stat GPU - stat Engine - stat Game - SynthBenchmark n The results will be visible in the log file - /data/SceneLoader/SceneLoader.log - HighResShot 1

A PNG image will be available in the folder: /data/SceneLoader/SceneLoader/Saved/Screenshots/ n LinuxNoEditor/ 3. Information on how to interpret this data and how to improve the overall performance of the Unreal Engine can be found in the Unreal Engine documentation (https://docs.unrealengine.com/ en-us/Engine/Performance).

Undersampling

To save rendering time (GPU), it is possible to render CGI with lower resolution than the targeted video format. To do so in the best possible way for the image quality, we are recommending to use an embedded Unreal Engine functionality. 1. Connect to the SceneLoader ReTalk3 using PuTTy. 2. Execute the following commands: The value for r.screenpercentage represents the percent of the resolution which will be used during n the rendering, calculated according to the desired output video format. You can change it according to the actual rendering time results.

The setting of r.TemporalAA.Upsampling improves the quality of the output when r.screenpercentage < 100.

Unstable Output in Specific Conditions on a Scene

If you have unstable FPS (close to realtime) then you can change the type of synchronization.

Output Color Space

By default, the SceneLoader is rendering in the Rec709 (LDR) standard.

It can be overwritten in the configuration file:

More details are available in the official Unreal Engine Documentation.

Restricted Folder Names in Content

The following folder names cannot be used inside the Content folder: •Win32 •Win64 •Mac • XboxOne • PS4 •IOS •Android • HTML5

86 • CarefullyRedist • AllDesktop • TVOS • Switch •Quail •Lumin •Windows • Microsoft •Apple • EpicInternal •Sony • NotForLicensees •NoRedist

Please use different folder names to deploy your scenes.

Tracking Data Diagnostic Tool 1. Using PuTTy, connect to the SceneLoader ReTalk3. 2. Execute the following command (or use GlobalExport - ShowTrackingDataOverlay): [1] Tracking.ShowOverlay(Value="On"); 3. It should activate the Tracking Data graph overlay.

(A) Correct window frame from Tracking Data 4. Adjust the SYNCHRO SOURCE DELAY parameter in the TrackingSet.cfg file to have data in the correct window frame (A). 5. Disable overlay using a ReTalk3 command: [1] Tracking.ShowOverlay(Value="Off");

87 In case you encounter a problem with synchronization of TrackingSet where similar statistics are g produced:

Adjust the SYNCHRO SOURCE parameter in the TrackingSet.cfg file to synchronize to the appropriate source.

88 Frequently Asked Questions

Garbage Matte cast shadows. What can I do to disable them?

To disable casting shadows from garbage matte you need to edit two materials: MaterialSolidGreen and MaterialBlurGreen. 1. Select FbxScene_GarbageMatte and double-click on Element 0 - MaterialSolidGreen to open the material editor.

2. Change Shading from Unlit to Default Lit. 3. Do the same for MaterialBlurGreen material (points 1 and 2).

89 4. Select all elements from FbxScene_GarbageMatte and disable the Cast Shadows option.

5. Rebuild the lights and deploy the scene.

Garbage Matte is reflected on shiny surfaces. What can I do to eliminate this issue?

You need to select all garbage objects and disable “Visible in Reflection Capture” option.

90 My project cannot be deployed! “Unknown Reason”

Save everything in the Unreal Editor. Close the Unreal Editor. Go to your project directory and delete folders: “Binaries, Build, Intermediate, Saved”. Now, start the Unreal Editor (using the Avid shortcut), load and deploy your project. n All shaders will be recompiled. It may take some time! Is there a way to remove the Avid logo (from dvgtest output) when the SceneLoader is stopped?

You can use the xsetroot command with -solid switch to set the whatever color you want or even -bitmap switch to use any file. This could be done in start.sh.

Is there a way to show API3 communication to SceneLoader logs?

You need to create an Engine.ini file in /data/SceneLoader/SceneLoader/Saved/ Config/LinuxNoEditor/ and add the following information there:

[Core.Log]

AvidControl=Log

Then, save the file and restart the SceneLoader.

Why my FlipFlop function in export does not work?

This function will return output A each time in any function context. Please do not use it.

91 A Guidelines for exporting from 3Ds Max® to Unreal

Introduction

The scenes imported from 3Ds Max to the Unreal Editor are in the FBX format.

The Import Into Level command allows to import full FBX scenes into their levels instead of importing assets individually. Users are given full control of exactly which assets will be imported, as well as per-asset control over import settings. The workflow also allows for selective reimporting for individual assets that receive further edits outside of Unreal.

At this time, FBX full scene import supports the following asset types: • Static Meshes • Skeletal Meshes • Animations • Materials (Basic support; may not match original material in your content creation application) •Textures • Rigid Mesh • Morph Targets • Cameras (no animation) •Lights

For more information, see “Import Into Level” (https://docs.unrealengine.com/en-us/Engine/ Content/FBX/FullScene) in the Unreal Editor Manual.

For more information, see “FBX Import Options Reference” (https://docs.unrealengine.com/en-US/ Engine/Content/FBX/ImportOptions) in the Unreal Editor Manual.

For more information, see “FBX Static Mesh Pipeline” (https://docs.unrealengine.com/en-US/ Engine/Content/FBX/StaticMeshes#ExportMesh) in the Unreal Editor Manual.

Animations

Keyframe animations created in 3Ds Max are converted into skeleton animations in the Unreal Editor The Blueprint bellow shows how to prepare a simple trigger for an animation using export. In this simple case, sending any string using an export starts playing an animation.

In a more complex case, depending on the incoming export, various actions may be taken, such as: stop, pause, play.

Materials

Preparing the stage requires knowledge of the conversion restrictions from the models used in 3Ds Max to the models used in Unreal.

Not all materials that can be used in 3DsMax can be converted to Unreal. The safest way to define material in 3Ds Max is to use a standard material.

93 In this case, the bitmap textures were pushed to most of the fields.

After conversion, the following set of connections is achieved.

Depending on the selected “Blend Mode”, we will get various ways to represent the material after conversion.

94 Mode: Opaque

Mode: Masked

Mode: Translucent

95 Mode: Additive

Mode: Modulate

Mode: Alpha Composite

96 Mode: From Texture Asset

Baked Texture Method

A common method is rendering to texture (known as “texture baking”). Its purpose is to make the created scene realistic by suturing the texture of the lighting existing in the scene. However, this method has some limitations. It can only be used if the objects do not move, and the lights do not change. It is also not applicable when objects are to be partially mirrored.

The process of baking textures from in 3Ds Max is well described in the 3ds Max Help. As a result, we receive the material as a standard model and, depending on the declared assignment method, appointment to the appropriate channel.

After being imported to the Unreal Editor, the result is an illuminated scene. Because the lighting is already sewn in the texture, you do not need to import the light separately.

97 Rendered Scene in 3Ds Max:

98 After importing to the Unreal Editor:

99 B Guidelines to Configure Mixing in Maestro | Virtual Set

Introduction

This solution allows to use inputs of a DVG board as a source for Camera feed in Maestro |Virtual Set. It is prepared to work with an external keyer device, such as Ultimatte12.

How Does It Work?

The whole composition contains a mixing plane which is positioned somewhere in the scene and behaves according to the tracking camera and a full screen mixing which is the part of PostProcessing pipeline. The mixing plane is delivering appropriate information to the engine to allow for generating proper reflections and dynamic shadows in virtual reality. The full screen mixing is responsible for adding the camera feed on top of the rendered scene.

The Intersection with the floor is the spot where the Mixing Layer and the floor are crossing; thus we can expect proper reflections and shadows near this area.

With a proper DistanceToScreen parameter, it should be possible to put a talent into a scene where his feet are directly touching the floor. This can be changed during runtime thanks to the possibility to adjust talent position during production. The alternative approach is possible using a dedicated AvidTalent Actor which binds the mixing layer to the given location in the scene coordinates.

As the Mixing Layer will be affected by all PostProcessing effects on top of that we have Mixing FullScreen post processing plane which keeps the best possible quality of the camera feed signal. Mixing FullScreen Options

The Full screen mixing is put in the appropriate place in the processing pipeline which will be applied after the ToneMapping stage. It allows to use the Camera feed as less modified as possible but it will be rendered on top of the PostProcessing effects (e.g. Lens Flares, Bloom, Tone Mapping, etc.).

Pros Cons

Mixing Plane not affected by Tone Mapping Mixing Plane not affected by Tone Mapping, Flares, Bloom, Motion Blur (which could be seen as a disadvantage).

Mixing Plane not affected by AA Issues with Trackless/Hologram mode. Smooth transition is possible after manual color correction. • Cuts are possible; • Very fast blending and blendout are recommended to keep the transition period as short as possible; • Holograms will always be rendered Before Effects; • Holograms can optionally be rendered before DepthOfField to support even more effects.

Mixing Plane not affected by Undistortion/ Distortion algorithms

AR Objects

The final composition depends on MixingDistanceToScreen value which gives the proper depth.

For opaque elements, the whole composition is based on the scene depth. In that case, there is no special attention required to provide to any AR object.

For mixing with translucent AR objects, their geometry needs to be rendered into the CustomDepth buffer.

In most cases, it will be enough to use an AlphaMatteSceneComponent marker. It should be put below any mash component of the desired AR actor, e.g.:

101 To manually mark a translucent object to be an AR object and be rendered on top of the camera (talent): 1. Select an object from the World Outliner which should be visible on top of a talent.

2. Enable the Render CustomDepth Pass option in the Rendering section for object Details.

102 Translucent Materials Issue

Custom Depth does not work on translucent materials. In that case, you need a second copy of the mesh using a simple opaque material with the Custom Depth and the Render Main pass options disabled. You can find useful examples here.

For a simple object, you can use an AlphaMatteSceneComponent helper to resolve this problem. Working with the CustomDepth rendering directly allows to solve very complicated scenarios.

To force Custom Depth on translucent materials used in an object: 1. Open the material editor:

2. Enable the Allow Custom Depth Writes option in the Translucency section for the material Details.

Because of Unreal optimizations, materials with opacity <30% are not rendered in the Custom Depth n buffer at all. The option with additional mesh is the only solution here.

Garbage

Mixing layers take all Garbage objects into account and use them to clean the mixing plane from all unwanted real studio elements.

For more information on Garbage objects, see “Garbage Objects” on page 37.

103 Example Configuration

This topic presents an example of how to configure mixing in the Maestro | Virtual Set solution.

To configure mixing: 1. Prepare Ultimatte12.

a. Connect the Studio Camera signal into Ultimatte FG input (A). b. Connect the Ultimatte Fill output (B) into Unreal input 8. c. Connect the Ultimatte MATTE output (C) into Unreal input 7. d. Configure Ultimatte to key the green color. 2. Prepare HdvgCpProfile configuration with 8 color video insertions and disabled alpha rendering. 3. Make sure a proper license for video insertions in Unreal is generated (Video_Inputs_MVSE). 4. Configure a proper usecase for Unreal (mstart.sh > Modify Configuration) using an appropriate HdvgCpProfile configuration file (A).

104 n The HdvgCpProfile file should have alpha output disabled. 5. Start the SceneLoader (mstart.sh > Restart SceneLoader). 6. From this moment the mixing should already be visible on the screen as the MixingFullScreen option is activated by default. 7. Load and Activate any scene. 8. Adjust the floor level through GlobalExports > TrackedPawnFloorLevel. 9. Configure Mixing for the currently used TrackedPawn. a. Show mixing layers through GlobalExports > MixingShowLayer, MixingShowShadows, MixingShowFullScreen. n By default, all of these options are activated after restart. b. Adjust the layer’s distance to camera: GlobalExport > MixingDistanceToScreen. n By default the distance is set to 1000 cm so it needs to be adjusted for smaller scenes. c. If required, adjust the Key signal level by going to: GlobalExport > MixingKeyBlackLevel, MixingKeyMiddleLevel, MixingKeyWhiteLevel, MixingKeyOldMiddleLevel.

Default settings are as follows: MixingKeyBlackLevel (0.0), MixingKeyMiddleLevel (0.5), n MixingKeyWhiteLevel (1.0), MixingKeyOldMiddleLevel (0.5). These values should work well with Ultimatte12.

d. Use appropriate Global Exports for Color Correction. 10. From this moment, the mixing should be already visible on the screen. 11. Adjust the delays on the tracking channel for Unreal in the related TrackingSet.cfg file to match the graphics with the camera feed.

105