Deadline User Manual Release 7.0.3.0

Thinkbox Software

June 01, 2015

CONTENTS

1 Introduction 3 1.1 Overview...... 3 1.2 Feature Set...... 7 1.3 Supported Software...... 10 1.4 Render Farm Considerations...... 25

2 Installation 31 2.1 System Requirements...... 31 2.2 Licensing...... 34 2.3 Database and Repository Installation...... 34 2.4 Client Installation...... 60 2.5 Submitter Installation...... 74 2.6 Upgrading or Downgrading Deadline...... 77 2.7 Relocating the Database or Repository...... 79

3 Getting Started 81 3.1 Submitting Jobs...... 81 3.2 Monitoring Jobs...... 91 3.3 Controlling Jobs...... 100 3.4 Archiving Jobs...... 131 3.5 Monitor and User Settings...... 135 3.6 Local Slave Controls...... 141

4 Client Applications 147 4.1 Launcher...... 147 4.2 Monitor...... 151 4.3 Slave...... 175 4.4 Pulse...... 182 4.5 Balancer...... 187 4.6 Command...... 192 4.7 Mobile...... 194

5 Administrative Features 199 5.1 Repository Configuration...... 199 5.2 User Management...... 231 5.3 Slave Configuration...... 236 5.4 Pulse Configuration...... 245 5.5 Balancer Configuration...... 249 5.6 Job Scheduling...... 255 5.7 Pools and Groups...... 258 5.8 Limits and Machine Limits...... 263

i 5.9 Job Failure Detection...... 268 5.10 Notifications...... 270 5.11 Remote Control...... 272 5.12 Network Performance...... 278 5.13 Cross Platform Rendering...... 281

6 Advanced Features 285 6.1 Manual Job Submission...... 285 6.2 Power Management...... 294 6.3 Slave Scheduling...... 304 6.4 Farm Statistics...... 306 6.5 Client Configuration...... 317 6.6 Auto Configuration...... 321 6.7 Render Environment...... 324 6.8 Multiple Slaves On One Machine...... 326 6.9 Cloud Controls...... 329 6.10 Web Service...... 332 6.11 Job Transferring...... 335

7 Scripting 339 7.1 Scripting Overview...... 339 7.2 Application Plugins...... 340 7.3 Event Plugins...... 354 7.4 Cloud Plugins...... 361 7.5 Balancer Plugins...... 366 7.6 Monitor Scripts...... 368 7.7 Job Scripts...... 375 7.8 Web Service Scripts...... 377 7.9 Standalone Python API...... 379

8 REST API 383 8.1 REST Overview...... 383 8.2 Jobs...... 385 8.3 Job Reports...... 396 8.4 Tasks...... 398 8.5 Task Reports...... 402 8.6 Slaves...... 404 8.7 Pulse...... 409 8.8 Balancer...... 411 8.9 Limits...... 413 8.10 Users...... 416 8.11 Repository...... 420 8.12 Pools...... 425 8.13 Groups...... 428

9 Application Plugins 433 9.1 3ds Command...... 433 9.2 3ds Max...... 442 9.3 After Effects...... 490 9.4 Anime Studio...... 501 9.5 Arion Standalone...... 504 9.6 Standalone...... 507 9.7 ...... 510 9.8 ...... 515 9.9 Cinema 4D Team Render...... 519 ii 9.10 Clarisse iFX...... 524 9.11 Combustion...... 529 9.12 Command Line...... 533 9.13 Command Script...... 535 9.14 Composite...... 538 9.15 Corona Standalone...... 543 9.16 DJV...... 546 9.17 Draft...... 548 9.18 Draft Tile Assembler...... 551 9.19 FFmpeg...... 553 9.20 Fusion...... 556 9.21 Generation...... 562 9.22 Hiero...... 566 9.23 ...... 569 9.24 Lightwave...... 576 9.25 LuxRender...... 583 9.26 Mantra Standalone...... 585 9.27 Maxwell...... 588 9.28 Maya...... 592 9.29 Mental Ray Standalone...... 609 9.30 Messiah...... 613 9.31 MetaFuze...... 617 9.32 MetaRender...... 620 9.33 ...... 622 9.34 Naiad...... 629 9.35 ...... 633 9.36 Octane Standalone...... 640 9.37 PRMan (Renderman Pro Server)...... 643 9.38 Puppet...... 646 9.39 Python...... 648 9.40 Quicktime Generation...... 650 9.41 Realflow...... 653 9.42 REDLine...... 660 9.43 Renderman (RIB)...... 663 9.44 Rendition...... 666 9.45 Rhino...... 669 9.46 RVIO...... 678 9.47 Salt...... 680 9.48 Shake...... 682 9.49 SketchUp...... 685 9.50 Softimage...... 688 9.51 ...... 699 9.52 Tile Assembler...... 702 9.53 VRay Distributed Rendering...... 704 9.54 VRay Standalone...... 715 9.55 VRay Ply2Vrmesh...... 718 9.56 VRay Vrimg2Exr...... 721 9.57 Vue...... 724 9.58 xNormal...... 729

10 Event Plugins 733 10.1 Draft...... 733 10.2 ftrack...... 734 10.3 Puppet...... 749

iii 10.4 Salt...... 750 10.5 Shotgun...... 751

11 Cloud Plugins 773 11.1 Amazon EC2...... 773 11.2 Google Cloud...... 774 11.3 Microsoft Azure...... 775 11.4 OpenStack...... 777 11.5 vCenter...... 778

12 Release Notes 779 12.1 Deadline 7.0.0.54 Release Notes...... 779 12.2 Deadline 7.0.1.3 Release Notes...... 807 12.3 Deadline 7.0.2.3 Release Notes...... 808 12.4 Deadline 7.0.3.0 Release Notes...... 810

iv Deadline User Manual, Release 7.0.3.0

• search

CONTENTS 1 Deadline User Manual, Release 7.0.3.0

2 CONTENTS CHAPTER ONE

INTRODUCTION

1.1 Overview

Deadline is a hassle-free administration and rendering toolkit for Windows, , and Mac OSX based render farms. It offers a world of flexibility and a wide-range of management options for render farms of all sizes, and supports over 50 different rendering packages out of the box. Deadline 7 is the latest version of Thinkbox Software’s scalable high-volume compute management solution. It fea- tures built-in VMX (Virtual Machine Extension) capabilities, which allow artists, architects and engineers to harness resources in both public and private clouds. In addition to enhanced cloud support, Deadline 7 expands support for the Jigsaw multi-region rendering feature, which can now be accessed in 3ds Max, Maya, modo, and Rhino. Deadline 7 also introduces Draft 1.2, an update to Thinkbox’s lightweight compositing and video processing plug-in designed to automate typical post-render tasks such as image format conversion as well as the creation of animated videos and QuickTimes, contact sheets, and watermark elements on exported images. Finally, Deadline 7 introduces a wealth of new features, enhancements, and bug fixes. Note that a new 7.0 license is required to run this version. If you have a license for Deadline 6.2 or earlier, you will need an updated license. In addition, the version of Draft that ships with Deadline 7 needs a new 1.2 license. If you have a license for Draft 1.1 or earlier, you will need an updated license.

1.1.1 Components

The Deadline Render Farm Management System is built up of 3 components: • A single Deadline Database • A single Deadline Repository • One or more Deadline Clients

3 Deadline User Manual, Release 7.0.3.0

The Database and Repository together act as a global system where all of Deadline’s data is stored. The Clients (workstations and render nodes) then connect to this system to submit, render, and monitor jobs. It is important to note that while the Database and Repository work together, they are still separate components, and therefore can be installed on separate machines if desired.

1.1.2 Database

The Database is the global database component of the Deadline Render Farm Management System. It stores the jobs, settings, and slave configurations. The Clients access the Database via a direct socket connection over the network. It only needs to be installed on one machine (preferably a server), and does not require a license.

1.1.3 Repository

The Repository is the global file system component of the Deadline Render Farm Management System. It stores the plugins, scripts, logs, and any auxiliary files (like scene files) that are submitted with the jobs. The Clients access the Repository via a shared network path. It only needs to be installed on one machine (preferably a server), and does not require a license.

1.1.4 Client

The Client should be installed on your render nodes, workstations, and any other machines you wish to participate in submitting, rendering, or monitoring jobs. The Client consists of the following applications:

4 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

• Launcher: Acts as a launch point for the Deadline applications on workstations, and facilitates remote commu- nication on render nodes. • Monitor: An all-in-one application that artists can use to monitor their jobs and administrators can use to monitor the farm. • Slave: Controls the rendering applications on the render nodes. • Command: A command line tool that can submit jobs to the farm and query for information about the farm. • Pulse: An optional mini server application that performs maintenance operations on the farm, and manages more advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering, and the Web Service. If you choose to run Pulse, it only needs to be running on one machine. • Balancer: An optional Cloud-controller application that can create and terminate Cloud instances based on things like available jobs and budget settings. Note that the Slaves and the Balancer applications are the only Client applications that require a license.

1.1.5 Jobs

A Deadline job typically represents one of the following: • The rendering of an sequence from a 3D scene. • The rendering of a frame sequence from a composition. It could represent a single write node, or multiple write nodes with the same frame range. • The generation of a Quicktime movie from an existing image sequence. • A simulation. These are just some common cases. Since a job simply represents some form of processing, a plug-in can be created for Deadline to do almost anything you can think of.

Job Breakdown

A job can be broken down into one or more tasks, where each task is an individual unit that can be rendered by the Slave application. Each task can then consist of a single frame or a sequence of frames. Here are some examples: • When rendering an animation with 3ds Max where each frame can take hours to render, each frame can be rendered as a separate task. • When rendering a compositing job with After Effects where each frame can take seconds to render, each task could consist of 20 frames. • When rendering a Quicktime job to create a movie from an existing sequence of images, the job would consist of a single task, and that task would consist of the entire image sequence.

1.1. Overview 5 Deadline User Manual, Release 7.0.3.0

Job Scheduling

Use numeric job priorities, machine groups and pools, and job-specific machine lists to explicitly control distribution of rendering resources among multiple departments. Limits allow you to handle both limited license plug-ins and render packages, while job dependencies and scheduling allow you to control when your jobs will begin rendering.

6 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

The Slave applications are fully responsible for figuring out which job they should render next, and they do this by connecting directly to the Database. In other words, there is no central server application that controls which jobs the Slaves are working on. The benefit to this is that as long as your Database and Repository are online, Deadline will be fully operational.

1.2 Feature Set

1.2.1 Rock-steady Operation

Deadline’s unique architecture removes the need for a centralized manager application by using a highly-scalable database and basic file sharing to manage the farm. As long as your Database and File Server are running, Deadline is running.

1.2.2 Intuitive User Interface

Built with your creativity in mind, Deadline’s User Interface has evolved in response to extensive feedback from artists. The flexible and intuitive interface provides a unified experience to artists and administrators across all platforms. For job submission, Deadline offers integrated submission scripts for 3ds Max, After Effects, Blender, Cinema 4D, Clarisse iFX, Composite, Fusion, Generation, Hiero, Houdini, Lightwave, Maya, Messiah, modo, Nuke, RealFlow, Rhino, SketchUp 3D, Softimage, and Vue, providing a comfortable native environment for cross-application tasks.

1.2. Feature Set 7 Deadline User Manual, Release 7.0.3.0

1.2.3 Supported Software

Deadline supports over 50 different rendering packages out of the box. See the Supported Software page in the Deadline documentation for more information.

1.2.4 Customizable and Scriptable

With its Python based plug-in API, studios can customize the out of the box plug-ins and scripts to suit their individual pipelines, or create custom plug-ins to support in-house applications. Event plug-ins can be created to trigger events like updating pipeline tools when jobs are submitted or finish rendering, and Cloud plug-ins can be created to control VMs in public and private Cloud providers. Finally, job scripts can be created to setup custom dependencies, as well as perform operations when a job starts, when a job finishes, and before and after each task is rendered.

1.2.5 Flexible Job Scheduling

Use numeric job priorities, machine groups and pools, and job-specific machine lists to explicitly control distribution of rendering resources among multiple departments. Limits allow you to handle both limited license plug-ins and render packages, while job, asset, and script based dependencies allow you to control when your jobs will begin rendering. Stick with the default First-in, First-out scheduling logic, or switch to a Balanced or Weighted system. Launch and configure an arbitrary number of Slaves on a single machine. Each Slave instance can be given a unique name, and can be assigned its own list of pools and groups, which allows Slaves to work on separate jobs. A single high performance machine can process multiple 3D, compositing, and simulation jobs simultaneously. Slave instances running on the same machine will share a single Deadline license.

1.2.6 Notifications

Deadline can be configured to notify users of job completion or failure through an automatic e-mail notification or a popup message on the users’ machine. Administrators can also configure Deadline to notify them with information about Power Management, stalled Slaves, licensing issues, and other issues that may arise on the farm.

1.2.7 Statistics Gathering

Deadline automatically stores job and render farm statistics in the Database. Statistics can be viewed from the Monitor, or retrieved from the Database by custom pipeline tools.

1.2.8 Shotgun and ftrack Integration

Deadline integrates with Shotgun to enable a seamless render and review data flow. When a render job is submitted, a version is automatically created in Shotgun with key metadata. When the render is complete, Shotgun is updated with a thumbnail image, paths to frames, render stats, and playback links. Deadline can also automatically upload a movie and/or a filmstrip when the render is complete. Shotgun then dispatches targeted notifications with links back to the work. Studios can view versions in various contexts, create reports, and organize work into playlists for review sessions where they can quickly take notes with the Shotgun Note App. The Deadline/FTrack integration enables a seamless render and review data flow. When Deadline starts a render, an Asset Version is automatically created within FTrack using key metadata. When the render is complete, Deadline automatically updates the created Version appropriately – a thumbnail image is uploaded, components are created from the Job’s output paths (taking advantage of FTrack’s location plugins), and the Version is flagged for Review. In

8 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0 doing so, Deadline provides a seamless transition from Job Submission to Review process, without artists needing to monitor their renders.

1.2.9 Draft

Draft is a tool that provides simple compositing functionality. It is implemented as a Python library, which exposes functionality for use in python scripts. Draft is designed to be tightly integrated with Deadline, but it can also be used as a standalone tool. Using the Draft plugin for Deadline, artists can automatically perform simple compositing operations on rendered frames after a render job finishes. They can also convert them to a different image format, or generate Quicktimes for dailies. Active Deadline subscribers are entitled to Draft licenses at no additional cost. Active Deadline subscribers can request a Draft license by emailing [email protected].

1.2.10 QuickTime Support

Install QuickTime on your slaves to create QuickTime movies from your own rendered frames.

1.2.11 Jigsaw and Tile Rendering

Jigsaw is available for 3ds Max, Maya, modo, and Rhino, and can be used to split up large frames into arbitrary sized tiles and distribute them over your render farm. When the tiles are finished rendering, they are automatically assemged into the final image using Draft. Specific tiles can be re-rendered and automatically composited on top of the original image. Regular tile rendering, which supports fixed tile sizes only, is still supported as well, and is available for 3ds Max, Maya, modo, Rendition, Rhino, and Softimage.

1.2.12 Easy Installation and Upgrade Deployment

Deadline has gone through rigorous analysis to make the installation and configuration process smooth and efficient. A detailed document provides easy, step-by-step instructions explaining the various components that will be installed. In addition, Deadline has the ability to auto-upgrade the whole render farm from a centralized deployment - an incredible time-saver for large render farms. Auto Configuration allows studios to efficiently increase the size of their farm by removing the need to configure each new Slave individually. The Repository Path, License Server, and additional settings can be configured in a single location, and broadcast to the slaves when they start up.

1.2.13 Slave Scheduling Improvements and Idle Detection

Start and stop the slave based on the time of day to allow workstations to join the render farm overnight. Alternatively, start the slave if the machine has been idle for a certain amount of time, and stop it when the machine is in use again. Other criteria like CPU usage, memory usage, and running processes can also be checked before starting the slave. Displays a warning message before starting the slave, allowing an artist to choose to delay when the slave starts if they are still using the machine.

1.2. Feature Set 9 Deadline User Manual, Release 7.0.3.0

1.2.14 Local Slave Controls

Artists can monitor and control the slave application running on their workstation, which is useful if the slave is running as a service. Override the Idle Detection settings for your slave, or change the slave’s Job Dequeuing Mode to control if the slave should render all jobs, jobs submitted from the artist’s machine, or jobs submitted by specific users.

1.2.15 Remote Control and Farm Administration

Stream the log from a Slave in real time, or start, stop, and restart Slave instances (as well as the remote machine on which it is running) remotely from within the Monitor. In addition, execute arbitrary command lines (applications, command line operations or batch files) on a single or group of remote machines to rollout software or install updates. In addition, Deadline integrates seamlessly with VNC, Remote Desktop Connection, Apple Remote Desktop, and Radmin using custom scripts. These scripts can be modified or new scripts can be created to support other remote access software.

1.2.16 Access Control and Auditing

While full access is granted for all users to modify their own jobs, the User Group Management System prevents users from inadvertently disrupting other jobs, and allows Administrators to configure the types of actions available to each user group. An optional password protected Super User mode allows for global network administration. Any command that affects a job or Slave is logged along with the originating user name and machine. This allows everyone, including project managers and supervisors, to track changes and troubleshoot issues with confidence. It also encourages responsibility and cooperation on the part of all users.

1.2.17 Reduced Energy Footprint

Save on energy consumption, power and cooling costs with Power Management, a feature that shuts down idle ma- chines and starts them back up when needed. This feature is available for render farms with machines that support WakeOnLan.

1.3 Supported Software

Deadline offers extensive out of the box support for third party applications, as well as an Application Plugin API for custom plugin development. The following applications (and associated renderers) are supported out of the box.

10 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

1.3.1 3ds Max

Highlighted Features Supported Renderers • Supports Versions 2010 to 2015 • Brazil r/s • 3ds Max and 3ds Max Design • Corona • Integrated Submission • finalRender • RPManager Submission • finalToon • Keeps Scene In Memory • Krakatoa • Tile Rendering • Maxwell • Jigsaw Support • Mental Ray • Interactive VRay Distributed Rendering • RenderPipe • Offload VRay Distributed Rendering • VRay • Offload Mental Ray Distributed Rendering • Render To Texture Support • Maxscript Jobs • Scene States • Custom Sanity Check • Local Rendering • Sticky/Default Setting Configuration • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Output File Path • Path Mapping Of Pre/Post Script Paths • Path Mapping Of Path Config File Path

Documentation: 3ds Max Documentation, 3ds Command Documentation

1.3.2 After Effects

Highlighted Features • Supports Versions CS3 to CS6 and CC to CC2014 • Integrated Submission • Local Rendering • Multi-Machine Rendering • Submit Layers As Separate Jobs • Custom Sanity Check • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Output Path • Path Mapping Of Scene File Contents (.aepx format only)

Documentation: After Effects Documentation

1.3. Supported Software 11 Deadline User Manual, Release 7.0.3.0

1.3.3 Anime Studio

Highlighted Features • Supports Version 8 to 10 • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Output Path

Documentation: Anime Studio Documentation

1.3.4 Arion Standalone

Highlighted Features • Supports Version 2 and Later • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path • Path Mapping Of Output Path

Documentation: Arion Standalone Documentation

1.3.5 Arnold Standalone

Highlighted Features • Supports the Pre-Release Beta and Version 1 • Local Rendering • Shotgun Support • ftrack Support • Path Mapping Of Input File Paths • Path Mapping Of Output Path • Path Mapping Of Plugin Folder Paths

Documentation: Arnold Standalone Documentation

1.3.6 Blender

Highlighted Features Supported Renderers • Supports Version 2.5 and Later • All • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Output Path

12 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

Documentation: Blender Documentation

1.3.7 Cinema 4D

Highlighted Features Supported Renderers • Supports Versions 12 to 16 • All • Integrated Submission • Local Rendering • Automatic Scene Exporting • Team Render Support • Custom Sanity Check • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Output Path

Documentation: Cinema 4D Documentation, Cinema 4D Team Render Documentation

1.3.8 Clarisse iFX

Highlighted Features • Integrated Submission • Automatic Render Archiving • Path Mapping Of Scene File Path • Path Mapping Of Config File Path • Path Mapping Of Module Paths • Path Mapping Of Search Paths

Documentation: Clarisse iFX Documentation

1.3.9 Combustion

Highlighted Features • Supports Versions 4 and 2008 • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Scene File Contents

Documentation: Combustion Documentation

1.3. Supported Software 13 Deadline User Manual, Release 7.0.3.0

1.3.10 Command Line

Highlighted Features • Run Arbitrary Command Line Jobs • Run The Same Command For Different Frames • Run Different Commands For Different Tasks • Path Mapping Of Executable File Path • Path Mapping Of Arguments

Documentation: Command Line Documentation, Command Script Documentation

1.3.11 Composite

Highlighted Features • Supports Versions 2010 to 2015 • Integrated Submission • Shotgun Support • ftrack Support

Documentation: Composite Documentation

1.3.12 Corona Standalone

Highlighted Features • Override number of passes and render time during submission • Specify multiple configuration files to use when rendering • Path Mapping Of Scene File Path • Path Mapping Of Output File Path • Path Mapping Of Config File Paths

Documentation: Corona Standalone Documentation

1.3.13 DJV

Highlighted Features • Image/Movie Type Conversion • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Input File Path • Path Mapping Of Output File Path • Path Mapping Of Slate Input Path

Documentation: DJV Documentation

14 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

1.3.14 Draft

Highlighted Features • Deep Integration With Deadline • Create Movies From Rendered Images • Perform Other Image Processing • Shotgun Support • ftrack Support • Path Mapping Of Template File Path • Path Mapping Of Template Arguments

Documentation: Draft Documentation

1.3.15 FFmpeg

Highlighted Features • Up To 10 Input Files or Sequences • Path Mapping Of Input File Paths • Path Mapping Of Output File Path • Path Mapping Of Video Preset File Path • Path Mapping Of Audio Preset File Path • Path Mapping Of Subtitle Preset File Path

Documentation: FFmpeg Documentation

1.3.16 Fusion

Highlighted Features • Supports Versions 5 to 7 • Integrated Submission • Keeps Scene In Memory • Custom Sanity Check • Shotgun Support • ftrack Support • Draft Support

Documentation: Fusion Documentation

1.3.17 Generation

Highlighted Features • Integrated Submission • Submit Comp Jobs To Fusion

Documentation: Generation Documentation

1.3. Supported Software 15 Deadline User Manual, Release 7.0.3.0

1.3.18 Hiero

Highlighted Features • Integrated Submission • Submit Transcoding Jobs To Nuke

Documentation: Hiero Documentation

1.3.19 Houdini

Highlighted Features Supported Renderers • Supports Versions 9 to 13 • All • Integrated Submission • Submit ROPs as Separate Jobs • IFD Export Jobs • Custom Sanity Check • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Output File Path • Path Mapping Of IFD File Path

Documentation: Houdini Documentation

1.3.20 Lightwave

Highlighted Features Supported Renderers • Supports Versions 8 to 11 and 2015 • All • FPrime Rendering • Integrated Submission • Keeps Scene In Memory • Custom Sanity Check • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Config Folder Path • Path Mapping Of Content Folder Path • Path Mapping Of Content File Contents

Documentation: Lightwave Documentation

16 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

1.3.21 LuxRender

Highlighted Features • Path Mapping Of Scene File Path

Documentation: LuxRender Documentation

1.3.22 Mantra Standalone

Highlighted Features • Supports Versions 7 to 13 • Shotgun Support • ftrack Support • Path Mapping Of IFD File Path • Path Mapping Of Output File Path • Path Mapping Of IFD File Contents

Documentation: Mantra Standalone Documentation

1.3.23 Maxwell

Highlighted Features • Supports Versions 2 and 3 • Cooperative Rendering • Automatic MXI Merging • Local Rendering • Resume Rendering from MXI Files • Override Time and Sampling Level Values • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path • Path Mapping Of MXI File Path • Path Mapping Of Output File Path

Documentation: Maxwell Documentation

1.3. Supported Software 17 Deadline User Manual, Release 7.0.3.0

1.3.24 Maya

Highlighted Features Supported Renderers • Supports Versions 2010 to 2015 • 3Delight • Integrated Submission • Arnold • Keeps Scene In Memory • Caustic Visualizer • Tile Rendering • finalRender • Jigsaw Support • Gelato • VRay Distributed Rendering • Krakatoa • Local Rendering • Maxwell • Submit Layers As Separate Jobs • mayaSoftware • Submit Cameras As Separate Jobs • mayaHardware • Mental Ray Export Jobs • mayaVector • VRay Export Jobs • Mental Ray • Renderman Export Jobs • Octane • Arnold Export Jobs • Redshift • Melscript/Python Script Jobs • Renderman • Custom Sanity Check • Turtle • Shotgun Support • VRay • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Output Folder Path • Path Mapping Of Project Folder Path • Path Mapping Of Scene File Contents (.ma format only)

Documentation: Maya Documentation

1.3.25 Mental Ray Standalone

Highlighted Features • Local Rendering • Shotgun Support • ftrack Support • Path Mapping Of Input File Path • Path Mapping Of Output File Path

Documentation: Mental Ray Standalone Documentation

18 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

1.3.26 Messiah

Highlighted Features • Integrated Submission • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path • Path Mapping Of Output Folder Path • Path Mapping Of Content Folder Path

Documentation: Messiah Documentation

1.3.27 MetaFuze

Highlighted Features • Batch Folder Submission • Path Mapping Of Scene File Path

Documentation: MetaFuze Documentation

1.3.28 MetaRender

Highlighted Features • Path Mapping Of Input File Path • Path Mapping Of Output File Path

Documentation: MetaRender Documentation

1.3.29 modo

Highlighted Features Supported Renderers • Supports Versions 3xx to 8xx • All • Integrated Submission • Keeps Scene In Memory • Modo Distributed Rendering • Tile Rendering • Jigsaw Support • Pass Groups Support • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Output File Path

Documentation: modo Documentation

1.3. Supported Software 19 Deadline User Manual, Release 7.0.3.0

1.3.30 Naiad

Highlighted Features • Simulation Jobs • EMP to PRT Conversion Jobs • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path • Path Mapping Of EMP File Path

Documentation: Naiad Documentation

1.3.31 Nuke

Highlighted Features • Supports Versions 6 to 9 • Integrated Submission • Keeps Scene In Memory • Submit Write Nodes As Separate Jobs • Specify Views to Render • Render Using Proxy Mode • Custom Sanity Check • Shotgun Support • ftrack Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Scene File Contents

Documentation: Nuke Documentation

1.3.32 Octane Standalone

Highlighted Features • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path • Path Mapping Of Output File Path

Documentation: Octane Standalone Documentation

20 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

1.3.33 PRMan (Renderman Pro Server)

Highlighted Features • Shotgun Support • ftrack Support • Path Mapping Of Input File Path • Path Mapping Of Working Directory Path

Documentation: PRMan Documentation

1.3.34 Python

Highlighted Features • Supports Versions 2.3 to 2.7 and 3.0 to 3.2 • Submit Python Scripts as Jobs • Path Mapping Of Script File Path • Path Mapping Of Script Arguments

Documentation: Python Documentation

1.3.35 Quicktime

Highlighted Features • Generate Quicktime Movies from Images • Shotgun Support • ftrack Support • Path Mapping Of Input File Path • Path Mapping Of Output File Path • Path Mapping Of Audio File Path

Documentation: Quicktime Documentation

1.3.36 RealFlow

Highlighted Features • Supports Versions 4 to 5, and 2012 to 2014 • Integrated Submission • Submit IDOCs as Separate Jobs • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path

Documentation: RealFlow Documentation

1.3. Supported Software 21 Deadline User Manual, Release 7.0.3.0

1.3.37 REDLine

Highlighted Features • Path Mapping Of Scene File Path • Path Mapping Of Output Folder Path • Path Mapping Of RSX File Path

Documentation: REDLine Documentation

1.3.38 Renderman (RIB)

Note that while this plugin supports PRMan, it is recommended that you use PRMan’s dedicated plugin instead if you are using that renderer. Highlighted Features Supported Renderers • Shotgun Support • 3Delight • ftrack Support • AIR • Draft Support • Aqsis • Path Mapping Of Input File Path • BMRT • Entropy • PRMan • Pixie • RenderDotC • RenderPipe

Documentation: Renderman Documentation

1.3.39 Rendition

Highlighted Features • Tile Rendering • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path • Path Mapping Of Output File Path

Documentation: Rendition Documentation

22 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

1.3.40 Rhino

Highlighted Features Supported Renderers • Supports Versions 4 and 5 • Brazil r/s • Integrated Submission • Flamingo Raytrace • Render Bongo • Flamingo Photometric • Shotgun Support • Maxwell • ftrack Support • Penguin • Draft Support • Rhino • Tile Rendering • TreeFrog • Jigsaw Support • VRay • Path Mapping Of Scene File Path • Path Mapping Of Output File Path

Documentation: Rhino Documentation

1.3.41 RVIO

Highlighted Features • Shotgun Support • ftrack Support • Path Mapping Of Input File Paths • Path Mapping Of Audio File Paths • Path Mapping Of Output File Path

Documentation: RVIO Documentation

1.3.42 Shake

Highlighted Features • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path

Documentation: Shake Documentation

1.3.43 SketchUp

Highlighted Features Supported Renderers • Supports Versions 7 to 8 and 2013 to 2015 • All • Integrated Submission • Export 3D Models • Export 2D Images • Export 2D Image Sequences • Path Mapping Of Scene File Path • Path Mapping Of Export Directory Path

1.3. Supported Software 23 Deadline User Manual, Release 7.0.3.0

Documentation: SketchUp Documentation

1.3.44 Softimage

Highlighted Features Supported Renderers • Supports Versions 2010 to 2015 • All • Integrated Submission • Keeps Scene In Memory • Tile Rendering • Local Rendering • Submit Passes As Separate Jobs • Fx Render Tree Jobs • Shotgun Support • ftrack Support • Draft Support • Path Mapping Of Scene File Path • Path Mapping Of Output File Path • Path Mapping Of Workgroup Folder Path

Documentation: Softimage Documentation

1.3.45 Terragen

Highlighted Features • Supports Versions 2 to 3 • Local Rendering • Path Mapping Of Scene File Path • Path Mapping Of Output File Path

Documentation: Terragen Documentation

1.3.46 VRay Distributed Rendering

Highlighted Features Supported Applications • Submit Spawner Jobs to Reserve Machines • 3ds Max (fully integrated) • Interactive Distributed Rendering • Maya (fully integrated) • Rhino (spawner launching only) • SketchUp (spawner launching only) • Softimage (fully integrated) • VRay Standalone (spawner launching only)

Documentation: VRay Distributed Rendering Documentation

24 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

1.3.47 VRay Standalone

Highlighted Features • VRIMG to EXR Conversion • PLY to VRMESH Conversion • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path • Path Mapping Of Output File Path • Path Mapping Of Scene File Contents

Documentation: VRay Standalone Documentation, Ply2Vrmesh Documentation, Vrimg2Exr Documentation

1.3.48 Vue

Highlighted Features • Supports Versions 7 to 11 and 2014 • Integrated Submission • Shotgun Support • ftrack Support • Path Mapping Of Scene File Path

Documentation: Vue Documentation

1.3.49 xNormal

Highlighted Features • Path Mapping Of Scene File Path

Documentation: xNormal Documentation

1.4 Render Farm Considerations

This is a list of things that should be taken into consideration before installing Deadline.

1.4.1 Rendering Software and Licensing

It is recommended that the rendering applications you plan to use for rendering (ie: 3ds Max, Maya, etc) be installed on all of your render nodes. It is preferable that you install an application to the same location on each machine, because this makes configuring the Deadline plugins easier. Note that some applications support being installed and run from a network location, which can make setup and configuration easier. Refer to your rendering application’s documentation to see if this is supported. In addition, it is recommended that all licensing that your rendering applications require be setup before attempting to render on your network. Deadline doesn’t handle the licensing of 3rd party rendering applications, so you should refer to your application’s documentation or contact its support team if you run into issues with licensing.

1.4. Render Farm Considerations 25 Deadline User Manual, Release 7.0.3.0

1.4.2 Store Assets On The Network

It is recommended that all assets (ie: scenes, footage, textures, etc) used by your render jobs be placed on a network share (preferably a server), which can be accessed via a shared path or a mapped network drive. This is important for two reasons: • It ensures that all the slaves in your render farm have access to your asset files. • It ensures that the slaves use the same version of the asset files that are used by your job. Note that you can optionally submit the scene file with the job. This results in the scene file being sent to the Repository or an alternate location, and then copied locally to the Slave that renders it. If the scene file contains relative asset paths, it is recommended to not submit the scene file with the job, as these relative paths will likely be broken when the Slave renders the scene from its local location. When rendering in a mixed OS environment, you can configure Deadline to swap paths based on the it is running on. The way this works is often specific to the rendering application that you are using, so please refer to Cross-Platform Rendering Considerations section for the plug-in that you are using for more information. You can access plug-in specific documentation in the Plug-ins documentation.

1.4.3 Save Output Files To The Network

All output should be saved to a network share as well (preferably a server). This is important because it ensures that all the slaves in your render farm have access to the output path. When rendering in a mixed OS environment, you can configure Deadline to swap output paths based on the operating system it is running on. The way this works is often specific to the rendering application that you are using, so please refer to Cross-Platform Rendering Considerations section for the plug-in that you are using for more information. You can access plug-in specific documentation in the Plug-ins documentation.

1.4.4 Remote Administration

Deadline has a Remote Administration feature that can be enabled in the Client Setup section of the Repository Options, which can be accessed from the Monitor by selecting Tools -> Configure Repository Options while in Super User Mode. This feature allows you to control all the render nodes remotely from a single machine, including starting and stopping the Slave application, and running arbitrary command line applications on each machine. However, this feature can be a potential security risk if you are not behind a firewall. If this is the case, we recommend that you keep this feature disabled.

1.4.5 Automatic Updates

Deadline has an Automatic Updates feature that can be enabled in the Client Setup section of the Repository Options, which can be accessed from the Monitor by selecting Tools -> Configure Repository Options while in Super User Mode. Enabling this feature makes minor Deadline upgrades easy, with little to no downtime. Refer to the Upgrading Documentation for more information.

1.4.6 Setup An SMTP Server for Emails

Deadline can use email to notify users when their jobs have succeeded or failed. Email can also be used to notify system administrators of all sorts of events, like when slaves stall or when jobs fail. It is recommended that an SMTP server be setup so that you can make use of these features. You can configure the email notification settings in the Repository Options, which can be accessed from the Monitor by selecting Tools -> Configure Repository Options while in Super User Mode.

26 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

1.4.7 Auto Login on Windows Render Nodes

If you’re not running the Slave as a service, it can be set to start automatically when the render mode it is on starts up, but this requires that the render node login automatically. On Windows, this can be done by modifying the registry on each render node. These are the steps to setup your render node registry to login: 1. Download the Registry Entry File For Auto Login from the Miscellaneous Deadline Downloads Page. 2. Edit the file to use the username and password you wish to. 3. Login to the render node as the specified user, then double-click on this file to run. 4. The next time you restart the machine, it should login automatically as the specified user. By default, the Slaves are set to start automatically when the machine logs in. This setting, as well as others, can be modified from the Launcher on each machine.

1.4.8 App Nap on Mac OS X Render Nodes and Workstations

App Nap is a collection of new features in OS X Mavericks that helps conserve CPU energy use by “slowing down” or stopping applications that cannot be seen, for example if they are behind another window or the screen has been put to sleep. However, this can have an adverse affect on Deadline and/or the applications it is rendering with. Because of this, we recommend disabling App Nap and screen power saving modes (if applicable) on render nodes across the entire operating system by following these steps: 1. Open a terminal (the Terminal can be found in /Applications/Utilities). 2. Run the following command (sudo rights required) and you must restart the machine defaults write NSGlobalDomain NSAppSleepDisabled -bool YES If you wish to re-enable App Nap, follow the steps above, but run the following command for (2) instead: defaults delete NSGlobalDomain NSAppSleepDisabled You can check the status of the setting (if it already exists on a machine) by the following command, where “1” means App Nap is disabled and “0” means it is enabled: defaults read NSGlobalDomain NSAppSleepDisabled If workstations are being used as render nodes, it is recommended to disable App Nap on them as well. However, if workstations are simply being used to submit and monitor render jobs, then this shouldn’t be necessary. On Macs which have built-in or connected external displays, once a screen saver has begun or the display has been put to sleep by power management, Deadline as well as other rendering applications will be throttled down to conserve energy, regardless of the per-app App Nap setting. Finally, the machine that is running Pulse/Balancer should also have App Nap disabled, or at the very least, disabled for the Pulse/Balancer applications. To disable App Nap for the Pulse/Balancer application only, right-click (or Command- click) on the DeadlinePulse/DeadlineBalancer application in Finder, and select Get Info. Then in the General section, check the “Prevent App Nap” box. If Pulse/Balancer is currently running, you will have to restart it for the changes to take effect.

1.4. Render Farm Considerations 27 Deadline User Manual, Release 7.0.3.0

1.4.9 Disable WER on Windows

When applications crash on Windows, the system holds the application open in memory and displays a series of helpful boxes asking if you want to submit the error report to Microsoft. While that’s super handy for all sorts of reasons, if there’s no one there to click the dialog (headless render node), Deadline will assume the application is still running and wait indefinitely by default. The registry fix below will stop that from popping up on render nodes that don’t have baby sitters. Meaning when the application crashes, it actually exits like we know it should. This change is system-wide, but can be configured per- user if you like by changing the registry hive used (HKEY_CURRENT_USER versus HKEY_LOCAL_MACHINE). Ensure you restart the machine after changing the registry setting and it is always recommended to take a backup before editing a machine’s registry. Copy the code below into a file: “DisableCrashReporting.reg” and double-click this file as a user with administrator privileges. Alternatively, you can manually add/edit the registry entry via “regedit.exe” or inject the registry silently via the command-line “regedit.exe /s DisableCrashReporting.reg”.

Windows Registry Editor Version 5.00 [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Windows Error Reporting] "Disabled"=dword:00000001

For more information about the possible settings, see here: MSDN article WER Settings. It’s also possible to just default to sending them if you like, or to store the crash dumps in a safe place if you’re a developer.

28 Chapter 1. Introduction Deadline User Manual, Release 7.0.3.0

1.4.10 Firewall, Anti-Virus & Security Considerations

Here is a checklist of items which should be considered by those responsible for deploying Deadline repository and client software. Ensure you consider additional configuration requirements for any software/hardware firewall clients, network switches, anti-virus software clients and Operating System specific security controls such as Windows UAC or SELinux (Security-Enhanced), which may attempt to block Deadline communication. It is recommended during initial setup & configuration to disable all firewalls, anti-virus software, etc and test the basic operation and functionality of Deadline. Once this has been verified as correct, then slowly re-enable all necessary other software, re-testing and confirming that Deadline execution is still correct. Windows UAC Ensure Windows UAC is correctly configured to allow Deadline communication and the correct execution of the Deadline applications. Anti-Virus Software Ensure Anti-Virus software does NOT block Deadline and allows Deadline executables to run normally on ALL machines. Deadline Executables Allow Deadline executables to pass-through any applicable Client Firewall. Ensure you consider all applicable policy scopes (Windows - domain, private, public) and both inbound & outbound rules: • [INSTALL_PATH] Windows executable / Mac OSX executable / Linux executable • [INSTALL_PATH] deadlinecommand.exe / DeadlineCommand.app / deadlinecommand • [INSTALL_PATH] deadlinecommandbg.exe / DeadlineCommandBG.app / deadlinecommandbg • [INSTALL_PATH] deadlinelauncher.exe / DeadlineLauncher.app / deadlinelauncher • [INSTALL_PATH] deadlinelauncherservice.exe (Windows Only) • [INSTALL_PATH] deadlinemonitor.exe / DeadlineMonitor.app / deadlinemonitor • [INSTALL_PATH] deadlineslave.exe / DeadlineSlave.app / deadlineslave • [INSTALL_PATH] deadlinepulse.exe / DeadlinePulse.app / deadlinepulse • [INSTALL_PATH] deadlinebalancer.exe / DeadlineBalancer.app / deadlinebalancer Deadline’s default local client software [INSTALL_PATH] for each OS are as follows (where # is the Deadline ver- sion): • Windows: “C:\Program Files\Thinkbox\Deadline#\bin” • Mac OSX: “/Applications/Thinkbox/Deadline#/bin” • Linux: “/opt/Thinkbox/Deadline#” Application Executables Make sure you allow your application executables to pass-through any applicable Client Firewall. Ensure you consider all applicable policy scopes (Windows - domain, private, public) and both inbound & outbound rules. See here for specific 3dsMax Firewall Exceptions documentation. MongoDB Server & Deadline clients Ensure you allow MongoDB service daemon to pass through any firewall and network switch. Ensure you consider all applicable policy scopes (Windows - domain, private, public) and both inbound & outbound rules: • [INSTALL_PATH] Windows executable / Mac OSX executable / Linux executable

1.4. Render Farm Considerations 29 Deadline User Manual, Release 7.0.3.0

• [INSTALL_PATH] mongod.exe / mongod / mongod Deadline’s default local database software [INSTALL_PATH] for each OS are as follows (where # is the Deadline version): • Windows: “c:\DeadlineDatabase#\mongo\application\bin” • Mac OSX: “/Applications/Thinkbox/DeadlineDatabase#/mongo/application/bin” • Linux: “/opt/Thinkbox/DeadlineDatabase#/mongo/application/bin” Mono (Mac OSX / Linux Only) Ensure Mono executable is allowed to pass-through any firewall / anti-virus software. Port Configuration. Ensure the machine(s) running the MongoDB, Deadline repository, Deadline Pulse/Balancer/Monitor/Slave ALL have the ability to communicate with each other on your local and/or extended network with the following (default) TCP or UDP ports. Proto- Port Service Comment col Number UDP 17061 Pulse Default UDP port - Pulse listens for broadcasts on the UDP port auto-configuration TCP 17061 Pulse Default TCP port - Pulse sends auto-config data over TCP auto-configuration TCP 17062 Pulse Default TCP port - “Configure Repository Options” - “Pulse Settings” - “General” TCP 27017 MongoDB TCP 28017 MongoDB Web Access the http web site (optional) for database information API TCP 8080 Pulse WebService Default TCP port - “Configure Repository Options” - “Pulse Settings” - “WebService” UDP 7 WoL Default UDP port - “Configure Repository Options” - “Wake On (Wake-On-Lan) Lan Settings” UDP 123 NTP TCP 25 SMTP For mail server to receive e-mail notifications from Slaves and Pulse TCP 587 SMTP (submission) TCP 465 SMTP SSL For sending notifications using SSL License Server If necessary, ensure that the Thinkbox Flexlm license file has been configured to run over an exact TCP port and this port has also been allowed access through any required firewall or network switch. Please refer to the FLEXnet Licensing Documentation. External Pulse Access & Deadline Mobile If external network access is required, please see the Pulse Web Service and Deadline Mobile documentation.

30 Chapter 1. Introduction CHAPTER TWO

INSTALLATION

2.1 System Requirements

This section covers the system requirements for all the Deadline components. It is also recommended to read through the Render Farm Considerations documentation before proceeding with the installation. For a more complete description of the Deadline components listed below, see the Deadline Overview documentation.

2.1.1 Database

Deadline uses MongoDB for the Database, and requires MongoDB 2.6.1 or later. The Repository installer can in- stall the MongoDB database for you, or you can use an existing MongoDB installation providing that it is running MongoDB 2.6.1 or later. The following operating systems are supported for the Database: • Windows Server 2003 and later (64-bit) • Linux (64-bit) • Mac OS X 10.7 and later (64-bit) These are the minimum recommended hardware requirements for a production Database: • 64-bit Architecture • 8 GB RAM • 4 Cores • RAID or SSD disks Note that MongoDB performs best if all the data fits into RAM, and it has fast disk write speeds. In addition, larger farms may have to scale up on RAM and Cores as necessary, or even look at Sharding their database. Finally, while you can install MongoDB to a 32-bit system for testing, it has limitations and is not recommended for production. For example, the database size will be limited to 2 gigabytes, and Journaling will be disabled. Without Journaling, it will not be possible to repair the database if a crash corrupts the data. See the MongoDB FAQ for more information.

Windows

If you choose a non-Server Windows Operating System (Vista, 7, or 8) to host the database, you should be aware that these operating systems have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists of more than 10 machines, it is very likely that you’ll hit this limitation every now and then (and the odds continue to increase as the number of machines increase). This is a limitation of the operating systems, and isn’t

31 Deadline User Manual, Release 7.0.3.0 something that we can workaround, so we recommend using a Server edition of Windows, or a different operating system like Linux.

Linux

If you choose a Linux system to host the database, you will need to make sure the system resource limits are configured properly to avoid connection issues. More details can be found in the Database and Repository Installation Guide. Other Linux recommendations include: • Do not run MongoDB on systems with Non-Uniform Access Memory (NUMA). It can cause a number of operational problems, including slow performance or high system process usage. • Install on a system with a minimum Linux kernel version of 2.6.36. • Install on a system with Ext4 or XFS file systems. • Turn off atime or relatime for the storage volume containing the database files, as it can impact performance. • Do not use hugepages virtual memory pages as MongoDB performs better with normal virtual memory pages.

Mac OS X

If you choose a Mac OS X system to host the database, you will need to make sure the system resource limits are configured properly to avoid connection issues. More details can be found in the Database and Repository Installation Guide.

2.1.2 Repository

The Repository is just a collection of files and folders, so it can be installed to any type of share on any type of operating system. Common Repository choices include: • Windows Server • Linux • FreeBSD While the Repository can be installed on any operating system, the Repository installer is only supported on the following operating systems. To install on a different operating system, first create the network share on that system, and then run the Repository installer on one of the systems below and choose the network share as the installation location. • Windows (32 and 64-bit) – Windows XP and later (32 and 64-bit) – Windows Server 2003 and later (32 and 64-bit) • Linux (64-bit only) – Ubuntu 12.04 and later – Debian 7 and later – Fedora 16 and later – CentOS 6 and later – RHEL 6 and later • Mac OS X (64-bit only)

32 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

– 10.7 (OS X Lion) and later If you choose a non-Server Windows Operating System (XP, Vista, 7, or 8), these operating systems usually will not allow more than 10 incoming connections without purchasing additional user access licenses from Microsoft. This means that if more than 10 machines (render nodes or workstations) connect to the Repository, connections will be dropped, which could result unexpected behavior. This is a limitation of the operating systems, and isn’t something that we can workaround, so we recommend using a Server edition of Windows, or a different operating system like Linux or FreeBSD. For hardware requirements, it mainly depends on if you are planning to submit scene files and other auxiliary files with your jobs. If you are, keep in mind that the Repository machine will need to serve out these files to the Client machines, so you will want to treat it like another asset server when it comes to picking hardware. That being said, if you already have an asset server, you could probably just install the Repository on it. If you are not submitting your scene files with your jobs (because they are already stored in a network location), then you should be fine with a less powerful machine.

2.1.3 Client

The Client can be installed on Windows, Linux, or Mac OS X. The requirements for today’s rendering applications go far beyond the requirements of the Client, so if a machine is powerful enough to be used for rendering, it is more than capable of running the Client applications. If you choose to run Pulse or Balancer, and you wish to run it on the same machine as the Database and/or Repository, you will have to install the Client on that machine as well. The following operating systems are supported for the Client: • Windows (32 and 64-bit) – Windows XP and later (32 and 64-bit) – Windows Server 2003 and later (32 and 64-bit) • Linux (64-bit only) – Ubuntu 12.04 and later – Debian 7 and later – Fedora 16 and later – CentOS 6 and later – RHEL 6 and later • Mac OS X (64-bit only) – 10.7 (OS X Lion) and later Note that if you are choosing a machine to run Pulse, you should be aware that non-Server editions of Windows have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists of more than 10 render nodes, it is very likely that you’ll hit this limitation every now and then (and the odds continue to increase as the number of machines increase). This is a limitation of the operating systems, and isn’t something that we can workaround, so we recommend using a Server edition of Windows, or a different operating system like Linux.

2.1.4 License Server

Deadline requires Flexnet License Server version 11.12 or later, and the license server can be run on the following operating systems: • Windows (32 and 64-bit)

2.1. System Requirements 33 Deadline User Manual, Release 7.0.3.0

– Windows XP and later (32 and 64-bit) – Windows Server 2003 and later (32 and 64-bit) • Linux (64-bit only) – Ubuntu 12.04 and later – Debian 7 and later – Fedora 16 and later – CentOS 6 and later – RHEL 6 and later • Mac OS X (64-bit only) – 10.7 (OS X Lion) and later See the License Server Documentation for more information on the License Server requirements. Note that if you choose a non-Server Windows Operating System (XP, Vista, 7, or 8), you should be aware that these operating systems have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists of more than 10 machines, it is very likely that you’ll hit this limitation every now and then (and the odds continue to increase as the number of machines increase). This is a limitation of the operating systems, and isn’t something that we can workaround, so we recommend using a Server edition of Windows, or a different operating system like Linux.

2.2 Licensing

See the License Server Documentation for more information on installing and configuring the License Server.

2.3 Database and Repository Installation

2.3.1 Overview

Before proceeding with this installation, it is highly recommended to read through the Render Farm Considerations documentation. The Database is the global database component of the Deadline Render Farm Management System. It stores the jobs, settings, and slave configurations. The Clients access the Database via a direct socket connection over the network. It only needs to be installed on one machine (preferably a server), and does not require a license. Deadline uses MongoDB for the Database. The Repository is the global file system component of the Deadline Render Farm Management System. It stores the plugins, scripts, logs, and any auxiliary files (like scene files) that are submitted with the jobs. The Clients access the Repository via a shared network path. It only needs to be installed on one machine (preferably a server), and does not require a license. The Database and Repository together act as a global system where all of Deadline’s data is stored. The Clients then connect to this system to submit, render, and monitor jobs. It is important to note that while the Database and Repository work together, they are still separate components, and therefore can be installed on separate machines if desired. The Repository installer can install the MongoDB database for you, but you can also choose to connect to an existing MongoDB installation.

34 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

2.3.2 Installation

While the Repository can be installed on any operating system, the Repository installer is only available for Windows, Linux, and Mac OS X. However, the machine that you run the Repository installer on doesn’t have to be the same machine you’re installing the Repository to. For example, if you have an existing share on a FreeBSD server or a NAS system, you can run the Repository installer on Windows, Linux, or Mac OS X and choose that share as the install location. To install the Repository, simply run the appropriate installer for your operating system and follow the steps. This procedure is identical for all operating systems. The Repository installer also supports silent installations.

2.3. Database and Repository Installation 35 Deadline User Manual, Release 7.0.3.0

When choosing the Installation Directory, you can choose either a local path on the current machine, or the path to an existing network share. Note that if you choose a local path, you must ensure that path is shared on the network so that the Clients can access it. Do not install over an existing installation unless it’s the same major version, or there could be unexpected results. If you’re installing over an existing Repository installation, all previous binaries, plug-ins, and scripts will be backed up prior to being overwritten. After the installation is complete, you can find these backed up files in the Backup folder in the Repository installation root. Note that installing over an existing repository is only supported for repairing a damaged repository, or for performing a minor upgrade. Major upgrades require a fresh repository installation. See the Upgrading or Downgrading Deadline Documentation for more information.

36 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

After choosing the installation directory, you will be asked to install the MongoDB Database, or connect to an existing one. If you choose to install the MongoDB Database, you will be asked to choose an installation location and a port number. It is highly recommended that you choose a local directory to install the Database. Note that Deadline 7 requires a newer version of the MongoDB database application than the one shipped with

2.3. Database and Repository Installation 37 Deadline User Manual, Release 7.0.3.0

Deadline 6. However, this newer version is backward compatible with Deadline 6. So if you are installing the MongoDB database application to a machine that already has a Deadline 6 database installed, you can just install it over top of the existing Deadline 6 database installation.

Next, you need to specify the Database Settings so that the installer can set up the Database. These settings will also be used by the Clients to connect to the database. The following are required: • Database Server: The host name or the IP address of the machine that the MongoDB database is running on. If desired, you can specify multiple entries and separate them with semicolons. There are a couple reasons to specify multiple entries: – You have machines on different subnets that need to access the database differently (ie: machines in the cloud might use a different host name than machines on the local network). – Some machines need to resolve the database machine by its host name, and others need to use its IP address. Note that if there are IP addresses listed that cannot be resolved, the Deadline Command application can run slower on Linux and OSX Clients because it won’t exit until the connection attempt for those IP addresses time out. • Database Port: The port that the MongoDB database is listening on. • Databse Name: The name of the Database. If you are setting up a new Database, you can leave this as the default. If you are connecting to an existing Database, make sure to enter the same name you used when you nitially set up the Database. • Replica Set: If you set up your MongoDB database manually and it is part of a Replica Set, specify the Replica Set Name here. If you don’t have a replica set, just leave this blank. When you press Next, the installer will try to connect to the database using these settings to configure it. This can take a minute or two. If an error occurs, you will be prompted with the error message. If the setup succeeds, you can then proceed with the installation of the Repository.

38 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Command Line or Silent Installation

The Repository installer can be run in command line mode or unattended mode on each operating system. Note though that on Mac OS X, you must run the installbuilder.sh script that can be found in the Contents/MacOS folder, which is inside the Mac Repository Installer package. To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:

./DeadlineRepository-X.X.X.X-linux-x64-installer.run --mode text

To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows:

DeadlineRepository-X.X.X.X-windows-installer.exe --mode unattended

To get a list of all available command line options, pass the “–help” command line option to the installer. For example, on Mac OS X:

/DeadlineRepository-X.X.X.X-osx-installer.app/Contents/MacOS/installbuilder.sh --help

Note that there are a few Repository installer options that are only available from the command line, which you can view when running the “–help” command. These options include: • –dbauth: If enabled, Deadline will use the given user and password to connect to MongoDB (if authentication is enabled on your database). • –dbuser: The user name to connect to MongoDB if authentication is enabled. • –dbpassword: The password to connect to MongoDB if authentication is enabled. • –dbsplit: If enabled, the database collections will be split into separate databases to improve performance (this is enabled by default).

Database Config File

A file called config.conf is installed to the data directory in the database installation folder. This file is used to configure the MongoDB database, and can be modified to add or change functionality. This is what you will typically see by default:

#MongoDB config file

#where to log systemLog: destination: file path: C:/DeadlineDatabase7/data/logs/log.txt quiet: true #verbosity:

#port for mongoDB to listen on #uncomment below ipv6 and REST option to enable them. net: port: 27070 #ipv6: true #http: #RESTInterfaceEnabled: true

#where to store the data

2.3. Database and Repository Installation 39 Deadline User Manual, Release 7.0.3.0

storage: dbPath: C:/DeadlineDatabase7/data

#enable sharding #sharding: #clusterRole #configDB

#setup replica set with give replica set name #replication: #replSetName

#enable authentication #security: #authorization: enabled

After making changes to this file, simply restart the mongod process for the changes to take effect. See the MongoDB Configuration File Options for more information on the available options.

Manual Database Installation

The Repository installer installs MongoDB with the bare minimum settings required for Deadline to operate. Manually installing the Database might be preferable for some because it gives you greater control over things like authentication, and allows you to create sharded clusters or replica sets for backup. If you wish to install MongoDB manually, you can download MongoDB from the MongoDB Downloads Page. Once MongoDB is running, you can then run the Repository installer, and choose to connect to an existing MongoDB Database. Here are some helpful links for manually installing the MongoDB database: • Installing MongoDB • Enabling Authentication • Replication • Sharding MongoDB also has a management system called MMS. It’s a cloud service that makes it easy to provision, monitor, backup, and scale your MongoDB databse. Here are some helpful links for setting up and using MMS: • Getting Started • Add MongoDB Servers to MMS • Install the Automation Agent The Automation Agent mentioned above makes it possible to setup your MongoDB database from a web interface, and easily configure which MongoDB servers are replica sets or shards. It also allows you to easily upgrade the version of your MongoDB database. Here are some additional links for how you can use the Automation Agent: • Deploy a Replica Set • Deploy a Sharded Cluster • Deploy a Standalone MongoDB Instance • Change the MongoDB Version Note though that as of this writing, the Automation Agent is only available for Linux and Mac OS X.

40 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Database Resource Limits

Linux and Mac OS X systems impose a limit on the number of resources a process can use, and these limits can affect the number of open connections to the database. It is important to be aware of these limits, and make sure they are set appropriately to avoid unexpected behaviour. Note that MongoDB will allocate 80% of the system limit for connections, so if the system limit is 1024, the maximum number of connections will be 819. If you choose a Linux system to host the database, make sure the system limits are configured properly to avoid connec- tion issues. See MongoDB’s Linux ulimit Settings documentation for more information, as well as the recommended system limits to use. If you choose a Mac OS X system to host the database, and you use the Repository installer to install the database, the resource limits will be set to 1024. These limits can be adjusted later by manually editing the HardResourceLimits and SoftResourceLimits values in /Library/LaunchDaemons/org.mongodb.mongod.plist after the Repository installer has finished.

2.3.3 Open Firewall Ports

To ensure that the Deadline applications can communicate with MongoDB, you will need to update the firewall on the machine that MongoDB is running on. You can either disable the firewall completely (assuming it operates in an internal network), or you can open the port that you chose for the database to use during install. More information on opening ports can be found below.

Windows

Open Windows Firewall with Advanced Security. Click on Inbound Rules in the left panel to view all inbout rules, and then right-click on Inbound Rules and select New Rule to start the Inbound Rule Wizard. Select Port for the Rule Type, and then click Next.

2.3. Database and Repository Installation 41 Deadline User Manual, Release 7.0.3.0

42 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

On the Protocol and Ports page, choose TCP, and then specify the port that you chose for the database during the install, and then press next. Then on the Action page, choose Allow The Connection and press Next.

2.3. Database and Repository Installation 43 Deadline User Manual, Release 7.0.3.0

44 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

On the Profile page, choose the networks that this rule applies to, and then press next. Then on the Name page, specify a name for the rule (for example, MongoDB Connection), and then press Finish.

2.3. Database and Repository Installation 45 Deadline User Manual, Release 7.0.3.0

46 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Linux

On RedHat and CentOS, the following commands should allow incoming connections to the Mongo database if ipta- bles are being used. Just make sure to specify the port that you chose for the database during the install. sudo iptables -I INPUT 1 -p tcp --dport 27070 -j ACCEPT sudo ip6tables -I INPUT 1 -p tcp --dport 27070 -j ACCEPT

Ubuntu has no firewall installed by default, and we have not yet tested Fedora Core’s FirewallD.

Mac OS X

Mac OS X has its firewall disabled by default, but if enabled, it is possible to open ports for specific applications. Open up System Preferences„ choose the Security & Privacy option, and click on the Firewall tab.

2.3. Database and Repository Installation 47 Deadline User Manual, Release 7.0.3.0

Press the Firewall Options button to open the firewall options. Press the [+] button and choose the path to the mongod application, which can be found in the database installation folder in mongo/application/bin (for example, /Applica- tions/Thinkbox/DeadlineDatabase7/mongo/application/bin/mongod). Then click OK to save your settings.

48 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

2.3.4 Sharing The Repository Folder

In general, the Repository must have open read and write permissions for Deadline to operate properly. This section explains how to share your Repository folder and configure its permissions to ensure the Clients have full access. Without full read/write access, the Client applications will not be able to function properly. Note that this guide is for applying full read/write permissions to the entire Repository folder structure. For the more advanced user, it is possible to enforce tighter restrictions on the Repository folders. Just make sure the Clients have full read/write access to the following folders in the Repository. The rest must have at least read access. • jobs: This is where job auxiliary files are copied to during submission. • jobsArchived: This is where archived jobs are exported to. • reports: This is where the physical log files for job and slave reports are saved to.

Windows

First, you need to configure the Repository folder permissions. Note that the images shown here are from Windows XP, but the procedure is basically the same for any version of Windows. • On the machine where the Repository is installed, navigate to the folder where it is installed using Windows Explorer.

2.3. Database and Repository Installation 49 Deadline User Manual, Release 7.0.3.0

• Right-click on the Repository folder and select Properties from the menu. • Select the Security tab.

• If there is already an Everyone item under Group or user names, you can skip the next two steps. • Click on the Add button. • In the resulting dialog, type Everyone and click OK.

50 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

• Select Everyone under Group or user names. • Ensure that Modify, Read & Execute, List Folder Contents, Read, and Write are all checked under the Allow column. • Click on the OK button to save the settings.

2.3. Database and Repository Installation 51 Deadline User Manual, Release 7.0.3.0

Second, you need to share the Repository folder. Note that the images shown here are from Windows XP, but the procedure is basically the same for any version of Windows. • On the machine where the Repository is installed, navigate to the folder where it is installed using Windows Explorer. • Right-click on the Repository folder and select Properties from the menu. If you’re unable to see the Sharing tab, you may need to disable Simple File Sharing in the Explorer Folder Options.

52 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

• Select the Sharing tab.

2.3. Database and Repository Installation 53 Deadline User Manual, Release 7.0.3.0

• Select the option to Share This Folder, then specify the share name. • Click the Permissions button. • Give Full Control to the Everyone user. • Press OK on the Permissions dialog and then the Properties dialog.

54 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Linux

Since the Clients expects full read and write access to the repository, it’s recommended to use a single user account to mount shares across all machines. It is possible to add particular users to a ‘deadline’ group, but you will need to experiment with that on your own. So for both of the sharing mechanisms we explain below, you’ll need to create a user and a group named ‘deadline’. They don’t need a login or credentials, we just need to be able to set files to be owned by them and for their account to show up in /etc/passwd. So, to do this use the useradd command. sudo useradd -d /dev/null -c "Deadline Repositry User" -M deadline

This should create a user named “deadline” with no home folder, and a fancy comment. The account login should also be disabled, meaning your standard users can’t ssh or ftp into your file server using this account. Set a password using sudo passwd deadline if you need your users to login as deadline using ftp or ssh.

2.3. Database and Repository Installation 55 Deadline User Manual, Release 7.0.3.0

Now add a group using sudo groupadd deadline

And finally, have the Repository owned by this new user and group sudo chown -R deadline:deadline /path/to/repository sudo chmod -R 777 /path/to/repository

Now you’re ready to set up your network sharing protocol. There are a many ways this can be done, and this just covers a few of them. Samba Share This is an example entry in the /etc/samba/smb.conf file:

[DeadlineRepository] path = /path/to/repository writeable = Yes guest ok = Yes create mask = 0777 force create mode = 0777 force directory mode = 0777 unix extensions = No

NFS Share The simplest thing that could possibly work. Note that this is not the most secure thing that could possibly work: For Linux and BSD, open up /etc/exports as an administrator, and make one new export:

/path/to/repository 192.168.2.0/24(rw,all_squash,insecure)

Breakdown of this command is as follows: • /path/to/repository: The Repository folder to share. Change the path as necessary. • 192.168.2.0/24: The IP range to allow. The zero is important for these ranges. You can also go by hostname if you have reverse DNS, or * to allow from anyone’s computer. • rw: Allow read/write for the repository, which is required for the Clients to operate properly. • all_squash: Make every single person who connects to the Repository share map to the nobody:nogroup user and group. This relieves a lot of permissions pain for new users at the cost of zero security. Files and folders within your repository will be fully readable and writeable by whomever is able to connect to your NFS server. The Clients require this, but it can also be achieved by creating a group and adding individual users into that group. Many studios will only need all_squash as Deadline will keep track of who submits what jobs. • insecure: Required for Mac OS X to mount nfs shares. It simply means that NFS doesn’t need to receive requests on a port in the secure port range (a port number less than 1024). Once that’s done, you may need to install an NFS server. To do so, open a terminal or your favourite package manager to install one. For Ubuntu Server, type the following: sudo apt-get install nfs-kernel-server

Then start up the server (for those living in an init.d world):

56 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

sudo /etc/init.d/nfs-kernel-server start

Any time you change the exports file, you’ll need to issue the same command, but replace ‘start’ with ‘reload’. There is an excellent tutorial here as well: https://help.ubuntu.com/community/SettingUpNFSHowTo

Mac OS X

First, you need to configure the Repository folder permissions. Note that the images shown here are from Leopard (10.5), but the procedure is basically the same for any version of Mac OS X. • On the machine where the Repository is installed, navigate to the folder where it is installed using Finder. • Right-click on the Repository folder and select Get Info from the menu. • Expand the Sharing & Permissions section, and unlock the settings if necessary. • Give everyone Read & Write privileges. • While probably not necessary, also give admin Read & Write privileges.

2.3. Database and Repository Installation 57 Deadline User Manual, Release 7.0.3.0

If you prefer to set the permissions from the Terminal, run the following commands:

$ chown -R nobody:nogroup /path/to/repository $ chmod -R 777 /path/to/repository

Now you can share the folder. There are a many ways this can be done, and this just covers a few of them. Using System Preferences Note that the images shown here are from Leopard (10.5), but the procedure is basically the same for any version of Mac OS X. • Open System Preferences, and select the Sharing option.

58 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

• Make sure File Sharing is enabled, and then add the Repository folder to the list of shared folders. • Under Users, give everyone Read & Write privileges. • If sharing with Windows machines, press the Options button and make sure the “Share files and folders using SMB (Windows)” is enabled.

Samba Share Interestingly, Mac OS X uses samba as well. Apple just does a good job of hiding it. To create a samba share in Mac OS X, past this at the bottom of /etc/smb.conf:

[DeadlineRepository] path = /path/to/repository writeable = Yes guest ok = Yes create mask = 0777 force create mode = 0777 force directory mode = 0777 unix extensions = No

2.3.5 Uninstallation

The Repository installer creates an uninstaller in the folder that you installed the Repository to. To uninstall the Repository, simply run the uninstaller and confirm that you want to proceed with the uninstallation.

2.3. Database and Repository Installation 59 Deadline User Manual, Release 7.0.3.0

Note that if you installed the Database with the Repository installer, it will be uninstalled as well. If you chose to connect to a Database that you manually installed, the Database will be unaffected.

Command Line or Silent Uninstallation

The Repository uninstaller can be run in command line mode or unattended mode on each operating system. To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:

./uninstall --mode text

To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows: uninstall.exe --mode unattended

To get a list of all available command line options, pass the “–help” command line option to the installer. For example, on Mac OS X:

./uninstall --help

2.4 Client Installation

2.4.1 Overview

Before proceeding with this installation, it is highly recommended to read through the Render Farm Considerations documentation. This guide will walk you through the installation of the Client. At this point, you should already have the Database and Repository installed. If you do not, please see the Database and Repository Installation documentation for installation instructions. The Client consists of the following applications: • Launcher: Acts as a launch point for the Deadline applications on workstations, and facilitates remote commu- nication on render nodes. • Monitor: An all-in-one application that artists can use to monitor their jobs and administrators can use to monitor the farm. • Slave: Controls the rendering applications on the render nodes. • Command: A command line tool that can submit jobs to the farm and query for information about the farm.

60 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

• Pulse: An optional mini server application that performs maintenance operations on the farm, and manages more advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering, and the Web Service. If you choose to run Pulse, it only needs to be running on one machine. • Balancer: An optional Cloud-controller application that can create and terminate Cloud instances based on things like available jobs and budget settings. If you choose to run Balancer, it only needs to be running on one machine. Note that the Slaves and the Balancer applications are the only Client applications that require a license.

2.4.2 Installing The Clients

The Client should be installed on your render nodes, workstations, and any other machines you wish to participate in submitting, rendering, or monitoring jobs. The Slaves and the Balancer applications are the only Client applications that require a license. Before you can configure the license for the Client, the license server must be running. See the Licensing documentation for more information. If you choose to run Pulse, you need to install the Client on the chosen machine. Note that if you wish to run it on the same machine as the Database and/or Repository, you still have to install the Client on that machine. There are Client installers for Windows, Linux, and Mac OS X. To install the Client, simply run the appropriate installer for your operating system and follow the steps.

Windows

Start the installation process by double-clicking on the Windows Client Installer. The Windows Client installer also supports silent installations with additional options.

2.4. Client Installation 61 Deadline User Manual, Release 7.0.3.0

Choose an installation location and press Next to continue.

62 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Configure the necessary Client Setup and Launcher Setup settings. The following Client settings are available: • Repository Directory: This is the shared path to the Repository. • License Server: The license server entry should be in the format @SERVER, where SERVER is the host name or IP address of the machine that the license server is running on. If you configured your license server to use a

2.4. Client Installation 63 Deadline User Manual, Release 7.0.3.0

specific port, you can use the format PORT@SERVER. For example, @lic-server or 27000@lic-server. If you are running Deadline in LICENSE-FREE MODE, or you have not set up your license server yet, you can leave this blank for now. The following Launcher settings are available: • Launch Slave When Launcher Starts: If enabled, the Slave will launch whenever the Launcher starts. • Install Launcher As A Service: Enable this if you which to install the Launcher as a service. The service must run under an account that has network access. See the Windows Service documentation below for more information. After configuring the Client and Launcher settings, press Next to continue with the installation.

Linux

Start the installation process by double-clicking on the Linux Client Installer. The Linux Client installer also supports silent installations with additional options.

64 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Choose an installation location and press Next to continue.

2.4. Client Installation 65 Deadline User Manual, Release 7.0.3.0

66 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Configure the necessary Client Setup and Launcher Setup settings. The following Client settings are available: • Repository Directory: This is the shared path to the Repository. • License Server: The license server entry should be in the format @SERVER, where SERVER is the host name or IP address of the machine that the license server is running on. If you configured your license server to use a specific port, you can use the format PORT@SERVER. For example, @lic-server or 27000@lic-server. If you are running Deadline in LICENSE-FREE MODE, or you have not set up your license server yet, you can leave this blank for now. The following Launcher settings are available: • Launch Slave When Launcher Starts: If enabled, the Slave will launch whenever the Launcher launches. • Install Launcher As A Daemon: Enable this if you which to install the Launcher as a daemon. You can also choose to run the daemon as a specific user. If you leave the user blank, it will run as root instead. See the Linux Daemon documentation below for more information. After configuring the Client and Launcher settings, press Next to continue with the installation.

Mac OSX

Start the installation process by double-clicking on the Mac Client Installer. The Mac Client installer also supports silent installations with additional options.

2.4. Client Installation 67 Deadline User Manual, Release 7.0.3.0

68 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Choose an installation location and press Next to continue.

2.4. Client Installation 69 Deadline User Manual, Release 7.0.3.0

70 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Configure the necessary Client Setup and Launcher Setup settings. The following Client settings are available: • Repository Directory: This is the shared path to the Repository. Deadline isn’t able to understand paths starting with “afp://” or “smb://”, so point the installer to the Repository path mounted under “/Volumes”. • License Server: The license server entry should be in the format @SERVER, where SERVER is the host name or IP address of the machine that the license server is running on. If you configured your license server to use a specific port, you can use the format PORT@SERVER. For example, @lic-server or 27000@lic-server. If you are running Deadline in LICENSE-FREE MODE, or you have not set up your license server yet, you can leave this blank for now. The following Launcher settings are available: • Launch Slave When Launcher Starts: If enabled, the Slave will launch whenever the Launcher launches. After configuring the Client and Launcher settings, press Next to continue with the installation.

2.4.3 Command Line or Silent Installation

The Client installer can be run in command line mode or unattended mode on each operating system. Note though that on OSX, you must run the installbuilder.sh script that can be found in the Contents/MacOS folder, which is inside the Mac Client Installer package. To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:

2.4. Client Installation 71 Deadline User Manual, Release 7.0.3.0

./DeadlineClient-X.X.X.X-linux-x64-installer.run --mode text

To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows:

DeadlineClient-X.X.X.X-windows-installer.exe --mode unattended

To get a list of all available command line options, pass the “–help” command line option to the installer. For example, on OSX:

/DeadlineClient-X.X.X.X-osx-installer.app/Contents/MacOS/installbuilder.sh --help

Note that there are quite a few Client installer options that are only available from the command line, which you can view when running the “–help” command. These options include: • –configport: The port that the Client uses for Auto Configuration • –launcherport The Launcher uses this port for Remote Administration, and it should be the same on all Client machines • –launcherstartup: If enabled, the Launcher will automatically launch when the system logs in (non-service mode on Windows only) • –restartstalled: If enabled, the Launcher will try to restart the Slave application on this machine if it stalls • –autoupdateoverride: Overrides the Auto Update setting for this client installation (leave blank to use the value specified in the Repository Options)

2.4.4 Installing as a Service or Daemon

On Windows and Linux, you can choose to install the Launcher as a service or daemon during installation. There are a few things to keep in mind when running Deadline in this mode.

Windows Service

When running as a service on Windows, the Launcher will run without displaying its system tray icon. If the Slave or Pulse application is started through the Launcher while it is in this mode, they will also run without a user interface. Finally, the Launcher can still perform an auto-upgrade, but only when launching the Slave and Pulse applications (launching the Monitor, for example, will not invoke an upgrade). Note that when running the Launcher as a service, the Slave or Pulse application will also run in a service context. Since services run in a different environment, and potentially under a different user profile than the one currently logged in, certain considerations need to be made. First, the default user for a service has no access to network resources, so while Launcher service will run without any issues, neither the Slave nor Pulse applications will be able to access the Repository. To avoid network access issues, you must configure the service to run as a user with network privileges. Typical desktop users have this permission, but check with your system administrator to find which account is best for this application. Another issue presented by the service context is that there is no access to the default set of mapped drives. Applica- tions will either need to map drives for themselves, or make use of UNC paths. While Deadline supports Automatic Drive Mapping, the SMB protocol does not allow sharing a resource between two users on the same machine. This means that mapping of drives or accessing a resource with different credentials may fail when running as a service on a machine which already requires access to the Repository.

72 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

There is also an issue with hardware-based renderers. Starting with , services now run in a virtualized environment which prevents them from accessing hardware resources. Because the renderer will run in the context of a service, hardware-based renderers will typically fail to work.

Linux Daemon

When installing the daemon, the Client installer creates the appropriate deadlinelauncherservice script in /etc/init.d. When running as a daemon on Linux, the Launcher will run without displaying its system tray icon. If the Slave or Pulse application is started through the Launcher while it is in this mode, they will also run without a user interface. This is useful when running Deadline on a Linux machine that doesn’t have a Desktop environment.

2.4.5 Client License Configuration

Before you can configure the license for the Client, the license server must be running. See the Licensing documenta- tion for more information. If you didn’t configure the license for the Client during installation (see above), there are a couple of ways to set the license for the Client. The quickest way is to use the right-click menu in the Launcher or the File menu in the Slave application to change the license server.

The other option is to set up Auto Configuration so that the Client automatically pulls the license server information.

2.4.6 Uninstallation

The Client installer creates an uninstaller in the folder that you installed the Client to. To uninstall the Client, simply run the uninstaller and confirm that you want to proceed with the uninstallation.

2.4. Client Installation 73 Deadline User Manual, Release 7.0.3.0

Command Line or Silent Uninstallation

The Client uninstaller can be run in command line mode or unattended mode on each operating system. To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:

./uninstall --mode text

To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows:

uninstall.exe --mode unattended

To get a list of all available command line options, pass the “–help” command line option to the installer. For example, on Mac OS X:

./uninstall --help

2.5 Submitter Installation

2.5.1 Overview

This guide will walk you through the installation of the integrated submitters, which can be used to submit jobs from within your application (3ds Max, Maya, Nuke, etc). These should be installed on any machines you wish to submit jobs from. Note that jobs can also be submitted from the Submit menu in the Monitor. See the Submitting Jobs documentation for more information. At this point, you should already have the Database and Repository installed, and the Client software installed. If you do not, please see the Database and Repository Installation and Client Installation documentation for installation instructions. You also need to have the software that you will be submitting from installed as well (3ds Max, Maya, Nuke, etc).

2.5.2 Installing The Submitters

The submitter installers can be found in the submission folder in the Deadline Repository. Open the folder for the application you want to install the submitter for (3dsmax, Maya, Nuke, etc), and then open the Installers folder. There will be an installer for each operating system that the current application runs on. Simply run the appropriate installer then follow the step as follows. Note that these steps are similar for each applica- tion and each operating system.

74 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Choose the Repository location that you would like to install the submitters from.

2.5. Submitter Installation 75 Deadline User Manual, Release 7.0.3.0

Select the components you wish to install (the installer will try to auto select the versions it detects), and then verify the install location for each one. After configuring these, press Next to continue with the installation.

76 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

2.5.3 Silent Installation

The Submitter installers can be run in command line mode or unattended mode on each operating system. Note though that on OSX, you must run the installbuilder.sh script that can be found in the Contents/MacOS folder, which is inside the Mac Submitter Installer package. To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:

./Nuke-submitter-linux-installer.run --mode text

To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows:

Maya-submitter-windows-installer.exe --mode unattended

To get a list of all available command line options, pass the “–help” command line option to the installer. For example, on OSX:

/Maya-submitter-osx-installer.app/Contents/MacOS/installbuilder.sh --help

Note that there are quite a few Submitter installer options that are only available from the command line, which you can view when running the –help command. These options include: • –enable-components: Select the components which you would like to enable (programs installed in default locations will be auto selected) • –disable-components: Select the components which you would like to disable (programs installed in default locations will be auto selected) • –destDir###: The destination directories for the components (will be defaulted to if installed in default locations) An example batch script that puts these all together:

@echo off .\Maya-submitter-windows-installer.exe --mode unattended --disable-components Maya2014 .\3dsMax-submitter-windows-installer.exe --mode unattended --enable-components 3dsMax2011,3dsMax2015 --disable-components 3dsMax2012,3dsMax2013,3dsMax2014 --destDir2011 "C:\3dsMax2011_64" .\Nuke-submitter-windows-installer.exe --mode unattended

This script installs the submitters for Maya (ignoring Maya 2014), 3ds Max(2011 and 2015 only, with 2011 in an unusual directory) and Nuke (default settings)

2.6 Upgrading or Downgrading Deadline

2.6.1 Overview

This will guide you through the process of upgrading or downgrading an existing Deadline installation.

2.6.2 Major Upgrades or Downgrades

If upgrading to a new major version (for example, Deadline 6 to 7), or downgrading from a new major version (for example, Deadline 7 to 6), you will need to install a new Repository and Database, and you will need to reinstall the Client software. This is necessary because there are often breaking changes between major releases. Do not install over an existing installation unless it’s the same major version, or there could be unexpected results.

2.6. Upgrading or Downgrading Deadline 77 Deadline User Manual, Release 7.0.3.0

Note that Deadline 7 requires a newer version of the MongoDB database application. However, this newer version is backward compatible with Deadline 6. So if you are installing the MongoDB database application to a machine that already has a Deadline 6 database installed, you can just install it over top of the existing Deadline 6 database installation. You should also reinstall your integrated submission scripts on your workstations, since it’s possible these were changed between major releases. See the Application Plug-ins documentation for more information on how to set up the integrated submission scripts (where applicable). The license server should also be upgraded to ensure it will work with newer releases in case there are incompatibilities with the previous version of the license server. Please refer to the following documentation for more information: • Database and Repository Installation Guide • Client Installation Guide • Licensing Guide

2.6.3 Minor Upgrades or Downgrades

Note that for Mac OS X users upgrading from 6.0 to 6.1, you must run the Deadline 6.1 Client installer on your Mac OS X machines. Unfortunately, there was a bug in the 6.0 Client installer that prevents the auto- upgrade to 6.1 from working properly. This bug has been fixed in the 6.1 Client installer, so you will be able to auto-upgrade from 6.1 to 6.2 in the future. Windows and Linux users can still auto-upgrade from 6.0 to 6.1 as documented here. If upgrading or downgrading to a minor version that is part of the same major release cycle (for example, Deadline 7.0 to 7.0.1, or Deadline 6.2 to 6.1), you can simply install over the existing installation. If you have Automatic Upgrades / Downgrades enabled, you can have the Clients automatically upgrade or downgrade themselves after upgrading or downgrading the Database and Repository. Automatic Upgrades / Downgrades can be enabled in the Client Setup section of the Repository Configuration. You can also enable Remote Administration in the Client Setup section of the Repository Configuration. This will make it easier to upgrade or downgrade your render nodes remotely. Note that this upgrade/downgrade method is only supported when upgrading or downgrading an existing Repository installation. For example, it is NOT recommended to install the Deadline 6.1 Repository to a new location and then have your 6.0 Clients upgrade by pointing them to the new Repository path. Instead, you should first move your Repository installation and then do the upgrade once your 6.0 Clients are connected to the new Repository.

Upgrading or Downgrading the Database and Repository

Launch the new Repository installer, and choose the existing Repository folder for the Installation Directory. Then choose the option to connect to an existing MongoDB database, and use the same Database Settings you used when installing the previous version (they should be pre-populated for you). During the installation, all binaries, plug-ins, and scripts from the previous version will be backed up. You can find them in the backup folder in the Repository after the installation is complete. Note that any scripts or plugins in the ‘custom’ folder will not be affected when upgrading the Repository. After upgrading or downgrading the Database and Repository, you can then upgrade or downgrade the Clients.

78 Chapter 2. Installation Deadline User Manual, Release 7.0.3.0

Upgrading or Downgrading Pulse

Before upgrading or downgrading all of your client machines, you should first upgrade or downgrade Pulse. If you don’t have Automatic Upgrades / Downgrades enabled, you will have to upgrade or downgrade Pulse manually, which involves running the Client Installer on the Pulse machine. See the Client Installation Guide for more information. If you have Automatic Upgrades / Downgrades enabled, all you have to do is restart the Pulse application through the Launcher. The Client will notice that the Repository has been upgraded or downgraded, and will automatically upgrade or downgrade itself. To restart Pulse remotely, Remote Administration must be enabled. Select Pulse in the Pulse List in the Monitor while in Super User mode, then right click and select Remote Control -> Restart Pulse. See the Remote Control documentation for more information.

Upgrading or Downgrading the Clients

If you don’t have Automatic Upgrades / Downgrades enabled, you will have to upgrade or downgrade the Clients manually, which involves running the Client Installers on the machines. See the Client Installation Guide for more information. If you have Automatic Upgrades / Downgrades enabled, all you have to do is restart the Slave application on each render node through the Launcher. The Client will notice that the Repository has been upgraded or downgraded, and will automatically upgrade or downgrade itself. In addition, the next time artists launch the Monitor on their workstations through the Launcher, their installation will also be upgraded or downgraded. To restart the Slaves remotely, Remote Administration must be enabled. Select the Slaves you want to upgrade or downgrade in the Monitor while in Super User mode, then right click and select Remote Control -> Restart Slaves. If the slaves are currently rendering and you don’t want to disrupt them, you can choose the option to Restart Slaves After Current Task instead. This option will allow the Slaves to upgrade or downgrade after they finishe rendering their current task to prevent the loss of any render time. See the Remote Control documentation for more information. After restarting the Slaves, several Slaves may appear offline or a message may pop up saying the certain Slaves did not respond. This may occur because all the Slaves are trying to upgrade or downgrade at once. Wait a little bit and eventually all the Slaves should come back online.

2.7 Relocating the Database or Repository

2.7.1 Overview

There may come a time where you have to move the Database or Repository (or both) to another location or another machine. This guide will walk you through the steps required.

2.7.2 Migrating the Database

These are the steps to move your Database to a new location: 1. Shut down all the Slave applications running on your render nodes. You don’t want them making changes during the move. 2. Stop the mongod process on the Database machine. 3. Copy the Database folder from the original location to the new one. 4. Update the config.conf file in the data folder to point to the new system log folder and storage folder locations.

2.7. Relocating the Database or Repository 79 Deadline User Manual, Release 7.0.3.0

5. Start the mongod process on the Database machine. 6. Modify the dbConnect.XML file in the settings folder in the Repository to set the new database host name or IP address (if you moved it to another machine). 7. Start up the Slaves and ensure that they can connect to the new Database. Here is an example of how you would update the config.conf file if you copied the new database location was C:\NEW_DATABASE_FOLDER: systemLog: destination: file path: C:/NEW_DATABASE_FOLDER/data/logs/log.txt quiet: true storage: dbPath: C:/NEW_DATABASE_FOLDER/data

Because the Clients use the dbConnect.xml file in the Repository to determine the database connection settings, you don’t have to reconfigure the Clients to find the new database.

2.7.3 Migrating the Repository

These are the steps to move your Repository to a new location: 1. Ensure that the share for the new location already exists. Also ensure that the proper permissions have been set. 2. Shut down all the Slave applications running on your render nodes. You don’t want them making changes during the move. 3. Copy the Repository folder from the original location to the new location. 4. Redirect all your Client machines to point to the new Repository location. 5. Start up the Slaves and ensure that they can connect to the new Repository location. 6. Delete the original Repository (optional). As an alternative to step (4), you can configure your share name (if the new Repository is on the same machine) or your DNS settings (if the new Repository is on a different machine) so that the new Repository location has the same path as the original. This saves you the hassle of having to reconfigure all of your Client machines.

80 Chapter 2. Installation CHAPTER THREE

GETTING STARTED

3.1 Submitting Jobs

3.1.1 Overview

The easiest and most common way to submit render jobs to Deadline is via our many submission scripts, which are written for each rendering application it supports. After you have submitted your job, you can monitor its progress using the Monitor. See the Monitoring Jobs documentation for more information. If you would like more control over the submission process, or would like to submit arbitrary command line jobs to Deadline, see the Manual Job Submission documentation for more information.

3.1.2 Integrated Submission Scripts

Where possible, we have created integrated submission scripts that allow you to submit jobs directly from the appli- cation you’re working with. These scripts are convenient because you don’t have to launch a separate application to submit the job. In addition, these scripts often provide more submission options because they have direct access to the scene or project file you are submitting. See the Plug-ins documentation for more information on how to set up the integrated submission scripts (where appli- cable) and submit jobs for specific applications.

81 Deadline User Manual, Release 7.0.3.0

82 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

3.1. Submitting Jobs 83 Deadline User Manual, Release 7.0.3.0

3.1.3 Monitor Submission Scripts

In cases where an application doesn’t have an integrated submission script, you can submit the jobs from the Submit menu in the Monitor. Note that applications that have integrated submission scripts also have Monitor scripts here, but in most cases there are less options to choose from. This is because the integrated submission scripts use the application’s native scripting language to pull additional information from the file being submitted. See the Plug-ins documentation for more information on how submit jobs for specific applications. You can also create your own submission scripts for the Monitor. Check out the Monitor Scripting documentation for more details.

84 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

3.1. Submitting Jobs 85 Deadline User Manual, Release 7.0.3.0

3.1.4 Common Job Submission Options

There are many job options that can be specified on submission. A lot of these options are general job properties that aren’t specific to the application you’re rendering with. Some of these options are described below. There are also many other options that are specific to the application that you’re rendering with. These are covered in each application’s plug-in guide, which can be found in the Plug-ins documentation. Job Name

86 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

The name of your job. This is optional, and if left blank, it will default to “Untitled”. Comment A simple description of your job. This is optional and can be left blank. Department The department you belong to. This is optional and can be left blank. Pool and Group The pool and group that the job belongs to. See the Job Scheduling documentation for more information on how these options affect job scheduling. Priority A job can have a numeric priority ranging from 0 to 100, where 0 is the lowest priority and 100 is the highest priority. See the Job Scheduling documentation for more information on how this option affects job scheduling. Task Timeout and Auto Task Timeout The number of minutes a slave has to render a task for this job before an error is reported and the task is requeued. Specify 0 for no limit. If the Auto Task Timeout is properly configured in the Repository Op- tions, then enabling the Auto Task Timeout option will allow a task timeout to be automatically calculated based on the render times of previous frames for the job. Concurrent Tasks and Limiting Tasks To A Slave’s Task Limit The number of tasks that can render concurrently on a single slave. This is useful if the rendering appli- cation only uses one thread to render and your slaves have multiple CPUs. Caution should be used when using this feature though if your renders require a large amount of RAM. If you limit the tasks to a slave’s task limit, then by default, the slave won’t dequeue more tasks then it has CPUs. This task limit can be overridden for individual slaves by an administrator. See the Slave Settings documentation for more information. Machine Limit and Machine Whitelists/Blacklists Use the Machine Limit to specify the maximum number of slaves that can render your job at one time. Specify 0 for no limit. You can also force the job to render on specific slaves by using a whitelist, or you can avoid specific slaves by using a blacklist. See the Limit Documentation for more information. Limits The limits that your job must adhere to. See the Limit Documentation for more information. Dependencies Specify existing jobs that this job will be dependent on. This job will not start until the specified depen- dencies finish rendering. On Job Complete If desired, you can automatically archive or delete the job when it completes. Submit Job As Suspended If enabled, the job will submit in the suspended state. This is useful if you don’t want the job to start rendering right away. Just resume it from the Monitor when you want it to render. Frame List The list of frames to render. See the Frame List Formatting Options below for valid frame lists. Frames Per Task

3.1. Submitting Jobs 87 Deadline User Manual, Release 7.0.3.0

Also known as Chunk Size. This is the number of frames that will be rendered at a time for each job task. Increasing the Frames Per Task can help alleviate some of the inherited overhead that comes with network rendering, but if your frames take longer than a couple of minutes to render, it is recommended that you leave the Frames Per Task at 1. Submit Scene/Project File With Job If this option is enabled, the scene or project file you want to render will be submitted with the job, and then copied locally to the slave machine during rendering. The benefit to this is that you have a copy of the file in the state that it was in when it was submitted. However, if your scene or project file uses relative asset paths, enabling this option can cause the render to fail when the asset paths can’t be resolved. If this option is disabled, the file needs to be in a shared location so that the slave machines can find it when they go to render it directly. Leaving this option disabled is required if the file has references (footage, textures, caches, etc) that exist in a relative location. Note though that if you modify the original file, it will affect the render job.

3.1.5 Draft and Integration Submission Options

The majority of the submission scripts that ship with Deadline have Integration options to connect to Shotgun and ftrack, and/or use Draft to perform post-rendering compositing operations. The Integration and Draft job options are essentially the same in every submission script, and more information can be found in their respective documentation: • Draft Documentation • Shotgun Documentation • ftrack Documentation

3.1.6 Jigsaw

Jigsaw is a flexible multi-region rendering system for Deadline, and is available for 3ds Max, Maya, modo, and Rhino. It can be used to render regions of various sizes for a single frame, and in 3ds Max and Maya, it can be used to track and render specific objects over an animation. Draft can then be used to automatically assemble the regions into the final frame or frames. It can also be used to automatically composite re-rendered regions onto the original frame. Jigsaw is built into the 3ds Max, Maya, modo, and Rhino submitters, and with the exception of 3ds Max, Jigsaw viewport will be displayed in a separate window.

88 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

The viewport can be used to create and manipulate regions, which will then be submitted to Deadline to render. The available options are listed below. General Options These options are always available: • Add Region: Adds a new region. • Delete All: Deletes all the current regions. • Create From Grid: Creates a grid of regions to cover the full viewport. The X value controls the number of columns and the Y value controls the number of rows. • Fill Regions: Automatically creates new regions to fill the parts of the viewport that are not currently covered by a region. • Clean Regions: Deletes any regions that are fully contained within another region. • Undo: Undo the last change made to the regions. • Redo: Redo the last change that was previously undone. Selected Regions Options These options are only available when one or more regions are selected. • Delete: Deletes the selected regions. • Split: Splits the selected regions into sub-regions based on the Tiles In X and Tyles In Y settings. These options are only available when a single region is selected: • Clone: Creates a duplicate region parallel to the selected region in the specified direction.

3.1. Submitting Jobs 89 Deadline User Manual, Release 7.0.3.0

• Lock Postion: If enabled, the region will be locked to its current position. • Enable Region: If disabled, the region will be ignored when submitting the job. • X Position: The horizontal position of the selected region, taken from the left. • Y Position: The vertical position of the selected region, taken from the top. • Width: The width of the selected region. • Height: The height of the selected region. These options are only available when multiple regions are selected. • Merge: Combines the selected regions into a single region that covers the full area of the selected regions. Maya Options These options are currently only available for Maya: • Reset Background: Gets the current viewport image from Maya. • Fit Selection: Create regions surrounding the selected items in the Maya scene. • Mode: The type of regions to be used when fitting the selected items. The options are Tight (fitting the minimum 2d bounding box of the points) and Loose (fitting the minimum 2d bounding box of the bounding box of the object). • Padding: The amount of padding to add when fitting the selection (this is a percentage value that is added in each direction). • Save Regions: Saves the informations in the regions directly into the Maya scene. • Load Regions: Loads teh saved regions information from the Maya scene.

3.1.7 Frame List Formatting Options

During job submission, you usually have the option to specify the frame list you want to render, which often involves manually typing the frame list into a text box. In this case, you can make use of the following frame list formatting options. Specifying Individual Frames or a Sequence You can specify a single frame just by typing in the frame number:

5

You can specify individual frames be separating each frame with a comma or a space:

5,10,15,20 5 10 15 20

You can specify a frame range by separating the start and end frame with a dash:

1-100

Specifying a Sequence with a Step Frame You can specify a step frame for a sequence using x, step, by, or every:

90 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

1-100x5 1-100step5 1-100by5 1-100every5

Each of these examples will render every 5th frame between 1 and 100 (1, 6, 11, 16, etc). Advanced Frame Lists Individual frames for the same job are never repeated when creating tasks for a job, which allows you to get creative with your frame lists without worrying about rendering the same frame more than once. To render frames 5, 18, and then from 28 to 100, you can specify one of the following:

5,18,28-100 5 18 28-100

To render every 5th frame between 1 to 100, then fill in the rest, you can specify one of the following:

1-100x5,1-100 1-100x5 1-100

To render every 10th frame between 1 to 100, then every 5th frame, then every 2nd frame, then fill in the rest, you can specify one of the following:

1-100x10,1-100x5,1-100x2,1-100 1-100x10 1-100x5 1-100x2 1-100

3.2 Monitoring Jobs

3.2.1 Overview

The Monitor application lets you monitor and control your jobs after they have been submitted to the farm. This documentation only covers some of the basics regarding the Monitor application. For more in-depth information, see the Monitor documentation.

3.2. Monitoring Jobs 91 Deadline User Manual, Release 7.0.3.0

If you’re launching the Monitor for the first time on your machine, you will be prompted with a Login dialog. Simply choose your user name or create a new one before continuing. Once the Monitor is running, you’ll see your user name in the bottom right corner. If this is the wrong user, you can log in as another user by selecting File -> Change User. Note that if your administrator set up Deadline to lock the user to the system’s login account, you will have to log off of your system and log back in as the correct user.

3.2.2 Finding Your Jobs

Information in the Monitor is broken up into different panels. When monitoring your jobs, you typically want to use the following panels: • Job Panel: This panel shows all the jobs in the farm. • Task Panel: When a job is selected, this will show all the tasks for the job. • Job Reports Panel: When a job is selected, this will show all reports (logs and errors) for the job. These panels, and others, can be created from the View menu, or from the main toolbar. They can be re-sized, docked, or floated as desired. This allows for a highly customized viewing experience which is adaptable to the needs of different users. See the Panel Features documentation for instructions on how to create new panels in the Monitor.

92 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

The easiest way to find your jobs is to enable Ego-Centric Sorting in the job panel’s drop down menu, which can be found in the upper-right corner of the panel. This keeps all of your jobs at the top of the job list, regardless of which column the job list is sorted on. Then sort on the Submit Date/Time column to show your jobs in the order they were submitted.

3.2. Monitoring Jobs 93 Deadline User Manual, Release 7.0.3.0

3.2.3 Filtering the Job List

Another way to find the jobs you are interested in is to use the filtering options in the job panel. The Quick Filter option in the job panel’s drop down menu will open a side panel that allows you to filter out jobs based on status, user, pool, group, and plugin.

For more advanced filtering, use the Edit Filter option in the drop down menu to filter on any column in the job list. If you would like to save a filter for later use, use the Pinned Filters option in the drop down menu to pin your filter. You will then be able to select it later from the Pinned Filters sub menu.

94 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

3.2. Monitoring Jobs 95 Deadline User Manual, Release 7.0.3.0

Finally, you can use the search box above the job list to filter your results even further.

3.2.4 Job Batches

Jobs that share the same Batch Name property will be grouped together in the job list. The batch a job belongs to can be configured in the Job Properties.

96 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

If you prefer to not have the jobs grouped together in the job list, you can disable the Group Jobs By Batch Name option in the Monitor and User Settings.

3.2.5 Controlling Your Jobs

If you need to pause your job, you can right-click on the job in the job list and select Suspend job. When you are ready to let the job continue, simply right-click on the job again and select Resume Job. See the Job States documentation for more information.

To modify the properties of your job, you can double-click on the job, or right-click on it and select Modify Properties. Here you can change scheduling options such as priority and pool, as well as other general properties like the job name. If you wish to limit which render nodes your job runs on, as well as the number of nodes that can render it concurrently, you can do so on the Machine Limit page. Depending on the application you’re rendering with, you may see an extra page at the bottom of the properties list (with the name of the plug-in) that allows you to modify

3.2. Monitoring Jobs 97 Deadline User Manual, Release 7.0.3.0 properties which are specific to that application. More information on job properties can be found in the Job Properties documentation.

3.2.6 Why Is My Job Not Rendering?

If a slave isn’t rendering a job that you think it should be, you can use the Job Candidate Filter option in the panel’s drop down menu to try and figure out why. See the Job Candidate Filter section in the Slave Configuration documentation for more information. The job could also be producing errors when rendering. See the following section below about handling job errors.

3.2.7 Handling Job Errors

If your job starts producing errors, you’ll notice that your job will change from green to brown, then eventually to red (depending on the number of errors). These error reports can be viewed in the Job Reports panel, which can be opened from the View menu, or from the job’s right-click menu. Here you will find all the reports generated for a job including the error reports which will be red. You can filter and sort the reports to help find what you are looking for. Often, the error reports will clearly show what the cause of the error is, allowing you can take the appropriate steps to resolve the problem. If you’re ever unsure of what an error means, feel free to email the error report to Deadline Support and we’ll try to help. See the Job Reports and History documentation for more information.

98 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

3.2.8 Completed Jobs

When your job is complete, you can view the output images by right-clicking on the individual tasks in the task list and selecting the output filename. This will open the image in the application that is set to open that type of file by default. Note that this option isn’t always available for some applications. In most cases though, you can view the output image folder by right-clicking on the job and selecting Explore Output. See the Job Output documentation for more information. You can also view the logs for the job in the Job Reports panel, which can be opened from the View menu, or from the job’s right-click menu. Finally, once you are happy with the results and no longer need the job, you can delete it by right-clicking on the job and selecting Delete Job.

3.2. Monitoring Jobs 99 Deadline User Manual, Release 7.0.3.0

3.2.9 Re-rendering Jobs

If you have a completed job that you need to re-render, you can do so by right-clicking on the job and selecting Requeue Job. If you only need to re-render a few bad frames, you can just requeue their corresponding tasks by right-clicking on one or more tasks in the task list selecting Requeue Tasks. In some cases, the Monitor can try to detect bad frames for you. You can use this feature by right-clicking on the job and selecting Scan For Missing Output. The scan will check for missing frames or frames that don’t meet a size threshold. You will then have the option to requeue all the corresponding tasks automatically. Note that the Scan For Missing Output option isn’t available for all jobs. See the Job Output documentation for more information.

3.3 Controlling Jobs

3.3.1 Overview

The Jobs panel allows jobs to be controlled and modified using the right-click menu. In addition, the Task panel allows specific tasks to be controlled using the right-click menu. Note that the availability of these options can vary depending on the context in which they are used, as well as the User Group Permissions that are defined for the current user.

100 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

If the Job or Task panels are not visible, see the Panel Features documentation for instructions on how to create new panels in the Monitor.

3.3.2 Job States

The state of jobs can be changed using the Job panel’s right-click menu. In addition, the states of specific tasks can be changed using the Task panel’s right-click menu. Note that it is possible to modify the states of multiple jobs or tasks at the same time, providing the selected jobs or tasks are all in the same state.

3.3. Controlling Jobs 101 Deadline User Manual, Release 7.0.3.0

When suspending a job, a confirmation message will appear that gives you the option to suspend the tasks for the job that are currently rendering. If you disable this option, any tasks that are currently rendering will be allowed to complete.

These are the states that a job can be in. They are color coded to make it clear which state the job is in. • Queued (white): No tasks for the job are currently being rendered. • Rendering (green): At least one task for the job is being rendered. • Completed (blue): All tasks for the job have finished rendering. • Suspended (gray): The job will not be rendered until it is resumed. • Pending (orange): The job is waiting on dependencies to finish, or is scheduled to start at a later time. • Failed (red): The job has failed due to errors. It must be resumed before it can be rendered again. You may notice Queued or Rendering jobs turn slightly red or brown as they sit in the farm. This is an indication that the job is reporting errors. See the Job Reports section further down for more information. The Job panel’s right-click menu also gives the option to delete or archive jobs. Both options will remove the jobs from the farm, but archived jobs can be imported again for later use. You can import archived jobs from the File menu in the Monitor. See the Job Archiving documentation for more information.

3.3.3 Resubmitting Jobs

If you want to render a specific job again, but you don’t want to lose the statistics for original job, you can resubmit it from the Job panel’s right-click menu. This will bring up a window allowing you to adjust the frame list and frames per task if you want to. All other job properties will remain identical.

102 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

Note that you can resubmit it as a normal job or a maintenance job. Maintenance jobs are special jobs where each task for the job will render the same frame(s) on a different machine in your farm. This is useful for performing benchmark tests on your machines. When a maintenance job is submitted, a task will automatically be created for each slave, and once a slave has finished a task, it will no longer pick up the job. It’s even possible to resubmit specific tasks as a new job, which can be done from the Task panel’s right-click menu. Note though that a Maintenance job can only be resubmitted from the Job panel.

3.3.4 Job Properties

To modify job properties, select the Modify Job Properties option from the Job panel’s right-click menu. Double- clicking on a job will also bring up the Job Properties window. There are many pages of properties you can modify, which are covered below. Note that it is possible to modify the properties of multiple jobs at the same time.

General

These are the most common job properties, and most of these were specified when the job was originally submitted.

The properties are as follows: • Job ID: The internal ID of the job. • Job Name: The name of the job. • Comment: The comment for the job. • Department: The department the job was submitted from.

3.3. Controlling Jobs 103 Deadline User Manual, Release 7.0.3.0

• Job Batch: The batch the job belongs to. • User: The user who submitted the job. • Pool: The pool that the job belongs to. • Secondary Pool: If enabled, the job can fall back to the secondary pool if there are machines available in that pool. • Group: The group that the job belongs to. • Priority: The priority of the job (0 = lowest, 100 = highest). • Concurrent Tasks: The number of tasks a slave can dequeue at a time (1-16). Note that not all plug-ins support this feature, such as Digital Fusion. • Limit Tasks To Slave’s Task Limit: If checked, a slave will not dequeue more tasks than it is allowed to based on its settings. • On Job Complete: When a job completes, you can auto-archive or auto-delete it. You can also choose to do nothing when the job completes. • Re-synchronize Auxiliary Files Between Tasks: If checked, all job files will be synchronized by the Slave between tasks for this job. This can add significant network overhead, and should only be used if you are manually editing any of the files that were submitted with the job. • Reload Plugin Between Tasks: If checked, the slave reloads all the plug-in files between tasks for the same job. • Enforce Sequential Rendering: Sequential rendering forces a slave to render the tasks of a job in order. If an earlier task is ever requeued, the slave won’t go back to that task until it has finished the remaining tasks in order. • Job Is Interruptible: If enabled, tasks for this job can be interrupted during rendering by a job with a higher priority. • Suppress Event Plugins: If enabled, this job will not trigger any event plugins while in the queue.

Timeouts

These properties effect how a job will timeout. It is important to note that the Auto Task Timeout feature is based on the Auto Job Timeout Settings in the Repository Options. The timeout is based on the render times of the tasks that have already finished for this job, so this option should only be used if the frames for the job have consistent render times.

104 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

The properties are as follows: • Minimum Task Render Time: The minimum amount of time a Slave has to render a task. If a task finishes faster, an error will be reported. • Maximum Task Render Time: The maximum amount of time a Slave has to render a task. If a Maximum Start Job Time is set, the Maximum Task Render Time will not be applied to the Starting phase of a job. • Maximum Start Job Time: The maximum amount of time a Slave has to start a job. • On Task Timeout: You have the option to have the job report an error or notify you when a timeout is reached. • Enable Timeouts For Pre/Post Job Scripts: If checked, then the timeouts for this job will also affect its pre/post job scripts, if any are defined. • Enable Auto Task Timeout: If the job should automatically timeout based on parameters specified in the Repos- itory Options. • Use Frame Timeouts: If enabled, timeouts will be calculated based on frames instead of by tasks. The timeouts entered for tasks will be used for each frame in that task.

Notifications

These properties allow you to notify user(s) are jobs complete. There are two list controls beside each other on this panel. The left list contains all the current users on your farm. The right list contains the names of the users of whom will receive notifications. You can move users from one list to another using the arrow controls between the two lists.

3.3. Controlling Jobs 105 Deadline User Manual, Release 7.0.3.0

The properties are as follows: • Notification Email Addresses: A comma delimited list of the Notification Users email addresses. • Job Completion Notes: Notes to attach in the email sent when the job has completed. • Override Notification Method: If checked, you can select whether to send an email or to not send an email.

Machine Limit

A Machine Limit can be used to limit the number of slaves that can render one particular job. This is useful if you want to render a bunch of jobs simultaneously. The list you create can be a whitelist or a blacklist. A whitelist is the list of slaves that are approved to render this job (only these approved machines will render this job) while a black list contains slaves which are will not render this job. To move a machine from one list to another you can use the arrow buttons between the two lists, drag and drop the machine names you want, or simply double click the machine name. You are also able to load and save your machine list from a file so you can use the same list across multiple jobs. The file used will save each machine name to a single line.

106 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

You can modify the following options for the machine limit: • Slaves that can render this job simultaneously: The number of slaves that can render this job at the same time. • Return Limit Stub When Task Progress % Reaches: If enabled, you can have a slave release its limit stub when the current task it is rendering reaches the specified progress. Note that not all plug-ins report task progress, in which case the machine limit stub will not be released until the task finishes rendering. • Whitelisted/Blacklisted Slaves: If slaves are on a blacklist, they will never try to render this job. If slaves are on a whitelist, only those slaves will try to render this job. Note that an empty blacklist and an empty whitelist are functionally equivalent, and have no impact on which machines the job renders on. • Load Machine List: Open a file dialog to load a list of slaves to be used in the white/blacklist. One machine name per line in the file (.txt). • Save Machine List: Open a file dialog to save the current white/black list. Each machine name will be written to a single line.

Limits

Here you can add or remove the limits that will effect your job. Limits are used to ensure floating licences are used correctly on your farm. To add a limit to your job, you can select the limit(s) you require from from the limit list and press the right arrow between the Limit List and the Required Limits. You are also able to drag and drop your selected limits into or from the required limits or just double click a limit to move it from one list to another.

3.3. Controlling Jobs 107 Deadline User Manual, Release 7.0.3.0

Dependencies

Dependencies can be used to control when a job should start rendering. See the Job Dependency Options below for more information.

108 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

Failure Detection

Here you can set how your job handles errors and determine when to fail a job.

3.3. Controlling Jobs 109 Deadline User Manual, Release 7.0.3.0

The properties are a follows: • Override Job Error Limit : Once checked, the job override limit will be set to the user specified value. • Override Task Error Limit: Once checked, the task error limit will be changed to the user specified value. • Send Warning Notification For Job Errors: Whether or not to send a notification to the users specified in the Notification Panel when a job error occurs. • Ignore Bad Slave Error Limit: If checked, A bad slave error will not count towards job errors. • Clear Bad Slave List: Determines whether or not the bad slave list should currently be cleared.

Cleanup

Here you can override if and how your job is automatically cleaned up when it completes.

110 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

The properties are a follows: • Override Automatic Job Cleanup: If enabled, these cleanup settings will be used instead of the ones in the Repository Options. • Cleanup Job After This Many Days: If enabled, this is the number of days to wait after this job has completed before cleaning it up. • Cleanup Mode: Whether the cleanup should archive the job or delete it.

Scheduling

If you do not want your job to start right away, you can schedule to to start at a later time and date. If you have a job that needs to be run repeatedly, you can do that too by setting a date, time, and frequency at which to run your job.

3.3. Controlling Jobs 111 Deadline User Manual, Release 7.0.3.0

Scheduling properties are as follows: • Scheduling Mode: Determines how the job will be scheduled. Possible values are Disabled (won’t be sched- uled), One Time, or Repeated. • Start Date and Time: The date and time at which to begin the job. • Day Interval: The number of days between queues when the job mode is Repeated. It should be noted that if the job is not put into the Pending state, the job will not wait for the scheduled time to begin rendering. This can be done by right clicking the job and choosing ‘Mark as Pending’.

Scripts

You can attach custom Python scripts to your job which can be run before and after your job has rendered. You may also attach scripts to your job’s tasks which can be run before and after your job’s tasks render. For more information on creating custom job scripts, see the Job Scripting section of the documentation.

112 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

You may attach the following scripts which will be executed at different times: • Pre Job Script: Executed before a job is run. • Post Job Script: Executed after a job has completed. • Pre Task Script: Executed before a task is completed. • Post Task Script: Executed after a task has completed. For more details on these script properties, see the Job Scripting section of the documentation.

Environment

When running a job, you are able to attach environment variables through the Environment tab. The environment vari- ables are specified as key-value pairs and are set on the slave machine running the job. You are able to specify whether your job specific environment variables will only be set while your job is rendering. All job specific environment variables will be removed when the job has finished running. You are also able to set a custom plugin directory on this panel. This acts as an alternative directory to load your jobs plugin from. It is useful while creating and testing custom job plugins or when you need 1 or more jobs to specifically use a custom job plugin which is not stored in the Deadline Repository.

3.3. Controlling Jobs 113 Deadline User Manual, Release 7.0.3.0

The Environment properties are as follows: • Custom Plugin Directory: An alternative directory to load your jobs plugin from. • Environment Variables: A list of environment variables to set while running a job. Stored as a list of key value pairs. • Only Use Job Environment Variables When Rendering: Environment variables for your job will only be set when the job is in the rendering state. Will be removed when the job is finished rendering.

Extra Info

When a job is submitted, it can have extra information embedded in it. For example, if a studio has an in-house pipeline tool, they may want to embed information in the job that will be used to update the pipeline tool when the job finishes rendering.

114 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

The Extra Info 0-9 properties can be renamed from the Jobs section of the Repository Options, and have corresponding columns in the Job list that can be sorted on. The additional key/value pairs in the list at the bottom do not have corresponding columns, and can be used to contain internal data that doesn’t need to be displayed in the job list.

Submission Params

Here you can view and export the job info and plugin info parameters that were specified when the job was submitted. The exported files can be passed to the the Command application to manually re-submit the job. See the Manual Job Submission documentation for more information.

3.3. Controlling Jobs 115 Deadline User Manual, Release 7.0.3.0

Plugin Specific Properties

The Plug-in specific properties vary between the different plug-ins, and some plug-ins may not have a Plug-in specific properties tab at all. Note that when modifying properties for multiple jobs at the same time, the Plug-in specific tab will only be available if all selected jobs use the same plug-in.

116 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

To get a description of specific plug-in properties, just hover your mouse cursor over them in the properties dialog and a tooltip will pop up with a description.

3.3.5 Job Dependency Options

Dependencies can be used to control when a job should start rendering. There are three types of dependencies available, and one or more can be specified for a job: • Jobs: Job dependencies can be used to start a job when other jobs that it depends on are finished. • Assets: Asset dependencies can be used to start a job when specific files exist on disk. • Scripts: Script dependencies can be used to start a job based on if a Python script returns True or False. There are a few ways to set up dependencies in the Monitor, which are described below.

Job Properties

In the Job tab on the Dependencies page, you have the ability to set which jobs your job is dependent on. By default, the job will only resume when each of its dependencies have completed, but you can also have your job resume when the dependencies have failed, or have been deleted from the queue. Note that you can only set which jobs this job is dependent on, not which jobs are dependent on this job. You can also make the job frame dependent, which means that a frame from the job won’t begin rendering until the same frame from the other job(s) is complete. This is useful if you have a job that is dependent on the frames of another job, and you want the two jobs to render concurrently.

3.3. Controlling Jobs 117 Deadline User Manual, Release 7.0.3.0

In the Asset tab, you can make this job dependent on asset files (textures, particle caches, etc). This job won’t be able to render on a slave unless it can access all the files listed here.

118 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

Iin the Script tab, you can make this job dependent on the results of the specified scripts.

3.3. Controlling Jobs 119 Deadline User Manual, Release 7.0.3.0

The following properties apply to all dependency types: • Resume On Completed Dependencies: This job will resume when its dependencies complete. • Resume On Failed Dependencies: This job will resume when its dependencies fail. • Resume On Deleted dependencies: This job will resume when its dependencies are deleted from the queue. • Resume When Each Dependency is % Compete: This job will resume when each of the jobs this job is dependent on reaches a certain percentage of completion. • Use Frame Dependencies: Specifies that this job is dependent on specific frames from its dependencies, and will release tasks for this job as appropriate. • Frame Offset Start/End: Use these to offset the frames that this job is dependent on. It can also be used to make frames for this job dependent on multiple frames from other jobs. You can also specify notes and set overrides for individual dependencies by clicking on them in the dependency list. Click the Overrides button to view the overrides panel.

120 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

Drag and Drop

In the Jobs panel, you can drag one or more jobs and drop them on another job. You will then be presented with some choices on how to set the dependencies.

3.3. Controlling Jobs 121 Deadline User Manual, Release 7.0.3.0

Note that drag & drop dependencies will not work if you are holding down a modifier key (SHIFT, CTRL, etc). This is to help avoid accidental drag & drops when selecting multiple jobs in the list. If you would like to disable drag & drop dependencies, you can do so from the Monitor Options, which can be accessed from the main toolbar. Note that if you change this setting, you will have to restart the Monitor for the changes to take effect.

Dependency View

The Job Dependency View is used to be able to visualize and modify your jobs and their dependencies. You can open the Job Dependency View panel from the View menu in the Monitor.

122 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

The view will show your currently selected job and all nodes that are linked to it by dependencies. The job node colors indicate the state of the job, while the asset nodes are yellow and the script nodes are purple. Jobs are dependent on everything that has a connection to the Square Socket on their left side. Connections can be made by dragging from the sockets on the nodes (square/circle) to the socket/main body of the other node. Connections can be broken by either dragging the connection off of the node or by selecting the connection and pressing the delete key. Note that changes made in the dependency view do not take effect until saved. If you have made changes and go to close the dependency view, you will be notified that you have unsaved changes. Additional job nodes can be added to the view by dragging them in from the job list (after locking the dependency first), or through the right click menu. Asset and script nodes can also be added by dragging the file in from your explorer/finder window, or through the right click menu as well. Dependencies can be tested by pressing the Test Dependency button in the toolbar. The results are represented by the following colors: • Green: The dependency test has passed. • Red: The dependency test has failed. • Yellow: The job is frame dependent, and the dependency test for some of the frames has passed.

3.3. Controlling Jobs 123 Deadline User Manual, Release 7.0.3.0

All the available dependency view options can be found across the toolbar at the top of the view, and/or from the view’s right click menu.

Options in toolbar and right click menu: • Lock View: When enabled, the view will no longer show the currently selected job and will display the last job selected before locking. This is necessary before additional job can be dragged from the job list into the dependency view. • Reload View: This redraws the dependency view for the selected job. If changes have been made, you will be prompted to save your changes.

124 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

• Save View: Saves the changes made to the dependency view for the selected job. • Selection Style: If off, all nodes and connections touched by the selection area will be selected. If on, only nodes and connections that are fully contained by the selection are will be selected. • Minimap: Controls if the minimap is visible and if so, in which corner. • Elide Titles: Control whether or not the titles of nodes should be elided and if so, in which direction. • Zoom All: Zoom the view to the point where the entire view (area that has been used) is visible. • Zoom Extents: Zoom the view to the point where all nodes currently in the view are visible. Options in toolbar only: • Modify Job Details: This allows you to set which properties are visible in the nodes. • Test Dependencies: This allows you to test your dependencies. • Zoom Level: The current zoom level. Options in right-click menu only: • Job Menu: If one or more jobs are selected, you can use the same job menu that is available in the job list. • Add Job: Choose a job to add to the dependency view. • Add Asset: Choose an asset file to add to the dependency view. • Add Script: Choose a script file to add to the dependency view. • Expand/Collapse: Expand or collapse the details in all nodes.

3.3.6 Job Frame Range

To modify the frame range, select the Modify Frame Range option from the Job panel’s right-click menu. Note that modifying these settings will stop and requeue all tasks that are currently rendering.

3.3.7 Job Reports and History

All reports for a job can be viewed in the Job Reports panel. This panel can be opened from the View menu or from the main toolbar in the Monitor. It can also be opened from the Job and Task panel’s right-click menu.

3.3. Controlling Jobs 125 Deadline User Manual, Release 7.0.3.0

The following reports can be viewed from the Job Report panel: • Render Logs: These are the reports from tasks that rendered successfully. • Render Errors: This are the reports from tasks that failed to render. • Event Logs: These are the reports from Events that were handled successfully. • Event Errors: These are the reports from Events that raised errors. • Requeues: These are reports explaining why tasks were requeued. You can use the Job Report panel’s right-click menu to save reports as files to send to Deadline Support. You can also delete reports from this menu as well. Finally, if a particular Slave is reporting lots of errors, you can blacklist it from this menu (or remove it from the job’s whitelist). In addition to viewing job reports, you can also view the job’s history. The History window can be brought up from the Job panel’s right-click menu by selecting the Job History option.

126 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

3.3.8 Job Output

Many jobs have the options to explore and view the job’s output directly from the Job or Task panel’s right-click menu. If the options to explore and view the output are available for the job, there will also be the option to copy the output path to the clipboard. This is helpful if you need to paste the path into another application. Note that the availability of these options is based on how much information about the job’s output could be determined at the time the job was submitted. In some cases, the submitter can’t determine where all or some of the job’s output will be saved to, so these options won’t be available.

3.3. Controlling Jobs 127 Deadline User Manual, Release 7.0.3.0

When viewing the output for a job, the Monitor will typically open the image file in the default application on the machine. You can configure the Monitor to use specific image viewer applications in the Monitor Options, which can be accessed from the main toolbar.

128 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

Finally, some jobs will support the ability to scan completed tasks for a job to see if any output is missing or below an expected file size. The Scan For Missing Output window can be opened by right-clicking on a job and selecting Job Output -> Scan For Missing Output. If any missing output is detected, or the output file is smaller than the Minimum File Size specified, you are given the option to requeue those tasks (simply place a check mark beside the tasks to requeue).

3.3. Controlling Jobs 129 Deadline User Manual, Release 7.0.3.0

3.3.9 Job Auxiliary Files

Many jobs have additional files submitted with them, such as the scene file being rendered. These files are copied to the server and are then copied to the Slaves when they render the jobs. If a job has auxiliary files submitted with it, you can explore these files from the Job panel’s right-click menu. There will also be the option to copy the auxiliary path to the clipboard, which is helpful if you need to paste the path into another application.

130 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

3.4 Archiving Jobs

3.4.1 Overview

Deadline allows you to archive jobs, which is useful if you want to keep a backup of every job you’ve rendered, or if you want to remove a job from one farm and place it in another. It can also be used to give a problematic job to Deadline Support for testing purposes. Jobs can be archived automatically or manually. When a job is archived, its job and task information are exported as JSON to separate text files. These files are placed in a zip file with any auxiliary files that were submitted with the job, and any reports the job currently has. The name of the zip file will contain the job’s user, plugin, name, and ID (to guarantee uniqueness). It will have the following format:

USER__PLUGIN__JOBNAME__JOBID.zip

Typically, this zip file is placed in the jobsArchived folder in the Repository. However, when manually archiving a job, you have the option to choose an alternative archive location.

3.4.2 Manual Job Archiving

Users can manually archive a job by right-clicking on it in the job list in the Monitor and selecting Archive Job. This will bring up the following window:

3.4. Archiving Jobs 131 Deadline User Manual, Release 7.0.3.0

By default, it will save the archive to the jobsArchived folder in the Repository. However, you can choose a different folder to archive the job. You can also choose whether or not to delete the job from the database after archiving it. Once case where you might not want to delete it is if you are archiving a job to send to Deadline Support for testing purposes. If the Job panel is not visible, see the Panel Features documentation for instructions on how to create new panels in the Monitor.

3.4.3 Automatic Job Archiving

When submitting a job, users can set the On Job Complete setting to Archive. When the job finishes, it will automati- cally be archived to the jobsArchived folder in the Repository.

132 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

Administrators can also configure Deadline to automatically archive all jobs after they have finished rendering and place them in the jobsArchived folder in the Repository. This can be done in the Job Settings section of the Repository Options.

3.4. Archiving Jobs 133 Deadline User Manual, Release 7.0.3.0

3.4.4 Importing Archived Jobs

To import an archived job, simply select File -> Import Archived Jobs in the Monitor and choose one or more zip files containing archived jobs.

134 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

3.5 Monitor and User Settings

3.5.1 Overview

You can customize your Monitor options, User settings, and Styles in the Monitor Options. On Windows and Linux, select Tools -> Options, and on Mac OS X, select DeadlineMonitor -> Preferences. You can also open these settings from the main toolbar in the Monitor.

3.5.2 Monitor Options

The Monitor options allow you to customize a few aspects of the Monitor.

3.5. Monitor and User Settings 135 Deadline User Manual, Release 7.0.3.0

Image Viewer • Custom Image Viewer Executables: Specify up to three custom image viewer applications that the Job and Task panels uses to view output images. See the Controlling Jobs documentation for more information on viewing job output. • Preferred Image Viewer: Choose the default image viewer to use when viewing output files. If set to Default- Viewer, the system’s default application for the output file type will be used. Job List

136 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

• Enable Drag & Drop Dependencies: If enabled, you can drag jobs and drop them on other jobs to set de- pendencies. Note that you must restart the Monitor for this setting to take effect. See the Controlling Jobs documentation for more information on setting dependencies this way. • Show Task States In Job Progress Bar: If enabled, the job progress bars will show the states of all the tasks for the job. • Group Jobs By Batch Name: If enabled, jobs that have the same Batch Name will be grouped together in the job list. Note that you must restart the Monitor for this setting to take effect. • Change Color Of Jobs That Accumulate Errors: If enabled, jobs will change color from the Rendering color to the Failed color as they accumulate errors. See the Styles section further down for more on the colors. Task List • Task Double-click Behavior: Customize the double-click behavior of rendering, completed, and failed tasks in the task list. Double-clicking on tasks in other states will bring up the task reports panel. These are the available options: – View Reports: This will bring up the task reports panel for the selected task. – Connect To Slave Log: This will connect to the Slave that is rendering or has rendered the selected task. – View Image: This will open the output image for the selected task in the default viewer. • Change Color Of Tasks That Accumulate Errors: If enabled, tasks will change color from the Rendering color to the Failed color as they accumulate errors. See the Styles section further down for more on the colors. Miscellaneous • Start In Super User Mode: If enabled, the Monitor will start with Super User mode enabled. If Super User mode is password protected, you will be prompted for the password when you start the Monitor. • Stream Job Logs from Pulse: If enabled, the Monitor will stream the job logs from Pulse instead of reading them directly from the Repository. While streaming the logs this way is typically slower, it can be useful if the connection to the Repository server is slow. • Show House Cleaning Updates In Status Bar: If enabled, the Monitor status bar will show when the last House Cleaning was performed. • Show Repository Repair Updates In Status Bar: If enabled, the Monitor status bar will show when the last Repository Repair was performed. • Show Pending Job Scan Updates In Status Bar: If enabled, the Monitor status bar will show when the last Pending Job Scan was performed. • Enable Slave Pinging: If enabled, the Slave List will show if slave machines can be pinged or not.

3.5.3 User Settings

You can configure you user settings here.

3.5. Monitor and User Settings 137 Deadline User Manual, Release 7.0.3.0

Notification Settings If you would like to receive email notifications for your job, you can specify your email address in the Notification Settings and enable the option to receive them. Note that this requires your administrator to configure the email settings in the Repository Options. If you would like to receive popup message notifications for your job, you can specify your machine name in the Notification Settings and enable the option to receive them. Note that this requires the Launcher to be running on the machine that you specify here. Render Job As User Settings If the Render Job As User option is enabled in the job settings in the Repository Options, these options will be used to launch the rendering process as the specified user. For Linux and OSX, only the User Name is required. For Windows, the Domain and Password must be provided for authentication. See the Render Jobs As Job’s User documentatison for more information. Web Service Authentication Settings You can also specify a Web Service password, which is typically used for the Mobile application. A password is required to authenticate with the Pulse Web Service if authentication has been enabled and empty passwords are not allowed.

138 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

Region A user’s region is used for cross platform rendering. All the paths a user sees in the Monitor will be replaced based on the path mappings for their region. Example: Viewing the output of a completed job. See Region Settings and Regions for more information.

3.5.4 Styles

The Styles panel can be used to customize the color palette and the fonts that the Deadline Applications use. Custom styles can be saved and imported as well.

By default, the current style will be Default Style, which is the style shipped with Deadline and cannot be modified in any way. Previously saved styles will be available in the Saved Styles list. Custom styles can be created and deleted by clicking the Create New Style and Delete Style buttons, respectively. Once a custom style has been selected, the style’s color palette can be modified: • The General Palette color is used to generate the colors for the various controls and text in the Deadline appli- cations. Note that dark palettes will result in light text, and light palettes will result in dark text. • The Selection color is used to highlight selected items or text.

3.5. Monitor and User Settings 139 Deadline User Manual, Release 7.0.3.0

• The remaining colors are used to color the text for jobs, tasks, slaves, etc, based on their current state. It is recommened to choose colors that contrast well with the General Palette and Selection colors to ensure the text is readable. The style’s font can be modified as well: • Primary Font: This is the font used for almost all the text in the Deadline applications. • Console Font: This is the font used in console and log windows. By default, a monospace font is used for these windows. Any style changes made are not saved until the Monitor Options dialog is accepted by clicking OK. Once the dialog has been accepted, the Monitor must be restarted in order to apply the style changes. In order to facilitate testing out new styles, there is a Preview Style button which opens a dialog that displays an approximation of the current style settings.

Note that the Deadline applications will always load with the style that was last selected in the Styles panel in the Monitor Options.

140 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

Styles may also be saved and loaded using the View menu in the Monitor. Note that when saving styles, all of the custom styles are saved, and when loading saved styles from disk the loaded styles will be appended to the list of styles currently present, overwriting any styles with a shared name.

3.6 Local Slave Controls

3.6.1 Overview

The Local Slave Controls allow you to control the slave on your machine, as well as configure Idle Detection and the Job Dequeing Mode. You can access the Local Slave Controls from the Launcher’s menu, or from the Tools menu in the Monitor.

3.6. Local Slave Controls 141 Deadline User Manual, Release 7.0.3.0

Note that it is possible for Administrators to disable the Local Slave Controls. If that’s the case, you will see this message when trying to open them.

142 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

3.6.2 Slave Controls

This section allows you to view the state of the slave running on your machine. Also, if the slave is rendering, you can see which job it is currently rendering in the list. Finally, you can control the slave on you machine by right-clicking on them in the list.

More information about the avaiable controls can be found in the Remote Control documentation.

3.6.3 Override Idle Detection

This section overrides the global Slave Scheduling settings for your machine (if there are any). It can be used to start the slave when your machine becomes idle (based on keyboard and mouse activity), and stop the slave when the machine is in use again. Note that Idle Detection is managed by the Launcher, so it must be running for this feature to work. Note that on Linux, the Launcher uses X11 to determine if there has been any mouse or keyboard activity. If X11 is not available, Idle Detection will not work.

3.6. Local Slave Controls 143 Deadline User Manual, Release 7.0.3.0

The available Idle Detection settings are as follows: • Start Slave When Machine Is Idle For: This option enables Idle Detection, and you can specify the number of minutes without keyboard, mouse or tablet activity before the slave should start. • Only Start Slave If CPU Usage Less Than: If enabled, the slave will only start if the machine’s CPU usage is less then the given value. • Only Start Slave If Free Memory More Than: If enabled, the slave will only start if the machine has this much free memory available. • Only Start Slave If These Processes Are Not Running: If enabled, the slave will not start if any of the listed processes are running. • Only Start Slave If Launcher Is Not Running As These Users: If eanbled, the slave will not start if the Launcher process is running as any of the listed users. • Stop Slave When Machine Is No Longer Idle: If enabled, the slave will automatically stop when there is key- board, mouse or tablet activity again. • Allow Slave To Finish Its Current Task When Stopping: If enabled, the slave will finish its current task before stopping when the machine is no longer idle. If disabled, the slave wil requeue its current task before stopping so that another slave can render it.

3.6.4 Job Dequeuing Mode

This section can be used to control how your slave dequeues jobs.

144 Chapter 3. Getting Started Deadline User Manual, Release 7.0.3.0

The available dequeing modes are: • All Jobs: This is the default behavior. The slave will dequeue any job that it can work on. • Only Jobs Submitted From This Slave’s Machine: This option will only allow the slave to dequeue jobs submit- ted from the same machine. This is a useful way of ensuring that your slave will only render your jobs. • Only Jobs Submitted From These Users: This option will only allow the slave to dequeue jobs submitted by the specified users. This is another way of ensuring that your slave will only render your jobs. However, it can also be used to make your slave render jobs from other specific users, which is useful if you’re waiting on the results of those jobs.

3.6. Local Slave Controls 145 Deadline User Manual, Release 7.0.3.0

146 Chapter 3. Getting Started CHAPTER FOUR

CLIENT APPLICATIONS

4.1 Launcher

4.1.1 Overview

The Launcher’s main use is to provide a means of remote communication between the Monitor and the Slave or Pulse applications, and therefore should always be left running on your render nodes and workstations. It can also detect if the Slave running on the machine has stalled, and restart it if it does.

Unless the Launcher is running as a service or daemon, you should see the icon in your system tray or notification area. You can right-click on the icon to access the Launcher menu, or double-click it to launch the Monitor.

4.1.2 Running The Launcher

To start the Launcher: • On Windows, you can start the Launcher from the Start Menu under Thinkbox\Deadline. • On Linux, you can start the Launcher from a terminal window by running the deadlinelauncher script in the bin folder. • On Mac OS X, you can start the Launcher from Finder by running the DeadlineLauncher application in Appli- cations/Thinkbox/Deadline. The Launcher can also be started from a command prompt or terminal window. For more information, see the Launcher Command Line documentation.

4.1.3 Administration Features

Running the Launcher can help make some administrative tasks easier, which is why it’s recommended to keep it running at all times on your render nodes and workstations.

Automatic Updates

If you have enabled Automatic Upgrades under the Client Setup section of the Repository Options, whenever you launch the Monitor, Slave, or Pulse using the Launcher, it will check the Repository for updates and upgrade itself automatically if necessary before starting the selected application. Note that the upgrade will only trigger when launching applications through the Launcher. Also, if the Launcher is running as a service on Windows, launching the Monitor will not trigger an update.

147 Deadline User Manual, Release 7.0.3.0

Remote Administration

If you have enabled Remote Administration under the Client Setup section of the Repository Options, you will be able to control the Slave or Pulse applications remotely, and remotely execute arbitrary commands. Note that it may be a potential security risk to leave it running if you are connected to the internet and are not behind a firewall. In this case, you should leave Remote Administration disabled.

4.1.4 Launcher Menu Options

Right-click on the Launcher system tray icon to bring up the Launcher menu. The available options are listed below. Note that if the Launcher is running as a service or daemon, this menu is unavailable because the system tray icon will be hidden.

Launch Monitor Launches the Monitor application. If the Repository has been upgraded recently, and Automatic Updates is enabled, this will automatically upgrade the client machine. Launch Slave(s) Launches the Slave application. If this machine has been configured to run more than one Slave instance, this will launch all of them. If the Repository has been upgraded recently, and Automatic Updates is enabled, this will automatically upgrade the client machine. Launch Slave By Name Launch a specific Slave instance, or add/remove Slave instances from this machine (if enabled for the current user). Note that new Slave instances must have names that only contain alphanumeric characters, underscores, or hyphens. See the documentation on running Multiple Slaves On One Machine for more information. Local Slave Controls Opens the Local Slave Controls window, which allows you to control and configure the Slave that runs on your machine. Launch Slave at Startup

148 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

If enabled, the Slave will launch when the Launcher starts up. Restart Slave If It Stalls If enabled, the Launcher will try to restart the Slave on the machine if it stalls. Scripts Allows you to run general scripts that you can create. Note that these are the same scripts that you can access from the Scripts menu in the Monitor. Check out the Monitor Scripts documentation for more information. Submit Allows you to submit jobs for different rendering plug-ins. Note that these are the same submission scripts that you can access from the Submit menu in the Monitor. More information regarding the Monitor sub- mission scripts for each plug-in can be found in the Plug-Ins section of the documentation. You can also add your own submission scripts to the submission menu. Check out the Monitor Scripts documentation for more information. Change Repository Change the Repository that the client connects to. Change User Change the current user on the client. Change License Server Change the license server that the Slave connects to. Explore Log Folder Opens the Deadline log folder on the machine.

4.1.5 Command Line Options

To run the Launcher from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources folder (Mac OS X) and run the ‘deadlinelauncher’ application. To view all available command line arguments, you can run the following: deadlinelauncher-help

Available Options

To start the Monitor with the Launcher, use the -monitor option. If another Launcher is already running, this will tell the existing Launcher to start the Monitor. If an upgrade is available, this will trigger an automatic upgrade: deadlinelauncher-monitor

To start the Slave with the Launcher, use the -slave option. If another Launcher is already running, this will tell the existing Launcher to start the Slave. If an upgrade is available, this will trigger an automatic upgrade: deadlinelauncher-slave

To start Pulse with the Launcher, use the -pulse option. If another Launcher is already running, this will tell the existing Launcher to start Pulse. If an upgrade is available, this will trigger an automatic upgrade:

4.1. Launcher 149 Deadline User Manual, Release 7.0.3.0

deadlinelauncher-pulse

To start the Balancer with the Launcher, use the -balancer option. If another Launcher is already running, this will tell the existing Launcher to start the Balancer. If an upgrade is available, this will trigger an automatic upgrade: deadlinelauncher-balancer

To trigger an automatic upgrade if one is available, use the -upgrade flag: deadlinelauncher-upgrade

To run the Launcher without a user interface, use the -nogui option. Note that if the Launcher is running in this mode, if you launch the Slave or Pulse through the Launcher, they will also run without a user interface: deadlinelauncher-nogui deadlinelauncher-nogui-slave

To shutdown the Launcher if it’s already running, use the -shutdown option: deadlinelauncher-shutdown

To shutdown the Slaves, Pulse, and Balancer on the machine before shutting down the Launcher, use the -shutdownall option: deadlinelauncher-shutdownall

4.1.6 Launcher As A Service

When installing the Deadline Client on Windows, you can choose to install the Launcher as a service. If you want to configure the Launcher to run as a service after the Client has been installed, it is possible to set up the service manually, which is explained below. However, it’s probably easier to simply run the Client installer again and enable the service option during installation. There are also some considerations that need to be made when installing the Launcher as a service. See the Windows Service documentation for more information.

Manually Installing the Launcher Service

You can use Deadline Command along with the following commands to install or uninstall the Launcher service:

InstallLauncherService Installs the Deadline Launcher Service, and optionally starts it. [true/false] Whether or not to start the Launcher Service after it has been installed (optional)

InstallLauncherServiceLogOn Installs the Deadline Launcher Service with the given account, and optionally starts it. [User Name] The account user name [Password] The account password [true/false] Whether or not to start the Launcher Service

150 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

after it has been installed (optional)

UninstallLauncherService Stops and uninstalls the Deadline Launcher Service.

StartLauncherService Starts the Deadline Launcher Service if it is running.

StopLauncherService Stops the Deadline Launcher Service if it is running.

Here is an example command line to install the service: deadlinecommand.exe -InstallLauncherServiceLogOn "USER" "PASSWORD"

Here is an example command line to uninstall the service: deadlinecommand.exe-UninstallLauncherService

4.1.7 FAQ

Why should the Launcher application be left running on the client machines? Its main purpose is to provide a means of remote communication between the Monitor and the Slave applications. If it’s not running, the Slave will have to be stopped and started manually. In addition, whenever you launch the Monitor or Slave using the Launcher, it will check the Repository for updates and upgrade itself automatically if necessary before starting the selected application. If the Launcher is not running, updates will not be detected. Finally, the Launcher can detect if the Slave running on the machine has stalled, and restart it. Can I run the Launcher without a user interface? Yes, you can do this by passing the -nogui command line argument to the Launcher application:

deadlinelauncher-nogui

4.2 Monitor

4.2.1 Overview

The Monitor application offers detailed information and control options for each job and Slave in your farm. It provides normal users a means of monitoring and controlling their jobs, and it gives administrators options for configuring and controlling the entire render farm.

4.2. Monitor 151 Deadline User Manual, Release 7.0.3.0

If you’re launching the Monitor for the first time on your machine, you will be prompted with a Login dialog. Simply choose your user name or create a new one before continuing. Once the Monitor is running, you’ll see your user name in the bottom right corner. If this is the wrong user, you can log in as another user by selecting File -> Change User. Note that if your administrator set up Deadline to lock the user to the system’s login account, you will have to log off of your system and log back in as the correct user.

4.2.2 Running the Monitor

To start the Monitor: • On Windows, you can start the Monitor from the Start Menu under Thinkbox\Deadline, or from the Launcher’s right-click menu. • On Linux, you can start the Monitor from a terminal window by running the deadlinemonitor script in the bin folder, or from the Launcher’s right-click menu. • On Mac OS X, you can start the Monitor from Finder by running the DeadlineMonitor application in Applica- tions/Thinkbox/Deadline, or from the Launcher’s right-click menu.

152 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

The Monitor can also be started from a command prompt or terminal window. For more information, see the Monitor Command Line documentation.

4.2.3 Panel Features

Information in the Monitor is broken up into different panels, which are described further down. These panels have many features in common, which are explained here.

Customization

Monitor panels can be created from the View menu, or from the main toolbar. They can be re-sized, docked, or floated as desired. This allows for a highly customized viewing experience which is adaptable to the needs of different users.

The current layout can be pinned to the Pinned Layouts menu so that it can be restored at a later time. This can be done from the View menu, or from the main toolbar. The current layout can also be saved to a file from the View menu, and then loaded from that file later.

4.2. Monitor 153 Deadline User Manual, Release 7.0.3.0

When you pin a layout you can chose to save the location and size of the monitor by checking the “Save Location and Size” box when pinning the layout.

To prevent accidental modifications to the current layout, you can lock the layout from the View menu, by pressing “Alt-‘”, or from the main toolbar. When locked, panels cannot be moved, but they can still be docked and undocked. To dock a floating panel while the layout is locked, simply double-click on the panels title. It will be docked to the same location it was originally undocked from.

The columns in monitor panels are customizable. The columns can be resized by simply clicking on the separator column line and moving it and can be reordered by clicking on a column and moving it. Right clicking on the column headers in a panel allows you to toggle the visibility of each column.

154 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

In this menu you can modify the visibility and ordering of the columns by clicking the “Customize..” menu item. Moving columns to the left side list hides them, and the order that columns are listed in the right list corresponds to the order they will appear in the panel (top->bottom corresponds to left->right). You move the columns around by clicking the arrow buttons.

4.2. Monitor 155 Deadline User Manual, Release 7.0.3.0

Once you have configured your column layout you can pin it.

156 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

You can also set the current list layout as the list layout to load by default, when opening new panels of the same type, by clicking “Save Current List Layout As Default”. If you want to restore the original list layout default click the “Reset Default List Layout”.

Data Filtering

Almost every panel has a search box that you can use to filter the information you’re interested in. You can simply type in the word(s) you are looking for, or use regular expressions for more advanced searching.

In addition, every panel that has a search box also supports a more advanced filtering system. To add a filter to a panel, select the Edit Filter option in the panel’s drop down menu, which can be found in the upper-right corner of the panel. A window will appear allowing you to specify the name the filter being created. You can select to match all of the filters added or any of the filters added. If all must match, only records where all data matches each filter will be shown, while if any can match, if a record contains one or more matches it will be shown.

4.2. Monitor 157 Deadline User Manual, Release 7.0.3.0

158 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

Clicking the add filter button generates a new filter. The filter requires a column to be selected, an operation to perform, and a value to use in the operation. Filters can also be removed by clicking the minus button to the right of each filter. After all filters are are entered, press OK to apply the filter to the current panel. A filter can be cloned and opened in a new tab within the panel through the Clone Filter option in the panel drop down menu. The Clear Filter option can be used to clear all filters from the current panel. Finally, you can pin the current filters so that they can be restored at a later time using the Pinned Filters sub menu in the panel drop down menu. Note that the Pin Current Filter option is only available if a filter is currently being applied. If there are no filters, the Pin Current Filter option will be hidden.

Automatic Sorting and Filtering

Almost every panel has an option to do automatic sorting and filtering when data changes in the panel. When this option is disabled, sorting and filters must manually be re-applied to ensure that the data is sorted and filtered properly. Note that automatic sorting and filtering can affect the Monitor’s performance if there are lots of jobs (10,000+) or lots of slaves (1000+) in the farm. To improve Monitor performance in this case, it is recommended to disable automatic sorting and filtering. There is an option in the Monitor Settings in the Repository Configuration to automatically disable it by default.

Saving and Loading Panel Layouts

Every list-based panel (Jobs, Slaves, Tasks, etc) has an option to save and load the list layout, which you can find in the panel’s drop down menu. This allows you to save out a list’s filters, column order and visibility, etc, and load them again later or share them with another user.

4.2. Monitor 159 Deadline User Manual, Release 7.0.3.0

Note that when loading a list layout, you must choose a layout that was saved from the same type of list. For example, you cannot save a layout from the Job list and then load it into the Slave list.

Graph Views

Almost every panel supports showing a graphical representation of the data. The graph can be shown by selecting the Graph View option in the panel’s drop down menu, which can be found in the upper-left corner of the panel. Note that some panels can have many types of graphs.

The graph view can be saved as an image file by right-clicking anywhere in view and selecting Save Graph As Image.

160 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

If the graph is a pie chart, you can also filter the data from the graph view by holding down the SHIFT key and clicking on one of the pie slices. The data will be filtered to only show records that are represented by the pie slice that was clicked on.

Scripts

Almost every panel has the option to run custom scripts from the panel’s right-click menu. Many scripts are already shipped with Deadline, and additional custom scripts can be written. See the Monitor Scripts documentation for more information.

4.2. Monitor 161 Deadline User Manual, Release 7.0.3.0

4.2.4 Information Panels

As mentioned earlier, information in the Monitor is broken up into different panels. These panels can be created from the View menu, or from the main toolbar. They can be re-sized, docked, or floated as desired. This allows for a highly customized viewing experience which is adaptable to the needs of different users.

Jobs

The Jobs panel contains a list that shows all jobs in the farm. It also displays useful information about each job such as it’s name, user, status, error count, plugin, etc. As jobs change states, their colors will change. Active jobs will appear as green, and will remain green as they continue to render without errors. But if it starts to accumulate errors, it will turn brown and then eventually red. This allows you to see at a glance which jobs are having problems. For more information on job monitoring, see the Monitoring Jobs documentation.

The Jobs panel supports standard filtering, but it also has a Quick Filter option in the panel’s drop down menu to make it easier to filter out unwanted jobs. By toggling the options within the Status, User, Pool, Group, and Plugin sections, you can quickly drill down to the jobs you are interested in. There is also an Ego-Centric Sorting optino in the panel’s drop down menu which can be used to keep all of your jobs at the top of the job list.

162 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

The Jobs panel also allows jobs to be controlled and modified using the right-click menu. You can also bring up the Job Properties window by double clicking on a job. See the Controlling Jobs documentation for more information.

Tasks

The Task panel shows all the tasks for the job that is currently selected. It displays useful information about each task such as its frame list, status, and if applicable, the Slave that is rendering it.

4.2. Monitor 163 Deadline User Manual, Release 7.0.3.0

The Task panel also allows you to control tasks from the right-click menu. See the Controlling Jobs documentation for more information. In addition, the double-click behavior in the Task panel can be set in the Monitor and User Settings, which can be accessed from the main toolbar.

Job Details

The Job Details panel shows all available information about the job that is currently selected. The information is split up into different sections that can be expanded or collapsed as desired.

164 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

Job Dependency View

This panel allows you to view and modify a job’s dependency tree in a node-based view. You can lock the view to the currently selected job, which allows you to drag & drop other jobs into the view to hook up new dependencies. In addition, you can drag & drop Python scripts or asset files directly into the view and hook them up as dependencies. See the Controlling Jobs documentation for more information.

4.2. Monitor 165 Deadline User Manual, Release 7.0.3.0

Job Report

All reports for a job can be viewed in the Job Reports panel. This includes error reports, logs, and task requeue reports. This panel can also be opened by right-clicking on a job in the Job List and selecting View Job Reports. More information can be found in the Controlling Jobs documentation.

166 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

Slaves

The Slave panel shows all the Slaves that are in your farm. It shows system information about each Slave, as well as information about the job the slave is currently rendering.

If you see a slave that is colored orange in the list, this means that the slave is unable to get a license or that the license is about to expire. When the slave cannot get a license, it could be because there is a network issue, the license has expired, or the license limit has been reached. If a slave isn’t rendering a job that you think it should be, you can use the Job Candidate Filter option in the panel’s drop down menu to try and figure out why. See the Job Candidate Filter section in the Slave Configuration documentation

4.2. Monitor 167 Deadline User Manual, Release 7.0.3.0 for more information. The Slave panel’s right-click menu allows you to modify Slave settings and control the Slaves remotely. See the Slave Configuration documentation for more information.

Slave Reports

All log and error reports for a Slave can be viewed in the Slave Reports panel. This panel can also be opened by right-clicking on a slave in the Slave List and selecting View Slave Reports.

Pulses

The Pulse panel shows which machine Pulse is running on, as well as previous machines that Pulse has run on. It also shows system information about each machine.

168 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

Balancers

The Balancer panel shows which machines the Balancer is running on. It also shows system information about each machine.

The Balancer panel’s right-click menu allows you to modify Balancer settings and control the Balancer remotely. See the Balancer Configuration documentation for more information.

Limits

The Limit panel shows all the Limits that are in your farm. You can access many options for the Limits by right- clicking on them. See the Limits and Machine Limits documentation for more information.

Console

The Console panel shows all lines of text that is written to the Monitor’s log.

4.2. Monitor 169 Deadline User Manual, Release 7.0.3.0

Remote Commands

The Remote Command panel shows all pending and completed remote commands that were sent from the Monitor. When sending a remote command, if this panel is not already displayed, it will be displayed automatically (assuming you have permissions to see the Remote Command panel). See the Remote Control documentation for more informa- tion.

Cloud

The Cloud panel shows all the instances from the cloud providers that the Monitor is connected to. This panel allows you to control and close your existing instances. See the Cloud Controls documentation for more information.

170 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

4.2.5 Monitor Menu Options

The available options are listed below. They are available in the Monitor’s main menu, and some are also available in the main toolbar. Note that the availability of these options can vary depending on the context in which they are used, as well as the User Group Permissions that are defined for the current user.

File Menu

Change Repository Connect to a different repository, or reconnect to the current repository if the Monitor becomes discon- nected. There is also a toolbar button for this option. Change User Change the current user. You have the choice to select a different user or create a new one. There is also a toolbar button for this option. Import Archived Jobs Opens a file dialog which allows you to select a zip file containing an archived job which you would like to add back to the monitor. See the Archiving Jobs documentation for more information.

View Menu

Manual Refresh Forces an immediate refresh of all the data in the Monitor. Manual refreshing is disabled by default, and can only be enabled in the Monitor Settings in the Repository Configuration. New Panel Spawn a new information panel. See the Information Panels section above for more information. There is also a toolbar button for this option. Lock Panels Prevents the panels from being moved. Panels can still be floated, docked, and closed. To dock a floating panel, double-click on the panel’s title. There is also a toolbar button for this option. Pinned Layouts You are able to save different Monitor layout for quick use. By selecting Pin Current Layout, your current layout will be added to your pinned layouts. Selecting a pinned layout will restore the monitors panels to the pinned layouts state. There is also a toolbar button for this option. Save Layout

4.2. Monitor 171 Deadline User Manual, Release 7.0.3.0

Saves the current layout to file Open Layout Load a previously saved layout from file. Reset Layout Reset the current layout to the Monitor’s default layout. Save Styles Saves the current Monitor styles to file. For more on saving Monitor styles see the Styles documentation. Load Styles Loads previously saved Monitor styles from file. For more on loading Monitor styles see the Styles documentation.

Submit and Script Menus

Submission scripts can be found under the Submit menu, and general scripts can be found under the Scripts menu. Many scripts are already shipped with Deadline, and additional custom scripts can be written. Check out the Monitor Scripts documentation for more information.

Tools Menu

Super User Enter Super User Mode, which allows you to access the administrative Monitor options. Super User mode can be password protected simply by specifying a password in the Access Control section of the Repository Configuration. View Repository History View all repository history entries generated on the farm. View Power Management History View all power management history entries on the farm. See the Power Management documentation for more information. View Farm Reports View various repository statistical information. See the Farm Statistics documentation for more informa- tion. Manage Pools Add or remove Pools, and configure which Pools are assigned to the Slaves. See the Pools and Groups documentation for more information. Manage Groups Add or remove Groups, and configure which Groups are assigned to the Slaves. See the Pools and Groups documentation for more information. Manage Users Add or remove users, and set user information. See the User Management documentation for more information. Manage User Groups

172 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

Add or remove a user group, and set user group permissions to control which features are accessible. See the User Management documentation for more information Configure Repository Settings Configure a wide range of global settings. See the Repository Configuration documentation for more information. Configure Slave Scheduling Configure the slave scheduling options. TODO - add reference to docs for slave scheduling Configure Power Management Options Configure the Power Management settings. See the Power Management documentation for more infor- mation. Configure Cloud Providers Set up and enable cloud service providers. See the Cloud Controls documentation for more information. Configure Plugins Configure the available render plugins, such as 3ds Max, After Effects, Maya, and Nuke. See the plugin documentation for more information on the configurable settings for each plugin. Configure Event Plugins Configure the available event plugins such as Draft and Shotgun. See the event plugin documentation for more information on the configurable settings for each plugin. Connect to Pulse Log Use this to remotely connect to the Pulse log. See the Remote Control documentation for more informa- tion. Perform Pending Jobs Scan Performs a scan of pending jobs and determines if any should be released. This operation is normally performed automatically, but you can force an immediate clean up with this option if desired. Perform House Cleaning Clean up files for deleted jobs, check for stalled slaves, etc. This operation is normally performed auto- matically, but you can force an immediate clean up with this option if desired. Undelete Jobs Use this to recover any deleted jobs that haven’t been purged from the database yet. Explore Repository Root View the root directory of the current Repository. Import Settings Import settings from another Repository. TODO - add reference to doc Synchronize Scripts and Plugin Icons Rebuilds the script-specific menus, and updates your local plugin icon cache with the icons that are cur- rently in the Repository. Note that if any new icons are copied over, you will have to restart the Monitor before the jobs in list show the new icons. Local Slave Controls Opens the Local Slave Controls window, which allows you to control and configure the Slave that runs on your machine.

4.2. Monitor 173 Deadline User Manual, Release 7.0.3.0

Options Modify the Monitor and User Settings. There is also a toolbar button for this option.

4.2.6 Command Line Options

To run the Monitor from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources folder (Mac OS X) and run the ‘deadlinemonitor’ application. To view all available command line arguments, you can run the following:

deadlinemonitor-help

Available Options

To start a new Monitor if there already another Monitor running, use the -new option:

deadlinemonitor-new

To start the Monitor connected to a different repository, use the -repository option. You can combine this with the -new option to have different Monitors connected to different repositories:

deadlinemonitor -repository "\\repository\path" deadlinemonitor -new -repository "\\repository\path"

To start the Monitor without the splash screen, use the -nosplash option:

deadlinemonitor-nosplash

To shutdown the Monitor if it’s already running, use the -shutdown option:

deadlinemonitor-shutdown

You can also set all of the Monitor Options using command line options. For example:

deadlinemonitor -draganddropdep True -groupjobbatches False

4.2.7 FAQ

I’m unable to move panels in the Monitor, or dock floating panels. You need to unlock the Monitory layout. This can be done from the View menu or from the toolbar. Can I dock a floating panel when the Monitor layout is locked? Yes, you can dock the floating panel by double-clicking on its title bar. It will be docked to its previous location, or to the bottom of the Monitor if it wasn’t docked previously. What does it mean when a Slave is orange in the Slave list? This means that the Slave is currently unable to get a license.

174 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

4.3 Slave

4.3.1 Overview

The Slave is the application that controls the rendering applications and should be running on any machine you want to to include in the rendering process.

4.3. Slave 175 Deadline User Manual, Release 7.0.3.0

4.3.2 Running the Slave

To start the Slave: • On Windows, you can start the Slave from the Start Menu under Thinkbox\Deadline, or from the Launcher’s

176 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

right-click menu. • On Linux, you can start the Slave from a terminal window by running the deadlineslave script in the bin folder, or from the Launcher’s right-click menu. • On Mac OS X, you can start the Slave from Finder by running the DeadlineSlave application in Applica- tions/Thinkbox/Deadline, or from the Launcher’s right-click menu. You can also configure the Slave to launch automatically when the Launcher starts up. To enable this, just enable the Launch Slave At Startup option in the Launcher menu. The Slave can also be started from a command prompt or terminal window. For more information, see the Slave Command Line documentation.

4.3.3 Licensing

The Slave requires a license to run, and more information on setting up licensing can be found in the Licensing Guide. The Slave only requires a license while rendering. If a Slave cannot get a license, it will continue to run, but it won’t be able to pick up jobs for rendering. In addition when a slave becomes idle it will return it’s license. The Slave’s licensing information can be found under the Slave Information tab (see next section). If you have more then one slave running on a machine they will all share the same licence.

4.3.4 Job and Slave Information Tabs

The Job Information tab shows information about the job currently being rendered. By default, the tab will show information about all render threads combined, but the drop down control gives the option to show information about a specific render thread. The Slave Information tab shows information about the Slave and the machine that it’s running on, including license information and resource usage (CPU and memory).

4.3. Slave 177 Deadline User Manual, Release 7.0.3.0

178 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

4.3.5 Viewing the Slave Log

To view the Slave’s current log, simply press the Open Slave Log button at the bottom of the Slave window. This will open the Slave’s log in a new window to avoid impacting the performance of the main Slave application.

If the Slave is running in the background or without an interface, you can connect to the Slave’s log from the command line. In a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources folder (Mac OS X) and run the following, where “SLAVENAME” is the name of the Slave you want to connect to: deadlinecommand -ConnectToSlaveLog "SLAVENAME"

4.3.6 Slave Menu Options

The available options are listed below. They are available in the Slave’s window, or from the Slave system tray icon’s right-click menu. Note that if the Slave is running in the background or without an interface, these options will be unavailable.

4.3. Slave 179 Deadline User Manual, Release 7.0.3.0

File Menu

Change License Server Change the license server that the Slave connects to.

Options Menu

Hide When Minimized The Slave is hidden when minimized, but can be restored using the Slave icon in the system tray. Minimize On Startup Starts the Slave in the minimized state.

Control Menu

Search For Jobs If the Slave is sitting idle, this option can be used to force the slave to search for a job immediately. Cancel Current Task If the Slave is currently rendering a task, this forces the slave to cancel it. Continue Running After Current Task Completion Check to keep the Slave application running after it finishes its current task completion. Stop/Restart Slave After Current Task Completion Check to stop or restart the Slave application after it finishes its current task. Shutdown/Restart Machine After Current Task Completion Check to shutdown or restart the machine after the Dealine Slave finishes its current task.

4.3.7 Command Line Options

To run the Slave from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources folder (Mac OS X) and run the ‘deadlineslave’ application. To view all available command line arguments, you can run the following: deadlineslave-help

Available Options

To start a new instance of the Slave, use the -name option. If you already have multiple instances of the Slave configured, use the -name option to start a specific instance: deadlineslave -name "second-slave"

To start the Slave without a user interface, use the -nogui option:

180 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

deadlineslave-nogui

To start the Slave without the splash screen, use the -nosplash option: deadlineslave-nosplash

To shut down the Slave if it’s already running, use the -shutdown option. This can be combined with the -name option if you have more than one Slave instance running and you want to shut down a specific instance: deadlineslave -shutdown deadlineslave -shutdown -name "second-slave"

To control what a running Slave should do after it finishes rendering its current task, use the -aftertask option. The available options are Continue, StopSlave, RestartSlave, ShutdownMachine, or RestartMachine. This can be combined with the -name option if you have more than one Slave instance running and you want to control a specific instance: deadlineslave -aftertask RestartSlave deadlineslave -aftertask RestartMachine -name "second-slave"

4.3.8 FAQ

Can I run the Slave on an artist’s workstation? Yes. On Windows and Linux, you can set the Affinity in the Slave Settings to help reduce the impact that the renders have on the artist’s workstation. Can I run the Slave as a service or daemon? Yes. If you’re running the Launcher as a service or daemon, then it will run the Slave in the background as well. See the Client Installation documentation for more information. The Slave keeps reporting errors for the same job instead of moving on to a different job. What can I do? You can enable Bad Slave Detection in the Repository Configuration to have a slave mark itself as bad for a job when it reports consecutive errors on it. What does it mean when a Slave is stalled, and is this a bad thing? Slaves become stalled when they don’t update their status for a long period of time, and is often an indication that the slave has crashed. A stalled slave isn’t necessarily a bad thing, because it’s possible the slave just wasn’t shutdown properly (it was killed from the Task Manager, for example). In either case, it’s a good idea to check the slave machine and restart the slave application if necessary. On Linux, the Slave is reporting that the operating system is simply ‘Linux’, instead of showing the actual Linux distribution. In order for the Slave to report the Linux distribution properly, you need to have lsb installed, and lsb_release needs to be in the path. You can use any package management application to install lsb.

4.3. Slave 181 Deadline User Manual, Release 7.0.3.0

4.4 Pulse

4.4.1 Overview

Pulse is an optional mini server application that performs maintenance operations on the farm, and manages more advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering, and the Web Service. If you choose to run Pulse, it only needs to be running on one machine. Note that Pulse does not play a role in job scheduling, so if you are running Pulse and it goes down, Deadline will still be fully operational (minus the advanced features).

182 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

If you are choosing a machine to run Pulse, you should be aware that non-Server editions of Windows have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists of more than 10 render nodes, it is very likely that you’ll hit this limitation every now and then (and the odds continue to increase as the number of machines increase). This is a limitation of the operating systems, and isn’t something that we can workaround, so we recommend using a Server edition of Windows, or a different operating system like Linux.

4.4. Pulse 183 Deadline User Manual, Release 7.0.3.0

4.4.2 Running Pulse

To start Pulse: • On Windows, you can start Pulse from the Start Menu under Thinkbox\Deadline. • On Linux, you can start Pulse from a terminal window by running the deadlinepulse script in the bin folder. • On Mac OS X, you can start Pulse from Finder by running the DeadlinePulse application in Applica- tions/Thinkbox/Deadline. You can configure Pulse to launch automatically when the Launcher starts up (similar to how the Slave does this). This can be done by adding the LaunchPulseAtStartup=True to the system’s deadline.ini file. See the Client Configuration documentation for more information. Pulse can also be started from a command prompt or terminal window. For more information, see the Pulse Command Line documentation.

4.4.3 Viewing the Pulse Log

To view Pulse’s current log, simply press the Open Pulse Log button at the bottom of the Pulse window. This will open the Pulse log in a new window to avoid impacting the performance of the main Pulse application.

If Pulse is running in the background or without an interface, you can connect to the Pulse log from the command line. In a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources

184 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0 folder (Mac OS X) and run the following, where “PULSENAME” is the name of the Pulse you want to connect to: deadlinecommand -ConnectToPulseLog "PULSENAME"

4.4.4 Configuring Pulse

Pulse needs to be configured so that the Slave applications know how to connect to Pulse. This is necessary for the Slave Throttling feature to function properly. There are a couple different ways to configure Pulse, which are described below.

Auto Configuration

If you launch Pulse, and a Primary Pulse hasn’t been set yet, it will automatically configure itself to be the Primary, and configure itself to be connected to by its host name. These settings can be changed from the Pulse Panel in the Monitor at any time. See the Pulse Configuration documentation for more information. If Pulse has already been configured, but you want to quickly switch to another machine to run Pulse on, simply launch Pulse on the desired machine. Then when it appears in the Pulse list in the Monitor, right-click on it and select Auto Configure Pulse. Generally, this feature is only available in Super User mode.

Manual Configuration

The connection settings, as well as additional settings, can be configured for Pulse from the Monitor. Advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering, and the Web Service can also be configured in the Monitor. See the Pulse Configuration documentation for more information.

4.4.5 Pulse Menu Options

The available options are listed below. They are available in Pulse’s window, or from the Pulse system tray icon’s right- click menu. Note that if Pulse is running in the background or without an interface, these options will be unavailable.

4.4. Pulse 185 Deadline User Manual, Release 7.0.3.0

Options Menu

Hide When Minimized Pulse is hidden when minimized, but can be restored using the Pulse icon in the system tray. Minimize On Startup Starts Pulse in the minimized state.

Control Menu

Perform Pending Job Scan If Pulse is between repository pending job scans, this option can be used to force Pulse to perform a pending job scan immediately. A pending job scan releases pending jobs by checking their dependencies or scheduling options. Perform Repository Clean-up If Pulse is between repository clean-ups, this option can be used to force Pulse to perform a repository clean-up immediately. A repository clean-up includes deleting jobs that are marked for automatic deletion. Perform Repository Repair If Pulse is between repository repairs, this option can be used to force Pulse to perform a repository repair immediately. A repository repair includes checking for stalled slaves and orphaned limit stubs. Perform Power Management Check If Pulse is between power management checks, this option can be used to force Pulse to perform a power management check immediately.

4.4.6 Command Line Options

To run Pulse from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources folder (Mac OS X) and run the ‘deadlinepulse’ application. To view all available command line arguments, you can run the following: deadlinepulse-help

Available Options

To start Pulse without a user interface, use the -nogui option: deadlinepulse-nogui

To start Pulse without the splash screen, use the -nosplash option: deadlinepulse-nosplash

To shut down Pulse if it’s already running, use the -shutdown option: deadlinepulse-shutdown

186 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

4.4.7 FAQ

Can I run Pulse on any machine in my farm? You can run Pulse on any machine in your farm, including the Repository or Database machine. However, for larger farms, we recommend running Pulse on a dedicated machine. When choosing a machine to run Pulse on, you should be aware that non-Server editions of Windows have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists of more than 100 machines, it is very likely that you’ll hit this limitation every now and then (and the odds continue to increase as the number of machines increase). Therefore, if you are running Pulse on a farm with 100 machines or more, we recommend using a Server edition of Windows, or a different operating system like Linux. Can I run Pulse as a service or daemon? Yes. If you’re running the Launcher as a service or daemon, then it will run Pulse in the background as well. See the Client Installation documentation for more information. If Pulse is shutdown or terminated, is the Power Management feature still functional? In this case, the only aspect of Power Management that is still functional is the Temperature Checking. Redundancy for Temperature checking has been built into the Slave application, so if Pulse isn’t running, you’re still protected if the temperature in your farm room begins to rise. Which temperature sensors work with Power Management? We have tested with many different temperature sensors. Basically, as long as the temperature sensors use SNMP, and you know its OID (which is configurable in the Power Management settings), it should work.

4.5 Balancer

4.5.1 Overview

The Balancer is a cloud controller application capable of virtual/physical, private/public, remote/local simultaneous machine orchestration. It can create, start, stop and terminate cloud instances based on the current queue load taking into account jobs and tasks. Further customization to take into account other job/task factors can be achieved by utilizing the Deadline plugin API to create a custom Balancer algorithm.

4.5. Balancer 187 Deadline User Manual, Release 7.0.3.0

The Balancer works in cycles, and each cycle consists for a number of stages. • First, the Balancer will do a House Keeping step in which it will clean up any disks or instances that haven’t been terminated like they were supposed to. • Second, the Balancer will execute the Balancer Algorithm. These are the steps of the default algorithm (note that these steps can be customized with your own Balancer Algorithm plugin):

– Create State Structure: This sets up the data structures used in the rest of the algorithm. – Compute Demand: Examines the groups for jobs that are queued and assigns a weighting to the group based on the amount of tasks that need to be done and the group priority. – Determine Resources: Here we determine how much space we have available with our provider and how many limits we have. – Compute Targets: Based on the Demand and the available Resources we set a target number of in- stances for each group. – Populate Targets: This sets up a full target data structure for use in Deadline.

188 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

• Third, the Balancer will equalize the targets by starting or terminating instances.

4.5.2 Running the Balancer

To start the Balancer: • On Windows, you can start the Balancer from the Start Menu under Thinkbox\Deadline. • On Linux, you can start the Balancer from a terminal window by running the deadlinebalancer script in the bin folder. • On Mac OS X, you can start the Balancer from Finder by running the DeadlineBalancer application in Applica- tions/Thinkbox/Deadline. You can configure the Balancer to launch automatically when the Launcher starts up (similar to how the Slave does this). This can be done by adding the LaunchBalancerAtStartup=True to the system’s deadline.ini file. See the Client Configuration documentation for more information. The Balancer can also be started from a command prompt or terminal window. For more information, see the Balancer Command Line documentation.

4.5.3 Viewing the Balancer Log

To view the Balancer’s current log, simply press the Open Balancer Log button at the bottom of the Balancer window. This will open the Balancer log in a new window to avoid impacting the performance of the main Balancer application.

4.5. Balancer 189 Deadline User Manual, Release 7.0.3.0

If the Balancer is running in the background or without an interface, you can connect to the Balancer log from the command line. In a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources folder (Mac OS X) and run the following, where “BALANCERNAME” is the name of the Balancer you want to connect to: deadlinecommand -ConnectToBalancerLog "BALANCERNAME"

4.5.4 Configuring the Balancer

The Balancer needs to be configured before it can do anything. See the Balancer Configuration documentation for more information.

190 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

4.5.5 Balancer Menu Options

The available options are listed below. They are available in the Balancer’s window, or from the Balancer system tray icon’s right-click menu. Note that if the Balancer is running in the background or without an interface, these options will be unavailable.

Options Menu

Hide When Minimized The Balancer is hidden when minimized, but can be restored using the the Balancer icon in the system tray. Minimize On Startup Starts the Balancer in the minimized state.

Control Menu

Perform Balancing If the Balancer is between Balancing cyles, this option forces the Balancer to perform a balancing cycle immediately. A balancing cycle looks at tasks, groups, limits and cloud regions to determine if it should create or terminate cloud instances.

4.5.6 Command Line Options

To run the Balancer from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources folder (Mac OS X) and run the ‘deadlinebalancer’ application. To view all available command line arguments, you can run the following: deadlinebalancer-help

Available Options

To start the Balancer without a user interface, use the -nogui option: deadlinebalancer-nogui

To start the Balancer without the splash screen, use the -nosplash option: deadlinebalancer-nosplash

To shut down the Balancer if it’s already running, use the -shutdown option: deadlinebalancer-shutdown

4.5. Balancer 191 Deadline User Manual, Release 7.0.3.0

4.6 Command

4.6.1 Overview

The deadlinecommanad application is a command line tool for the Deadline render farm management system. It can be used to control, query, and submit jobs to the farm. There is also a deadlinecommandbg application which is identical to deadlinecommand, except that it is executed in the background. When using deadlinecommandbg, the output and exit code are written to the Deadline temp folder as dsubmitoutput.txt and dsubmitexitcode.txt respectively. If you want to control where these files get written to, you can use the ‘-outputFiles’ option, followed by the paths to the output and exit code file names. For example: deadlinecommandbg -outputFiles c:\output.txt c:\exitcode.txt -pools

You can find the deadlinecommand and deadlinecommandbg applications in the Deadline bin folder (Windows or Linux) or the Resources folder (Mac OS X).

4.6.2 Command Line Options

The supported command line options and their usage instructions can be printed out by running ‘deadlinecommand’ from a command prompt or terminal with the ‘-help’ argument. deadlinecommand-help

To get usage information for a specific command, specify the command name after the -help argument: deadlinecommand -help SubmitCommandLineJob

4.6.3 Usage Examples

Submitting a Job

To submit a 3dsmax scene (ie. C:\MyScene.max), you must first create a job submission info file (ie. C:\job_info.job) and a 3dsmax plugin info file (ie. C:\max_info.job). See the Manual Job Submission documentation for more infor- mation. Once the files are created, you can submit the job using this command: deadlinecommand "C:\job_info.job" "C:\max_info.job" "C:\MyScene.max"

Querying For Jobs Using Filters

To query for all jobs that belong to jsmith or cdavis: deadlinecommand -getjobsfilter username=jsmith username=cdavis

To query for all of jsmith’s jobs with completed status:

192 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

deadlinecommand -getjobsfilterand username=jsmith status=completed

Checking Which Slaves Assigned To A Specific Pool

To check which slaves are assigned to the 3dsmax pool: deadlinecommand -getslavenamesinpool 3dsmax Assigned

To Check which slaves are excluded from the xsi pool: deadlinecommand -getslavenamesinpool Xsi Excluded

Querying For Task Information

To query for task information for the job with the ID of “546cc87357dbb04344a5c6b5”: deadlinecommand -getjobtasks 546cc87357dbb04344a5c6b5

Retrieving and Changing Job Status

To retrieve the status of the job with the ID of “546cc87357dbb04344a5c6b5”: deadlinecommand -getjob 546cc87357dbb04344a5c6b5

To retrieve all of the jobs details: deadlinecommand -getjobdetails 546cc87357dbb04344a5c6b5

To suspend the job with the ID of “546cc87357dbb04344a5c6b5”: deadlinecommand -suspendjob 546cc87357dbb04344a5c6b5 deadlinecommand -suspendjobnonrenderingtasks 546cc87357dbb04344a5c6b5

To resume the job: deadlinecommand -resumejob 546cc87357dbb04344a5c6b5

To requeue the job: deadlinecommand -requeuejob 546cc87357dbb04344a5c6b5

To delete the job: deadlinecommand -deletejob 546cc87357dbb04344a5c6b5

To archive the job:

4.6. Command 193 Deadline User Manual, Release 7.0.3.0

deadlinecommand -archivejob 546cc87357dbb04344a5c6b5

Sending An Email

To send the message to [email protected](cc [email protected]): deadlinecommand -sendemail -to [email protected] -cc [email protected] -subject "the subject" -message "C:\MyMessage.html"

To send the same message with the attachment “C:\MyAttachment.txt”: deadlinecommand -sendemail -to [email protected] -cc [email protected] -subject "the subject" -message "C:\MyMessage.html" -attach "C:\MyAttachment.txt"

Note that the -to, -subject, and -message options are required. The other two options are optional.

4.6.4 FAQ

What’s the difference between the deadlinecommand and deadlinecommandbg applications? The deadlinecommandbg application is identical to deadlinecommand, except that it is executed in the background. When using deadlinecommandbg, the exit code and output are written to the Deadline temp directory as dsubmitexitcode.txt dsubmitoutput.txt respectively.

4.7 Mobile

4.7.1 Overview

The Mobile application allows you to monitor your jobs from anywhere. The application connects to a Pulse server over the Web Service to download information about the state of your jobs, so Pulse must be running before you can use the Mobile application. See the Pulse Web Service documentation for more information. The minimum requirements for the Mobile application are as follows. Mobile Device Minimum Requirements Android Deadline 5.0 and Android 2.1 iPhone or iPad Deadline 4.1 and iPhone OS 3.0 - 7.10 Windows Phone Deadline 5.0 and Windows Phone 7.0

4.7.2 Mobile Setup

When you launch the Mobile application for the first time, you will need to configure it so that it can connect to your Pulse server. Just press the Settings button in the top left corner. The important settings are the Deadline User settings and the Pulse Server settings. For Mobile to connect to Pulse, you must provide the following information: • Deadline User Settings -> User Name: This is the Deadline User is the user that you normally submit render jobs from.

194 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

• Deadline User Settings -> Password: If the Pulse Web Service has been configured to require authentication, and empty passwords are not allowed, you must enter your user password here. This is the password that you specify in your User Settings in the Monitor. See the User Settings documentation for more information. • Pulse Server Settings -> Server Name: This is the host name or IP address of your Pulse server. • Pulse Server Settings -> Server Port: The default is 8080, and should only be changed if the Pulse Web Service has been configured to listen on a different port. After you have configured your Server and User settings, press the Job List button to return and press the Refresh button to connect to Pulse and load the job list. If you get an error when Mobile attempts to contact Pulse, see the Troubleshooting section for known errors and solutions.

4.7.3 Job List

The job list is the main screen, and by default it shows all the jobs in the repository. See the Settings section below for information on how to sort and filter this list. You can also use the search field to search for specific jobs.

To refresh the job list, just press the Refresh button. If you want to see more information about a specific job, press the button to the right of the job name to bring up the job details panel.

4.7.4 Job Details

The job details panel shows additional information for a specific job. In this view, you can see most of the information you could normally see in the Monitor.

4.7. Mobile 195 Deadline User Manual, Release 7.0.3.0

To refresh the job details, just press the Refresh Job button. To return to the job list, press the Job List button in the upper left corner.

4.7.5 Settings

The settings panel can be accessed from the job list by pressing the Settings button. You can access the online help by pressing the Help button in the top right corner (Android) or by scrolling down to find the Online Help link (iPhone). To return to the job list, press the Job List button in the upper left corner.

Auto Refresh Settings • Job List: If enabled, the job list will automatically refresh itself at the increment defined in Job List Interval. • Job Details: If enabled, the job’s details will automatically refresh itself at the increment defined in Job Details Interval.

196 Chapter 4. Client Applications Deadline User Manual, Release 7.0.3.0

Job List Filter Settings • Configure filters to only show the jobs that you’re interested in. Job List Sort Settings • Ego-centric Sort: If enabled, all of your jobs will appear at the top of the job list, followed by the remaining jobs. • Primary Sort: Set the primary sort field and order for the job list. • Secondary Sort: Set the secondary sort field and order for the job list. Deadline User Settings • User Name: Your Deadline user name. This is the user that you normally submit render jobs under. • Password: If the Pulse Web Service has been configured to require authentication, and empty passwords are not allowed, you must enter your user password here. This is the password that you specify in your User Settings in the Monitor. See the User Settings documentation for more information. Pulse Server Settings • Server Name: This is the host name or IP address of your Pulse server. • Server Port: The default is 8080, and should only be changed if the Pulse Web Service has been configured to listen on a different port. Proxy Server Settings • Server URL: If you are using a proxy web server, you may need to set a more specific URL to connect to the Pulse server. • Http Authorization: If your proxy web server requires HTTP authorization, you should enable this option and specify the user name and password. • SSL: If you are using a proxy web server that requires SSL, you should enable this option. Note that this will change the server port in the Pulse Server Settings to 443 by default. Download Information • This a running tally of the data that you’ve downloaded from the Pulse server.

4.7.6 Pulse Proxy Server

Depending on the security restrictions of your studio, you may wish to to setup a proxy server that acts as a mid- dleman between Mobile and Pulse. You can run the proxy server on a different machine, and configure it to require authentication, use SSL, etc. We have example scripts that you can start with by downloading Pulse Proxy Script For Deadline Mobile from the Miscellaneous Deadline Downloads Page. • Place these scripts into a cgi script executable folder. For apache, the default is the cgi-bin directory, but different folders can be configured as script folders. • Once the scripts are in the folder, running them should yield a 403: Not authorized error until the script has been configured. The proxy scripts have been written to assume that the root web directory will be where the scripts will be run. Because of this, if they are placed into the cgi-bin folder you must prepend ‘\cgi-bin\’ to the URI regular expression test in the scripts. Note that all slashes and regular expression special characters must be escaped (hence the double slash).

4.7. Mobile 197 Deadline User Manual, Release 7.0.3.0

Common pitfalls with this are forgetting to mark the scripts as executable on unix based systems (use “chmod og+x Mobile_GetJob*” to mark them executable), and forgetting to set the owner and group to be the same as the webserver runs as (use “chown www:www Mobile_GetJob*” on most systems). Note that we provide these scripts as is, and we don’t officially support them. However, if you are having difficulties, contact Deadline Support and we’ll do what we can to help.

4.7.7 Troubleshooting

These are some known Mobile errors and solutions. You must provide a password for authentication This error occurs when a password has not been set for the current user while authentication is enabled and empty passwords are not accepted. To resolve this issue, you must fill in the Web Service Password field for the user in the User Settings in the Monitor. Before you can connect, you may need to wait for Pulse to update its network settings or manually restart Pulse. The provided user name and password are invalid This error occurs when the password provided is incorrect for the given user. If you believe the password is correct, you may need to wait for Pulse to update its network settings or manually restart Pulse. The provided user name is invalid This error occurs when the provided user is not in Pulse’s cached list. If the user name is valid, you may need to wait for Pulse to update its network settings or manually restart Pulse. There was an error connecting to Pulse This error occurs when there are two errors connecting to Pulse in a row. The likely cause of this error is that Pulse is not running on the specified server. Verify that Pulse is running on the specified server and that you have entered the server’s name or IP address correctly. If you have a name specified for the server and are not on the local area network of that machine, you may need to enter the server’s IP address instead of its name. Network Error The connection with the server failed. Please check your server settings in the Settings Section Double check your settings in Mobile to make sure they match the required information. If all the Mobile settings are entered correctly and you still cannot connect, look in your general mobile device settings and make sure you are connected to the right network. Depending on how things are set up, your device will try to connect to the strongest network in the area. If the network it switches to doesn’t have the correct settings to connect to your server then the connection will fail. If you are still unable to connect try rebooting the device (fully power off your device and power it back on). This error also occurs when the server you are trying to connect to has lost access to the internet. Double check that the server is connected to the internet.

4.7.8 FAQ

How do I get the Mobile application? The Mobile application can be downloaded from the Android Market and the iPhone App Store. How much does Mobile cost? Nothing, it’s free!

198 Chapter 4. Client Applications CHAPTER FIVE

ADMINISTRATIVE FEATURES

5.1 Repository Configuration

5.1.1 Overview

There are a wide variety of Repository options that can be configured. These options can be modified at any time from the Deadline Monitor while in Super User Mode by selecting Tools -> Configure Repository Options. If you want to restore all the Repository Options to their defaults, simply click the Reset Settings button.

Note that long-running applications like the Launcher, Slave, and Pulse only update these settings every 10 minutes, so after making changes, it can take up to 10 minutes for all machines to recognize them. You can restart these

199 Deadline User Manual, Release 7.0.3.0 applications to have them recognize the changes immediately.

5.1.2 Client Setup

These settings affect the Deadline Client installed on each machine. • Remote Administration: Enabling Remote Administration allows the Deadline Clients to be controlled remotely from the Monitor running on another machine. Note that this can be a security risk if you are not behind a firewall. • Automatic Upgrades:Enabling Automatic Upgrades allows the Deadline Clients to detect if the Repository has been upgraded, and upgrade themselves if necessary. Note that the upgrade check is only performed when launching applications via the Launcher.

5.1.3 Monitor Settings

These settings affect the Deadline Monitor application on each machine.

Monitor Layouts

Existing Monitor layouts can be added here. These layouts can be assigned to User Groups as a user’s default layout. If the Pinned option is enabled, they can also be chosen from the Pinned Layouts menu in the Monitor. The order of

200 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0 the layouts here will be the same in the Pinned Layouts menu.

To add a new layout, simply press the Add button, and then choose an existing Monitor layout file, or use the current Monitor’s layout. Note that Monitor layout files can be saved from the Monitor by selecting View -> Save Layout.

Update Settings

Enable Manual Refreshing If your Auto Refreshing Intervals are set to longer intervals, manual refreshing in the Monitor can be enabled to allow users to get the most up to date data immediately. To prevent users from abusing manual refreshing, a minimum interval between manual refreshes can be configured. Sorting and Filtering

5.1. Repository Configuration 201 Deadline User Manual, Release 7.0.3.0

For farms that have a large number of jobs (10,000+) or slaves (1000+), disabling Automatic Sorting and Filtering in the lists in the Monitor can improve the Monitor’s overall performance. This option in the Repository Options can be used to disable Automatic Sorting and Filtering by default, and users can enable it later in their Monitors if desired.

5.1.4 Slave Settings

These settings affect the Deadline Slave application on each machine.

Slave Settings

General • Limit the number of characters per line for standard output handling: Lines of standard output that are longer than the specified limit will be ignored by the Slave’s stdout handling. • Delete Offline/Stalled Slaves from the Repository after this many days: Slaves that are Offline or Stalled will be removed from the Repository after this many days. • Gather System Resources (CPU and RAM) When Rendering Tasks On Linux/Mac: If enabled, the Slave will collect CPU and RAM usage for a task while it is rendering. We have seen cases where this can cause the Slave to crash on Linux or Mac, so you should only disable this feature if you run into this problem. • Use fully qualified domain name (FQDN) for Machine Name instead of host name: If enabled, the Slave will try to use the machine’s fully qualified domain name (FQDN) when setting its Machine Name instead of using

202 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

the machine’s host name. The FQDN will then be used for Remote Control, which can be useful if the remote machine name isn’t recognized in the local network. If the Slave can’t resolve the FQDN, it will just use the host name instead. • Use Slave’s IP Address for Remote Control: If enabled, the Slave’s IP address will be used for remote control instead of trying to resolve the Slave’s host name. Wait Times • Number of Minutes Before An Unresponsive Slave is Marked as Stalled: If a slave has not provided a status update in this amount of time, it will be marked as stalled. • Number of Seconds To Wait Fora Response When Connecting to Pulse: The number of seconds a salve that is connected to plus will wait for pulse to respond when querying for a job. • Number of Seconds Between Thermal Shutdown Checks if Pulse is Offline: The number of seconds between thermal shutdown checks. The Slave only does this check if Pulse is not running.

Extra Properties

Extra arbitrary properties can be set for slaves, and these properties can be given user friendly names so that they can easily be identified and used to filter and sort slaves in the Monitor.

5.1. Repository Configuration 203 Deadline User Manual, Release 7.0.3.0

5.1.5 Performance Settings

These settings are used to influence the performance of Deadline by modifying update intervals. Auto Adjust The auto adjust option will try to choose the best interval settings based on the number of slaves in your farm. These should act as a good base that you can modify later as necessary. Monitor Referesh Intervals • Number of Seconds Between Job Updates: This controls how often the Monitor reads in new job updates. • Number of Seconds Between Slave Updates: This controls how often the Monitor reads in new slave updates. • Number of Seconds Between Pulse Updates: This controls how often the Monitor reads in new pulse updates. • Number of Seconds Between Limit Updates: This controls how often the Monitor reads in new limit updates. • Number of Seconds Between Settings Updates: This controls how often the Settings such as groups, pools and users are updated. • Number of Seconds Between Cloud Updates: This controls how often the Monitor updates the Cloud Panel. • Number of Seconds Between Balancer Updates: This controls how often the Monitor reads in new Balancer updates.

204 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

Slave Intervals • Number of Seconds Between Slave Information Updates: This controls how often the Slave updates the infor- mation that’s shown in the Slave list in the Monitor. • Number of Seconds Between Queries For New Tasks While the Slave is Rendering: The number of seconds a Slave will wait after it finishes a task before moving on to another. This delay is not applied when the Slave is idle. • Multiplier to determine seconds between queries while the Slave is Idle: The multiplier to be applied to the number of slaves that will determine how long a slave will wait between polls to the Repository for tasks when it is idle. • Maximum number of seconds between Job queries while the Slave is Idle: The maximum number of seconds a slave will wait between polls to the Repository for tasks when it is idle. • Minimum number of seconds between Job queries when the Slave is Idle: The minimum number of seconds a slave will wait between polls to the Repository for tasks when it is idle.

5.1. Repository Configuration 205 Deadline User Manual, Release 7.0.3.0

5.1.6 Pulse Settings

These settings control how the Slaves connect to Pulse for Throttling, and are also used by the Slave to determine if Pulse is running.

General

• Maximum Incoming Connections: The maximum number of Slaves that can connect to Pulse at any given time. • Connection Timeout (in milliseconds): The number of milliseconds messages to and from Pulse have to com- plete before they timeout. • Maximum Connection Attempts: The maximum number of times a Slave will attempt to connect to Pulse before giving up. • Stalled Pulse Threshold (in minutes): Deadline determines if a Pulse has stalled by checking the last time that the Pulse has provided a status update. If a Pulse has not updated its state in the specified amount of time, it will be marked as Stalled. • Use Pulse’s IP Address for Remote Control: If enabled, the Pulse’s IP address will be used for remote control instead of trying to resolve the Pulse’s host name.

206 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

Power Management

• Power Management Check Interval: How often Pulse performs Power Management operations.

Throttling

Throttling can be used to limit the number of slave applications that are copying over the job files at the same time. This can help network performance if large scene files are being submitted with the jobs. Note that a Slave only copies over the job files when it starts up a new job. When it goes to render subsequent tasks for the same job, it will not be affected by the throttling feature. • Enable Throttling: Allow throttling to occur. • Maximum Number of Slaves That Can Copy Job Files at The Same Time: The maximum number of Slaves that can copy a scene file at the same time. • The Interval a Slave Waits Between Updates To See If It Can Start Copying Job Files: The amount of time(in seconds) a Salve will wait to send throttle checks and updates to Pulse. • Throttle Update Timeout Multiplier (based on the Slave Interval): The interval a slave waits between updates is multiplied by this value to determine the timeout value.

5.1. Repository Configuration 207 Deadline User Manual, Release 7.0.3.0

Web Service

Enable the Web Service Pulse’s Web Service allows you to execute commands and scripts from a browser, and must be enabled to use the Mobile applications and the Pulse RESTful API (see :ref: rest_overview_ref_label). Note that if you enable or disable the Web Service feature while Pulse is running, it must be restarted for the changes to take effect. • Enable the Web Service: Makes the Pulse Web Service Available. • Listening Port: The port on which the Web Service will listen. • Connection Limit: The maximum number of concurrent connections allowed for the Pulse Web Service. • Connection Timeout(in seconds): The amount of time in between sending and receiving messages to and from the Web Service before a timeout occurs. Security If the Web Service requires authentication, users would use their Deadline user name along with the password stored in their User Settings. If empty passwords are allowed, they can leave their password setting blank. • Require Authentication: If enabled, the Pulse Web Service will require a username and password. These are stored in the user settings. • Allow Empty Passwords: If enabled, the Web Service will accept empty passwords.

208 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

• Allow Execution of Non-Script Commands: If enabled, users are allowed access to Deadline Command com- mands.

5.1.7 Balancer Settings

These settings control general settings for the Balancer. • Balancer Update Interval: How often the Balancer performs a balancing cycle. • Current Algorithm Logic: The Balancer Plugin to use for determining balancing targets. • Use Balancer’s IP Address for Remote Control: If enabled, the Balancer’s IP address will be used for remote control instead of trying to resolve the Balancer’s host name. • Stalled Balancer Threshold (in minutes): Deadline determines if a Balancer has stalled by checking the last time that the Balancer has provided a status update. If a Balancer has not updated its state in the specified amount of time, it will be marked as Stalled. • Error Tolerance: How many times we try to connect to the primary Balancer before it fails and we make another Balancer the new primary. • Enable Group Switching: If there are group mappings that have the same image and hardware types instances will move between groups as needed. If it’s not enabled instances will shutdown and startup like normal. Any Balancer Plugin specific settings will be shown below.

5.1. Repository Configuration 209 Deadline User Manual, Release 7.0.3.0

5.1.8 Region Settings

This is where you can set up Regions in Deadline. Regions are logical groupings for slaves and users. Cross Platform Rendering and Balancer Settings can be unique to each region. For example a slave that’s in the ‘thinkbox_west’ Region will use the path mapping settings for that Region. The list on the right shows the Cloud Regions and the list on the left shows the general Regions. Regions must have a unique name.’all’ and ‘none’ are reserved names that cannot be used. See Regions for more information.

210 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.1.9 Email Notification

This section handles all email related settings within the repository.

Primary and Secondary Server

Set up a primary SMTP server to send email notifications. You can set up an optional secondary SMTP server for Deadline to use if the primary server is unavailable. • SMTP Server: The SMTP server used by Deadline to send emails. • Sender Account: The email account that Deadline will use to send emails from. • Port: The SMTP port to use. • Use SSL: The email account from which the notifications will be sent. • SMTP Server Requires Authentication: Enable if the SMTP server requires a user name and password to au- thenticate. • Testing: Send a test email to the specified email address. • Automatically Generate Email Addresses for New Users: Generates new email address for new users in the form username@postfix

5.1. Repository Configuration 211 Deadline User Manual, Release 7.0.3.0

Note that if you have SSL enabled, you may need to configure your Linux and OSX machines for SSL to work. The process for doing this is explained in Mono’s Security Documentation.

Notifications

• Job Completed: When a job completes, an email will be sent to these email addresses. • Job Timed Out: When a job times out, an email will be sent to these email addresses. • Job Error Warning: When a job accumulates a certain number of errors, a warning email will be sent to these email addresses. You can configure the warning limit in the Failure Detection settings. • Job Failed: “When a job fails, an email will be sent to these email addresses. • Job Corrupted: When a corrupted job is detected, an email will be sent to these email addresses. • Slave License Errors: “When a slave is unable to get a license, an email will be sent to these email addresses. • Slave Status Errors: When a slave is unable to update its state in the Repository, an email will be sent to these email addresses. • Slave Error Warning: When a slave accumulates a certain number of errors in one session, a warning email will be sent to these email addresses. You can configure the warning limit in the Failure Detection settings. • Stalled Slave: When a stalled slave detected, an email will be sent to these email addresses.

212 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

• System Administrator: When users use the option in the Error Report Viewer to report error messages to their system administrator, those emails will be sent to these email addresses. • Low Database Connections: Low Database connection notification emails will be sent to these email addresses. • Database Connection Thresholds: When the number of available database connections is below the set threshold a warning email will be sent.

Power Management Notifications

• Idle Shutdown: Notifications for Idle Shutdown operations will be sent to these email addresses. • Machine Startup: Notifications for Machine Startup operations will be sent to these email addresses. • Thermal Shutdown: Notifications for Thermal Shutdown operations will be sent to these email addresses. • Machine Restart: Notifications for Machine Restart operations will be sent to these email addresses.

5.1. Repository Configuration 213 Deadline User Manual, Release 7.0.3.0

5.1.10 House Cleaning

Pending Job Scan

• Pending Job Scan Interval: The maximum amount of time between Pending Job Scans in seconds. • Run Pending Job Scan in a Separate Process: If enabled, the pending job scan will be run in a separate process. This can be useful when using dependency scripts to ensure that a crash caused by the script doesn’t cause the main application (Pulse, Slave, or Monitor) to crash. – Write Pending Job Scan Output to Seperate Log File: If enabled, all output from the pending job scan will be placed into a seperate log file. – Pending Job Scan Process Timeout: If running the pending job scan in a separate process, this is the maximum amount of time the process can take before it is aborted. • Asynchronous Job Events: If enabled, many job events will be processed asynchronously by the Pending Job Scan operation, which can help improve improve the performance of the Monitor when performing operations on batches of jobs. If this is enabled, the OnJobSubmitted event will still be processed synchronously to ensure that any updates to the job are committed before the job can be picked up by Slaves. – Maximum Job Events Per Session: The maximum number of pending job events that can be processed per scan.

214 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

House Cleaning

• House Cleaning Interval: The maximum amount of time between House Cleaning operations in seconds. • Run House Cleaning in a Separate Process: If enabled, the house cleaning operation will be run in a separate process. – Write House Cleaning Output to Seperate Log File: If enabled, all output from the house cleaning will be placed into a seperate log file. – House Cleaning Process Timeout: If running the house cleaning in a separate process, this is the maximum amount of time the process can take before it is aborted. • House Cleaning Maximum Per Session – Maximum Deleted Jobs: The maximum number of deleted jobs that can be purged per session. – Maximum Archived Jobs: The maximum number of jobs that can be archived per session.

5.1. Repository Configuration 215 Deadline User Manual, Release 7.0.3.0

– Maximum Auxiliary Folders: The maximum number of job auxiliary folders that can be deleted per ses- sion. – Maximum Job Reports: The maximum number of jobs report files that can be deleted per session.

Repository Repair

• Repository Repair Interval: The maximum amount of time between Repository Repair operations in seconds. • Run Repository Repair in a Separate Process: If enabled, the repository repair operation will be run in a separate process. – Write Repository Repair Output to Seperate Log File: If enabled, all output from the repository repair will be placed into a seperate log file. – Repository Repair Process Timeout: If running the repository repair in a separate process, this is the maximum amount of time the process can take before it is aborted. • Automatic Primary Election: If enabled, the Repository Repair operation will elect another running Pulse/Balancer instance as the Primary if the current Primary instance is no longer running.

216 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.1.11 Auto Configuration

This allows you to configure your Slaves from a single location. When a Slave starts up, it will automatically pull this configuration from Pulse and apply it before fully initializing. See the Auto Configuration documentation for more information.

5.1. Repository Configuration 217 Deadline User Manual, Release 7.0.3.0

5.1.12 User Security

• Super User Password: The password needed to access Super User Mode in the Monitor. Leave blank for no password. • Enhanced User Security: When using the System User for the Deadline User, the only way to switch Deadline users is to log off the system and log back in as someone else. This helps improve Deadlines’s user security, as it prevents users from impersonating others to modify their jobs. • Use The System User For The Deadline User: Enable the advanced user security • Rendering Jobs As User: By default, the rendering process will run under the same user account that the Slave is running as. If Render Jobs As User is enabled, the rendering process will run under the user account associated with the user that submitted the job.

Each Deadline user must have their Render Jobs As User settings configured properly for this to work. On Windows, the user’s Run As Name, Domain, and Password settings to start the rendering process as that user. On Linux and Mac OS X, only the user’s Run As Name setting will be used with ‘sudo’ to start the rendering process as that user. Note that on Linux and Mac OS X, the Slave must be running is root for this to work properly. • Render Jobs As User: Enables Rendering Jobs As User

218 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.1.13 Job Settings

Job Scheduling

Scheduling Order • Job Scheduling Order: The order of priority that Deadline uses to schedule jobs. See the Job Scheduling documentation for more details. • Priority Weight: Weight given to job priority when using a Weighted scheduling order. • Submission Time Weight: Weight given to job submission time when using a Weighted scheduling order. • Error Weight: Weight given to the number of errors a job has when using a Weighted scheduling order. • Rendering Task Weight: Weight given to the number of rendering tasks a job has when using a Weighted scheduling order. • Rendering Task Buffer: A buffer that is used by slaves to give their job extra priority on the farm. Submission Limitations • Task Limit For Jobs: The maximum number of tasks a job can have. Note that this does not impose a frame limit so you you can always increase the number of frames per task to stay below this limit. • Maximum Job Priority: The maximum priority value a job can have.

5.1. Repository Configuration 219 Deadline User Manual, Release 7.0.3.0

Automatic Job Timeout Configure Deadline to automatically determine a timeout for a job based on the render times of tasks that have already completed. If a task goes longer than that timeout, a timeout error will occur and the task will be requeued. • Minimum number of completed tasks required before calculating a timeout: The minimum number of tasks that must be completed before Auto Job Timeout Checking occurs. • Minimum percent of completed tasks required before calculating a timeout: The minimum percent of tasks that must be completed before Auto Job Timeout Checking occurs. • Enforce an automatic job timeout for all jobs: If enabled, the Auto Job Timeout will be enabled for all jobs overriding the per job specification of the value. • Timeout Multiplier: To calculate the Auto Job Timeout, the longest render time of the completed tasks is multi- plied by this value to determine the timeout time.

Failure Detection

Job Failure Detection Sends warnings and fail jobs or tasks if they generate too many errors. • Send a warning to the job’s user after it has generated this many errors: A warning will be sent to the job’s notification list once its error count has reach. By default, the submitting user is automatically added to this list.

220 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

• Mark a job as failed after it has generated this many errors: The number of errors a job must throw before it is marked as failed. • Mark a task as failed after it has generated this many errors: The number of errors a task must throw before it is marked as failed. • Automatically delete corrupted jobs from the Repository: If enabled, if a job is found to be corrupted it will it will be automatically removed from the the Repository. • Maximum Number of Job Error Reports Allowed: This is the maximum number of error reports each job can generate. Once a job generate this many errors it will fail and can not be resumed until some of it’s error reports are deleted or this value is increased. Slave Failure Detection Sends warnings and prevent Slaves from reattempting jobs that keep generating errors. • Send a warning after a Slave has generated this many errors for a job in a row: The maximum number of errors that can occur before email warnings are sent to the users specified in the Email Notification section. • Mark a Slave as bad after it has generated this many errors for a job in a row: If a Slave hits this many errors, it will be marked as bad for its current job. • Frequency at which a slave will attempt a job that it has been marked bad for: The percentage of time a Slave will attempt a task it has been marked bad for if no good jobs are available.

5.1. Repository Configuration 221 Deadline User Manual, Release 7.0.3.0

Cleanup

Automatic Job Cleanup • Cleanup Jobs After This Many Days: If enabled, this is the number of days to wait before cleaning up un- archived jobs. • Cleanup Mode: Whether the cleanup should archive the jobs found or delete them. • You can also set the number of hours since the job was last modified before cleaning it up. Deleted Job Purging • Set the number of hours after a job has been deleted before it is purged from the database.

Auxiliary Files

Many jobs have an option to submit the scene file and other auxiliary files with the job. This can be useful because it stores a copy of the scene file with the job that can be referred to later. However, if the size of these files are large and the Repository server isn’t designed to handle this load, it can seriously impact the Repository machine’s performance. This problem can be avoided by storing these files in a location on a different server that is designed to handle the load. • Store job auxiliary files in a different location: If enabled, job auxiliary files submitted to Deadline will be stored at a location specified and not the Repository.

222 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

Extra Properties

Extra arbitrary properties can be submitted with a job, and these properties can be given user friendly names so that they can easily be identified and used to filter and sort jobs in the Monitor.

5.1. Repository Configuration 223 Deadline User Manual, Release 7.0.3.0

5.1.14 Application Logging

Application Log Cleanup • Delete Monitor logs after this many days: The number of days before a Monitor log will be deleted. • Delete Slave logs after this many days: The number of days before a Slave log will be deleted. • Delete Pulse logs after this many days: The number of days before a Pulse log will be deleted. • Delete Balancer logs after this many days: The number of days before a Balancer log will be deleted. • Delete Launcher logs after this many days: The number of days before a Launcher log will be deleted. History Entries • Maximum Number of Repository History Entries: The maximum number of repository history entries that are stored before old entries are overwritten. • Maximum Number of Job History Entries: The maximum number of job history entries that are stored before old entries are overwritten. • Maximum Number of Slave History Entries: The maximum number of slave history entries that are stored before old entries are overwritten. • Maximum Number of Pulse History Entries: The maximum number of pulse history entries that are stored before old entries are overwritten.

224 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

• Maximum Number of Balancer History Entries: The maximum number of balancer history entries that are stored before old entries are overwritten. Logging Verbosity • Slave Verbose Logging: If enabled, more information will be written to the Slave log while it is running. • Pulse Verbose Logging: If enabled, more information will be written to the Pulse log while it is running. • Balancer Verbose Logging: If enabled, more information will be written to the Balancer log while it is running.

5.1.15 Statistics Gathering

Configure Deadline to keep track of job and farm statistics. Note that Pulse must be running to gather Slave and Repository statistics. Job statistics will be gathered regardless if Pulse is running or not. • Enable Statistics Gathering: If enabled, Deadline will gather statistical information.

5.1. Repository Configuration 225 Deadline User Manual, Release 7.0.3.0

• Slave Statistics Gathering Interval(in minutes): The amount of time between polling Slaves for statistical infor- mation. • Repository Statistics Gathering Interval(in minutes): The amount of time between polling the Repository for statistical information. • Delete Job Statistics After This Many Days: The number of days from generation that job statistics will be kept before they are deleted. • Delete Slave Statistics After This Many Days: The number of days from generation that Slave statistics will be kept before they are deleted. • Delete Repository Statistics After This Many Days: The number of days from generation that Repository statis- tics will be kept before they are deleted.

5.1.16 Mapped Paths

Paths to be mapped before rendering(based on Operating System). You may add, remove, or edit paths as well as modify the order in which they will be mapped. See the Cross Platform Rendering section for more details.

226 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.1.17 Mapped Drives

Drives to be mapped before rendering(Windows Only). • Drive: The drive to be mapped. • Remote Path: The remote path for the drive. • Only Map If Unmapped: Enable to only map the drive if it is unmapped. Disabled by default. • Requires Authentication: (Optional) Enable if the drive requires authentication. If unchecked, the existing logged in user account credentials will be used. • Username: Username. Must not be blank. • Password: Password. Must not be blank. Note, drives can be mapped when running as a service. Beware that if a user is logged in and has mapped drives set up for them, the Deadline Slave service won’t see them because they run in a different environment. However, if the drives are mapped in the service’s environment (which is what the slave is doing), then they will work fine. Using the following setting can help remove this potential situation. • Only map drives when the Slave is running as a service: If checked, the slave will only map the drives if it’s running as a service. If unchecked, it will also do it when the slave is running as a normal application.

5.1. Repository Configuration 227 Deadline User Manual, Release 7.0.3.0

5.1.18 Script Menus

There are many scripts that ship with Deadline, and it’s more than likely that you don’t need to use them all, especially the submission scripts. Here, you can configure the contents of the individual script menus to only display what you use. You can also set icons and keyboard shortcuts for your script menu items. If a script menu item has the same shortcut as an existing menu item, the script menu item’s shortcut will take precedence. Note though that these settings will affect all Monitors that connect to this Repository.

228 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.1.19 Python Settings

A list of additional paths to be added to the Python search paths. Specify one path per line, and use the Add Path button to browse for paths.

5.1. Repository Configuration 229 Deadline User Manual, Release 7.0.3.0

5.1.20 Wake On Lan Settings

Deadline’s Power Management uses Wake On Lan to wake up machines, and you can configure which port(s) the WOL packet is sent over. If no ports are listed here, Deadline will use port 9 by default.

230 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.2 User Management

5.2.1 Overview

Deadline has its own user system, which is primarily used to tie users to Jobs. By default, users cannot control or modify the settings of another User’s Jobs. Each user can configure their own user settings from the Monitor by selecting Tools -> Options. See the Monitor and User Settings documentation for more information on the available user settings.

5.2.2 Managing Users

Administrators can manage the all users from the Monitor. This is done by selecting Tools -> Manage Users in Super User mode, or as a user with appropriate User Group privileges. From here, you can add or remove individual users, and edit their user settings. See the Monitor and User Settings documentation for more information on the available user settings.

5.2. User Management 231 Deadline User Manual, Release 7.0.3.0

5.2.3 User Security

User Security settings can be configured in the Repository Configuration.

232 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

By default, Deadline does not enforce Enhanced User Security. This means that a user can switch to a different User and edit someone else’s Jobs. For some pipelines, this “honor system” will work fine, but for those looking for tighter security, you should enable Enhanced User Security, so that it uses the system user as the Deadline User. When this option is enabled, users will not be able to switch to a another Deadline User unless they log off their system and log back in as someone else. It is also recommended that you add a Super User password if you are looking for enhanced security, as a Super User without a password would allow Users to circumvent User Job-editing restriction, as well as circumventing any restrictions imposed on them by their User Groups (see below).

5.2.4 User Group

User Groups allow Administrators to restrict what functionality is available to certain users, as well as make certain features accessible to others without requiring the use of the Super User mode. Deadline automatically creates an ‘Everyone’ User Group, which always contains all Users, and cannot be removed or disabled. This User Group is also populated with the default Permission Settings recommended for normal users.

Managing User Groups

The User Group Management section can be accessed as a Super User through the Tools -> Manage User Groups menu in the Monitor.

5.2. User Management 233 Deadline User Manual, Release 7.0.3.0

The left side of this dialog contains the list of User Groups that have already been created in the Repository. There are also controls allowing you to manipulate this list in many ways: • Add: Will create a new User Group using the default options and feature access levels (equivalent to the default ‘Everyone’ group before modification). • Remove: Will delete the selected User Group from the Repository. Note that the ‘Everyone’ group can never be Removed in order to guarantee that all Users will at least be part of this group. • Clone: Will create a new User Group using the Options and Feature Access Levels of the currently selected group as defaults. This list is visible regardless of which tab is selected, allowing you to quickly change which Group you’re modifying, and ensuring you’re always aware of which one is currently selected.

General Options

This tab contains basic higher-level settings for User Groups. Note that most of the features on this tab, described below, will be disabled when modifying the ‘Everyone’ group, since it is a special Group that must always be active and enabled for all Users. • Group Options

234 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

– Group Enabled: This indicates whether or not this User Group is currently active or not. Disabling a User Group instead of Removing it altogether can be useful if you just want to temporarily disable access for a group of users without having to re-create it later. This is always true for the ‘Everyone’ Group. – Group Expires: This setting will cause a Group to only be valid up to the specified Date and Time. This can be useful if you are hiring temporary staff and know in advance that you will need to revoke their access on a certain Date. This cannot be set for the ‘Everyone’ Group. • Job Access Level – Can Submit Jobs: This setting determines whether or not Users belonging to the Group can submit jobs. – Can View Other Users’ Jobs: This setting determines whether or not Users belonging to the Group can see other users’ jobs. – Can Modify Other Users’ Jobs: This setting indicates whether or not Users in this Group should be allowed to modify other users’ jobs (change properties, job state, etc). • Default Monitor Layout: Here you can select a Monitor layout that was added to the Repository Configuration. This layout will act as the default for users belonging to this user group. The Priority setting is used as a tie breaker if a user is part of more than one group with a default layout. When a user selects View -> Reset Layout, it will reset to their user group’s default layout instead of the normal default. Finally, if the Reset Layout On Startup setting is enabled, the Monitor will always start up with that layout when it is launched. • Time-Restricted Access: This section allows you to set windows of time during which this Group is considered Active. This is useful if you want to set up permissions to change based on the time of day, or if you just want to lock out certain Users after hours. This cannot be enabled for the ‘Everyone’ Group. • Group Members: This is where you control which Users are considered members of the currently selected Group. Users can be part of multiple Groups. All Users are always part of the ‘Everyone’ Group, and this cannot be changed.

Controlling Feature Access

The other tabs in the Group Management dialog are dedicated to enabling or restricting access to certain Features on a per-group basis. Each tab groups displays a different type of Feature, that represent different aspects of the end-user experience: • Menu Items: This tab contains all the Menu Item features, including the main menu bar, right-click menus, and toolbar items. • Job Properties: This tab contains all of a Job’s modifiable properties, and determines which ones a User will be allowed to change. Note that this is only for Jobs a User is allowed to modify in the first place, if he is not allowed to modify other Users’ Jobs (see section above). • Scripts: This contains all the different type of Scripts a User could run from the Monitor. This section is a little different than the others, because the actual Features are dynamically generated based on which Scripts are currently in the Repository. Note that all scripts will default to a value of ‘Inherited’, so make sure to revisit this screen when adding new Scripts to your Repository. • UI Features: This tab contains all the different types of Panels that a User can spawn in the Monitor, and controls whether or not a particular User Group is allowed to spawn them. These Features are also grouped further within each tab into logical categories, to try and make maintenance easier.

5.2. User Management 235 Deadline User Manual, Release 7.0.3.0

There are three possible Access Levels that you can specify for each Feature: • Enabled: The members of this Group will have access to this particular Feature. • Disabled: This Group is not granted access to this Feature. Note, however, that Users in this Group might be granted access to this Feature by a different Group. • Inherited: Whether or not this Feature is ‘Enabled’ or ‘Disabled’ is deferred to the Feature’s Parent Category. Its current inherited value is reflected in the coloured square next to the dropdown; Red indicates it is cur- rently Disabled, while Green indicates it is currently Enabled. Top-level Parents in a category cannot be set to ‘Inherited’. If Users are part of multiple Groups, they will always use the least-restrictive Group for a particular Feature. In other words, a given User will have access to a Feature as long as he is part of at least one currently active Group that has access to that Feature, regardless of whether or not his other Groups typically allow it.

5.3 Slave Configuration

5.3.1 Overview

The Slaves panel allows Slaves to be controlled and modified using the right-click menu. Note that the availability of these options can vary depending on the context in which they are used, as well as the User Group Permissions that

236 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0 are defined for the current user. If the Slaves panel is not visible, see the Panel Features documentation for instructions on how to create new panels in the Monitor.

5.3.2 Slave States

These are the states that a Slave can be in. They are color coded to make it clear which state the Slave is in. • Offline (gray): The Slave application is closed. • Idle (white): The Slave application is running, but it is not currently rendering. • Rendering (green): The Slave application is running, and is rendering a job. • Stalled (red): A Slave becomes stalled if it hasn’t updated its state for a certain amount of time. This could be because the machine crashed, or the Slave simply didn’t shutdown cleanly. • Disabled (yellow): The Slave has been disabled by an administrator. This prevents the Slave application from launching on the machine.* License Warning - Slave received a license error when last attempting to render. View Job Reports to find the exact error message. • License Problems (orange): The Slave cannot acquire a license, or its temporary license is about to expire. If you see an orange Slave in the Slave list, it means that the Slave is having licensing problems, or that the license it is using will expire in less than 10 days. You can check the License column in the Slave list to see what the problem is. If you see a red Slave, it means the Slave has been marked as stalled. This happens if the Slave hasn’t updated its state for a certain amount of time. You can configure this amount of time in the Wait Times section of the Slave Settings in the Repository Configuration. When a Slave is marked as stalled, it usually means that the machine crashed, or that the Slave simply didn’t shutdown cleanly. In the latter case, you can simply mark the Slave as offline from the right-click menu. The Slave panel’s right-click menu also gives the option to delete or disable Slaves. When disabled, the Slave applica- tion will not be allowed to launch on the machine. This is useful if you are doing maintenance on a machine and you don’t want the Slave accidentally starting up on it.

5.3.3 Job Candidate Filter

If a slave isn’t rendering a job that you think it should be, you can use the Job Candidate Filter option in the Slave Panel’s drop down menu to try and figure out why. When the option is enabled, simply click on a job in the Job Panel and the Slave Panel will be filtered to only show the slaves that can render the selected job based on the job’s settings. The filtering takes the following into account: • The job’s pool and group (see the Pools and Groups documentation for more information). • The job’s whitelist/blacklist, and the whitelist/blacklist in the job’s assigned limits (see the Limits and Machine Limits documentation for more information). • If the slave has been marked bad for the job (see the Job Failure Detection documentation for more information).

5.3.4 Slave Settings

Most of the Slave settings be configured from the Monitor while in Super User Mode (or with the proper user priv- ileges) by right-clicking on one or more of them and selecting ‘Modify Slave Properties’. To configure Pools and Groups, you can use the Tools menu, or you can use the Slave panel’s right-click menu. See the Pools and Groups documentation for more information.

5.3. Slave Configuration 237 Deadline User Manual, Release 7.0.3.0

Note that the only settings here that have an actual impact on rendering are the Concurrent Tasks and CPU Affinity settings. Furthermore, the CPU Affinity feature is only supported on Windows and Linux operating systems, since OSX does not support process affinity.

General

These are some general Slave settings: • Slave Description: A description of the selected Slave. This can be used to provide some pertinent information about the slave, such as certain system information. • Slave Comment: A short comment regarding the Slave. This can be used to inform other users why certain changes were made to that Slave’s settings, or of any known potential issues with that particular Slave. • Normalized Render Time Multiplier: This value is used to calculate the normalized render time of Tasks. For example, a Slave that normally takes twice as long to render a Task should be assigned a multiplier of 2. • Normalized Task Timeout Multiplier: This value is used to calculate the normalized render time of Task Time- outs. Typically, this should be the same value as above. • Concurrent Task Limit Override: The concurrent Task Limit for the Slave. If 0, the Slave’s CPU count is used as the limit. • MAC Address Override: This is used to override the MAC Address associated with this Slave. This is useful in the event that the slave defaults to a different MAC Address than the one needed for Wake On Lan.

238 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

• Host Name/IP Address Override: Overrides the Host name/IP address for remote commands. • Region: The Slave’s region. Used for cross platform rendering. Default is ‘None’. See Regions for more information. • Exclude Jobs in the ‘none’ Pool: Enable this option to prevent the Slave from picking up Jobs that are assigned to the ‘none’ Pool. • Exclude Jobs in the ‘none’ Group: Enable this option to prevent the Slave from picking up Jobs that are assigned to the ‘none’ Group.

Idle Detection

These settings can be used to override the global Slave Scheduling settings for the slave (if there are any). It can be used to start the slave when its machine becomes idle (based on keyboard and mouse activity), and stop the slave when its machine is in use again. Note that Idle Detection is managed by the Launcher, so it must be running for this feature to work. • Start Slave When Machine Idle For: If enabled, the Slave will be started on the machine if it is idle. A machine is considered idle if there hasn’t been any keyboard, mouse or tablet activity for the specified amount of time. • Only Start Slave If CPU Usage Less Than: If enabled, the slave will only be launched if the machine’s CPU usage is less than the specified value. • Only Start Slave If Free Memory More Than: If enabled, the slave will only be launched if the machine has more free memory than the specified value (in Megabytes). • Only Start Slave If These Processes Are Not Running: If enabled, the slave will only be launched if the specified processes are not running on the machine.

5.3. Slave Configuration 239 Deadline User Manual, Release 7.0.3.0

• Only Start If Launcher Is Not Running As These Users: If enabled, the slave will only be launched if the launcher is not running as one of the specified users. • Stop Slave When Machine Is No Longer Idle: If enabled, the Slave will be stopped when the machine is no longer idle. A machine is considered idle if there hasn’t been any keyboard, mouse or tablet activity for the specified amount of time. • Allow Slave To Finish Its Current Task When Stopping: If enabled, the Slave application will not be closed until it finishes its current Task.

Job Dequeuing

These setting are used to determine when a Slave can dequeue Jobs. • All Jobs: In this mode, the Slave will dequeue any job. • Only Jobs Submitted From This Slave’s Machine: In this mode, the Slave will only dequeue job submitted from the machine it’s running on. • Only Jobs Submitted From These Users: In this mode, the Slave will only dequeue job submitted from the specified users.

240 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

CPU Affinity

These settings affect the number of CPUs the Slave renders with (Windows and Linux only): • Override CPU Affinity: Enable this option to override which CPUs the Slave and its child processes are limited to. • Specify Number of CPUs to use: Choose this option if you just want to limit the number of CPUs used, and you aren’t concerned with which specific CPUs are used. • Select Individual CPUs: Choose this option if you want to explicitly pick which CPUs are used. This is useful if you are running multiple Slaves on the same machine and you want to give each of them their own set of CPUs.

5.3. Slave Configuration 241 Deadline User Manual, Release 7.0.3.0

Extra Info

Like jobs, extra arbitrary properties can also be set for slaves.

242 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

The Extra Info 0-9 properties can be renamed from the Slaves section of the Repository Configuration, and have corresponding columns in the Slave list that can be sorted on.

5.3.5 Slave Reports and History

All error reports for a Slave can be viewed in the Slave Reports panel. This panel can be opened from the View menu or from the main toolbar in the Monitor. It can also be opened from the Slave panel’s right-click menu.

5.3. Slave Configuration 243 Deadline User Manual, Release 7.0.3.0

You can use the Slave Report panel’s right-click menu to save reports as files to send to Deadline Support. You can also delete reports from this menu as well. In addition to viewing Slave reports, you can also view the Slave’s history. The History window can be brought up from the Slave panel’s right-click menu by selecting the View Slave History option.

244 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.3.6 Remote Control

You can view the live log for Slaves or control them remotely from the right-click menu. See the Remote Control documentation for more information.

5.4 Pulse Configuration

5.4.1 Overview

Pulse has two sets of options that can be configured. There are the global Pulse settings in the Repository Options, which are applied to every running instance of Pulse, and there are the per-Pulse settings that can be configured from the right-click menu in the Pulse panel. Note that the availability of these options can vary depending on the context in which they are used, as well as the User Group Permissions that are defined for the current user.

5.4. Pulse Configuration 245 Deadline User Manual, Release 7.0.3.0

If the Pulse panel is not visible, see the Panel Features documentation for instructions on how to create new panels in the Monitor.

5.4.2 Pulse States

These are the states that a Pulse can be in. They are color coded to make it clear which state the Pulse is in. • Offline (gray): The Pulse application is closed. • Running (white): The Pulse application is running. • Stalled (red): Pulse becomes stalled if it hasn’t updated its state for a certain amount of time. This could be because the machine crashed, or that Pulse simply didn’t shutdown cleanly. If you see a red Pulse, it means the Pulse has been marked as stalled. This happens if the Pulse hasn’t updated its state for a certain amount of time. You can configure the Stalled Pulse Threshold in the General Pulse settings in the Repository Options. When a Pulse is marked as stalled, it usually means that the machine crashed, or that Pulse simply didn’t shutdown cleanly. In the latter case, you can simply mark Pulse as offline from the right-click menu. The Pulse panel’s right-click menu also gives the option to delete Pulses.

5.4.3 Pulse Settings

As mentioned above, there are the global Pulse settings in the Repository Options, which are applied to every running instance of Pulse. However, there are also settings that can be specified for individual Pulse instances, which can be modified by right-clicking on a Pulse in the Pulse panel and selecting ‘Modify Pulse Properties’.

You can also auto-configure a Pulse instance by right-clicking on it in the Monitor and selecting ‘Auto Configure Pulse’. This will automatically make this Pulse the Primary Pulse, and set its connection settings.

General

These are some general Pulse settings: • Host Name or IP Address: This is how the Slaves connect to Pulse. You can specify the host name or IP address of the machine that Pulse is running on.

246 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

• Override Port: If enabled, this port will be used by Pulse instead of a random port. • This Pulse Is The Primary: If enabled, this is the Primary Pulse that the Slaves will connect to. If there is no Primary, the Slaves will not be able to connect to Pulse. • Region: The region for Pulse. Used for path mapping when executing commands with the Pulse Web Service.

5.4.4 Pulse History

You can view a Pulse’s history by right-clicking on it in the Pulse panel and selecting the View Pulse History option.

5.4. Pulse Configuration 247 Deadline User Manual, Release 7.0.3.0

5.4.5 Remote Control

You can view the live log for Pulse or control it remotely from the right-click menu. See the Remote Control docu- mentation for more information.

5.4.6 Pulse Redundancy

You can run multiple instances of Pulse on separate machines as backups in case your Primary Pulse instance goes down. If the Primary Pulse goes offline or becomes stalled, Deadline’s Repository Repair operation can elect another running instance of Pulse as the Primary, and the Slaves will automatically connect to the new Primary instance. To enable Pulse Redundancy, you must enable the Automatic Primary Pulse Election optin in the Repository Repair settings in the Repository Options. Note that when multiple Pulse instances are running, only the Primary Pulse is used by the Slaves for Throttling. In addition, only Primary Pulse is used to perform Housecleaning, Power Management, and Statistics Gathering. However, you can connect to any Pulse instance to use the Web Service.

5.4.7 Advanced Features

Many advanced features are built into Pulse. These features are described below. Auto Configuration

248 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

This allows you to set the repository path in a single location. When a Slave starts up, it will automatically pull the repository path from Pulse and from that apply some settings before fully initializing. See the Auto Configuration documentation for more information. Slave Throttling Pulse supports a throttling feature, which is helpful if you’re submitting large files with your jobs. This is used to limit the number of Slaves that copy over the job and plugin files at the same time. See the Network Performance Guide documentation for more information. Power Management Power management is a system for controlling how machines startup and shutdown automatically based on sets of conditions on the render farm, including job load and temperature. Power management is built into Pulse, so Pulse must be running to use this feature. The only exception to this rule is Temperature checking. See the Power Management documentation for more information. Statistics Gathering While Pulse isn’t required to gather job statistics, it is required to gather the Slave and Repository statistics. See the Farm Statistics documentation for more information. Web Service Pulse has a web service feature built in, which you can use to get information directly from Pulse over an Internet connection. It is used by the Mobile application, and can also be used to display information in a web page. See the Pulse Web Service documentation for more information.

5.5 Balancer Configuration

5.5.1 Overview

Balancer has three sets of options that can be configured: • Global Balancer settings in the Repository Options. • Cloud Provider Balancer settings in the Cloud Provider Configuration dialog. • Per-Balancer settings that can be configured from the right-click menu in the Balancer panel. Note that the availability of these options can vary depending on the context in which they are used, as well as the User Group Permissions that are defined for the current user. If the Balancer panel is not visible, see the Panel Features documentation for instructions on how to create new panels in the Monitor.

5.5.2 Global Balancer Settings

As mentioned above, there are the global Balancer settings in the Repository Options, which are applied to every running instance of Balancer.

5.5.3 Cloud Privider Configuration

Before the Balancer can do anything, you’ll need to setup a Cloud Region. Balancer settings for each Cloud Provider can be configured in the Cloud Provider Configuration dialog. Deadline supports a number of cloud providers by default. Custom cloud plugins can be written to support different providers. Here’s a list of all the supported Cloud Plugins.

5.5. Balancer Configuration 249 Deadline User Manual, Release 7.0.3.0

When adding a new Cloud Region you’ll have to enter all of your credentials and settings for that particular provider. You can look at the documentation for each plugin for further details about all the settings and credentials. Enabling the region will show instances in the Cloud Panel. Your credentials need to be verified before you’re able to enable the region to work with the Balancer.

Basic Configuration

The basic configuration options are: • Enabled: Enabling the region makes it usable by the Balancer. • Region Preference: Weighting towards the region. • Region Budget: Total Budget for a region. Governs how many instances will be started for this region.

Asset Checking

Asset Checking can be used to sync assets between the repository and the slaves. The Asset Checking options are: • Enable Asset Checking: Enables asset crawler for jobs with assets. • Asset Crawler Hostname: Hostname for the Asset Crawler. • Asset Crawler Port: Port number for the Asset Crawler. • Asset Crawler OS: Operating system of the Asset Crawler. The asset script itself can be found in the vmx folder in the Repository, and is called AssetCrawler_Server.py.

250 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

Balancer Plugins

The Balancer uses an algorithm that’s defined in a Balancer Plugin. That can be set in the Balancer Settings section in Repository Configuration. We’ve included a default algorithm that should be fine for most use cases but you can write your own for your specific needs.

Group Mappings

Group Mappings are the heart of the Balancer. They tell the Balancer what kinds of instances to start for each group.

A Group Mapping is mainly comprised of a group, an image, a hardware type and a budget. The image and hardware type are from the provider. The Cost is how much of the region’s budget will be consumed by each instance.

5.5. Balancer Configuration 251 Deadline User Manual, Release 7.0.3.0

You can also add Pools to a mapping so that instances will be started in those pools.

5.5.4 Balancer States

These are the states that a Balancer can be in. They are color coded to make it clear which state the Balancer is in. • Offline (gray): The Balancer application is closed. • Running (white): The Balancer application is running. • Stalled (red): Balancer becomes stalled if it hasn’t updated its state for a certain amount of time. This could be because the machine crashed, or that Balancer simply didn’t shutdown cleanly. If you see a red Balancer, it means the Balancer has been marked as stalled. This happens if the Balancer hasn’t updated its state for a certain amount of time. You can configure the Stalled Balancer Threshold in the General Balancer settings in the Repository Options. When a Balancer is marked as stalled, it usually means that the machine crashed, or that Balancer simply didn’t shutdown cleanly. In the latter case, you can simply mark Balancer as offline from the right-click menu. The Balancer panel’s right-click menu also gives the option to delete Balancers.

5.5.5 Balancer Settings

There are settings that can be specified for individual Balancer instances, which can be modified by right-clicking on a Balancer in the Balancer panel and selecting ‘Modify Balancer Properties’.

252 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

You can also auto-configure a Balancer instance by right-clicking on it in the Monitor and selecting ‘Auto Configure Balancer’. This will automatically make this Balancer the Primary Balancer.

General

These are some general Balancer settings: • This Balancer Is The Primary: If enabled, this is the Primary Balancer. • Region: The region for Balancer.

5.5.6 Balancer History

You can view a Balancer’s history by right-clicking on it in the Balancer panel and selecting the View Balancer History option.

5.5. Balancer Configuration 253 Deadline User Manual, Release 7.0.3.0

5.5.7 Remote Control

You can view the live log for Balancer or control it remotely from the right-click menu. See the Remote Control documentation for more information.

5.5.8 Balancer Redundancy

You can run multiple instances of Balancer on separate machines as backups in case your Primary Balancer instance goes down. If the Primary Balancer goes offline or becomes stalled, Deadline’s Repository Repair operation can elect another running instance of Balancer as the Primary. To enable Balancer Redundancy, you must enable the Automatic Primary Balancer Election optin in the Repository Repair settings in the Repository Options.

254 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

Note that when multiple Balancer instances are running, only the Primary Balancer is starting and stopping virtual instances.

5.6 Job Scheduling

5.6.1 How a Job is Selected by a Slave

By default, a job is selected by a Slave based on the following properties, in this order: 1. The Pools and Groups that the Job has been submitted to. • A Slave will only select a Job if it has been assigned to the Pool and Group to which the Job belongs. • Pools are priority-based, so a Slave will favour Jobs in Pools that are higher on its priority list. This ordering can be configured on a per-Slave basis through the Manage Pools utility. • Groups are not priority-based, and are typically just used to ensure that Jobs render on machines with appropriate hardware and software configurations. 2. The Job’s Priority: • By default, a Job has a numeric Priority ranging from 0 to 100, where 0 is the lowest priority and 100 is the highest priority. You can adjust the maximum Job priority in the Job Settings section of the Repository Configuration. • Everything else being equal, the highest Priority Job will always be chosen first when a Slave is selecting its next Job. 3. The Date and Time at which the Job was submitted: • This is set automatically and is the timestamp of when the Job was submitted to Deadline. • Everything else being equal, an older Job will take priority over a newer Job when a Slave is looking for a new one. 4. The Job’s Limits and Machine Limits • With Limits, if a Job has the highest priority, but requires a Limit that is maxed out, a Slave will try to select a different Job. • A Machine Limit is a special type of Limit that restricts the number of machines that can render that particular Job at the same time.

5.6.2 Changing the Scheduling Order

It is possible to change the order in which Jobs are scheduled in the Job Settings section of the Repository Configura- tion.

5.6. Job Scheduling 255 Deadline User Manual, Release 7.0.3.0

The following options are available: • First-in First-Out: Job order will be based solely on submission date, and will be rendered in the order they are submitted. • Pool, First-in First-Out: Job order will be based on the job’s pool first, with submission date being the tie-breaker. • Pool, Priority, First-in First-Out: This is the default scheduling order that is used. Job order will be based on the job’s pool, then priority, with submission date being the tie-breaker. • Priority, First-in First-Out: Job order will be based on the job’s priority first, with submission date being the tie-breaker. • Priority, Pool, First-in First-Out: Job order will be based on the job’s priority, then pool, with submission date being the tie-breaker. • Balanced: Job order will be balanced so that each job has the same number of slaves rendering them at a time. • Pool, Balanced: Job order will be based on the job’s pool first, with a balance being applied to jobs that are in the same pool. • Pool, Priority, Balanced: Job order will be based on the job’s pool, then priority, with a balance being applied to jobs that have the same pool and priority. • Priority, Balanced: Job order will be based on the job’s priority first, with a balance being applied to jobs that have the same priority.

256 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

• Priority, Pool, Balanced: Job order will be based on the job’s priority, then pool, with a balance being applied to jobs that have the same pool and priority. • Weighted, First-in First-out: A weighted system that takes priority, submission time, number of rendering tasks, and number of job errors into account, but does not take pools into account. If two or more jobs have the same calculated weight, the submission date will act as the tie-breaker. • Pool, Weighted, First-in First-out: A weighted system that still respects pool priority. If two or more jobs have the same calculated weight, the submission date will act as the tie-breaker. • Weighted, Balanced: A weighted system that takes priority, submission time, number of rendering tasks, and number of job errors into account, but does not take pools into account. A balance will be applied to jobs that have the same calculated weight. • Pool, Weighted, Balanced: A weighted system that still respects pool priority. A balance will be applied to jobs that have the same calculated weight. Note that the Secondary Pool feature was designed for job scheduling orders that have Pool listed first, and might not work as expected otherwise. For example, if Priority is listed first, a job with lower priority that’s found during the initial Primary Pool scan will be preferred over a job with higher priority that’s found during the Secondary Pool scan. This is because the Secondary Pool scan is only performed if no jobs are found during the initial Primary Pool scan. See the Pools and Groups documentation for more information.

Weighted Scheduling

For the weighted options, you can control how much weight is applied to the job priority, submission time, number of rendering tasks, and number of errors. You can also give weight to the job that the slave is currently working on using the Rendering Task Buffer. The buffer is subtracted from the rendering task count for the current job, which pushes it higher in the queue. Deadline then sorts by this weight so that jobs with the largest weight value have the higher priority. Note that the weight values can be negative. For example, if you set a negative weight value to the number of job errors, a job with more errors will end up having a lower overall weight so that precedence is given to other jobs in the queue. Here is how the weight is calculated: weight = (job.Priority * PW) + (job.Errors * EW) + ((NOW - job.SubmissionTimeSeconds) * SW) + ((job.RenderingTasks - RB) * RW)

Where: • PW = priority weight • EW = error weight • SW = submission time weight • RW = rendering task weight • RB = rendering task buffer • NOW = the current repository time Note that because the job submission time is measured in seconds, it will have the greatest impact on the overall weight. Reducing the SW value can help reduce the submission time’s impact on the weight value.

5.6. Job Scheduling 257 Deadline User Manual, Release 7.0.3.0

5.7 Pools and Groups

5.7.1 What are Pools and Groups?

Groups can be used to organize your farm based on machine configurations (e.g., specs, installed software, etc). For example, if you have several 64-bit machines with 3ds Max installed, you could assign them to groups like 3dsmax, or 3dsmax_64bit, or simply 3D. Groups have no impact on the order in which Jobs are rendered, they just help to ensure that Jobs render on machines with proper an appropriate hardware/software setup. If you don’t care about grouping your machines, you can simply use the default ‘none’ Group. Pools are similar to Groups, except that they do affect the order in which Jobs are rendered. Because of this, it is encouraged to use Pools for prioritizing different shows, shots, types of Jobs, etc. If you don’t want to set up Pools on your farm, you can simply use the default ‘none’ Pool. Note that the ‘none’ Pool always has the lowest priority of all the Pools. Jobs can be added to an optional Secondary Pool. When searching for a Job, a Slave does a first pass using the Primary Pool of the available Jobs. If the Slave doesn’t find any Jobs using the Primary Pool, it then makes a second pass using the Secondary Pool. This system can allow a Job to spread to a Secondary Pool as necessary, and it can also ease the configuration of Pools in the farm if there are lots of Pools and Slaves. An example of this is shown below. Note that the Secondary Pool feature was designed for Job Scheduling Orders that have Pool listed first, and might not work as expected otherwise. For example, if Priority is listed first, a job with lower priority that’s found during the initial Primary Pool scan will be preferred over a job with higher priority that’s found during the Secondary Pool scan. This is because the Secondary Pool scan is only performed if no jobs are found during the initial Primary Pool scan.

5.7.2 Managing Pools and Groups

Pools and Groups can be managed from the Monitor while in Super User mode (or as a User with the proper User Group privileges). Just select ‘Manage Pools’ (or ‘Manage Groups’) from the ‘Tools’ menu, or from the Slave panel’s right-click menu. The dialogs are very similar to each other, but the nuances between the two are described below in detail. Note that if you used the Slave panel’s right-click menu to open these dialogs, they will be pre-filtered to just show the slaves that you right-clicked on. They will also show the same columns that are currently being shown in the slave list.

Group Management Dialog

From here, you can manage individual Groups, and assign them to various Slaves. It is a bit simpler than the Pool Management Dialog, which will be covered below in more detail, since it does not have to worry about the order of Groups for a given Slave.

258 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

The functions you can perform here are as follows: • Groups: This section displays existing Groups and allows you to manipulate them, or create new ones. Your selection here will determine which Groups will be affected by the Group Operations. – New: This will create a new Group in the Repository, and allow you to assign the Group to different Slaves. You will be prompted for a name for the new Group. Group names cannot be changed once the Group has been created. Adding a Group with the name of previously Deleted Group will effectively ‘re-instate’ that Group if it hasn’t been Purged yet (see below). – Delete: This will Delete all of the selected Groups from the Repository, and enable the option to Purge them (described below). – Purge Obsolete Groups on Close: This will purge any obsolete (deleted) Groups from existing Jobs and remove from any Slaves that are currently assigned to it. They will be replaced with the Group selected in the drop down. Note that if you choose not to Purge the obsolete Groups right now, you can always return to this dialog and do it later. • Slaves: This section displays a list of all known Slaves that have connected to your Repository. Your selection here will determine which Slaves will be affected by the Groups Operations. – Only Show Slaves Assigned to a Selected Group: This option will filter the displayed Slaves to only include the ones that are currently assigned to at least one of the selected Groups. • Group Operations: These operations are used to manipulate which Groups are assigned to which Slaves. They typically require a selection of one or more Groups and one or more Slaves to be active. – Add: This will add all of the selected Groups to all of the selected Slaves, if it wasn’t already there. – Remove: This will remove all of the selected Groups from all of the selected Slaves, if applicable. – Copy: This will copy the groups from the selected slave to the clipboard. – Paste: This will paste the groups that were copied using the Copy button to the selected slaves.

5.7. Pools and Groups 259 Deadline User Manual, Release 7.0.3.0

– Clear: This will clear all the groups from all of the selected Slaves. This option does not require a Group to be selected.

Pool Management Dialog

The Pool Management dialog functions similarly to the Group Management dialog described above, but with a few added options to deal with managing Pool Ordering for individual Slaves.

The functions you can perform here are as follows. Note that a lot of these overlap with the described Group Manage- ment functionality described in the previous section. • Pools: This section displays existing Pools and allows you to manipulate them, or create new ones. Your selection here will determine which Pools will be affected by the Pool Operations described below. – New: This will create a new Pool in the Repository, and allow you to assign it to Slaves. You will be prompted for a name for the new Pool; not that Pool names cannot be changed once the Pool has been created. Adding a Pool with the name of previously Deleted Pool will effectively ‘re-instate’ that Pool if it hasn’t been Purged yet (see below). – Delete: This will Delete all of the selected Pools from the Repository, and enable the option to Purge them (described below). – Purge Obsolete Pools on Close: This will purge any obsolete (deleted) Pools from existing Jobs and remove them from any Slaves that may have them in their list. They will be replaced with the Pool selected in the drop down. Note that if you choose not to Purge the obsolete Pools right now, you can always return to this dialog and do it later. – Priority Distribution: This graph visualizes how many Slaves have one of the selected Pools as #1 priority, #2 priority, etc. It also displays how many Slaves are not currently assigned to the selected Pools. • Slaves: This section displays a list of all known Slaves that have connected to your Repository. Your selection here will determine which Slaves will be affected by the Pool Operations described below.

260 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

– Only Show Slaves Assigned to a Selected Pool: This option will filter the displayed Slaves to only include the ones that are currently assigned to at least one of the selected Pools. • Pool Operations: These operations are used to manipulate which Pools are assigned to which Slaves. They typically require a selection of one or more Pools and one or more Slaves to be active. – Add: This will add all of the selected Pools to all of the selected Slaves, if it wasn’t already there. – Remove: This will remove all of the selected Pools from all of the selected Slaves, if applicable. – Promote: This will bump up the selected Pools by one position in all of the selected Slaves’ Pool lists. Any selected Slaves that are not assigned to the selected Pool(s) are unaffected. – Demote: This will bump down the selected Pools by one position in all of the selected Slaves’ Pool lists. Any selected Slaves that are not assigned to the selected Pool(s) are unaffected. Note that a Pool cannot be demoted to be lower than the ‘none’ pool – the ‘none’ Pool is always assigned the lowest priority by Slaves. – Copy: This will copy the pools from the selected slave to the clipboard. – Paste: This will paste the pools that were copied using the Copy button to the selected slaves. – Clear: This will clear all the Pools from all of the selected Slaves. This option does not require a Pool to be selected.

Preventing Slaves from Rendering Jobs in the ‘none’ Pool or Group

In some cases, it may be useful to prevent on or more Slaves from rendering Jobs that are assigned to the ‘none’ Pool or Group. For example, you may have a single machine that you want to only render Quicktime Jobs. Normally, you could add this machine to a ‘quicktime’ Group, but if there are noe Quicktime Jobs, the Slave could move on to Jobs that are in the ‘none’ Group. If you want this machine to only be available for Quicktime Jobs, you can configure it to exclue Jobs in the ‘none’ Group. The option to exclude Jobs in the ‘none’ Pool or Group can be found in the Slave Settings in the Monitor.

5.7.3 Pools and Job Scheduling

How pools affect the Job selection process is best explained through an example. Note that this example relies on a Scheduling Order where Pools are the primary determining factor of scheduling (such as the default Pool -> Priority -> Submit Date scheme). Say we need to render Jobs for two different shows, and we’ve already created corresponding pools for each show in Deadline: • show_a • show_b Now say we have 10 machine in our render farm, and we want to give each show top priority on half of it. To do this, we’d just assign the pools to our Slaves like this: • Slaves 1-5: 1. show_a • Slaves 6-10: 1. show_b

5.7. Pools and Groups 261 Deadline User Manual, Release 7.0.3.0

With this setup, if Jobs from both shows are in the queue, then Slaves 1-5 will pick up the Jobs from show_a, while Slaves 6-10 will work on Jobs from show_b. This effectively splits our farm in half, like we desired, but with this configuration Slaves 1-5 would sit idle once show_a finishes production, even if there are plenty of show_b Jobs in the queue. The reverse would also be true if show_b production slows down while show_a is still ramping up. To accomplish this second goal of maximizing our resources, we’ll assign the Pools to our Slaves as follows: • Slaves 1-5: 1. show_a 2. show_b • Slaves 6-10: 1. show_b 2. show_a Now, Slaves 1-5 will still give top priority to show_a Jobs, and Slaves 6-10 will similarly give top priority to show_b Jobs. However, if there are no show_a Jobs currently in the queue, Slaves 1-5 will start working on show_b Jobs until another show_a Job comes along. Similarly, Slaves 6-10 would start working on show_a if no show_b Jobs were available. This concept is also extensible to any number of shows/pools, you just have to make sure to have an even Priority Distribution across your farm (the Priority Distribution graph should help with that). Here’s an example of what the Priority Distribution for a 3-show farm with 15 Slaves could look like: • Slaves 1-5: 1. show_a 2. show_b 3. show_c • Slaves 6-10: 1. show_b 2. show_c 3. show_a • Slaves 11-15: 1. show_c 2. show_a 3. show_b

5.7.4 Secondary Pools and Job Scheduling

How secondary pools affect the Job selection process is best explained through an example. Note that this example relies on a Scheduling Order where Pools are the primary determining factor of scheduling (such as the default Pool -> Priority -> First-in First-out option). The Secondary Pool feature was designed for job scheduling orders that have Pool listed first, and might not work as expected otherwise. Let’s say you have 5 pools and 10 slaves. You want each pool to have top priority on 2 machines, but then be able to spread to the rest of them if they are idle. Without using the secondary pool system, you might have something like this: • Slaves 0-1: pool_1, pool_2, pool_3, pool_4, pool_5

262 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

• Slaves 2-3: pool_2, pool_3, pool_4, pool_5, pool_1 • Slaves 4-5: pool_3, pool_4, pool_5, pool_1, pool_2 • Slaves 6-7: pool_4, pool_5, pool_1, pool_2, pool_3 • Slaves 8-9: pool_5, pool_1, pool_2, pool_3, pool_4 This can be tricky to maintain if you have to reorganize pools or new slaves are added to the farm. The new secondary pool system can make this easier: • Slaves 0-1: pool_1, pool_all • Slaves 2-3: pool_2, pool_all • Slaves 4-5: pool_3, pool_all • Slaves 6-7: pool_4, pool_all • Slaves 8-9: pool_5, pool_all In this case, all jobs could have pool_all as their secondary pool, and will spread to the rest of the farm if machines become available.

5.8 Limits and Machine Limits

5.8.1 Overview

In order to support rendering applications that use floating licensing to limit the number of clients rendering at any one time, Deadline supports the ability to create Limits to manage this restriction. When creating a Limit, be sure to set the limit to the number of network licenses you have for the product. For example, if you have 20 nodes in your render farm and only 10 licenses of Nuke, you can create a Nuke Limit with a limit of 10. During rendering Deadline will ensure that no more than 10 machines are rendering Jobs associated with this Nuke Limit at any given time. Because of this, you never have to worry about licensing issues. Machine Limits function similarly, but are on a per-Job basis. Instead of limiting how many Slaves can render a group of Jobs, they limit the number of Slaves that can render one particular Job. This is useful if you want to prevent a job from potentially taking up the entire farm.

5.8.2 Job Machine Limits

Machine Limits are a per-Job option, and can be managed through the Job’s Properties window, which you can get to by right-clicking on the Job and selecting ‘Modify Job Properties’. More information on the available Machine Limit settings can be found in the Controlling Jobs documentation.

5.8. Limits and Machine Limits 263 Deadline User Manual, Release 7.0.3.0

5.8.3 Limits

Limits can be managed from the Limit list in the Monitor while in Super User mode (or as a user with appropriate User Group privileges). This list shows all the Limits that are in your Repository. It also displays useful information about each Limit, such as its name, its limit, and the number of Limit stubs that are currently in use. You can access many options for the Limits (listed below) by right-clicking on them.

264 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

If the Limits panel is not visible, see the Panel Features documentation for instructions on how to create new panels in the Monitor.

New Limit

Use this option to add a new Limit to your Repository.

5.8. Limits and Machine Limits 265 Deadline User Manual, Release 7.0.3.0

You can modify the following settings for the new Limit: Name The name of the new Limit. Note that this setting cannot be changed once the Limit has been created. Usage Level The level at which a Limit Stub will be checked out. ‘Slave’ is the default, and will require each Slave to acquire a Stub; if ‘Machine’ is selected, only a single Stub will be required for all Slaves on the same machine. Conversely, if ‘Task’ is selected, Slaves will try to acquire one Stub per concurrent Render Thread. Note that this setting cannot be changed after Limit creation. Limit The maximum number of simultaneous uses that this Limit can support at any given time. What counts as a ‘use’ is based on the usage Level (will be either on a Machine, Slave, or Task level). Release at Task Progress

266 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

If enabled, Slaves will release their Limit stub when the current Task reaches the specified percentage. Note that not all Plugins report Task progress. Whitelisted/Blacklisted Slaves If Slaves (or Machines, depending on Level selected above) are on a Blacklist, they will never try to render Jobs associated with this Limit. If Slaves/Machines are on a Whitelist, then they are the only ones that will try to render Jobs associated with this Limit. Note that an empty blacklist and an empty whitelist are functionally equivalent, and have no impact on which machines the job renders on. Slaves Excluded From Limit These Slaves (or Machines, depending on Level selected above) will ignore this Limit and won’t con- tribute to the Limit’s stub count. This is useful if you are juggling a mix of floating and node-locked licenses, in which case your machines with node-locked licenses should be placed on this list.

Clone Limit

This option allows you to create a new Limit while using an existing Limit as a template. It will bring up a dialog very similar to the one pictured in Create Limit, with all the same options. This option is handy if you want to create a Limit that is very similar to an existing one, but with a small variation.

Modify Limit Properties

This option allows you to edit the settings for an existing Limit. All of the settings described in the New Limit section above can be changed except for the Limit’s Name and Usage Level, which cannot be changed once the Limit has been created.

Reset Limit Usage Count

Sometimes a Limit stub will get orphaned, meaning that it is counting against the Limit’s usage count, but not machines are actually using it. After a while, Deadline will eventually clean up these orphaned Limit stubs. This option provides the means to delete all existing stubs immediately (whether they are orphaned or not), which will require Slaves to acquire them again.

Delete Limit

Removes an existing Limit from your Repository. Any Jobs associated with deleted Limits will still be able to render, but they will print out Warnings indicating that the Limit no longer exists.

5.8.4 Limits and Job Scheduling

Although Limits and Job Machine Limits aren’t priority-based like Pools, they do have an impact on job scheduling. Here are some examples. Limits • If a job is assigned to a Limit, and that Limit is currently maxed out, the job will not be picked up by any additional slaves. • If a job is assigned to a Limit, and that Limit has a whitelist, the job will only render on the slaves in that whitelist.

5.8. Limits and Machine Limits 267 Deadline User Manual, Release 7.0.3.0

• If a job is assigned to two Limits, and one of those Limits is currently maxed out, the job will not be picked up by any additional slaves. This is because a slave must be able to acquire all Limits that the job requires. • If a job is assigned to two Limits, and one of those Limits has slave_1 on its blacklist, slave_1 will never pick up the job. This is because a slave must be able to acquire all Limits that the job requires. Job Machine Limits • If a job has a Machine Limit greater than 0, and that Limit is currently maxed out, the job will not be picked up by any additional slaves. • If a job has a whitelist, the job will only render on the slaves in that whitelist.

5.9 Job Failure Detection

5.9.1 Overview

Job Failure Detection can be used to prevent problematic Jobs from wasting previous render time on the farm. There are two types of Failure Detection, which are both explained below. By default, Jobs will fail after they have accumulated 100 errors, but this can be changed in the Job Settings section of the Repository Configuration.

268 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.9.2 Job Failure Detection

A Job will enter the Failed state when it has accumulated the maximum permitted number of errors. Once in the Failed state, the Job will no longer be picked up by Slaves for rendering without manual intervention. Because of this, Job Failure Detection can help ensure that problematic Jobs are flagged appropriately and won’t waste precious rendering time. In the Repository Options, can setup failure thresholds for Jobs and for individual Tasks.

If you’ve resolved the problems that were preventing the Job from rendering properly, you can right-click on it in the Monitor and select ‘Resume Failed Job’. You will then be prompted with the option to ignore or override Failure Detection for this Job going forward. Note that an Error Limit of 0 indicates that there is no limit, and the Job will never be marked as Failed by Failure Detection.

If you choose not to ignore Failure Detection, make sure to clear the Job’s errors, or a single new error will result in the Job failing again, because its error limit is still over the maximum. To clear a Job’s errors, simply delete all of the Job’s Error Reports using the Job Reports Panel.

5.9. Job Failure Detection 269 Deadline User Manual, Release 7.0.3.0

5.9.3 Slave Failure Detection

Slave Failure Detection works a little differently than Job Failure Detection. Basically, if a particular Slave reports consecutive errors for a given Job, it will add itself to the Job’s list of Bad Slaves. When a Slave has been marked as bad for a particular Job, it will not try to render that Job again until it has no other Jobs available. This helps ensure that Slaves aren’t wasting render time on Jobs that they likely aren’t able to render. If the issue preventing a Slave from rendering a particular Job properly has been resolved, you can remove it from a Job’s ‘bad’ list by navigating to the ‘Failure Detection’ section of a Job’s Properties dialog. There is also an option in this section to have your Job completely ignore Slave Failure Detection, if you wish.

5.10 Notifications

5.10.1 Overview

Deadline can be configured to notify Users when their Jobs finish, or if they have failed. In addition, Deadline can be configured to send notifications to administrators when certain events occur on the farm (e.g., when a Slave has stalled, or if a Slave is being shutdown by Power Management).

5.10.2 Email Notifications

Before Deadline can send email notifications, you need to configure the Email Notification settings in the Repository Configuration.

270 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.10.3 Popup Message Notifications

The popup message notification system can be used to send job notifications to users by popping up a message window on their workstations. In order to receive popup message notifications, the user needs to have the Launcher running on their workstation, and have their workstation machine name specified in their User Settings (see below).

5.10.4 Job Notifications

Users can edit their User Settings to control whether or not they receive notifications for their own Jobs.

5.10. Notifications 271 Deadline User Manual, Release 7.0.3.0

In order to receive email notifications, the user needs to set their Email Address setting and enable the Email Notifi- cation option. Note that email notifications will only be sent if the SMTP settings in the Repository Options are set properly, as mentioned in the previous section. In order to receive popup message notifications, the user needs to have the Launcher running on their workstation, and have their workstation machine name specified in their User Settings.

5.11 Remote Control

5.11.1 Overview

Remote control features are built into the Monitor to make farm administration easier. These features allow you to connect to and control the Slave application on your render nodes, and also run remote commands on them. They also allow you to control Pulse and the Balancer as well (if you’re running them on your farm). If the Slaves, Pulse, or Balancer panel are not visible, see the Panel Features documentation for instructions on how to create new panels in the Monitor.

272 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.11.2 Connecting to the Slave Log

You can remotely connect to a Slave by double-clicking on it in the Slave panel, or by right-clicking on it and selecting Connect To Slave Log. This will bring up the Slave Log window, allowing you to see what the Slave is currently doing.

There are a few places in the Monitor you can find the option to connect to the Slave log: • The Slave panel right-click menu. • The Task panel right-click menu. Note that it will only appear for rendering or completed tasks. • The Job Report panel right-click menu. • The Slave Report panel right-click menu.

5.11. Remote Control 273 Deadline User Manual, Release 7.0.3.0

5.11.3 Connecting to the Pulse Log

You can remotely connect to a Pulse by double-clicking on it in the Pulse panel, or by right-clicking on it and selecting Connect To Pulse Log. This will bring up the Pulse Log window, allowing you to see what the Pulse is currently doing.

5.11.4 Connecting to the Balancer Log

You can remotely connect to a Balancer by double-clicking on it in the Balancer panel, or by right-clicking on it and selecting Connect To Balancer Log. This will bring up the Balancer Log window, allowing you to see what the Balancer is currently doing.

274 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.11.5 Remote Controlling Slaves, Pulses, and Balancers

The Remote Control menu can be found in the Slave, Pulse, and Balancer panel’s right-click menu. Note that the availability of these options can vary depending on the context in which they are used, as well as the User Group Permissions that are defined for the current user. Remote Administration must also be enabled on the farm, and can be enabled in the Client Setup. These are the options that are available in the Slave, Pulse, and Balancer Remote Control menus: • Start Machine: Starts the machine using Wake On Lan. • Shutdown Machine: Turns off the machine. • Restart Machine: Restarts the machine. • Suspend Machine: Sets the machines state as suspended (Windows Only). • Execute Command: Executes an arbitrary command on the machine.

5.11. Remote Control 275 Deadline User Manual, Release 7.0.3.0

When executing an arbitrary command, if you want to execute a DOS command on a Windows machine, the command must be preceded with “cmd /C”. This opens the DOS prompt, executes the command, and closes the prompt. For example: cmd /C echo "foo" > c:\test.txt

These remote commands do not allow for user input for any command requiring a prompt. An example where this might cause confusion is with Microsoft’s xcopy command. Here, if the destination is uncertain to be a file or folder, xcopy will immediately exit as though successful instead of asking what should be done. If a command returns a non-zero exit code, the command will be interpreted as having failed.

Slave Remote Control Options

These options are only available in the Slave Remote Control menu: • Search For Jobs: Forces the Slave to search the Repository for a job to do. • Cancel Current Tasks: Forces the Slave to cancel its current tasks. • Start Slave: Starts the Slave instance. • Stop Slave: Stops the Slave instance. • Restart Slave: Restarts the Slave instance. • Continue Running After Current Task Completion: The Slave will continue to run after it finishes its current task. • Stop Slave After Current Task Completion: The Slave will stop after the current task is completed. • Restart Slave After Current Task Completion: The Slave will restart after the current task is com- pleted. • Shutdown Machine After Current Task Completion: The Machine running the Slave will stop after the current task is completed. • Restart Machine After Current Task Completion: The machine running the Slave will restart after the current task is completed. • Start All Slave Instances: Starts all the slave instances on the selected machines. • Start New Slave Instance: Starts a new slave instance with the specified name on the selected ma- chine.

276 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.11.6 Pulse Remote Control Options

These options are only available in the Pulse Remote Control menu: • Start Pulse: Starts the Pulse instance. • Stop Pulse: Stops the Pulse instance. • Restart Pulse: Restarts the Pulse instance.

5.11.7 Balancer Remote Control Options

These options are only available in the Pulse Remote Control menu: • Start Balancer: Starts the Balancer instance. • Stop Balancer: Stops the Balancer instance. • Restart Balancer: Restarts the Balancer instance.

5.11. Remote Control 277 Deadline User Manual, Release 7.0.3.0

5.11.8 Remote Commands Panel

The Remote Command panel shows all pending and completed remote commands that were sent from the Monitor. When sending a remote command, if this panel is not already displayed, it will be displayed automatically (assuming you have permissions to see the Remote Command panel).

You can view the results of a remote command by clicking on the command in the list. The full results will be shown in the log window below. All successful commands will start with “Connection Accepted.”

5.12 Network Performance

5.12.1 Overview

This guide is intended to help you find and fix potential bottlenecks in your Deadline render farm. If you are noticing sluggish performance when you are using Deadline, there are a few things you can do to try and improve it.

5.12.2 Adjust Monitor and Slave Settings

There are a few Monitor and Slave settings in the Repository Options that you can tweak to help improve performance, and reduce load on both the network and the database. You can also use the Auto Adjust option to figure out the best default values to use based on the number of Slaves in your farm. See the Repository Options documentation for more information.

278 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

5.12.3 Enable Throttling

Pulse supports a Throttling feature, which is helpful if you’re submitting large files with your jobs. This can be used to limit the number of Slaves that are copying over the Job files at the same time. The Throttling settings can be found in the Pulse Settings section of the Repository Options.

5.12. Network Performance 279 Deadline User Manual, Release 7.0.3.0

For example, if you have 100 Slaves, and you’re submitting 500MB scene files with your jobs, you may notice a performance hit if all 100 Slaves try to copy over the Job and Plugin files at the same time. You could set the Slave Throttle Limit to 10, so that only 10 of those Slaves will ever be copying those files at the same time. When it goes to render subsequent tasks for the same Job, it will not be affected by the throttling feature, since it already has the required files. Note that for this feature to work, you must be running Pulse.

5.12.4 Manage Job Auxiliary Files

If you are submitting your scene files with your Jobs, this can affect overall performance if the scene files are quite large. This is because whenever a Slave starts a new Job, it copies those Job files locally before rendering, including the Scene file if submitted with the Job. As mentioned in the previous section, if you have hundreds of Slaves starting a Job with a large scene file, and your Repository hardware isn’t built to handle a large load, your performance will suffer. If enabling Throttling isn’t helping, another option (which can also be used in conjunction if you wish) is to configure Deadline to store these scene files in an alternate location (like a separate, dedicated file server). This can be done by configuring the Job Auxiliary Files settings in the Repository Options.

280 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

From here, you can choose a server that’s better equipped to handle the load, which will help improve the performance and stability of your Repository machine, especially if it is also hosting your Database backend. In a mixed farm environment, you need to ensure that the paths for each operating system resolve to the same location. Otherwise, a scene file submitted with the Job on one operating system will not be visible to a Slave running on another.

5.13 Cross Platform Rendering

5.13.1 Overview

Many of the applications that Deadline supports are available for multiple operating systems, and if you have a mixed farm, you will probably run into one or more of these scenarios: • You want to submit Jobs from one operating system and render on a different one. • You want one or more Jobs to render on machines with different operating systems concurrently. Both of these can be achieved, thanks to Deadline’s Path Mapping feature. While there may be other considerations to take into account, depending on the application you’re rendering with, the Path Mapping feature will do most of the work for you.

5.13. Cross Platform Rendering 281 Deadline User Manual, Release 7.0.3.0

5.13.2 Mapped Path Setup

When using a mixed render farm, it is all but guaranteed that asset paths will be different on each operating system. In many cases, Deadline is aware of the paths being passed to the rendering application, so you can configure Path Mappings to swap out these paths when appropriate based on the operating system.

To add a new Path Mapping, just click the ‘Add’ button. Then, you specify the path that needs to be swapped out, along with the paths that will be swapped in based on the operating system. You can also specify a region so you can have different mappings for the same path across different regions. For best results, make sure that all paths end with their appropriate path separator (‘/’ or ‘\’). This helps avoid mangled paths that are a result of one path with a trailing separator, and one without.

282 Chapter 5. Administrative Features Deadline User Manual, Release 7.0.3.0

Note that these swaps only work one-way, so if you are swapping from PC to Linux and vice-versa, you will need two separate entries. For example, let’s say the PC machines use the path ‘\\server\share\’ for assets, while the Linux machines use the path ‘/mnt/share/’. Here are what your two entries should look like: • Entry 1 (replaces the Linux path with the PC path on PCs):

Replace Path: /mnt/share/ Windows Path: \\server\share\ Linux Path: Mac Path:

• Entry 2 (replaces the PC path with the Linux path on Linux):

Replace Path: \\server\share\ Windows Path: Linux Path: /mnt/share/ Mac Path:

If you have Mac machines as well, you will need three entries. For example, if the Macs use ‘/Volumes/share/’ to access the assets from the previous example, here are what your three entries should look like: • Entry 1 (replaces the Linux path with the PC path on PCs and the Mac path on Macs):

Replace Path: /mnt/share/ Windows Path: \\server\share\ Linux Path: Mac Path: /Volumes/share/

• Entry 2 (replaces the PC path with the Linux path on Linux and the Mac path on Macs):

Replace Path: \\server\share\ Windows Path: Linux Path: /mnt/share/ Mac Path: /Volumes/share/

• Entry 3 (replaces the Mac path with the PC path on PCs and the Linux path on Linux):

Replace Path: /Volumes/share/ Windows Path: \\server\share\

5.13. Cross Platform Rendering 283 Deadline User Manual, Release 7.0.3.0

Linux Path: /mnt/share/ Mac Path:

By default, Deadline just uses regular string replacement to swap out the paths. In this case, Deadline takes care of the path separators (‘/’ and ‘\’) automatically. If you want more flexibility, you can configure your path mappings to use regular expressions, but not that you will need to handle the path separators manually using ‘[/\\]’ in your regular expressions.

5.13.3 Application-Specific Considerations

For some applications, like Maya and Nuke, configuring Path Mappings is enough to allow for cross-platform render- ing. For other applications, like After Effects and Cinema 4D, more setup is required. More information on how to render with these applications on mixed farms can be found in their Cross-Platform Rendering Considerations sections in the Plugins documentation.

5.13.4 Regions

Regions can be used to setup different mappings for the same path across your farm. For example, let’s say we have a local farm and a remote farm, and we want to map the path ‘/mnt/share/’ in our remote farm but not in our local farm. All we have to do is set the region of our mapping to the same region our remote slaves are in. Slaves in the region will replace ‘/mnt/share/’ but all the other slaves will use ‘/mnt/share/’ normally. We could also setup an alternate path for the slaves in our local farm. A mapping in the ‘All’ region will apply to every region. It should be noted that a region’s mapping is applied before the ‘All’ region.

284 Chapter 5. Administrative Features CHAPTER SIX

ADVANCED FEATURES

6.1 Manual Job Submission

6.1.1 Overview

Manual job submission is useful if you want more control over the submission process. For example, if you’re writing a custom submission script, or you are integrating the submission process into an internal pipeline tool, you will probably want full control over which job settings are being set. If you are just looking to submit jobs from one of the many scripts that are shipped with Deadline, you should refer to the Submitting Jobs documentation.

6.1.2 Arbitrary Command Line Jobs

To manually submit arbitrary command line jobs, you can use the -SubmitCommandLineJob option with the Command application. The key parameters that you need to specify are: • -executable: The executable we wish to use. • -arguments: The arguments we wish to pass to the executable. In the arguments string, there are a few key words that will be replaced with the appropriate text when rendering the job: – will be replaced with the current start frame for each task. – will be replaced by the current end frame for each task. – will be replaced with the current start frame for each task, and will be padded with 0’s to match the length specified for #. For example, will ensure a start frame padding of 4. – will be replaced by the current end frame for each task, and will be padded with 0’s to match the length specified for #. For example, will ensure an end frame padding of 6. – will be replaced with an actual quote character (”). • -frames: The frames we wish to render. The following parameters can also be included, but are optional: • -startupdirectory: The directory that the command line will start up in. • -chunksize: The number of frames per task (defaults to 1). • -pool: The pool we wish to submit to (defaults to none). • -group: The group we wish to submit to (defaults to none).

285 Deadline User Manual, Release 7.0.3.0

• -priority: The job’s priority (defaults to 50). • -name: The job’s name (defaults to “Untitled”). • -department: The job’s department (defaults to “”). • -initialstatus: Specify “Active” or “Suspended” (defaults to “Active”). • -prop: Specify additional job properties in the form KEY=VALUE, where KEY is any of the property names that can be specified in the Job Info File. For example, say we want to submit a job that uses 3dsmaxcmd.exe to render frames in the scene file “\\shared\path\scene.max”. We want to render frames 1 to 10, and we want an image resolution of 480x320. The command line to do this from a command prompt would look like:

3dsmaxcmd.exe -start:1 -end:10 -width:480 -height:320 "\\shared\path\scene.max"

For the job, we want a task chunk size of 2, we want to submit to the 3dsmax group, we want a priority of 50, and we want a machine limit of 5. Finally, we want to call the job “3dsmax command line job”. The command line to submit this job would look like this: deadlinecommand.exe -SubmitCommandLineJob -executable "c:\Program Files\Autodesk\3dsmax8\3dsmaxcmd.exe" -arguments "-start: -end: -width:480 -height:320 \\shared\path\scene.max" -frames 1-10 -chunksize 2 -group 3dsmax -priority 50 -name "3dsmax command line job" -prop MachineLimit=5

6.1.3 Maintenance Jobs

Maintenance jobs are special jobs where each task for the job will render on a different machine in your farm. This is useful for performing benchmark tests, installing new software, synchronizing files to each machine, etc. When a maintenance job is submitted, a task will automatically be created for each slave, and once a slave has finished a task, it will no longer pick up the job. One way to submit a Maintenance job is to manually submit a job to Deadline by creating the the necessary job submission files as documented below. In the job info file, you must set MaintenanceJob to True:

MaintenanceJob=True

By default, a Maintenance job will render frame 0 on every machine. To render a different frame, or a sequence of frames, you can specify the MaintenanceJobStartFrame and MaintenanceJobEndFrame options in the job info file:

MaintenanceJob=True MaintenanceJobStartFrame=1 MaintenanceJobEndFrame=5

286 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

Another way to submit a Maintenance job is to right-click on an existing job in the Monitor and choose the Resubmit Job option. See the Resubmitting Jobs section of the Controlling Jobs documentation for more information.

6.1.4 Creating Job Submission Files

This is the method that our submission scripts use to submit jobs. This method is far more flexible, but requires more work to setup the job. It also uses the Command application to submit the job. Before the job can be submitted though, a Job Info File and a Plug-in Info File must be created. These are the first two files that should always be submitted with the job. You can also submit additional auxiliary files with the job, such as the scene file you want to render, or any other files the job will need. Any number of auxiliary files can be specified after the job info and plugin info file. These auxiliary file are copied to the Repository, and are then copied locally to each machine during rendering. Because these files will be copied to the same folder, it is necessary that every file name be unique. Once these files are ready to go, you can submit the job using the command line: deadlinecommand.exe [Job Info File] [Plug-in Info File] [Auxiliary File 1] [Auxiliary File 2]

Job Info File

The Job Info File is a plain text file that uses Key/Value pairs (key=value) to define all the generic job options used to render the job. A couple options are required, but most are optional. All jobs can use these options, regardless of the plug-in that they use. Some examples have been provided further down. Required Options These options must be specified in the job info file, or the job will fail to submit. The rest of the options are optional. • Plugin= : Specifies the plugin to use. Must match an existing plugin in the repository. General Options • Frames=<1,2,3-10,20> : Specifies the frame range of the render job. See the Frame List Formatting Options in the Job Submission documentation for more information (default = 0). • Name= : Specifies the name of the job (default = Untitled). • UserName= : Specifies the job’s user (default = current user). • MachineName= : Specifies the machine the job was submitted from (default = current ma- chine). • Department= : Specifies the department that the job belongs to. This is simply a way to group jobs together, and does not affect rendering in any way (default = blank). • Comment= : Specifies a comment for the job (default = blank). • Group= : Specifies the group that the job is being submitted to (default = none). • Pool= : Specifies the pool that the job is being submitted to (default = none). • SecondaryPool= : Specifies the secondary pool that the job can spread to if machines are available. If not specified, the job will not use a secondary pool. • Priority=<0 or greater> : Specifies the priority of a job with 0 being the lowest (default = 50). The maximum priority can be configured in the Job Settings of the Repository Options, and defaults to 100. • ChunkSize=<1 or greater> : Specifies how many frames to render per task (default = 1).

6.1. Manual Job Submission 287 Deadline User Manual, Release 7.0.3.0

• ForceReloadPlugin= : Specifies whether or not to reload the plugin between subsequent frames of a job (default = false). This deals with memory leaks or applications that do not unload all job aspects properly. • SynchronizeAllAuxiliaryFiles= : If enabled, all job files (as opposed to just the job info and plugin info files) will be synchronized by the Slave between tasks for this job (default = false). Note that this can add significant network overhead, and should only be used if you plan on manually editing any of the files that are being submitted with the job. • InitialStatus= : Specifies what status the job should be in immediately after submission (default = Active). • LimitGroups= : Specifies the limit groups that this job is a member of (default = blank). • MachineLimit=<0 or greater> : Specifies the maximum number of machines this job can be rendered on at the same time (default = 0, which means unlimited). • MachineLimitProgress=<0.1 or greater> : If set, the slave rendering the job will give up its current machine limit lock when the current task reaches the specified progress. If negative, this feature is disabled (default = -1.0). The usefulness of this feature is directly related to the progress reporting capabilities of the individual plugins. • Whitelist= : Specifies which slaves are on the job’s whitelist (default = blank). If both a whitelist and a blacklist are specified, only the whitelist is used. • Blacklist= : Specifies which slaves are on the job’s blacklist (default = blank). If both a whitelist and a blacklist are specified, only the whitelist is used. • ConcurrentTasks=<1-16> : Specifies the maximum number of tasks that a slave can render at a time (default = 1). This is useful for script plugins that support multithreading. • LimitTasksToNumberOfCpus= : If ConcurrentTasks is greater than 1, setting this to true will ensure that a slave will not dequeue more tasks than it has processors (default = true). • Sequential= : Sequential rendering forces a slave to render the tasks of a job in order. If an earlier task is ever requeued, the slave won’t go back to that task until it has finished the remaining tasks in order (default = false). • Interruptible= : Specifies if tasks for a job can be interrupted by a higher priority job during render- ing (default = false). • SuppressEvents= : If true, the job will not trigger any event plugins while in the queue (default = false). • NetworkRoot= : Specifies the repository that the job will be submitted to. This is required if you are using more than one repository (default = current default repository for the machine from which submission is occurring). Cleanup Options • OnJobComplete= : Specifies what should happen to a job after it completes (default = Nothing). • DeleteOnComplete= : Specifies whether or not the job should be automatically deleted after it completes (default = false). • ArchiveOnComplete= : Specifies whether or not the job should be automatically archived after it completes (default = false). • OverrideAutoJobCleanup= : If true, the job will ignore the global Job Cleanup settings and instead use its own (default = false).

288 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

• OverrideJobCleanup= : If OverrideAutoJobCleanup is true, this will determine if the job should be automatically cleaned up or not. • JobCleanupDays= : If OverrideAutoJobCleanup and OverrideJobCleanup are both true, this is the number of days to keep the job before cleaning it up. • OverrideJobCleanupType= : If OverrideAutoJobCleanup and OverrideJobCleanup are both true, this is the job cleanup mode. Environment Options • EnvironmentKeyValue#= : Specifies environment variables to set when the job renders. This option is numbered, starting with 0 (EnvironmentKeyValue0), to handle multiple environment variables. For each additional variable, just increase the number (EnvironmentKeyValue1, EnvironmentKeyValue2, etc). Note that these variables are only applied to the rendering process, so they do not persist between jobs. • IncludeEnvironment= : If true, the submission process will automatically grab all the environment variables from the submitter’s current environment and set them in the job’s environment variables (default = false). Note that these variables are only applied to the rendering process, so they do not persist between jobs. • UseJobEnvironmentOnly= : If true, only the job’s environment variables will be used at render time (default = false). If False, the job’s environment variables will be merged with the slave’s current environment, with the job’s variables overwriting any existing ones with the same name. • CustomPluginDirectory= : If specified, the job will look for for the plugin it needs to render in this location. If it does not exist in this location, it will fall back on the Repository plugin directory. For example, if you are rendering with a plugin called MyPlugin, and it exists in \\server\development\plugins\MyPlugin, you would set CustomPluginDirectory=\\server\development\plugins. Failure Detection Options • OverrideJobFailureDetection= : If true, the job will ignore the global Job Failure Detection settings and instead use its own (default = false). • FailureDetectionJobErrors=<0 or greater> : If OverrideJobFailureDetection is true, this sets the number of errors before the job fails. If set to 0, job failure detection will be disabled. • OverrideTaskFailureDetection= : If true, the job will ignore the global Task Failure Detection set- tings and instead use its own (default = false). • FailureDetectionTaskErrors=<0 or greater> : If OverrideTaskFailureDetection is true, this sets the number of errors before a task for the job fails. If set to 0, task failure detection will be disabled. • IgnoreBadJobDetection= : If true, slaves will never mark the job as bad for themselves. This means that they will continue to make attempts at jobs that often report errors until the job is complete, or until it fails (default = false). • SendJobErrorWarning= : If the job should send warning notifications when it reaches a certain number of errors (default = false). Timeout Options • MinRenderTimeSeconds=<0 or greater> : Specifies the minimum time, in seconds, a slave should render a task for, otherwise an error will be reported (default = 0, which means no minimum). Note that if MinRender- TimeSeconds and MinRenderTimeMinutes are both specified, MinRenderTimeSeconds will be ignored. • MinRenderTimeMinutes=<0 or greater> : Specifies the minimum time, in minutes, a slave should render a task for, otherwise an error will be reported (default = 0, which means no minimum). Note that if MinRender- TimeSeconds and MinRenderTimeMinutes are both specified, MinRenderTimeSeconds will be ignored. • TaskTimeoutSeconds=<0 or greater> : Specifies the time, in seconds, a slave has to render a task before it times out (default = 0, which means unlimited). Note that if TaskTimeoutSeconds and TaskTimeoutMinutes are both specified, TaskTimeoutSeconds will be ignored.

6.1. Manual Job Submission 289 Deadline User Manual, Release 7.0.3.0

• TaskTimeoutMinutes=<0 or greater> : Specifies the time, in minutes, a slave has to render a task before it times out (default = 0, which means unlimited). Note that if TaskTimeoutSeconds and TaskTimeoutMinutes are both specified, TaskTimeoutSeconds will be ignored. • StartJobTimeoutSeconds=<0 or greater> : Specifies the time, in seconds, a slave has to start a render job before it times out (default = 0, which means unlimited). Note that if StartJobTimeoutSeconds and StartJobTimeout- Minutes are both specified, StartJobTimeoutSeconds will be ignored. • StartJobTimeoutMinutes=<0 or greater> : Specifies the time, in minutes, a slave has to start a render job before it times out (default = 0, which means unlimited). Note that if StartJobTimeoutSeconds and StartJobTimeout- Minutes are both specified, StartJobTimeoutSeconds will be ignored. • OnTaskTimeout= : Specifies what should occur if a task times out (default = Error). • EnableAutoTimeout= : If true, a slave will automatically figure out if it has been rendering too long based on some Repository Configuration settings and the render times of previously completed tasks (default = false). • EnableTimeoutsForScriptTasks= : If true, then the timeouts for this job will also affect its pre/post job scripts, if any are defined (default = false). Dependency Options • JobDependencies= : Specifies what jobs must finish before this job will resume (default = blank). These dependency jobs must be identified using their unique job ID, which is outputted after the job is submitted, and can be found in the Monitor in the “Job ID” column. • JobDependencyPercentage=<-1, or 0 to 100> : If between 0 and 100, this job will resume when all of its job dependencies have completed the specified percentage number of tasks. If -1, this feature will be disabled (default = -1). • IsFrameDependent= : Specifies whether or not the job is frame dependent (default = false). • FrameDependencyOffsetStart=<-100000 to 100000> : If the job is frame dependent, this is the start frame offset (default = 0). • FrameDependencyOffsetEnd=<-100000 to 100000> : If the job is frame dependent, this is the end frame offset (default = 0). • ResumeOnCompleteDependencies= : Specifies whether or not the dependent job should resume when its dependencies are complete (default = true). • ResumeOnDeletedDependencies= : Specifies whether or not the dependent job should resume when its dependencies have been deleted (default = false). • ResumeOnFailedDependencies= : Specifies whether or not the dependent job should resume when its dependencies have failed (default = false). • RequiredAssets= : Specifies what asset files must exist before this job will re- sume (default = blank). These asset paths must be identified using full paths, and multiple paths can be separated with commas. If using frame dependencies, you can replace padding in a sequence with the ‘#’ characters, and a task for the job will only be resumed when the required assets for the task’s frame) exist. • ScriptDependencies= : Specifies what Python script files will be executed to determine if a job can resume (default = blank). These script paths must be identified using full paths, and multiple paths can be separated with commas. See the Scripting section of the documentation for more information on script dependencies. Scheduling Options • ScheduledType= : Specifies whether or not you want to schedule the job (default = None).

290 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

• ScheduledStartDateTime=

[The date/time at which the job will run. The start date/time must match the specified format. Here’s an explanation:] – dd: The day of the month. Single-digit days must have a leading zero. – MM: The numeric month. Single-digit months must have a leading zero. – yyyy: The year in four digits, including the century. – HH: The hour in a 24-hour clock. Single-digit hours must have a leading zero. – mm: The minute. Single-digit minutes must have a leading zero. • ScheduledDays= : If scheduling a Daily job, this is the day interval for when the job runs (default = 1). • JobDelay= : A start time delay. If there is no ScheduledStartDateTime this delay will be applied to the submission date. The delay value is represented by the number of days, hours, minutes, and seconds, all separated by colons. Output Options • OutputFilename#=<fileName> : Specifies the output image filenames for each frame (default = blank). This al- lows the Monitor to display the “View Output Image” context menu option in the task list. There is no minimum or maximum limit to padding length supported. A padding of 4 x #### is very common in many applications. If the filename is a full path, then the OutputDirectory# option is not needed. This option is numbered, starting with 0 (OutputFilename0), to handle multiple output file names per frame. For each additional file name, just increase the number (OutputFilename1, OutputFilename2, etc). • OutputDirectory#= : Specifies the output image directory for the job (default = blank). This allows the Monitor to display the “Explore Output” context menu option in the job list. This option is num- bered, starting with 0 (OutputDirectory0), to handle multiple output directories per frame. For each additional directory, just increase the number (OutputDirectory1, OutputDirectory2, etc).

OutputDirectory0=\\fileserver\Project\Renders\OutputFolder\ OutputFilename0=o_HDP_010_BG_v01.####.exr OutputDirectory1=\\fileserver\Project\Renders\OutputFolder\ OutputFilename1=o_HDP_010_SPEC_v01####.dpx OutputDirectory2=\\fileserver\Project\Renders\OutputFolder\ OutputFilename2=o_HDP_010_RAW_v01_####.png

Notification Options • NotificationTargets= : A list of users, separated by commas, who should be notified when the job is completed (default = blank). • ClearNotificationTargets= : If enabled, all of the job’s notification targets will be removed (default = false). • NotificationEmails= : A list of additional email addresses, separated by commas, to send job notifications to (default = blank). • OverrideNotificationMethod= : If the job user’s notification method should be ignored (default = false). • EmailNotification= : If overriding the job user’s notification method, whether to use email notifica- tion (default = false). • PopupNotification= : If overriding the job user’s notification method, whether to use popup notifi- cation (default = false). • NotificationNote= : A note to append to the notification email sent out when the job is complete (default = blank). Separate multiple lines with [EOL], for example:

6.1. Manual Job Submission 291 Deadline User Manual, Release 7.0.3.0

This is a line[EOL]This is another line[EOL]This is the last line

Script Options • PreJobScript= : Specifies a full path to a python script to execute when the job initially starts rendering (default = blank). • PostJobScript= : Specifies a full path to a python script to execute when the job completes (default = blank). • PreTaskScript= : Specifies a full path to a python script to execute before each task starts rendering (default = blank). • PostTaskScript= : Specifies a full path to a python script to execute after each task completes (default = blank). Tile Job Options • TileJob= : If this job is a tile job (default = false). • TileJobFrame= : The frame that the tile job is rendering (default = 0). • TileJobTilesInX= : The number of tiles in X for a tile job (default = 0). This should be specified with the TileJobTilesInY option below. • TileJobTilesInY= : The number of tiles in Y for a tile job (default = 0). This should be specified with the TileJobTilesInX option above. • TileJobTileCount= : The number of tiles for a tile job (default = 0). This is an alternative to specifying the TileJobTilesInX and TileJobTilesInY options above. Maintenance Job Options • MaintenanceJob= : If this job is a maintenance job (default = false). • MaintenanceJobStartFrame= : The first frame for the maintenance job (default = 0). • MaintenanceJobEndFrame= : The last frame for the maintenance job (default = 0). Extra Info Options These are extra arbitrary properties that have corresponding columns in the Monitor that can be sorted on. There are a total of 10 Extra Info properties that can be specified. • ExtraInfo0= • ExtraInfo1= • ExtraInfo2= • ExtraInfo3= • ExtraInfo4= • ExtraInfo5= • ExtraInfo6= • ExtraInfo7= • ExtraInfo8= • ExtraInfo9= These are additional arbitrary properties. There is no limit on the number that are specified, but they do not have corresponding columns in the Monitor.

292 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

• ExtraInfoKeyValue0= • ExtraInfoKeyValue1= • ExtraInfoKeyValue2= • ExtraInfoKeyValue3= • ... Job Info File Examples 3ds Max Job Info File:

Plugin=3dsmax ForceReloadPlugin=false Frames=0-400 Priority=50 Pool=3dsmax Name=IslandWaveScene_lighted01 Comment=Testing OutputDirectory0=\\fileserver\Renders\OutputFolder\ OutputFilename0=islandWaveBreak_Std####.png

Lightwave Job Info File:

Plugin=Lightwave Frames=1-10,21-30 ChunkSize=10 Priority=99 Pool=LightwavePool Group=NiceShot Name=Lightwave Test OutputFilename0=\\fileserver\Renders\OutputFolder\test####.tif DeleteOnComplete=true MachineLimit=5 SlaveTimeout=3600

Fusion Job Info File:

Plugin=Fusion Frames=1-100 Priority=50 Group=Fusion Name=Fusion Dependency Test OutputFilename0=\\fileserver\Renders\OutputFolder\dfusion_test####.tif JobDependencies=546cc87357dbb04344a5c6b5,53d27c9257dbb027b8a4ccd2 InitialStatus=Suspended LimitGroups=DFRNode ExtraInfo0=Regression Testing ExtraInfoKeyValue0=TestID=344 ExtraInfoKeyValue1=DeveloperID=12

Plug-in Info File

The Plug-in Info File is a plain text file that uses Key/Value pairs (key=value) to define the plug-in specific options that are used by the individual plug-ins to render the job. Often, these options are used to build up the command line arguments that are to be passed on to the rendering application.

6.1. Manual Job Submission 293 Deadline User Manual, Release 7.0.3.0

The plug-ins read in the settings from the Plug-in Info File using the script functions GetPluginInfoEntry(...) and GetPluginInfoEntryWithDefault(...), which are discussed in more detail in the Plug-in Scripting documentation (Ap- plication Plugins).

6.2 Power Management

6.2.1 Overview

Power Management is a system that automatically controls when machines in the farm start up or shut down, based on the current conditions of the farm. It can start machines if they are required to render jobs in the farm, and it can shut down machines that are no longer needed for rendering. It can also poll an external temperature sensor using SNMP and shut down machines if the server room is too hot. Finally, it can reboot problematic machines on a regular basis.

6.2.2 Running Pulse

Power Management is built into Pulse, so Pulse must be running for Power Management to work. The only exception for this is the Thermal Shutdown feature. Redundancy for this feature has been built into the Slave applications, so if Pulse isn’t running, you’re still protected if the temperature of your server room gets too hot. See the Pulse documentation for more information about running and configuring Pulse.

294 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

6.2.3 Configuration

Power Management can be configured from the Monitor by selecting ‘Tools’ -> ‘Configure Power Management’. You will need to be in Super User mode for this, if you are not part of a User Group that has access to this feature.

6.2. Power Management 295 Deadline User Manual, Release 7.0.3.0

Machine Groups are used by Power Management to organize Slave machines on the farm, and each group has four sections of settings that can be configured independently of each other. To add a new Machine Group, simply click the Add button in the Machine Group section.

296 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

Power Management Group Settings: • Group Name: The name with which the Power Management Group will be identified. • Group Mode: Whether this particular Group is enabled or disabled. • Slaves Not In Group: The Slaves that will not be part of this Group. • Slaves In Group: Slaves that will be part of this Group. To edit the Power Management settings within a group, simply click on the group in the Machine Groups list.

Idle Shutdown

Idle Shutdown is a system that forces Slaves to shutdown after they have been idle for a certain period of time. This can be used to save on energy costs when the render farm is idle, without having to shutdown machines manually. Combining this feature with Wake-On-Lan will ensure that machines in the render farm are only running when they are needed. You can split the idle time period between a Daytime period and an Evening period. This is useful because in most cases, you want most of your machines to stay on during the workday, and then shutdown during the evening when there are no renders left. In addition, you can also specify exceptions to these two periods, which means (for example) you could have different idle periods for the weekend.

6.2. Power Management 297 Deadline User Manual, Release 7.0.3.0

Idle Shutdown Settings: • Idle Shutdown Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks are performed as normal, but no action is actually taken. • Number of Minutes Before Idles Slaves Will be Shutdown: Self explanatory. • Number of Slaves to Leave Running: The minimum number of Slaves to keep running at all times. • Slave Shutdown Type: The method that will be used to shutdown Idle Slaves: • Shutdown: Power off the machine using the normal shutdown method. • Suspend: Suspends the machines instead of shutting them down. Only works for Windows Slaves. • Run Command: Use this method to have the Slave run a command when attempting to shutdown a Slave. • Important Processes: If the Slave has any of these processes running, it will not shutdown.

298 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

• Overrides: Define overrides for different days and times. Simply specify the day(s) of the week, the time period, the minimum number of Slaves, and the idle shutdown time for each override required. For example, if more machines are required to be running continuously for Friday evening and Saturday afternoon, this can be accomplished with an override. • Override Shutdown Order: Whether or not to define the order in which Slaves are shutdown. If disabled, Slaves will be shut down in alphabetical order. If enabled, use the Set Shutdown Order dialog to define the order in which you would like the Slaves to shut down.

Machine Startup

This is a system that allows powered-down machines to be started automatically when new Jobs are submitted to the render farm. Combining this feature with Idle Shutdown will ensure that machines in the render farm are only running when they are needed. If Slave machines support it, Wake On Lan (WOL) or IPMI commands can be used to start them up after they shut- down. By default, the WOL packet is sent over port 9, but you can change this in the Wake On Lan settings in the Repository Configuration. Make sure there isn’t a firewall or other security software blocking communication over the selected port(s). WOL Packets are sent to the MAC addresses that Deadline has on file for each of the Slaves. If your Slaves have mul- tiple Ethernet ports, the Slave may have registered the wrong MAC address, which may prevent WOL from working properly. If this is the case, you will have to manually set MAC Address overrides for the Slaves that are having this problem, which can be done through the Slave Settings dialog. Note that if machines in the group begin to be shutdown due to temperature, this feature may be automatically disabled for the group to prevent machines from starting up and raising the temperature again.

6.2. Power Management 299 Deadline User Manual, Release 7.0.3.0

Machine Startup Settings: • Machine Startup Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks are per- formed, but no action is actually taken. • Number of Slaves to Wake Up per Interval: The maximum number of machines that will be started in a given Power Management check interval. The interval itself can be configured in the Pulse section of the Repository Options. • Wake Up Mode: This determines how the machines will be woken up. See the available Wake Up Modes below for more information. • Override Startup Order: Whether or not to define the order in which Slaves are started up. If disabled, Slaves will be started in alphabetical order. If enabled, use the ‘Set Startup Order’ dialog to define the order. Wake Up Modes: • Use Wake On Lan: Wake On Lan packets will be sent to machines to wake them up.

300 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

• Run Command: This is primarily for IPMI support. If enabled, Pulse will run a given command to start Slave machines. This command will be run once for each Slave that is being woken up. A few tags can be used within the command: – {SLAVE_NAME} is replaced with the current Slave’s hostname. – {SLAVE_MAC} is replaced with the current Slave’s MAC address. – {SLAVE_IP} is replaced with the current Slave’s IP address.

Thermal Shutdown

The Thermal Shutdown system polls temperature sensors and responds by shutting down machines if the temperature gets too high. The sensors we have used for testing are NetTherms, and APC Sensors are also known to be compatible. Note that the temperature sensor uses port 161, and should be automatically unblocked.

Thermal Shutdown settings:

6.2. Power Management 301 Deadline User Manual, Release 7.0.3.0

• Thermal Shutdown Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks will be performed, but no action is actually taken. • Temperature Units: The units used to display and configure the temperatures. Note that this is separate from the units that the actual sensors use. • Thermal Sensors: The host and OID (Object Identifier) of the sensor(s) in the zone. To add a new sensor, simply click the ‘Add’ button. • Temperature Threshold: Thresholds can be added for any temperature. When a sensor reports a temperature higher than a particular threshold, the Slaves in the zone will respond accordingly. Note that higher temperature thresholds take precedence over lower ones. • Shut down Slaves if sensors are offline for this many minutes: If enabled, Slaves will shut down after a period of time in which the temperature sensor could not be reached for temperature information. • Disable Machine Startup if thermal threshold is reached: If enabled, Machine Startup for the current group will be disabled if a thermal threshold is reached. • Re-enable Machine Startup when temperature returns to: If enabled, this will re-enable Machine Startup when the temperature returns to the specified temperature. • Override Shutdown Order: Whether or not to define a custom order in which Slaves will be shutdown. If disabled, Slaves will be shut down in alphabetical order. If enabled, use the ‘Set Shutdown Order’ dialog to define the order.

Sensor Settings: • Sensor Hostname or IP Address: The host of the temperature sensor. • Sensor OID: The OID (Object Identifier) of the temperature sensor. The default OID is for the particular type of sensor we use. • Sensor SNMP Community: If testing the sensor fails with private is selected, try using public. • Sensor Reports Temperature As: Select the units that your temperature sensor uses to report the temperature. • Sensor Timeout in Milliseconds: The timeout value for contacting the sensor.

302 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

• Test Sensor: Queries the sensor for the current temperature, and displays it. If the temperature displayed seems incorrect, make sure you have selected the correct unit of temperature above.

Machine Restart

If you have problematic machines that you need to reboot periodically, you can configure the Machine Restart feature of Power Management to restart your Slaves for you . Note that if the Slave on the machine is in the middle of rendering a Task, it will finish its current Task before the machine is restarted.

Machine Restart settings: • Machine Restart Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks are performed as normal, but no action is actually taken. • Restart machines after Slave has been running for: The interval, in minutes, at which this group of Slaves will be restarted.

6.2. Power Management 303 Deadline User Manual, Release 7.0.3.0

6.3 Slave Scheduling

6.3.1 Overview

You can use the Slave Scheduling feature to configure when Slaves applications should be launched and shut down. Slave Scheduling is controlled by the Launcher, so the Launcher must be running on the machines for Slave Scheduling to work. If a slave is scheduled to start on a machine, a notification message will pop up for 30 seconds indicating that the slave is scheduled to start. If someone is still using the machine, they can choose to delay the start of the slave for a certain amount of time.

6.3.2 Configuration

Slave Scheduling can be configured from the Monitor by selecting ‘Tools’ -> ‘Configure Slave Scheduling’. You will need to be in Super User mode for this, if you are not part of a User Group that has access to this feature.

Machine Groups are used by Slave Scheduling to organize Slave machines on the farm, and each group can have different scheduling settings. To add a new Machine Group, simply click the Add button in the Machine Group section.

304 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

Slave Scheduling Group Settings: • Group Name: The name with which the Slave Scheduling Group will be identified. • Group Mode: Whether this particular Group is enabled or disabled. • Unassigned Slaves: The Slaves that will not be part of this Group. • Slaves In Group: Slaves that will be part of this Group. To edit the scheduling settings within a group, simply click on the group in the Machine Groups list.

Slave Scheduling

These settings are used to define the schedule for when slaves should start and stop. • Ensure Slave Is Running During Scheduled Hours: If enabled, slaves will be restarted if they are shut down during the scheduled hours. • Day of the Week: Configure which days of the week you want to set a schedule for. • Start Time: The time on the selected day that the Slave application should be launched if it is not already running. • Stop Time: The time on the selected day that the Slave application should be closed if it is running.

6.3. Slave Scheduling 305 Deadline User Manual, Release 7.0.3.0

Idle Detection

These settings are used to launch the slave if the machine has been idle for a certain amount of time (“idle” means no keyboard or mouse input). There is also additional criteria that can be checked before launching the slave, including the machine’s current memory and CPU usage, the current logged in user, and the processes currently running on the machine. Finally, this system can stop the slave automatically when the machine is no longer idle. Note that on Linux, the Launcher uses X11 to determine if there has been any mouse or keyboard activity. If X11 is not available, Idle Detection will not work. • Start Slave When Machine Is Idle For ___ Minutes: If enabled, the Slave will be started on the machine if it is idle. A machine is considered idle if there hasn’t been any keyboard or mouse activity for the specified amount of time. • Stop Slave When Machine Is No Longer Idle: If enabled, the Slave will be stopped when the machine is no longer idle. A machine is considered idle if there hasn’t been any keyboard or mouse activity for the specified amount of time. Note that Idle Detection can be overridden in the Local Slave Controls so that users can configure if their local slave should launch when the machine becomes idle.

Miscellaneous Options

These settings are applied to both Slave Scheduling and Idle Detection. • Only Start Slave If CPU Usage Less Than ___%: If enabled, the slave will only be launched if the machine’s CPU usage is less than the specified value. • Only Start Slave If Free Memory More Than ___ MB: If enabled, the slave will only be launched if the machine has more free memory than the specified value (in Megabytes). • Only Start Slave If These Processes Are Not Running: If enabled, the slave will only be launched if the specified processes are not running on the machine. • Only Start If Launcher Is Not Running As These Users: If enabled, the slave will only be launched if the launcher is not running as one of the specified users. • Allow Slaves to Finish Their Current Task When Stopping: If enabled, the Slave application will not be closed until it finishes its current Task.

6.4 Farm Statistics

6.4.1 Overview

Deadline can keep track of some basic statistics. It can keep track of all of your completed Jobs so that you refer to them later. It stores the User that submitted the Job, when the Job was submitted, the error count, as well as some useful rendering metrics like render time, CPU usage, and memory usage. You can use all of this information to figure out if there are any Slaves that aren’t being utilized to their full potential. Statistical information is also gathered for individual slaves, including the slave’s running time, rendering time, and idle time. It also includes information about the number of tasks the slave has completed, the number of errors it has reported, and its average Memory and CPU usage. Note that some statistics can only be gathered if Pulse is running.

306 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

6.4.2 Enabling Statistics Gathering

You must first make sure Statistics Gathering has been enabled before Deadline will start logging information, which can be done in the Statistics Gathering section of the Repository Options.

Note that if Pulse is not running, only statistics for completed Jobs, User usage and Slave Statistics will be recorded. You must run Pulse to keep track of Slave Resource Usage and overall Repository statistics. When running, Pulse will periodically gather information about Slaves Resource Usage and the general state of the repository, and record them in the Database.

6.4.3 Viewing Farm Reports

To view Statistics, open the Monitor and select ‘Tools’ -> ‘View Farm Reports’. This must be done in Super User mode, unless you have the proper User Privileges to do so.

6.4. Farm Statistics 307 Deadline User Manual, Release 7.0.3.0

From this window, you can specify a range for which you would like to view stats, as well as which type of report(s) to generate. There are five default Reports that will always be available, but custom reports can also be created and saved for later use (see the ‘Custom Reports’ section below for more info).

Completed Job Stats

The Completed Job Stats report consists of a list of completed Jobs with detailed statistics. Pulse does not need to be running to gather these statistics.

308 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

Farm Overview

The Farm Overview report displays statistics about the Farm using graphs. The statistics displayed by this report are assembled by Pulse, and will therefore only be gethered if Pulse is running. The State Counts section displays the statistics in terms of counts.

6.4. Farm Statistics 309 Deadline User Manual, Release 7.0.3.0

The State Totals gives a visual representation of the statistics in terms of percentages.

310 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

Slave Resource Usage

The Slave Resource Usage report displays the statistics for each Slave on the farm with graphs to help display the statistics. The statistics displayed by this report are assembled by Pulse, and will therefore only be gathered if Pulse is running.

6.4. Farm Statistics 311 Deadline User Manual, Release 7.0.3.0

Slave Statistics

The Slave Statistics report displays Slave usage statistics for the farm, which are logged by Slaves as they are running. The statistics displayed by this report are generated by each individual slave at regular intervals and do not require Pulse to be running.

312 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

User Farm Time Report

The User Farm Time Report displays the farm usage statistics for each User. Pulse does not need to be running to gather these statistics.

6.4. Farm Statistics 313 Deadline User Manual, Release 7.0.3.0

Custom Reports

Users can create their own custom Reports to control how the gathered statistics are aggregated and presented. By doing this, users can create their own arsenal of specialized reports that help to drill down and expose potential problems with the farm. In order to create or edit Custom Reports you first need to be in Super User mode, or have the appropriate User Group Permissions to do so. If that is the case, there should be a new set of buttons below the list of Reports, providing control over Custom Reports. By clicking the ‘New’ button, you will be prompted to specify a name for your new report and select the type of statistics which this report will display.

Once you’ve done that, you’ll be brought to the Edit view for your new Report. You’ll note that this is very sim- ilar to generating a report under normal circumstances, but with the addition of several buttons that allow further

314 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0 customization of your Report.

Chief among these new buttons is the ‘Edit Data Columns’ button, which will allow you to select which columns are displayed, whether or not to group rows and if so, how to aggregate all the column data (ie, the ‘Group Ops’). For those used to working with Relational Databases, you can think of this as building a very simple SQL query.

6.4. Farm Statistics 315 Deadline User Manual, Release 7.0.3.0

Once you’ve specified which columns are displayed, and whether/how rows are aggregated, you can also add simple Graphs to your report. Simply click the ‘Add Graph’ button, and specify the type of graph you want along with the columns on which the graph should be based. Graphs are always based on all of the data presented the list view, and currently cannot be based on selection or a different data model.

Once you’re done customizing your new report, simply click the ‘OK’ button on the Farm Status Reports window, and

316 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0 your changes will be committed to the Database. Now, every time anyone brings up this dialog, they should be able to generate the report you’ve just created!

6.4.4 Custom Statistics

If you need to keep track of more information, we suggest writing your own tool that uses Deadline Command. Deadline Command can be used to query the repository for all sorts of information, like the current state of all the Jobs and all the Slaves. You can have it print these out in an ini file format and use any ini file parser to extract the information (Python has a module for this). This is also handy if you want to post stats to a web page, or insert entries into a separate database.

6.5 Client Configuration

6.5.1 Overview

Clients are configured using the deadline.ini file. Some settings are stored in a system deadline.ini file, and some are stored in a per-user deadline.ini file. Most of these settings are set during the Client Installation, but they can be changed afterwards by editing the deadline.ini file directly. Some of these settings can also be updated using Auto Configuration. This guide will cover the various settings, and how they can be configured.

6.5.2 DEADLINE_PATH Environment Variable

The DEADLINE_PATH Environment variable is an environment variable on Windows and Linux which contains the path to Deadline’s bin directory. On OSX it is instead a file located at /Users/Shared/Thinkbox which contains the path to Deadline’s resources directory. In either case it is used by deadline’s plugins and submitters in order to call deadline command as needed

6.5.3 Local Slave Instance Files

Deadline supports the ability to run Multiple Slaves On One Machine. The local slave instances are represented by .ini files which are stored in the “slaves” folder in the following locations. Note that the # in the path will change based on the Deadline version number. • Windows: %PROGRAMDATA%\Thinkbox\Deadline#\slaves\ • Linux: /var/lib/Thinkbox/Deadline#/slaves/ • OSX: /Users/Shared/Thinkbox/Deadline#/slaves/ To remove local slave instances, simply delete their corresponding .ini file. Note that this does not remove the slave entries from the repository that the slaves connected to.

6.5.4 Configuration File Format

The deadline.ini file has an ini file format, so there will be a [Deadline] section followed by a number of key=value pairs that represent each setting. For example:

6.5. Client Configuration 317 Deadline User Manual, Release 7.0.3.0

[Deadline] LicenseServer=@my-server NetworkRoot=\\\\repository\\path LauncherListeningPort=17060 AutoConfigurationPort=17061

6.5.5 System Configuration File

The system deadline.ini file can be found in the following locations. Note that the # in the path will change based on the Deadline version number. • Windows: %PROGRAMDATA%\Thinkbox\Deadline#\deadline.ini • Linux: /var/lib/Thinkbox/Deadline#/deadline.ini • OSX: /Users/Shared/Thinkbox/Deadline#/deadline.ini The following settings can be configured in the system deadline.ini file. Note that other settings can show up in this file, but they are used internally by Deadline and are not documented here.

NetworkRoot

The NetworkRoot setting tells the Client which Repository to connect to.

NetworkRoot=\\\\repository\\path

There can also be additional NetworkRoot# settings that store previous Repository paths. These paths will be pre- populated in the drop down list when changing Repositories.

NetworkRoot0=\\\\repository\\path NetworkRoot1=\\\\another\\repository NetworkRoot2=\\\\test\\repository

This setting can be changed using the Change Repository option in the Launcher or the Monitor, and it can also be configured using Auto Configuration.

LicenseServer

The LicenseServer setting tells the Client where it can get a license from.

LicenseServer=@my-server

This setting can be changed using the Change License Server option in the Launcher or the Slave, and it can also be configured using Auto Configuration.

LauncherListeningPort

The LauncherListeningPort setting is the port that the Launcher listens on for Remote Control. It must be the same on all Clients.

318 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

LauncherListeningPort=17060

This setting can only be changed manually.

AutoConfigurationPort

The AutoConfigurationPort setting is the port that the Clients use when Auto Configuring themselves. It must be the same on all Clients.

AutoConfigurationPort=17061

This setting can only be changed manually

SlaveDataRoot

The SlaveDataRoot setting tells the Slave where to copy its job files temporarily during rendering. The default location is the “slave” folder in the same folder as the per-user deadline.ini file. If left blank, the default location will be used as well.

SlaveDataRoot=C:\\LocalSlaveData

This setting can be configured using Auto Configuration.

MultipleSlavesEnabled

The MultipleSlavesEnabled setting indicates if multiple slaves are allowed to run on this machine or not. The default is True.

MultipleSlavesEnabled=True

This setting can only be changed manually.

RestartStalledSlave

The RestartStalledSlave setting indicates if the Launcher should try to restart the Slave on the machine if it becomes stalled. The default is True.

RestartStalledSlave=True

This setting can be changed from the Launcher menu, and it can also be configured using Auto Configuration.

LaunchPulseAtStartup

The LaunchPulseAtStartup setting controls if the Launcher should automatically launch Pulse after the launcher starts up. The default is False.

LaunchPulseAtStartup=True

This setting can only be changed manually.

6.5. Client Configuration 319 Deadline User Manual, Release 7.0.3.0

LaunchBalancerAtStartup

The LaunchBalancerAtStartup setting controls if the Launcher should automatically launch the Balacner after the launcher starts up. The default is False.

LaunchBalancerAtStartup=True

This setting can only be changed manually.

AutoUpdateOverride

The AutoUpdateOverride setting can be used to override the Automatic Upgrades setting in the Repository Configu- ration. If left blank, then it will not override the Repository Options, which is also the default behavior if this setting isn’t specified.

AutoUpdateOverride=False

This setting can be configured using Auto Configuration.

6.5.6 Per-User Configuration File

The per-user deadline.ini file can be found in the following locations. Note that the # in the path will change based on the Deadline version number. • Windows: %LOCALAPPDATA%\Thinkbox\Deadline#\deadline.ini • Linux: ~/Thinkbox/Deadline#/deadline.ini • OSX: ~/Library/Application Support/Thinkbox/Deadline#/deadline.ini The following settings can be configured in the per-user deadline.ini file.

User

The User setting is used by the Client to know which user you are when launching the Monitor or when submitting jobs.

User=Ryan

This setting can be changed using the Change User option in the Launcher or the Monitor. To prevent users from changing who they are, see the User Management documentation.

LaunchSlaveAtStartup

The LaunchSlaveAtStartup setting controls if the Launcher should automatically launch the Slave after the launcher starts up. The default is True.

LaunchSlaveAtStartup=False

This setting can be changed from the Launcher menu, and it can also be configured using Auto Configuration.

320 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

6.6 Auto Configuration

6.6.1 Overview

Auto Configuration allows you to configure many Client settings from a single location. When the Deadline applica- tions start up, they will automatically pull these settings, save them locally, and apply them before fully initializing. Note that must be running for the Deadline applications to pull the Repository Path setting. All the other settings are pulled directly from the Database once the applications are able to connect to it. Note that if Pulse isn’t running, the other settings will still be pulled directly from the Database. To configure and run Pulse, see the Pulse documentation.

6.6.2 Rulesets

You can set up Client Configuration Rulesets from the Auto Configuration section of the Repository Configuration. If you want to configure groups of Clients differently from others, you can add multiple Rulesets. This is useful if you have more than one Repository on your network, or if you want to configure your render nodes differently than your workstations. New Rulesets can be added by pressing the Add button. You can give the Ruleset a name, and then choose a Client Filter method to control which Clients will use this Ruleset. There are currently three types of Slave Filters: • Hostname Regex: You can use regular expressions to match a Client’s host name. If your Slaves are using IPv6, this is probably the preferred method to use. Note that this is case-sensitive. For example:

6.6. Auto Configuration 321 Deadline User Manual, Release 7.0.3.0

– .*host.* will match hostnames containing the word ‘host’ in lower case. – host.* will match hostnames starting with ‘host’. – .*[Hh]ost will match ending with ‘Host’ or ‘host’. – .* will match everything. • IP Regex: You can use regular expressions to match a Client’s IP address. This works with both IPv4 and IPv6 addresses. For example: – 192.168..* will match IPv4 addresses not transported inside IPv6 starting with “192.168”. – [:fF]*192.168. should match IPv4 address even if they are carried over IPv6 addresses (ex ”::ffff:192.168.2.128”). – .* will match everything. • IPv4 Match: You can specify specific IP addresses, or a range of IP addresses (by using wildcards or ranges). Note that this only works with IPv4. Do not use this for IPv6 addresses. For example: – 192.168.0.1-150 – 192.168.0.151-255 – 192.168.*.* – *.*.*.* Configurations are generated starting from the top rule working down one by one. When there is a match for the requesting Client, any properties in the rule which are not marked as ‘(Inherited)’ will override a previous setting. By default, Slaves will use their local configuration for any property which is not set by a rule. Based on the example here, all clients starting with the name ‘Render-‘ and ending with a whole number will use the same Repository Path and launch the Client at startup, while the ‘Default’ rule above it matches all Clients and sets their license server.

322 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

The available options are: • License Server: The license server setting. Use the format ‘@SERVER’, or if you have configured your license file to use a specific port, use ‘PORT@SERVER’. • Launch Slave At Startup: Whether or not the Slave should automatically launch when the Launcher starts up. • Auto Update Override: Whether or not launching the Client should trigger an automatic upgrade if it is available. • Restart Slave If It Stalls: If enabled, the Launcher will try to restart the Slave on the machine if it stalls. • Repository Path: This is the path to the Repository that the Slave will connect to. You can specify a different path for each operating system. • Local Data Path: The local path where the Client temporarily stores plugin and job data from the Repository during rendering. Note that this should be a local path to avoid conflicts. You can specify a different path for each operating system.

6.6. Auto Configuration 323 Deadline User Manual, Release 7.0.3.0

6.7 Render Environment

6.7.1 Job Environment Variables

Environment variables can be set for a job, and these variables will be applied to the rendering process’ environment. These variables can be set in the Job Properties in the Monitor, and they can be set during Manual Job Submission.

Manual Job Submission

For manual job submission, these variables can be specified in the job info file like this:

EnvironmentKeyValue0=mykey=myvalue EnvironmentKeyValue1=anotherkey=anothervalue EnvironmentKeyValue2=athirdkey=athirdvalue ...

There is also an IncludeEnvironment option that takes either True or False (False is the default). When IncludeEnviron- ment is set to True, Deadline will automatically grab all the environment variables from the submitter’s environment and set them as the job’s environment variables.

IncludeEnvironment=True

This can be used in conjunction with the EnvironmentKeyValue# options above, but note that the EnvironmentKey- Value# options will take precedence over any current environment variables with the same name. Finally, there is a UseJobEnvironmentOnly option that takes either True or False (False is the default):

UseJobEnvironmentOnly=True

The UseJobEnvironmentOnly setting controls how the job’s environment variables are applied to the rendering envi- ronment. If True, ONLY the job’s environment variables will be used. If False, the job’s environment variables will be merged with the Slave’s current environment, with the job’s variables overwriting any existing ones with the same name.

Job Rendering

At render time, the job’s environment variables are applied to the rendering process. As explained above, the job’s environment can either be merged with the Slave’s current environment, or the job’s environment can be used exclu- sively. Note though that if the job’s plugin defines any environment variables, those will take precedence over any job environ- ment variables with the same name. In a job’s plugin, there are two functions that are available for the DeadlinePlugin object that can be used to set environment variables: • SetProcessEnvironmentVariable( key, value ): – This should be used in Advanced plugins only. – Any variables set by this function are applied to all process launched through Deadline’s plugin API. – Note that calling SetProcessEnvironmentVariable in Simple plugins or within ManagedProcess callbacks will not affect the current process’ environment. – When using SetProcessEnvironmentVariable in an Advanced plugin, make sure to call it outside of the ManagedProcess callbacks.

324 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

• SetEnvironmentVariable( key, value ): – This is typically used in Simple plugins, or within ManagedProcess callbacks in Advanced plugins. – Any variables set by this function are only applied to the process they are starting up, and they take precedence over any variables set by SetProcessEnvironmentVariable. See the Application Plugins documentation for more information.

6.7.2 Render Jobs As Job’s User

Deadline has some features that allow jobs to be rendered with the the job’s user account, rather than the user account that the Slave is running as. • On Windows, this is done by using the job’s user account credentials to start the rendering process using that account. • On Linux and Mac OS X, the Slave must be running as root. It will then use “sudo” to start the rendering process using the job’s user account.

Enabling Render Jobs As User

To render jobs as the job’s user, you must enable Render Jobs As User in the User Security section of the Repository Options. Note that this setting affects all jobs, and requires users to ensure that their User Account Settings are configured properly (see below).

6.7. Render Environment 325 Deadline User Manual, Release 7.0.3.0

User Account Settings

The user account settings used to start the rendering process are stored in the User Settings for each user. For Linux and OSX, only the User Name is required. For Windows, the Domain and Password must also be provided for authentication.

6.8 Multiple Slaves On One Machine

6.8.1 Overview

Deadline has the ability to launch and configure an arbitrary number of Slave instances on a single machine. Each Slave instance can be given a unique name, and can be assigned its own list of Pools and Groups, which allows Slaves to work independently on separate Jobs. A single high-performance machine could potentially process multiple 3D, compositing, and simulation Jobs simultaneously. Note that the configurations for these slave instances are stored locally on the slave machine. This means that these slave instances exist independently from the repository that the slaves connect to. So if you delete a slave from the repository, the local configuration for that slave instance still exists. Conversely, if you delete a local slave instance,

326 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0 the slave will still have an entry in the repository. It is possible to remove both the slave from the repository and the local slave instance from the slave machine, which is covered below.

6.8.2 Licensing

In Deadline 7, all Slave instances running on a single machine will use the same license. For example, if you had 3 slave instances running on one machine, they would only use 1 license.

6.8.3 Adding and Running Slaves

There are three ways to launch new slave instances: • From the Launcher menu by selecting Launch Slave By Name -> New Slave Instance. This is disabled by default, but can be enabled in the User Group Management settings.

• From the right-click menu in the Slave list in the Monitor by selecting Remote Control -> Slave Commands -> Start New Slave Instance. By default, this is only available when in Super User Mode.

• From the command line using the -name option.

deadlineslave -name "instance-01"

Note that the name you enter is the postfix that is appended to the slave’s base name. For example, if the slave’s base name is “Render-02”, and you start a new instance on it called “instance-01”, the full name for that slave instance will be “Render-02-instance-01”. This is done so that if the slave’s machine name is changed, the full slave name will be updated accordingly. Using the same example, if the machine was renamed to “Node-05”, the slave instance will now be called “Node-05-instance-01”. Once the new Slave shows up in the Slave List in the Monitor, you can configure it like any other Slave. You might want to use Slave Settings (see Slave Configuration) to assign the different Slaves to run on separate CPUs. It might also be a good idea to assign them to different Pools and Groups, so that they can work on different types of Jobs to avoid competing for the same resource (e.g., you could have one Slave assigned to CPU intensive Jobs, while the other works on RAM intensive ones). Once the Slave has been created, you can also launch it remotely like you would any other Slave. See the Remote Control documentation for more information.

6.8. Multiple Slaves On One Machine 327 Deadline User Manual, Release 7.0.3.0

6.8.4 Removing Slaves

There are three ways to remove existing slave instances: • From the Launcher menu by selecting Launch Slave By Name -> Remove Slave Instances. This is disabled by default, but can be enabled in the User Group Management settings.

• From the right-click menu in the Slave list in the Monitor by selecting Remote Control -> Slave Commands -> Remove Slave Instance. This method gives the additional option to automatically remove the slave instance from the repository as well. By default, this is only available when in Super User Mode.

• Manually delete the .ini files that define the local slaves instances on the machine that the slave runs on. See the Client Configuration documentation for more information.

6.8.5 Limiting and Disabling Multiple Slaves

By default, users do not have the ability to launch additional Slaves on their own machines (see User Group Manage- ment). However, there are some cases where you might want to completely disable the ability to run multiple slaves on the same machine. The only known situation where this might be necessary is if your render nodes all net-boot off the same installation (meaning they share the same file system). In this case, if multiple Slaves are enabled, each render node will end up trying to run a Slave instance for every other render node net-booting off the same installation.

328 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

In this scenario, you can disable the multi-slave feature by opening the system’s deadline.ini file and adding this line:

MultipleSlavesEnabled=False

The system deadline.ini file can be found in the following locations. Note that the # in the path will change based on the Deadline version number. • Windows: %PROGRAMDATA%\Thinkbox\Deadline#\deadline.ini • Linux: /var/lib/Thinkbox/Deadline#/deadline.ini • OSX: /Users/Shared/Thinkbox/Deadline#/deadline.ini

6.9 Cloud Controls

6.9.1 Overview

Deadline has some built in cloud features that allows it to connect to different cloud providers and control your instances. Currently, Amazon EC2, Microsoft Azure, Google Cloud, OpenStack, and vCenter are supported, but more providers may be added in the future. Note that Deadline only allows you to control existing instances. It does not create instances for you, except in the case where you clone an existing instance. In order to use instances for rendering, you will need to set them up first, which includes installing the Deadline Client, installing your rendering software, and setting up any licensing that is required. Permission for the Cloud Panel can be editted in the User Group Permissions form. See Controlling Feature Access.

6.9.2 Cloud Providers

Cloud providers can be configured from the Monitor by selecting Tools -> Configure Cloud Providers. By default, this option is hidden for normal users, so you may need to enter Super User Mode. This will bring up the Cloud Options window.

6.9. Cloud Controls 329 Deadline User Manual, Release 7.0.3.0

Adding Providers

To add a provider, click the Add button under the Cloud Region list. Choose the Cloud plugin you wish to use, and give it a region name. This is useful for providers like Amazon EC2 that have more than one region. Then click OK.

The new Cloud region will now show up in the Cloud Region list.

Configuring Providers

To configure an existing provider, select it in the Cloud Region box, which will bring up its configuration settings. This are the settings that the Monitor will use to connect to your cloud provider(s).

330 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

Every provider has an option to enable or disable it, but the other options can vary between providers. To get more information about a particular setting, just hover your mouse over the setting text, or refer to the Cloud Plugins section of the documentation.

6.9.3 Cloud Panel

The Cloud panel in the Monitor shows all the instances from the cloud providers that the Monitor is connected to. By default, this panel is hidden for normal users, so you may need to enter Super User Mode before you can open it.

If the Cloud panel is not visible, see the Panel Features documentation for instructions on how to create new panels in the Monitor.

Controlling Instances

The Cloud panel allows you to control and close your existing instances using the right-click context menu. The following options are available when you right-click on an instance: • Start Instance: Starts an instance that is currently stopped. • Stop Instance: Stops an instance that is currently running. • Destroy Instance: Destroys an existing instance. Once an instance is destroyed, it can not be recovered.

6.9. Cloud Controls 331 Deadline User Manual, Release 7.0.3.0

• Clone Instance: Clones an existing instance. This allows you to quickly launch multiple copies of the selected instance. • Reboot Instance: Reboots an instance that is currently running. It should be noted that some cloud providers, like Google Compute Engine, don’t provide the ability to Start/Stop instances.

6.9.4 Cloud Plug-ins

Cloud providers are supported via the Cloud Plug-in system. This means that the existing ones can be customized, or you can write your own. See the Cloud Plugins documentation for more information on creating cloud plug-ins. Plugin data is only loaded and updated when the Cloud Panel is being displayed.

6.10 Web Service

6.10.1 Overview

Pulse has a web service feature built in, which you can use to get information directly from Pulse over an Internet connection. You can view this information with the Mobile application, or you can write custom Web Service Scripts to display this information in a manner of your choice, such as in a web page. Before you use the web service, you need to configure the Web Service settings in the Repository Configuration. Note that if you enable or disable the web service feature while Pulse is running, you must restart Pulse for the changes to take effect.

332 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

6.10.2 RESTful HTTP API

The RESTful API in Pulse can be used to request information from the database, store new data, alter existing data or remove entries from the database. Note that the original web service functionality in Pulse is still available, but is now deprecated. We won’t be removing it anytime soon, but we won’t be adding new features to it either and will instead continue to focus on the new RESTful API. See the REST Overview documentation for more information.

6.10.3 Additional Web Service Functionality

Note that this additional web service functionality is still supported, but is now deprecated in favor of the new RESTful HTTP API.

Connecting to the Web Service

When Pulse is running, you can connect to the web service using a URL containing the host name or IP address of the machine that is hosting Pulse, as well as the port, which we will assume to be 8080 for now (this can be configured in

6.10. Web Service 333 Deadline User Manual, Release 7.0.3.0 the Web Service Settings). Note that if port 8080 is being blocked by a firewall, Pulse will not be able to accept web requests. An example URL will look like the following: http://[myhost]:8080/[command][arguments]

Where: • myhost is your Pulse server’s IP address or host name. • command is the command you want to execute. Pulse can support two different types of commands, which are explained below. • arguments represents the arguments being passed to the command. This can be optional, and depends on the command. To confirm that you can at least connect to Pulse’s web service, try the following URL. http://[myhost]:8080/

You should see the following if you connect to Pulse successfully:

This is the Deadline Pulse web service!

If Pulse is running on Windows, you may also need to add a namespace reservation for the current user that Pulse is running under, so that it can reserve namespaces for the URL connection. See the Configuring Namespace Reserva- tions section in this MSDN Article for more information. Note that Pulse listens on http://*:8080/, so make sure this is the URL you use when reserving the namespace. For example: netsh http add urlacl url=http://*:8080/ user=USERNAME

Running Commands

The first set of commands are the same commands that you can use with the Command application. However, these commands are disabled by default. To enable them, you need to enable the Allow Non-Script Commands setting in the Web Service settings. If left disabled, you will see the following results when trying to call one of these commands:

Error - Non-Script commands are disabled.

Note that these commands are executed in a similar fashion as if they were called using the Command application, which means that they don’t use any of the data that is currently cached by Pulse. Because of this, there may be a delay when executing these commands. Here is an example of how you would use the web service to call the -GetSlaveNames command: http://[myhost]:8080/GetSlaveNames

Here is an example of the results that would be displayed:

Jupiter Rnd-vista Slave-29 Monkeypantswin7

334 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

Electron.franticfilms.com Test3 Monkeypants Slave-27d Proton.franticfilms.com Atom.franticfilms.com Rnd-suse Opensuse-64 Pathos Neutron.franticfilms.com

Some commands can take arguments. To include arguments, you need to place a ? between the command name and the first argument, and then a & between additional arguments. Here is an example of how you would use the web service to call the -GetSlaveNamesInPool command, and pass it two pools as arguments: http://[myhost]:8080/GetSlaveNamesInPool?show_a&show_b

Here is an example of the results that would be displayed:

Monkeypants Pathos

Calling Python Scripts

The second set of commands are actually Python scripts that you can create in the Repository. These scripts use Pulse’s Python API to get data, and then return the data in a readable manner. So basically, you can create scripts to access any type of data and display it in any way you want. See the Web Service Scripts documentation for more information on how to create these scripts. Once a script has been created, you can call it by using the name of the script, without the .py extension. For example, if you have a Pulse script called GetFarmStatistics.py, you would call it using: http://[myhost]:8080/GetFarmStatistics

Some scripts can take arguments. To include arguments, you need to place a ? between the command name and the first argument, and then a & between additional arguments. Here is an example of how you would pass arg1, arg2, and arg3 as separate arguments to the GetFarmStatistics.py script: http://[myhost]:8080/GetFarmStatistics?arg1&arg2&arg3

The way the results are displayed depends on the format in which they are returned. Again, see the Pulse Web Service Scripting documentation for more information.

6.11 Job Transferring

6.11.1 Overview

If you have multiple office locations that each have their own Deadline Repository, it is possible to transfer Jobs between them. This can be handy if one office’s farm is sitting idle while the other is completely swamped.

6.11. Job Transferring 335 Deadline User Manual, Release 7.0.3.0

Note though that Deadline will only transfer over the files that are submitted with the Job, which in most cases is just the scene file. You must ensure that all assets the scene requires and all output paths that it writes to exist in the remote location before transferring the Job.

6.11.2 Setting Up a Transfer

Before you can transfer a Job, it must be in the Suspend, Completed, or Failed state. Just right-click on the Job, and select ‘Scripts’ -> ‘TransferSubmission’. A Transfer Job window will be displayed.

You’ll notice that you’re actually submitting another Job that will transfer the original Job. The general Deadline options are explained in the Job Submission documentation. The Job Transfer specific options are: • Frame List and Frames Per Task: This is the frame list for the original Job that will be transferred. It will default

336 Chapter 6. Advanced Features Deadline User Manual, Release 7.0.3.0

to the values for the original Job, but you can change them if you only want to transfer a subset of frames. • New Repository: This is the path to the remote Repository that the original Job will be transferred to. Note that the Slaves that the transfer Job will be running on must be able to see this path in order to transfer the original Job to the new repository. • Compress Files During Transfer: If enabled, the original Job’s files will be compressed during the transfer. • Suspend Remote Job After Transfer: If enabled, the original Job will be submitted in the Suspended state to the new Repository. • Email Results After Transfer: If enabled, you will be emailed when the original Job has been successfully transferred. Note that this requires you to have your email notification options set up properly. • Remove Local Job After Transfer: If enabled, the original Job in the local Repository will be deleted after the Job has been successfully transferred to the remote Repository. Once you have your options set, click the Submit button to submit the transfer Job.

6.11.3 Global Transfer Options

Job Transfers are handled by a JobTransfer plugin, which has a few options that can be configured which will affect all transfers. To change the JobTransfer plugin options, open the Monitor and select ‘Tools’ -> ‘Configure Plugins’ as a Super User, and then select the JobTransfer plugin from the list on the left.

The following options are available: • Notification Email(s): The email(s) where successful Job Transfer reports will be sent, so that sys admins can keep track of all successfully transferred Jobs. Leave blank to disable this feature. Use commas to specify more than one email address.

6.11. Job Transferring 337 Deadline User Manual, Release 7.0.3.0

338 Chapter 6. Advanced Features CHAPTER SEVEN

SCRIPTING

7.1 Scripting Overview

7.1.1 Overview

Scripts can be used to customize various aspects of Deadline, including creating custom plug-ins, submitting jobs to the farm, or automating specific tasks after a job completes. The scripting language that Deadline uses is Python 2.7, which is supported using Python for .NET. In addition to supporting native cPython modules, Python for .NET allows your scripts to make use of the .NET Libraries.

7.1.2 Custom Repository Folder

If desired, custom scripts and plugins can be placed in the ‘custom’ folder in the Repository. This folder contains subfolders for different plugins and scripts, allowing you to customize the following areas of Deadline: • Application Plugins • Event Plugins • Cloud Plugins • Monitor Scripts • Job Scripts • Web Service Scripts Note that any scripts or plugins in the ‘custom’ folder will not be affected when upgrading the Repository. The Repository installer also creates a backup of the ‘custom’ directory together with the other Deadline directories during the install process to ‘../backup/[timeStamp] and/or [mostRecent]/custom’ directory. In addition, any scripts or plugins in the ‘custom’ folder will override any scripts or plugins that are shipped with Deadline if they share the same name. If you want to check out the scripts and plugins that are shipped with Deadline, you can find then in the ‘events’, ‘plugins’, and ‘scripts’ folders in the Repository. There is also an option for a job to load its Application Plug-in from another location, which can be set in the Job Properties. This can be useful when testing plugins before updating them directly in the Repository.

7.1.3 Scripting Reference

The full Deadline Scripting Reference can be downloaded in CHM or PDF format from the Deadline Downloads page. There are also many scripts and plug-ins that are shipped with Deadline, which you can use as a reference. These scripts can be found in the following folders in the Repository:

339 Deadline User Manual, Release 7.0.3.0

• cloud (cloud plug-ins) • events (event plug-ins) • plugins (application plug-ins) • scripts (Monitor and web service scripts)

7.1.4 Running Scripts from the Command Line

To run scripts from the command line, the only requirement is that you define a __main__ function. This is the function called by the Command application when it executes the script. def __main__( *args ): # Replace "pass" with code pass

If you save this script to a file called myscript.py, you can execute it using this command: deadlinecommand -ExecuteScript "myscript.py"

If you are running the script in a headless environment where there is no display, you should use this command again: deadlinecommand -ExecuteScriptNoGui "myscript.py"

The only difference between these commands is that ExecuteScriptNoGui doesn’t pre-import any of the user interface modules so that it can run in a headless environment. If your script doesn’t use any user interface modules, then you can use ExecuteScriptNoGui regardless of whether or not you’re in a headless environment.

7.2 Application Plugins

7.2.1 Overview

All of Deadline’s plug-ins are written in Python, which means that it’s easy to create your own plug-ins or customize the existing ones. See the Scripting Overview documentation for more information, and links to the Deadline Scripting reference. Note that because the Python scripts for application plug-ins will be executed in a non-interactive way, it is important that your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input. When a plugin is loaded the log will show where the plugin is being loaded from.

7.2.2 General Plug-in Information

There are two types of plug-ins that can be created: • Simple • Advanced Simple plug-ins provide the basics to wrap a command line application, and is typically used to build up command line arguments to pass to the application. Advanced plug-ins provide more control, and are typically used when running a simple command line application isn’t enough. Other than the plug-in Python script itself though, Simple and Advanced plug-ins are very similar.

340 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

7.2.3 Creating a New Plug-in

This section covers the the areas that Simple and Advanced plug-ins have in common. Specifics for Simple and Advanced plug-ins are covered later on. To create a new plug-in, start by creating a folder in the Repository’s custom\plugins folder and give it the name of your plug-in. See the Scripting Overview documentation for more information on the ‘custom’ folder in the Repository and how it’s used. For the sake of this document, we will call our new plug-in MyPlugin. All relative script and configuration files for this plug-in are to be placed in this plug-in’s folder (some are required and some are optional).

The dlinit File - Required

The first required file is MyPlugin.dlinit, which is the main configuration file for this plug-in. It is a plain text file that defines a few general key=value plug-in properties, which include: Key Description Name About A short description of the plug-in. Concur- Set to True or False (default is False). If tasks for this plug-in can render concurrently without rentTasks interfering with each other, this can be set to True. Debu- Set to True or False (default is False). If set to True, then debug plug-in logging will be printed out gLogging during rendering. Deprecat- Set to True or False (default is False). Only set to True if you want a custom Python.NET plug-in edMode from Deadline 5.1 or 5.2 to work with Deadline 6 or later. More information on DeprecatedMode can be found later on. It can also define key=value custom settings to be used by the plug-in. A common custom setting is the executable to use to render the job. For this example, our MyPlugin.dlinit file might look like this:

About=My Example Plugin for Deadline ConcurrentTasks=True MyPluginRenderExecutable=c:\path\to\my\executable.exe

The py File - Required

The other required file is MyPlugin.py, which is the main plug-in script file. It defines the main DeadlinePlugin class that contains the necessary code that Deadline uses to render a job. This is where Simple and Advanced plug-ins will differ, and the specifics for each can be found later on, but the template for this script file might look like this:

from Deadline.Plugins import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main DeadlinePlugin class. ###################################################################### def GetDeadlinePlugin(): return MyPlugin()

###################################################################### ## This is the function that Deadline calls when the plugin is no ## longer in use so that it can get cleaned up. ###################################################################### def CleanupDeadlinePlugin( deadlinePlugin ):

7.2. Application Plugins 341 Deadline User Manual, Release 7.0.3.0

deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlinePlugin class for MyPlugin. ###################################################################### class MyPlugin (DeadlinePlugin):

# TODO: Place code here instead of "pass" pass

The first thing to note is that we’re importing the Deadline.Plugins namespace so that we can access the DeadlinePlugin class. The GetDeadlinePlugin() function is important, as it allows the Slave to get an instance of our MyPlugin class (which is extending the abstract DeadlinePlugin class). In Deadline 6.2 and later, the GetDeadlinePluginWithJob( job ) function can be defined as an alternative. It works the same as GetDeadlinePlugin(), except that it accepts an instance of the Job object that the plug-in is being loaded for. If either of these functions are not defined, the Slave will report an error when it tries to render the job. The MyPlugin class will need to implement certain callbacks based on the type of plug-in it is, and these callbacks must be hooked up in the MyPlugin constructor. One callback that all plug-ins should implement is the InitializeProcess function. There are many other callbacks that can be implemented, which are covered in the Events section for the DeadlinePlugin class in the Deadline Scripting reference. The CleanupDeadlinePlugin() function is also important, as it is necessary to clean up the plug-in when it is no longer in use. Typically, this is used to clean up any callbacks that were created when the plug-in was initialized. To start off, the InitializeProcess callback is typically used to set some general plug-in settings: from Deadline.Plugins import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main DeadlinePlugin class. ###################################################################### def GetDeadlinePlugin(): return MyPlugin()

###################################################################### ## This is the function that Deadline calls when the plugin is no ## longer in use so that it can get cleaned up. ###################################################################### def CleanupDeadlinePlugin( deadlinePlugin ): deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlinePlugin class for MyPlugin. ###################################################################### class MyPlugin (DeadlinePlugin):

## Hook up the callbacks in the constructor. def __init__( self): self.InitializeProcessCallback+= self.InitializeProcess

## Clean up the plugin. def Cleanup(): del self.InitializeProcessCallback

342 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

## Called by Deadline to initialize the plugin. def InitializeProcess( self): # Set the plugin specific settings. self.SingleFramesOnly= False self.PluginType= PluginType.Simple

These are the common plug-in properties that can be set in InitializeProcess callback. See the DeadlinePlugin class in the Deadline Scripting reference for additional properties. Property Description PluginType The type of plug-in this is (PluginType.Simple/PluginType.Advanced). Single- Set to True or False. Set to True if your plug-in can only work on one frame at a time, rather FramesOnly than a frame sequence.

The param File - Optional

The MyPlugin.param file is an optional file that is used by the Plugin Configuration dialog in the Monitor. It declares properties that the Monitor uses to generate a user interface for modifying custom settings in the MyPlugin.dlinit file. After you’ve created this file, open the Monitor and enter Super User mode. Then select Tools -> Configure Plugins and look for your plug-in in the list on the left.

The file might look something like:

[MyPluginRenderExecutable] Type=filename Label=My Plugin Render Executable Default=c:\path\to\my\executable.exe Description=The path to the executable file used for rendering.

7.2. Application Plugins 343 Deadline User Manual, Release 7.0.3.0

You’ll notice that the property name between the square brackets matches the MyPluginRenderExecutable custom setting we defined in our MyPlugin.dlinit file. This means that this control will change the MyPluginRenderExecutable setting. The available key=value pairs for the properties defined here are: Key Description Name Category The category the control should go under. Catego- This determines the control’s order under its category. This does the same thing as Index. ryIndex Category- This determines the category’s order among other categories. If more than one CategoryOrder is Order defined for the same category, the lowest value is used. Default The default value to be used if this property is not defined in the dlinit file. This does the same thing as DefaultValue. Default- The default value to be used if this property is not defined in the dlinit file. This does the same thing Value as Default. Descrip- A short description of the property the control is for (displayed as a tooltip in the UI). tion DisableIf- If True, a control will not be shown if this property is not defined in the dinit file (True/False). This Blank does the same thing as IgnoreIfBlank. IgnoreIf- If True, a control will not be shown if this property is not defined in the dinit file (True/False). This Blank does the same thing as DisableIfBlank. Index This determines the control’s order under its category. This does the same thing as CategoryIndex. Label The control label. Required If True, a control will be shown for this property even if it’s not defined in the dlinit file (True/False). Type The type of control (see table below). These are the available controls. Control Type Description Boolean A drop-down control that allows the selection of True or False. Color Allows the selection of a color. Enum A drop-down control that allows the selection of an item from a list. Enumeration Same as Enum above. Filename Allows the selection of an existing file. FilenameSave Allows the selection of a new or existing file. Float An floating point spinner control. Folder Allows the selection of an existing folder. Integer An integer spinner control. Label A read-only text field. MultiFilename Allows the selection of multiple existing files, which are then separated by semicolons in the text field. MultiLineMultiFile- Allows the selection of multiple existing files, which are then placed on multiple lines in name the text field. MultiLineMulti- Allows the selection of multiple existing folders, which are then placed on multiple lines Folder in the text field. MultiLineString A text field with multiple lines. Password A text field that masks the text. SlaveList Allows the selection of existing Slaves, which are then separated by commas in the text field. String A text field. There are also key/value pairs for specific controls:

344 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

Key Name Description DecimalPlaces The number of decimal places for the Float controls. Filter The filter string for the Filename, FilenameSave, or MultiFilename controls. Increment The value to increment the Integer or Float controls by. Items The semicolon separated list of items for the Enum control. This does the same thing as Values. Maximum The maximum value for the Integer or Float controls. Minimum The minimum value for the Integer or Float controls. Validator A regular expression for the String control that is used to ensure the value is valid. Values The semicolon separated list of items for the Enum control. This does the same thing as Items.

The options File - Optional

The MyPlugin.options file is an optional file that is used by the Job Properties dialog in the Monitor. It declares properties that the Monitor uses to generate a user interface for modifying plug-in specific options as they appear in the plug-in info file that was submitted with the job. After you’ve created this file, you can right-click on a job in the Monitor that uses this plug-in and select Modify Properties. You should then see a MyPlugin page at the bottom of the list on the left which you can select to view these properties.

Often, these plug-in specific options are used to build up the arguments to be passed to the rendering application. Let’s assume that our render executable takes a “-verbose” argument that accepts a boolean parameter, and that the plug-in info file submitted with the job contains the following:

Verbose=True

7.2. Application Plugins 345 Deadline User Manual, Release 7.0.3.0

Now we would like to be able to change this value from the Job Properties dialog in the Monitor, so our MyPlu- gin.options file might look like this:

[Verbose] Type=boolean Label=Verbose Logging Description=If verbose logging is enabled. Required=true DisableIfBlank=false DefaultValue=True

You’ll notice that the property name between the square brackets matches the Verbose setting in our plug-in info file. This means that this control will change the Verbose setting. The available key=value pairs for the properties defined here are the same as those defined for the param file above.

The ico File - Optional

The MyPlugin.icon file is an optional 16x16 icon file that can be used to easily identify jobs that use this plug-in in the Monitor. Typically, it is the plug-in application’s logo, or something else that represents the plug-in. If a plug-in does not have an icon file, a generic icon will be shown in the jobs list in the Monitor

The JobPreLoad.py File - Optional

The JobPreLoad.py file is an optional script that will be executed by the Slave prior to loading a job that uses this plug-in. Note that in this case, the file does not share its name with the plug-in folder. This script can be used to do things like synchronize plug-ins or scripts prior to starting the render job. The only requirement for the PreJobLoad.py script is that you define a __main__ function, which is called by the Slave when it executes the script. It must accept a single parameter, which is the current instance of the DeadlinePlugin class. Here is an example script that copies a couple files from a server to the local machine, and sets some environment variables: from System import * from System.IO import * def __main__( deadlinePlugin ): deadlinePlugin.LogInfo("Copying some files") File.Copy( r"\\server\files\file1.ext", r"C:\local\files\file1.ext", True) File.Copy( r"\\server\files\file2.ext", r"C:\local\files\file2.ext", True)

deadlinePlugin.LogInfo("Setting EnvVar1 to True") deadlinePlugin.SetProcessEnvironmentVariable("EnvVar1","True")

deadlinePlugin.LogInfo("Setting EnvVar2 to False") deadlinePlugin.SetProcessEnvironmentVariable("EnvVar2","False")

The PluginPreLoad.py File - Optional

The PluginPreLoad.py file is an optional script that will be executed by the Slave prior to executing any python script for the plug-in (MyPlugin.py or JobPreLoad.py), and any pre or post job or task script for the current job. Note that in this case, the file does not share its name with the plug-in folder. This script can be used to set up the Python environment prior to running any other python script, including setting sys.path to control where additional modules will be loaded from.

346 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

7.2.4 Simple Plug-ins

A render job goes through three stages: • StartJob: A job enters this stage when it is first picked up by a Slave. • RenderTasks: A job can enter this stage many times (once for each task a Slave dequeues while it has the current job loaded). • EndJob: A job enters this stage when a Slave is unloading the job. Simple plug-ins only covers the RenderTasks stage, and are pretty straight forward. They are commonly used to render with applications that support simple command line rendering (running a command line executable and waiting for it to complete). For example, After Effects has a command line renderer called aerender.exe, which can be executed by the Slave to render specific frames of an After Effects project file.

Initialization

By default, a plug-in is considered to be a Simple plug-in, but you can explicitly set this in the InitializeProcess() callback (as explained above). You can also define settings specific to the simple plug-in, as well as any popup or stdout handlers that you need. These additional settings are covered in the ManagedProcess class in the Deadline Scripting reference (note that the DeadlinePlugin class inherits from the ManagedProcess class). For example: from Deadline.Plugins import * from System.Diagnostics import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main DeadlinePlugin class. ###################################################################### def GetDeadlinePlugin(): return MyPlugin()

###################################################################### ## This is the function that Deadline calls when the plugin is no ## longer in use so that it can get cleaned up. ###################################################################### def CleanupDeadlinePlugin( deadlinePlugin ): deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlinePlugin class for MyPlugin. ###################################################################### class MyPlugin (DeadlinePlugin):

## Hook up the callbacks in the constructor. def __init__( self): self.InitializeProcessCallback+= self.InitializeProcess

## Clean up the plugin. def Cleanup(): # Clean up stdout handler callbacks. for stdoutHandler in self.StdoutHandlers: del stdoutHandler.HandleCallback

del self.InitializeProcessCallback

7.2. Application Plugins 347 Deadline User Manual, Release 7.0.3.0

## Called by Deadline to initialize the process. def InitializeProcess( self): # Set the plugin specific settings. self.SingleFramesOnly= False self.PluginType= PluginType.Simple

# Set the ManagedProcess specific settings. self.ProcessPriority= ProcessPriorityClass.BelowNormal self.UseProcessTree= True self.StdoutHandling= True self.PopupHandling= True

# Set the stdout handlers. self.AddStdoutHandlerCallback("WARNING:. *").HandleCallback+= self.HandleStdoutWarning self.AddStdoutHandlerCallback("ERROR:(. *)").HandleCallback+= self.HandleStdoutError

# Set the popup ignorers. self.AddPopupIgnorer("Popup 1") self.AddPopupIgnorer("Popup 2")

# Set the popup handlers. self.AddPopupHandler("Popup 3","OK") self.AddPopupHandler("Popup 4","Do not ask me this again;Continue")

## Callback for when a line of stdout contains a WARNING message. def HandleStdoutWarning( self): self.LogWarning( self.GetRegexMatch(0))

## Callback for when a line of stdout contains an ERROR message. def HandleStdoutError( self): self.FailRender("Detected an error:"+ self.GetRegexMatch(1))

The AddStdoutHandlerCallback() function accepts a string parameter, which is a regular expression used to match against lines of stdout from the command line process. This function also returns a RegexHandlerCallback instance, which you can hook up a callback to that is called when a line of stdout is matched. This can all be done on one line, which is shown in the example above. Examples of handler callback functions are also shown in the example above. Within these handler functions, the GetRegexMatch() function can be used to get a specific match from the regular expression. The parameter passed to GetRegexMatch() is the index for the matches that were found. 0 returns the entire matched string, and 1, 2, etc returns the matched substrings (matches that are surrounded by round brackets). If there isn’t a corresponding substring, you’ll get an error (note that 0 is always a valid index). In HandleStdoutWarning(), 0 is the only valid index because there is no substring in round brackets in the regular expression. In HandleStdoutError(), 0 and 1 are valid. 0 will return the entire matched string, whereas 1 will return the substring in the round brackets. The AddPopupIgnorer() function accepts a string parameter, which is a regular expression. If a popup is displayed with a title that matches the given regular expression, the popup is simply ignored. Popup ignorers should only be used if the popup doesn’t halt the rendering because it is waiting for a button to be pressed. In the case where a button needs to be pressed to continue, popup handlers should be used instead. The AddPopupHandler() function takes two parameters: a regular expression string, and the button(s) to press (multiple buttons can be separated with semicolons).

348 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

Render Executable and Arguments

The RenderExecutable() callback is used to get the path to the executable that will be used for rendering. This callback must be implemented in a Simple plug-in, or an error will occur. Continuing our example from above, we’ll use the path specified in the MyPlugin.dlinit file, and we can access it using the global GetConfigEntry() function. Another important (but optional) callback is the RenderArgument() callback. This callback should return the argu- ments you want to pass to the render executable. Typically, these arguments are built from values that are pulled from the DeadlinePlugin class (like the scene file name, or the start and end frame for the task), or from the plug-in info file that was submitted with the job using the GetPluginInfoEntry() function. If this callback is not implemented, then no arguments will be passed to the executable. After adding these callbacks, our example plug-in script now looks like this: from Deadline.Plugins import * from System.Diagnostics import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main DeadlinePlugin class. ###################################################################### def GetDeadlinePlugin(): return MyPlugin()

###################################################################### ## This is the function that Deadline calls when the plugin is no ## longer in use so that it can get cleaned up. ###################################################################### def CleanupDeadlinePlugin( deadlinePlugin ): deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlinePlugin class for MyPlugin. ###################################################################### class MyPlugin (DeadlinePlugin):

## Hook up the callbacks in the constructor. def __init__( self): self.InitializeProcessCallback+= self.InitializeProcess self.RenderExecutableCallback+= self.RenderExecutable self.RenderArgumentCallback+= self.RenderArgument

## Clean up the plugin. def Cleanup(): # Clean up stdout handler callbacks. for stdoutHandler in self.StdoutHandlers: del stdoutHandler.HandleCallback

del self.InitializeProcessCallback del self.RenderExecutableCallback del self.RenderArgumentCallback

## Called by Deadline to initialize the process. def InitializeProcess( self): # Set the plugin specific settings. self.SingleFramesOnly= False self.PluginType= PluginType.Simple

7.2. Application Plugins 349 Deadline User Manual, Release 7.0.3.0

# Set the ManagedProcess specific settings. self.ProcessPriority= ProcessPriorityClass.BelowNormal self.UseProcessTree= True self.StdoutHandling= True self.PopupHandling= True

# Set the stdout handlers. self.AddStdoutHandlerCallback("WARNING:. *").HandleCallback+= self.HandleStdoutWarning self.AddStdoutHandlerCallback("ERROR:(. *)").HandleCallback+= self.HandleStdoutError

# Set the popup ignorers. self.AddPopupIgnorer("Popup 1") self.AddPopupIgnorer("Popup 2")

# Set the popup handlers. self.AddPopupHandler("Popup 3","OK") self.AddPopupHandler("Popup 4","Do not ask me this again;Continue")

## Callback for when a line of stdout contains a WARNING message. def HandleStdoutWarning( self): self.LogWarning( self.GetRegexMatch(0))

## Callback for when a line of stdout contains an ERROR message. def HandleStdoutError( self): self.FailRender("Detected an error:"+ self.GetRegexMatch(1))

## Callback to get the executable used for rendering. def RenderExecutable( self): return self.GetConfigEntry("MyPluginRenderExecutable")

## Callback to get the arguments that will be passed to the executable. def RenderArgument( self): arguments=" -continueOnError -verbose"+ self.GetPluginInfoEntry("Verbose") arguments+=" -start"+ str(self.GetStartFrame())+" -end"+ str(self.GetEndFrame()) arguments+=" -scene \""+ self.GetDataFilename()+" \"" return arguments

There are many other callbacks that can be implemented for Simple plug-ins, which are covered in the Events section for the ManagedProcess class in the Deadline Scripting reference. The best place to find examples of Simple plug-ins is to look at some of the plug-ins that are shipped with Deadline. These range from the very basic (Blender), to the more complex (MayaCmd).

7.2.5 Advanced Plug-ins

To reiterate, a render job goes through three stages: • StartJob: A job enters this stage when it is first picked up by a Slave. • RenderTasks: A job can enter this stage many times (once for each task a Slave dequeues while it has the current job loaded). • EndJob: A job enters this stage when a Slave is unloading the job. Advanced plug-ins are more complex, as they control all three of these job stages. They are commonly used to render with applications that support some sort of slave/server mode that Deadline can interact with. Usually, this requires the application to be started during the StartJob phase, fed commands during the RenderTasks stage(s), and finally shut

350 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0 down during the EndJob stage. For example, the 3ds Max plug-in starts up 3dsmax in slave mode and forces it to load our Lightning plug-in. The Lightning plug-in listens for commands from Deadline and executes them as necessary. After rendering is complete, 3ds Max is shut down.

Initialization

To indicate that your plug-in is an Advanced plug-in, you need to set the PluginType property in the InitializeProcess() callback. from Deadline.Plugins import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main DeadlinePlugin class. ###################################################################### def GetDeadlinePlugin(): return MyPlugin()

###################################################################### ## This is the function that Deadline calls when the plugin is no ## longer in use so that it can get cleaned up. ###################################################################### def CleanupDeadlinePlugin( deadlinePlugin ): deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlinePlugin class for MyPlugin. ###################################################################### class MyPlugin (DeadlinePlugin):

## Hook up the callbacks in the constructor. def __init__( self): self.InitializeProcessCallback+= self.InitializeProcess

## Clean up the plugin. def Cleanup(): del self.InitializeProcessCallback

## Called by Deadline to initialize the process. def InitializeProcess( self): # Set the plugin specific settings. self.SingleFramesOnly= False self.PluginType= PluginType.Advanced

Render Tasks

The RenderTasks() callback is the only required callback for Advanced plug-ins. If it is not implemented, an error will occur. It contains the code to be executed for each task that a Slave renders. This could involve launching applications, communicating with already running applications, or simply running a script to automate a particular task (like backing up a group of files). Other common callbacks for Advanced plug-ins are the StartJob() and EndJob() callbacks. The StartJob() callback can be used to start up an application, or to set some local variables that will be used in other callbacks. If the StartJob() callback is not implemented, then nothing is done during the StartJob phase. The EndJob() callback can be used to

7.2. Application Plugins 351 Deadline User Manual, Release 7.0.3.0 shut down a running application, or to clean up temporary files. If the EndJob() callback is not implemented, then nothing is done during the EndJob phase. In the example below, we will be launching our application during the StartJob phase. The benefit to this is that the application can be left running during the duration of the job, which eliminates the overhead of having to launch the application for each task. To launch and monitor the application, we will be implementing a ManagedProcess class, and calling it MyPluginProcess .This ManagedProcess class will define the render executable and command line arguments for launching the process we will be monitoring. Note that we aren’t passing it any frame information, as this needs to be handled in the RenderTasks() callback when it interacts with the process. After adding these three callbacks, and the MyPluginProcess class, our example code looks like this. Note that the RenderTasks() callback still needs code to allow it to interact with the running process launched in the StartJob() callback. from Deadline.Plugins import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main DeadlinePlugin class. ###################################################################### def GetDeadlinePlugin(): return MyPlugin()

###################################################################### ## This is the function that Deadline calls when the plugin is no ## longer in use so that it can get cleaned up. ###################################################################### def CleanupDeadlinePlugin( deadlinePlugin ): deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlinePlugin class for MyPlugin. ###################################################################### class MyPlugin (DeadlinePlugin):

## Variable to hold the Managed Process object. Process= None

## Hook up the callbacks in the constructor. def __init__( self): self.InitializeProcessCallback+= self.InitializeProcess self.StartJobCallback+= self.StartJob self.RenderTasksCallback+= self.RenderTasks self.EndJobCallback+= self.EndJob

## Clean up the plugin. def Cleanup(): del self.InitializeProcessCallback del self.StartJobCallback del self.RenderTasksCallback del self.EndJobCallback

# Clean up the managed process object. if self.Process: self.Process.Cleanup() del self.Process

## Called by Deadline to initialize the process.

352 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

def InitializeProcess( self): # Set the plugin specific settings. self.SingleFramesOnly= False self.PluginType= PluginType.Advanced

## Called by Deadline when the job starts. def StartJob( self): myProcess= MyPluginProcess() StartMonitoredManagedProcess("My Process", myProcess )

## Called by Deadline for each task the Slave renders. def RenderTasks( self): # Do something to interact with the running process. pass

## Called by Deadline when the job ends. def EndJob( self): ShutdownMonitoredManagedProcess("My Process")

###################################################################### ## This is the ManagedProcess class that is launched above. ###################################################################### class MyPluginProcess (ManagedProcess): deadlinePlugin= None

## Hook up the callbacks in the constructor. def __init__( self, deadlinePlugin ): self.InitializeProcessCallback+= self.InitializeProcess self.RenderExecutableCallback+= self.RenderExecutable self.RenderArgumentCallback+= self.RenderArgument

## Clean up the managed process. def Cleanup(): # Clean up stdout handler callbacks. for stdoutHandler in self.StdoutHandlers: del stdoutHandler.HandleCallback

del self.InitializeProcessCallback del self.RenderExecutableCallback del self.RenderArgumentCallback

## Called by Deadline to initialize the process. def InitializeProcess( self): # Set the ManagedProcess specific settings. self.ProcessPriority= ProcessPriorityClass.BelowNormal self.UseProcessTree= True self.StdoutHandling= True self.PopupHandling= True

# Set the stdout handlers. self.AddStdoutHandlerCallback("WARNING:. *").HandleCallback+= self.HandleStdoutWarning self.AddStdoutHandlerCallback("ERROR:(. *)").HandleCallback+= self.HandleStdoutError

# Set the popup ignorers. self.AddPopupIgnorer("Popup 1") self.AddPopupIgnorer("Popup 2")

# Set the popup handlers.

7.2. Application Plugins 353 Deadline User Manual, Release 7.0.3.0

self.AddPopupHandler("Popup 3","OK") self.AddPopupHandler("Popup 4","Do not ask me this again;Continue")

## Callback for when a line of stdout contains a WARNING message. def HandleStdoutWarning( self): self.deadlinePlugin.LogWarning( self.GetRegexMatch(0))

## Callback for when a line of stdout contains an ERROR message. def HandleStdoutError( self): self.deadlinePlugin.FailRender("Detected an error:"+ self.GetRegexMatch(1))

## Callback to get the executable used for rendering. def RenderExecutable( self): return self.deadlinePlugin.GetConfigEntry("MyPluginRenderExecutable")

## Callback to get the arguments that will be passed to the executable. def RenderArgument( self): arguments=" -verbose"+ self.deadlinePlugin.GetPluginInfoEntry("Verbose") arguments+=" -scene \""+ self.deadlinePlugin.GetDataFilename()+" \"" return arguments

Because the Advanced plug-ins are much more complex than the Simple plug-ins, we recommend taking a look at the following plug-ins that are shipped with Deadline for examples: • 3dsmax • Fusion • Lightwave • MayaBatch • Modo • Nuke • SoftimageBatch

7.3 Event Plugins

7.3.1 Overview

Event plug-ins can be created to execute specific tasks in response to specific events in Deadline (like when a job is submitted or when it finishes). For example, event plug-ins can be used to communicate with in-house pipeline tools to update the state of shots or tasks, or they can be used to submit a post-processing job when another job finishes. All of Deadline’s event plug-ins are written in Python, which means that it’s easy to create your own plug-ins or customize the existing ones. See the Scripting Overview documentation for more information, and links to the Deadline Scripting reference. Note that because the Python scripts for event plug-ins will be executed in a non-interactive way, it is important that your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input. When an event is executed the log will show where the script is being loaded from.

354 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

7.3.2 Triggering Events

An event plug-in can respond to one or more of the following events: • When a job is submitted • When a job starts rendering • When a job finishes rendering • When a job is requeued • When a job fails • When a job is suspended • When a suspended or failed job is resumed • When a job is placed in the pending state • When a job is released from a pending state • When a job is deleted • When a job error occurs during rendering • When a job is about to be purged from the database • When a house cleaning operation finishes • When a repository repair operation finishes • When a slave starts • When a slave stops • When a slave starts rendering • When a slave starts a job • When a slave is marked as stalled By default, all jobs will trigger event plug-ins when they are submitted or change state. However, there is a job property that can be enabled to suppress events. In the Monitor, you can set the Suppress Events property under the Advanced tab in the Job Properties dialog. If you have a custom submission tool or script, you can specify the following in the job info file:

SuppressEvents=True

Note that events will be executed by different Deadline applications, depending on the context of the event. For example, the job submission event is processed by the Command application after the job has been submitted, while the job finished event is normally processed by the Slave that finishes the last task for the job. However, the job finished event could also be processed by the Monitor if manually marking a job as complete.

7.3.3 Creating an Event Plug-in

To create a custom event plug-in, you start by creating a folder in the Repository’s custom\events folder and give it the name of your event plug-in. See the Scripting Overview documentation for more information on the ‘custom’ folder in the Repository and how it’s used. For the sake of this document, we will call our new event plug-in MyEvent. All relative script and configuration files for this event plug-in are to be placed in this folder (some are required and some are optional).

7.3. Event Plugins 355 Deadline User Manual, Release 7.0.3.0

The dlinit File - Required

The first required file is MyEvent.dlinit, which is the main configuration file for this event plug-in. It is a plain text file that defines a few general key=value event plug-in properties, which include: Key Description Name Enabled Set to True or False (default is False). Only enabled event plug-ins will respond to events. Deprecat- Set to True or False (default is False). Only set to True if you want a custom Python.NET event edMode plug-in from Deadline 5.1 or 5.2 to work with Deadline 6 or later. More information on DeprecatedMode can be found later on. It can also define key=value custom settings to be used by the event plug-in. For example, if you are connecting to an in-house pipeline tool, you may want the URL and credentials to be configurable, in which case our MyEvent.dlinit file might look like this:

Enabled=True PipelineURL=http://[myserver]/pipeline PipelineUserName=myuser PipelinePassword=mypassword

The py File - Required

The other required file is MyEvent.py, which is the main event plug-in script file. It defines the main DeadlineEventLis- tener class that contains the necessary callbacks that will respond to specific events. The template for this script file might look like this: from Deadline.Events import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main DeadlineEventListener class. ###################################################################### def GetDeadlineEventListener(): return MyEvent()

###################################################################### ## This is the function that Deadline calls when the event plugin is ## no longer in use so that it can get cleaned up. ###################################################################### def CleanupDeadlineEventListener( deadlinePlugin ): deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlineEventListener class for MyEvent. ###################################################################### class MyEvent (DeadlineEventListener):

# TODO: Place code here to replace "pass" pass

The first thing to note is that we’re importing the Deadline.Events namespace so that we can access the Deadli- neEventListener class. The GetDeadlineEventListener() function is important, as it allows Deadline to get an instance of our MyEvent class (which is extending the abstract DeadlineEventListener class). In Deadline 6.2 and later, the GetDeadlineEventListen-

356 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0 erWithJobs( jobs ) function can be defined as an alternative. It works the same as GetDeadlineEventListener(), except that it accepts a list of the Job objects that the event plug-in is being loaded for. If either of these functions are not defined, Deadline will report an error when it tries to load the event plug-in. The MyEvent class will need to implement certain callbacks based on the events you want to respond to, and these callbacks must be hooked up in the MyEvent constructor. All callbacks are optional, but make sure to include at least one so that your event plug-in actually does something. For a list of all available callbacks, refer to the Deadli- neEventListener class in the Deadline Scripting reference. The CleanupDeadlineEventListener() function is also important, as it is necessary to clean up the event plug-in when it is no longer in use. Typically, this is used to clean up any callbacks that were created when the event plug-in was initialized. After implementing a few functions, your MyEvent.py script file might look something like this: from Deadline.Events import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main DeadlineEventListener class. ###################################################################### def GetDeadlineEventListener(): return MyEvent()

###################################################################### ## This is the function that Deadline calls when the event plugin is ## no longer in use so that it can get cleaned up. ###################################################################### def CleanupDeadlineEventListener( deadlinePlugin ): deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlineEventListener class for MyEvent. ###################################################################### class MyEvent (DeadlineEventListener):

def __init__( self): # Set up the event callbacks here self.OnJobSubmittedCallback+= self.OnJobSubmitted self.OnJobFinishedCallback+= self.OnJobFinished

def Cleanup( self): del self.OnJobSubmittedCallback del self.OnJobFinishedCallback

def OnJobSubmitted( self, job ): #TODO: Connect to pipeline site to notify it that a job has been submitted #for a particular shot or task. pass

def OnJobFinished( self, job ): #TODO: Connect to pipeline site to notify it that the job for a particular #shot or task is complete. pass

7.3. Event Plugins 357 Deadline User Manual, Release 7.0.3.0

The param File - Optional

The MyEvent.param file is an optional file that is used by the Event Configuration dialog in the Monitor. It declares properties that the Monitor uses to generate a user interface for modifying custom settings in the MyEvent.dlinit file. After you’ve created this file, open the Monitor and enter Super User mode. Then select Tools -> Configure Events and look for your event plug-in in the list on the left.

The file might look something like:

[Enabled] Type=boolean Label=Enabled Default=True Description=If this event plug-in should respond to events.

[PipelineURL] Type=string Label=Pipeline URL Default=http://[myserver]/pipeline Description=The URL for our pipeline website.

[PipelineUserName] Type=string Label=Pipeline User Name Default= Description=The user name for our pipeline website.

[PipelinePassword] Type=string Label=Pipeline Password Default=

358 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

Description=The password for our pipeline website.

You’ll notice that the property names between the square brackets matches the custom keys we defined in our MyEvent.dlinit file. This means that these control will change the corresponding settings. The available key=value pairs for the properties defined here are: Key Description Name Category The category the control should go under. Catego- This determines the control’s order under its category. This does the same thing as Index. ryIndex Category- This determines the category’s order among other categories. If more than one CategoryOrder is Order defined for the same category, the lowest value is used. Default The default value to be used if this property is not defined in the dlinit file. This does the same thing as DefaultValue. Default- The default value to be used if this property is not defined in the dlinit file. This does the same thing Value as Default. Descrip- A short description of the property the control is for (displayed as a tooltip in the UI). tion DisableIf- If True, a control will not be shown if this property is not defined in the dinit file (True/False). This Blank does the same thing as IgnoreIfBlank. IgnoreIf- If True, a control will not be shown if this property is not defined in the dinit file (True/False). This Blank does the same thing as DisableIfBlank. Index This determines the control’s order under its category. This does the same thing as CategoryIndex. Label The control label. Required If True, a control will be shown for this property even if it’s not defined in the dlinit file (True/False). Type The type of control (see table below). These are the available controls. Control Type Description Boolean A drop-down control that allows the selection of True or False. Color Allows the selection of a color. Enum A drop-down control that allows the selection of an item from a list. Enumeration Same as Enum above. Filename Allows the selection of an existing file. FilenameSave Allows the selection of a new or existing file. Float An floating point spinner control. Folder Allows the selection of an existing folder. Integer An integer spinner control. Label A read-only text field. MultiFilename Allows the selection of multiple existing files, which are then separated by semicolons in the text field. MultiLineMultiFile- Allows the selection of multiple existing files, which are then placed on multiple lines in name the text field. MultiLineMulti- Allows the selection of multiple existing folders, which are then placed on multiple lines Folder in the text field. MultiLineString A text field with multiple lines. Password A text field that masks the text. SlaveList Allows the selection of existing Slaves, when are then separated by commas in the text field. String A text field. There are also key/value pairs for specific controls:

7.3. Event Plugins 359 Deadline User Manual, Release 7.0.3.0

Key Name Description DecimalPlaces The number of decimal places for the Float controls. Filter The filter string for the Filename, FilenameSave, or MultiFilename controls. Increment The value to increment the Integer or Float controls by. Items The semicolon separated list of items for the Enum control. This does the same thing as Values. Maximum The maximum value for the Integer or Float controls. Minimum The minimum value for the Integer or Float controls. Validator A regular expression for the String control that is used to ensure the value is valid. Values The semicolon separated list of items for the Enum control. This does the same thing as Items.

7.3.4 Event Plug-in and Error Reports

Logs and reports can be stored with the job or the slave, depending on the event type.

Job Event Reports

Event types that start with “OnJob...” will save reports with the corresponding job. When an event plug-in that uses the LogInfo or LogWarning functions finishes executing, its log will be stored with the job’s other render logs, which you can view in the Monitor by right-clicking on the job and selecting View Job Reports. When an error occurs in an event-plugin, an error report will also be stored with the job’s other render errors, which you can view in the Monitor by right-clicking on the job and selecting View Job Reports.

Slave Event Reports

Event types that start with “OnSlave...” will save reports with the corresponding slave. When an event plug-in that uses the LogInfo or LogWarning functions finishes executing, its log will be stored with the slave’s other render logs, which you can view in the Monitor by right-clicking on the slave and selecting View Slave Reports. When an error occurs in an event-plugin, an error report will also be stored with the slave’s other render errors, which you can view in the Monitor by right-clicking on the slave and selecting View Slave Reports.

7.3.5 Quicktime Generation Example

An event plug-in can be used to automatically submit a Quicktime job to create a movie from the rendered images of a job that just finished. An example of an event plug-in like this can be downloaded from the Miscellaneous Deadline Downloads Page. To install the event plugin, just unzip the downloaded file to your Repository’s custom/events folder. Configuration Files The QuicktimeGen.dlinit and QuicktimeGen.param files define a couple of settings that can be configured from the Monitor. Here you can specify a path to the Quicktime settings XML file you want to use. This settings file can be generated from the Submit Quicktime Job To Deadline submitter in the Monitor. The QuicktimeGen.dlinit file:

Enabled=True QTSettings=\\ws-wpg-026\share\quicktime_export_settings.xml

The QuicktimeGen.param file:

360 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

[Enabled] Type=boolean Label=Enabled Default=True Description=If this event plug-in should respond to events.

[QTSettings] Type=filename Label=QT Settings XML File Default= Description=The QT settings xml file.

7.4 Cloud Plugins

7.4.1 Overview

Cloud plug-ins can be created to allow Deadline to communicate with different cloud providers. All of Deadline’s cloud plug-ins are written in Python, which means that it’s easy to create your own plug-ins or customize the existing ones. You can also refer to them in the Repository’s cloud folder for examples of how they work. See the Scripting Overview documentation for more information, and links to the Deadline Scripting reference. Note that because the Python scripts for cloud plug-ins will be executed in a non-interactive way, it is important that your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input. When a cloud script is executed the log will show where the script is being loaded from.

7.4.2 Creating a Cloud Plug-in

To create a custom cloud plug-in, you start by creating a folder in the Repository’s custom\cloud folder and give it the name of your cloud plug-in. See the Scripting Overview documentation for more information on the ‘custom’ folder in the Repository and how it’s used. For the sake of this document, we will call our new cloud plug-in MyCloud. All relative script and configuration files for this cloud plug-in are to be placed in this folder.

The py File

The first required file is MyCloud.py, which is the main cloud plug-in script file. It defines the main CloudPlugin- Wrapper class that contains the necessary callbacks that will respond to specific commands. The template for this script file might look like this: from Deadline.Cloud import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main CloudPluginWrapper class. ###################################################################### def GetCloudPluginWrapper(): return MyCloudPlugin()

###################################################################### ## This is the function that Deadline calls when the cloud plugin is

7.4. Cloud Plugins 361 Deadline User Manual, Release 7.0.3.0

## no longer in use so that it can get cleaned up. ###################################################################### def CleanupCloudPlugin( deadlinePlugin ): deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlineCloudListener class for MyCloud. ###################################################################### class MyCloud (CloudPluginWrapper):

# TODO: Place code here instead of "pass" pass

The GetCloudPluginWrapper() function is important, as it allows Deadline to get an instance of our MyCloud class (which is extending the abstract CloudPluginWrapper class). If this function isn’t defined, Deadline will report an error when it tries to load the cloud plug-in. Notice that we’re importing the Deadline.Cloud namespace so that we can access the CloudPluginWrapper class. The MyCloud class will need to implement certain callbacks so that Deadline can get information from the cloud provider, and these callbacks must be hooked up in the MyCloud constructor. For a list of all available callbacks, refer to the CloudPluginWrapper class in the Deadline Scripting reference. After implementing a few functions, your MyCloud.py script file might look something like this: from Deadline.Cloud import *

###################################################################### ## This is the function that Deadline calls to get an instance of the ## main CloudPluginWrapper class. ###################################################################### def GetCloudPluginWrapper(): return MyCloudPlugin()

###################################################################### ## This is the function that Deadline calls when the cloud plugin is ## no longer in use so that it can get cleaned up. ###################################################################### def CleanupCloudPlugin( deadlinePlugin ): deadlinePlugin.Cleanup()

###################################################################### ## This is the main DeadlineCloudListener class for MyCloud. ###################################################################### class MyCloud (CloudPluginWrapper):

def __init__( self): #Set up our callbacks for cloud control self.VerifyAccessCallback+= self.VerifyAccess self.AvailableHardwareTypesCallback+= self.GetAvailableHardwareTypes self.AvailableOSImagesCallback+= self.GetAvailableOSImages self.CreateInstancesCallback+= self.CreateInstances self.TerminateInstancesCallback+= self.TerminateInstances self.CloneInstanceCallback+= self.CloneInstance self.GetActiveInstancesCallback+= self.GetActiveInstances self.StopInstancesCallback+= self.StopInstances self.StartInstancesCallback+= self.StartInstances self.RebootInstancesCallback+= self.RebootInstances

362 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

def Cleanup( self): #Clean up our callbacks for cloud control del self.VerifyAccessCallback del self.AvailableHardwareTypesCallback del self.AvailableOSImagesCallback del self.CreateInstancesCallback del self.TerminateInstancesCallback del self.CloneInstanceCallback del self.GetActiveInstancesCallback del self.StopInstancesCallback del self.StartInstancesCallback del self.RebootInstancesCallback

def VerifyAccess( self): #TODO: Return True if connection to cloud provider can be verified. pass

def GetAvailableHardwareTypes( self): #TODO: Return list of HardwareType objects representing the hardware #types supported by this provider. #Must be implemented for the Balancer to work. pass

def GetAvailableOSImages( self): #TODO: Return list of OSImage objects representing the OS images #supported by this provider. #Must be implemented for the Balancer to work. pass

def GetActiveInstances( self): #TODO: Return list of CloudInstance objects that are currently active. pass

def CreateInstances( self, hardwareID, imageID, count ): #TODO: Start instances and return list of CloudInstance objects that #have been started. #Must be implemented for the Balancer to work. pass

def TerminateInstances( self, instanceIDs ): #TODO: Return list of boolean values indicating which instances #terminated successfully. #Must be implemented for the Balancer to work. pass

def StopInstances( self, instanceIDs ): #TODO: Return list of boolean values indicating which instances #stopped successfully. pass

def StartInstances( self, instanceIDs ): #TODO: Return list of boolean values indicating which instances #started successfully. pass

def RebootInstances( self, instanceIDs ): #TODO: Return list of boolean values indicating which instances #rebooted successfully.

7.4. Cloud Plugins 363 Deadline User Manual, Release 7.0.3.0

pass

The param File

The MyCloud.param file is an optional file that is used by the Cloud Provider Configuration dialog in the Monitor. It declares properties that the Monitor uses to generate a user interface for modifying settings for this provider, which are then stored in the database. After you’ve created this file, open the Monitor and enter Super User mode. Then select Tools -> Configure Cloud Providers and click the Add button under the Cloud Region box to see your cloud plugin.

The file might look something like:

[Enabled] Type=boolean Label=Enabled Default=True Description=If this cloud plug-in should be enabled.

[AccessID] Type=string Category=Options CategoryOrder=0 Index=1 Label=Access ID Default= Description=Your Cloud Provider Access ID. [SecretKey] Type=password Category=Options CategoryOrder=0 Index=2 Label=Secret Key Default= Description=Your Cloud Provider Secret Key.

364 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

The available key=value pairs for the properties defined here are: Key Description Name Category The category the control should go under. Catego- This determines the control’s order under its category. This does the same thing as Index. ryIndex Category- This determines the category’s order among other categories. If more than one CategoryOrder is Order defined for the same category, the lowest value is used. Default The default value to be used if this property is not defined in the dlinit file. This does the same thing as DefaultValue. Default- The default value to be used if this property is not defined in the dlinit file. This does the same thing Value as Default. Descrip- A short description of the property the control is for (displayed as a tooltip in the UI). tion DisableIf- If True, a control will not be shown if this property is not defined in the dinit file (True/False). This Blank does the same thing as IgnoreIfBlank. IgnoreIf- If True, a control will not be shown if this property is not defined in the dinit file (True/False). This Blank does the same thing as DisableIfBlank. Index This determines the control’s order under its category. This does the same thing as CategoryIndex. Label The control label. Required If True, a control will be shown for this property even if it’s not defined in the dlinit file (True/False). Type The type of control (see table below). These are the available controls. Control Type Description Boolean A drop-down control that allows the selection of True or False. Color Allows the selection of a color. Enum A drop-down control that allows the selection of an item from a list. Enumeration Same as Enum above. Filename Allows the selection of an existing file. FilenameSave Allows the selection of a new or existing file. Float An floating point spinner control. Folder Allows the selection of an existing folder. Integer An integer spinner control. Label A read-only text field. MultiFilename Allows the selection of multiple existing files, which are then separated by semicolons in the text field. MultiLineMultiFile- Allows the selection of multiple existing files, which are then placed on multiple lines in name the text field. MultiLineMulti- Allows the selection of multiple existing folders, which are then placed on multiple lines Folder in the text field. MultiLineString A text field with multiple lines. Password A text field that masks the text. SlaveList Allows the selection of existing Slaves, when are then separated by commas in the text field. String A text field. There are also key/value pairs for specific controls:

7.4. Cloud Plugins 365 Deadline User Manual, Release 7.0.3.0

Key Name Description DecimalPlaces The number of decimal places for the Float controls. Filter The filter string for the Filename, FilenameSave, or MultiFilename controls. Increment The value to increment the Integer or Float controls by. Items The semicolon separated list of items for the Enum control. This does the same thing as Values. Maximum The maximum value for the Integer or Float controls. Minimum The minimum value for the Integer or Float controls. Validator A regular expression for the String control that is used to ensure the value is valid. Values The semicolon separated list of items for the Enum control. This does the same thing as Items.

7.5 Balancer Plugins

7.5.1 Overview

Balancer plugins can be created to customize the algorithm logic for the Balancer application. Balancer plugins are written in Python, which means that they can easily be created and customized. You can also refer to the default plugin in the Repoistory’s balancer folder for a full example of how it works. See the Scripting Overview documentation for more information, and links to the Deadline Scripting reference.

7.5.2 Creating a Balancer Plug-in

To create a custom balancer plug-in, you start by creating a folder in the Repository’s custom\balancer folder and give it the name of your balancer plug-in. See the Scripting Overview documentation for more information on the ‘custom’ folder in the Repository and how it’s used. For the sake of this document, we will call our new balancer plug-in MyBalancerAlgorithm. All relative script and configuration files for this balancer plug-in are to be placed in this folder.

The py File

The first required file is MyBalancerAlgorithm.py, which is the main balancer plugin script. It defines the Balancer- PluginWrapper class that contains all the necessary callbacks that will be used during a balancer cycle. The template for this script file might look like this: from Deadline.Balancer import *

########################################################################### ## This is the function that Deadline calls to get an instance of the ## main BalancerPluginWrapper class. ########################################################################### def GetBalancerPluginWrapper(): return MyBalancerPlugin()

########################################################################### ## This is the main DeadlineBalancerListener class for MyBalancerAlgorithm. ########################################################################### class MyBalancerAlgorithm (BalancerPluginWrapper):

# TODO: Place code here instead of "pass" pass

366 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

The GetBalancerPluginWrapper() function is important, as it allows Deadline to get an instance of our MyBalancerAlgorithm class (which is extending the abstract BalancerPluginWrapper class). If this function isn’t defined, Deadline will report an error when it tries to load the balancer plug-in. Notice that we’re importing the Deadline.Balancer namespace so that we can access the BalancerPluginWrapper class. The MyBalancerAlgorithm class will need to implement the BalancerAlgorithm callback so that Deadline can know how to balance your farm, and these callbacks must be hooked up in the MyBalancerAlgorithm constructor. After implementing a few functions, your MyBalancerAlgorithm.py script file might look something like this: from Deadline.Balancer import *

########################################################################### ## This is the function that Deadline calls to get an instance of the ## main BalancerPluginWrapper class. ########################################################################### def GetBalancerPluginWrapper(): return MyBalancerPlugin()

########################################################################### ## This is the main DeadlineBalancerListener class for MyBalancerAlgorithm. ########################################################################### class MyBalancerAlgorithm (BalancerPluginWrapper): def __init__( self): self.BalancerAlgorithmCallback+= self.BalancerAlgorithm

def BalancerAlgorithm(self, stateStruct): #TODO: Return a target struct to the Balancer. pass

Here’s what a BalancerTargetStruct looks like:

///

/// The BalancerTargetStruct indicates the ideal number of VM instances that should be /// running in each enabled Group /// of each CloudRegion. The BalancerTargetStruct is populated by a Balancer Logic Plug-in. /// public class BalancerTargetStruct { public BalancerTargetStruct() { }

public bool ErrorEncountered; // Logic plug-in can set this to true to indicate that // an error occurred. public string ErrorMessage; // Logic plugin can convey an error message here // (ErrorEncountered should be set to true). public string Message; // Logic plugin can convey a non-error message here.

public CloudRegionTargetStruct[] CloudRegionTargets; // An array of cloud region targets. public DateTime Time; // The time the structure was filled. } public class CloudRegionTargetStruct { public CloudRegionTargetStruct() { }

public string RegionID; // The unique ID of the region. public GroupTargetStruct[] GroupTargets; // An array of Group targets }

7.5. Balancer Plugins 367 Deadline User Manual, Release 7.0.3.0

public class GroupTargetStruct { public GroupTargetStruct() { }

public GroupTargetStruct(string Name, int Count) { this.Name = Name; this.Count = Count; }

public string Name; // The name of the group. public int Count; // The target number of VM instances for the group. }

The param File

The MyBalancerAlgorithm.param file is an optional file that is used in the Balancer Settings panel of the Repository Options dialog in the Monitor. It declares properties that the Monitor uses to generate a user interface for modifying settings for this algorithm, which are then stored in the database. After you’ve created this file, open the Monitor and enter Super User mode. Then select Tools -> Repository Options -> Balancer Settings and click the dropdown to MyBalancerAlgorithm to see your settings.

The dlinit File

The last required file is MyBalancerAlgorithm.dlinit, which is the main configuration file for this plugin. It is a plain text file that defines a few general key=value plug-in properties, which include: Key Description Name About A short description of the plug-in. Concur- Set to True or False (default is False). If tasks for this plug-in can render concurrently without rentTasks interfering with each other, this can be set to True. Debu- Set to True or False (default is False). If set to True, then debug plug-in logging will be printed out gLogging during rendering. Deprecat- Set to True or False (default is False). Only set to True if you want a custom Python.NET plug-in edMode from Deadline 5.1 or 5.2 to work with Deadline 6 or later. More information on DeprecatedMode can be found later on. It can also define key=value custom settings to be used by the plug-in. For this example, our MyBalancerAlgo- rithm.dlinit file might look like this:

About=My Example Plugin for Deadline SomeSortOfScript=c:\path\to\my\script.py

7.6 Monitor Scripts

7.6.1 Overview

There are several different types of Monitor scripts available. While the large majority of the ones shipping with Deadline are Submission Scripts used to submit new Jobs to the farm, the Monitor has the capability of running utility scripts in the context of specific Jobs, Tasks, Slaves, Limits, or even Reports.

368 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

Below, we go into more detail for each of the different types of Scripts, and how to create your own.

7.6.2 Scripting Reference

As with all other Deadline scripts, Monitor scripts use Python 2.7, which is supported using Python for .NET. This means that in addition to typical cPython modules, Python for .NET allows your scripts to make use of .NET Libraries, and Deadline’s own internal functions. The full Deadline Scripting Reference can be downloaded in CHM or PDF format from the Deadline Downloads page. Particular functions of note relevant to Monitor Scripting can be found in the aforementioned Scripting Reference, under the following sections: • Deadline.Scripting.MonitorUtils • Deadline.Scripting.JobUtils • Deadline.Scripting.SlaveUtils It can also be very helpful when developing your own Monitor Script to take a look at how our built-in Monitor Scripts of that type are structured.

7.6.3 General Script Template

We follow a fairly specific template when making any new built-in Monitor scripts. The template is loosely as follows: • Define your __main__ function: This is the function that Deadline will call when invoking your script. This is mandatory, and your script will generate an error if it isn’t done.

def __main__( *args ): #Replace "pass" pass

• Build the submission UI: Typically done in the __main__ function by creating a ScriptDialog object, and adding controls to it. Each control’s name must be unique, so that each control can be identified properly. You can also set the dialog’s size (if not using a grid layout), the row and column (if using a grid layout), title, and a few other settings. For more details, see the ScriptDialog and ScriptControl sections of the Reference Manual. For an example on how to use the grid layout see the Grid Layout Example Script documentation. – Define and Load Sticky Settings: Sticky settings are settings that persist after the dialog has been closed. They are defined by creating a string array that contains the names of the controls for which you want the settings to persist. After defining them, you can load them by calling the ‘LoadSettings’ function of your ScriptDialog. – Show the Dialog: The last thing you should do in your __main__ function is to show your ScriptDialog, by using its ‘ShowDialog’ function. • Define Your Functions: Specify any functions that may be used by your script. These could just be helper functions, or event handlers that do stuff when UI values are modified. Note that you don’t necessarily need to follow this template, but the closer you stick to it, the more examples you’ll have to draw on.

7.6.4 Monitor Scripts

There are many different types of scripts you can write for the Monitor, which are listed below. It is recommended that these scripts be created in the ‘custom’ folder in the Repository to avoid issues when upgrading your Repository in the

7.6. Monitor Scripts 369 Deadline User Manual, Release 7.0.3.0

future. See the Scripting Overview documentation for more information on the ‘custom’ folder in the Repository and how it’s used. When a monitor script is executed the log will show where the script is being loaded from.

Submission Scripts

Submission Scripts are used to create custom Submission dialogs, and ultimately submit new Jobs to Deadline. They are located in the ‘Submit’ menu of the Monitor’s main menu bar, as well as the ‘Submit’ menu in the the Launcher. Creating your own custom Submission dialog is quite simple, and the process is described below. To create new submission scripts, simply navigate to the ‘custom\scripts\Submission’ folder in your Repository. Then, create a new Python file named ‘MySubmissionScript.py’, where ‘MySubmissionScript’ is the name of your new script. Once created, you can follow the template outlined above in the General Script Template section to build up your script.

General Scripts

General scripts are used to perform any sort of custom action by selecting them from the Monitor’s (or Launcher’s) ‘Scripts’ menu. Under the hood, there technically isn’t anything different between General and Submission scripts. The only real difference is that they show up under different menus, which is just to help keep scripts semantically separated. To create new General scripts, simply navigate to the ‘custom\scripts\General’ folder in your Repository. Then, create a new Python file named ‘MyGeneralScript.py’, where ‘MyGeneralScript’ is the name of your new script. Once created, you can follow the template outlined above in the General Script Template section to build up your script.

Job Scripts

Job Scripts are typically used to modify or to perform actions on a selected Job in the Monitor. They can be accessed by right-clicking an existing Job in the Job Panel, under the ‘Scripts’ sub-menu. To create new Job scripts, simply navigate to the ‘custom\scripts\Jobs’ folder in your Repository. Then, create a new Python file named ‘MyJobScript.py’, where ‘MyJobScript’ is the name of your new script. Once created, you can follow the template outlined above in the General Script Template section to build up your script.

Task Scripts

Task Scripts are typically used to modify or to perform actions on a selected Task in the Monitor. They can be accessed by right-clicking an existing Task in the Task Panel, under the ‘Scripts’ sub-menu. To create new Task scripts, simply navigate to the ‘custom\scripts\Tasks’ folder in your Repository. Then, create a new Python file named ‘MyTaskScript.py’, where ‘MyTaskScript’ is the name of your new script. Once created, you can follow the template outlined above in the General Script Template section to build up your script.

370 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

Slave Scripts

Slave Scripts are typically used to modify or to perform actions on a selected Slave in the Monitor. They can be accessed by right-clicking an existing Slave in the Slave Panel, under the ‘Scripts’ sub-menu. To create new Slave scripts, simply navigate to the ‘custom\scripts\Slaves’ folder in your Repository. Then, create a new Python file named ‘MySlaveScript.py’, where ‘MySlaveScript’ is the name of your new script. Once created, you can follow the template outlined above in the General Script Template section to build up your script.

Pulse Scripts

Pulse Scripts are typically used to modify or to perform actions on a selected Pulse in the Monitor. They can be accessed by right-clicking an existing Pulse in the Pulse Panel, under the ‘Scripts’ sub-menu. To create new Pulse scripts, simply navigate to the ‘custom\scripts\Pulse’ folder in your Repository. Then, create a new Python file named ‘MyPulseScript.py’, where ‘MyPulseScript’ is the name of your new script. Once created, you can follow the template outlined above in the General Script Template section to build up your script.

Limit Scripts

Limit Scripts are typically used to modify or to perform actions on selected Limits in the Monitor. They can be accessed by right-clicking an existing Limit in the Pulse Panel, under the ‘Scripts’ sub-menu. To create new Limit scripts, simply navigate to the ‘custom\scripts\Limits’ folder in your Repository. Then, create a new Python file named ‘MyLimitScript.py’, where ‘MyLimitScript’ is the name of your new script. Once created, you can follow the template outlined above in the General Script Template section to build up your script.

Job Report Scripts

Job Report Scripts are typically used to modify or to perform actions on selected Job Reports in the Monitor. They can be accessed by right-clicking an existing Job Report in the Job Report Panel, under the ‘Scripts’ sub-menu. To create new Job Report scripts, simply navigate to the ‘custom\scripts\JobReports’ folder in your Repository. Then, create a new Python file named ‘MyJobReportScript.py’, where ‘MyJobReportScript’ is the name of your new script. Once created, you can follow the template outlined above in the General Script Template section to build up your script.

Slave Report Scripts

Slave Report Scripts are typically used to modify or to perform actions on selected Slave Reports in the Monitor. They can be accessed by right-clicking an existing Slave Report in the Slave Report Panel, under the ‘Scripts’ sub-menu. To create new Slave Report scripts, simply navigate to the ‘custom\scripts\SlaveReports’ folder in your Repository. Then, create a new Python file named ‘MySlaveReportScript.py’, where ‘MySlaveReportScript’ is the name of your new script. Once created, you can follow the template outlined above in the General Script Template section to build up your script.

7.6. Monitor Scripts 371 Deadline User Manual, Release 7.0.3.0

7.6.5 Customizing Script Display

As with any built-in script, once you’ve created your new Monitor Script you can change its Display Name, Keyboard Shortcut, Icon, and its position within the menu in the Repository Configuration. You can also control who can see (and use) your Submission Script through by tweaking its access level in User Management. It is probably a good idea to disable access to it for most users until you have your new script in working order.

7.6.6 Grid Layout Example Script

Grid layouts allow your script dialog to dynamically resize its contents to fit the the size of the dialog. Below are some examples of how to use the new grid layout to build a script dialog. First you must create a ScriptDialog object and start a grid. Once all controls have been added you must end the grid dg= DeadlineScriptDialog() dg.AddGrid() #... #Added controls go here #... dg.EndGrid()

Once you start a grid you can add controls to it by row and column. There is no need to specify how many rows or columns you want the grid to have, just specify the row and column where you want the control to be and the grid will grow to accommodate. Here is an example of adding a label and a text field to the dialog in the same row. dg.AddGrid() dg.AddControlToGrid("Label1","LabelControl","I'm a label.",0,0,"A tooltip", False) dg.AddControlToGrid("TextBox1","TextControl","",0,1) dg.EndGrid()

Here is an example of what this dialog would look like:

It is not possible to specify the size of the controls you want to add to the grid, however it is also not necessary to do so. The contents of the grid(s) will automatically adjust themselves to share the size of the dialog. If you want certain elements to not grow within a row you can set the “expand” property to be disabled. If you want a control to take more space you can set the control span multiple rows or columns using “rowSpan” and “colSpan”, respectively. By default controls have “expand” set and have their “colSpan” and “rowSpan” properties set to 1. This is an example of a dialog with two rows and four columns. The first row contains a label in the first column and is set to not grow any bigger than it needs to and a text control that spans the next 3 columns and is allowed to grow. The second row contains three labels that are not allowed to grow in the first three columns and a text control in the fourth column that can grow as needed. dg.AddGrid() dg.AddControlToGrid("L1","LabelControl","I'm a label.",0,0,"A tooltip", expand=False)

372 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

dg.AddControlToGrid("TextBox1","TextControl","",0,1, colSpan=3) dg.AddControlToGrid("L2","LabelControl","I'm another label.",1,0,"A tooltip", expand=False) dg.AddControlToGrid("L3","LabelControl","I'm another label.",1,1,"A tooltip", expand=False) dg.AddControlToGrid("L4","LabelControl","I'm another label.",1,2,"A tooltip", expand=False) dg.AddControlToGrid("TextBox2","TextControl","",1,3) dg.EndGrid()

Here is an example of what this dialog would look like:

When you expand the dialog horizontally, only the text controls will grow in the above example. Nothing will grow, other than the dialog itself, when expanding vertically. Note that if you set all controls in a row to not expand that this will cause the cells in the grid that the controls are in to expand without allowing any of the controls to expand with it. This will result in the dialog losing its layout when it is expanded. Here is an example of what this dialog would look like expanded horizontally:

Here is an example of what this dialog would look like expanded vertically:

Here is an example of what the dialog would look like expanded horizontally if all controls had “expand=False” set.

7.6. Monitor Scripts 373 Deadline User Manual, Release 7.0.3.0

If you want to space controls out in the grid you can use labels filled with white space, or you can use horizontal spacers. Here is an example of adding two buttons to a dialog and keeping them to the far right of the dialog. dg.AddGrid() dg.AddHorizontalSpacerToGrid("DummyLabel",0,0) ok= dg.AddControlToGrid("Ok","ButtonControl","OK",0,1, expand=False) ok.ValueModified.connect(OkButtonPressed) cancel= dg.AddControlToGrid("Cancel","ButtonControl","Cancel",0,2, expand=False) cancel.ValueModified.connect(CancelButtonPressed) dg.EndGrid()

Here is an example of what this dialog will look like when expanded horizontally:

All together, here is an example of a basic script dialog using grid layouts. from DeadlineUI.Controls.Scripting.DeadlineScriptDialog import DeadlineScriptDialog

######################################################################## ## Globals ######################################################################## dg= None

######################################################################## ## Main Function Called By Deadline ######################################################################## def __main__( *args ): global dg

dg= DeadlineScriptDialog()

dg.SetTitle("Example Deadline Script")

dg.AddGrid() dg.AddControlToGrid("L1","LabelControl","I'm a label.",0,0,"A tooltip", expand=False) dg.AddControlToGrid("TextBox1","TextControl","",0,1, colSpan=3)

dg.AddControlToGrid("L2","LabelControl","I'm another label.",1,0,"A tooltip", expand=False) dg.AddControlToGrid("L3","LabelControl","I'm another label.",1,1,"A tooltip", expand=False) dg.AddControlToGrid("L4","LabelControl","I'm another label.",1,2,"A tooltip", expand=False) dg.AddControlToGrid("TextBox2","TextControl","",1,3)

dg.EndGrid()

#Adds an OK and Cancel button to the dialog dg.AddGrid()

374 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

dg.AddHorizontalSpacerToGrid("DummyLabel",0,0) ok= dg.AddControlToGrid("Ok","ButtonControl","OK",0,1, expand=False) ok.ValueModified.connect(OkButtonPressed) cancel= dg.AddControlToGrid("Cancel","ButtonControl","Cancel",0,2, expand=False) cancel.ValueModified.connect(CancelButtonPressed) dg.EndGrid()

dg.ShowDialog( True) def CloseDialog(): global dg

dg.CloseDialog() def CancelButtonPressed(): CloseDialog() def OkButtonPressed( *args ): global dg

dg.ShowMessageBox("You pressed the OK button.","Button Pressed")

Here is what this dialog looks like:

7.7 Job Scripts

7.7.1 Overview

Job scripts and Dependency scripts can use Python to implement additional automation. Job scripts can be used to perform additional tasks during rendering, and Dependency scripts can control when jobs start rendering. Note that because the Python scripts will be executed in a non-interactive way, it is important that your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input. See the Scripting Overview documentation for more information, and links to the Deadline Scripting reference.

7.7.2 Job Scripts

Job scripts can be assigned to Jobs in order to automate certain tasks before a Job starts rendering (Pre-Job Script), after a Job finished rendering (Post-Job Script), or before and after each individual Job Task has been completed (Pre and Post-Task Scripts). After you create your scripts, you can assign them to a Job by right-clicking on the desired Job in the Monitor, and selecting ‘Modify Job Properties’. The script options can be found under the ‘Scripts’ section of the Job Properties

7.7. Job Scripts 375 Deadline User Manual, Release 7.0.3.0 window. In addition to this, Job scripts can be specified by custom submitters by including them in the Job Info File on submission. Note that a full path to the script is required, so it is recommended that the script file be stored in a location that is accessible to all Slaves.

Creating Job Scripts

The only requirement for a Job script is that you define a __main__ function. This is the function that will be called by Deadline when it comes time to execute the script, and an instance of the DeadlinePlugin object will be passed as a parameter. def __main__( *args ): #Replace "pass" pass

A common use for Post-Task scripts is to do some processing with the output image files. Here is a sample script that demonstrates how to get the output file names for the current task, and print them out to the render log: import re from System.IO import * from Deadline.Scripting import * def __main__( *args ): deadlinePlugin= args[0] job= deadlinePlugin.GetJob() outputDirectories= job.OutputDirectories outputFilenames= job.OutputFileNames paddingRegex= re.compile("[^ \\?#]*([\\?#]+).*")

for i in range(0, len(outputDirectories) ): outputDirectory= outputDirectories[i] outputFilename= outputFilenames[i]

for frameNum in range(deadlinePlugin.GetStartFrame(),deadlinePlugin.GetEndFrame()+1): outputPath= Path.Combine(outputDirectory,outputFilename).replace("//","/")

m= re.match(paddingRegex,outputPath) if( m != None): padding=m.group(1) frame= StringUtils.ToZeroPaddedString(frameNum,len(padding),False) outputPath= outputPath.replace( padding, frame )

deadlinePlugin.LogInfo("Output file:"+ outputPath )

7.7.3 Dependency Scripts

Dependency scripts can be used to control when a job starts rendering. For example, the script could connect to an internal pipeline database to see if the job has been approved to start rendering. After you create your dependency scripts, you can assign them to a Job by right-clicking on the desired Job in the Monitor, and selecting ‘Modify Job Properties’. The Script Dependencies options can be found under the ‘Scripts’ section of the Job Properties window. In addition to this, Job scripts can be specified by custom submitters by including them in the Job Info File on submission. Note that a full path to the script is required, so it is recommended that the script file be stored in a location that is accessible to all Slaves.

376 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

Creating Dependency Scripts

The only requirement for a Job script is that you define a __main__ function. This is the function that will be called by Deadline when it comes time to execute the script to determine if a job should be released or not. For jobs without Frame Dependencies enabled, only the job ID will be passed as a parameter. The __main__ function should then return True if the job should be released or False if it shouldn’t be. For jobs with Frame Dependencies enabled, the job ID will be passed as the first parameter, and a list of pending task IDs will be passed as the second parameter. The __main__ function should then return the list of task IDs that should be released, or an empty list of none should be released. Here is a very simple example that will work regardless of whether Frame Dependencies are enabled or not:

def __main__( jobId, taskIds=None): if not taskIds: # Frame Dependencies are disabled releaseJob= False

#figure out if job should be released

return releaseJob else: # Frame Dependencies are enabled tasksToRelease=[]

#figure out which tasks should be released, and append their IDs to the array

return tasksToRelease

By giving the taskIds parameter a default of None, it allows the script to function regardless of whether Frame Depen- dencies are enabled or not. You can check if “taskIds” is None, and if it is, you know that Frame Dependencies are disabled.

7.8 Web Service Scripts

7.8.1 Overview

Web service scripts allow you to retrieve cached data from Pulse and display them in any way you see fit. See the Web Service for more information on Pulse’s web service feature and how you can use it to call scripts and commands.

7.8.2 Creating Web Service Scripts

Custom web service scripts can be created in the ‘custom\scripts\WebService’ folder in your repository. See the Scripting Overview documentation for more information on the ‘custom’ folder in the Repository and how it’s used. Just place any new scripts directly into this folder, and they will be available to the Web Service. Script files names should not contain any spaces, and should end in a ‘.py’ extension (ie, they must be Python scripts).

The __main__ Function

All web service scripts must define a __main__ function that accepts *args (a tuple containing 2 items). This is the function that will be called when Pulse executes the script. Note that if you decide not to accept args, and an argument

7.8. Web Service Scripts 377 Deadline User Manual, Release 7.0.3.0 string is passed to your script in the URL, it will result in an exception being thrown. The function should also return a string value, which is used to display the results. The string can be HTML, XML, plain text, etc. def __main__( *args ): results=""

#... #append data to results #...

return results

Supporting Arguments

Arguments can be passed to web service scripts as a tuple with 2 items, and can be accepted in two different ways. The first way is to simply accept args, which will be an array of length 2. The other way is to accept the tuple as two separate variables, for instance (dlArgs, qsArgs) for Deadline arguments and query string arguments. In the first case, args[0] is equivalent to dlArgs (Deadline arguments), and args[1] is equivalent to qsArgs (Query String Arguments). Deadline Arguments The Pulse Web Service will automatically pass your script a dictionary as the first item in the args tuple. The Dictionary will contain at least one key (“Authenticated”), but may contain more if the user authenticated with the Pulse Web Service. Currently, if the user has not authenticated, the Dictionary will only contain the “Authenticated” key, with a value of ‘False’. However, if the user has authenticated, it will also contain the “UserName” key, with a value of the user executing the script. Query String Arguments Arguments are passed to your script by a query string defined in the URL, and can be in one of the following forms: Key/Value Pairs: This is the preferred method of passing arguments. Arguments in this form will look something like this at the end of the URL:

?key0=value0&key1=value1

List of Values: Arguments in this form will instead look something like this:

?value0&value1

The query string will be passed to the Python script as a NameValueCollection and it will be the second item of the tuple passed to your script’s __main__ function.

Relevant API Functions

For functions that will be relevant to most Web Service scripts, see the ‘Deadline.PulseUtils’ section of the Deadline Scripting Reference document, available from the Downloads Section.

7.8.3 Calling Web Service Scripts

Once the script has been created, you can call it using Pulse’s Web Service feature. See the Pulse Web Service Documentation for more information on how to set this up. For example, if you have a Web Service script called ‘GetFarmStatistics.py’, you would call it using the following URL (where [myhost] is the hostname pointing to your Pulse machine):

378 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

http://[myhost]:8080/GetFarmStatistics

Some scripts can take arguments, as detailed in the previous section. To include arguments, you need to place a ‘?’ between the base URL and the first argument, with ‘&’ separating addition arguments. Here is an example of how you would pass ‘arg1’, ‘arg2’, and ‘arg3’ as a list of arguments to the GetFarmStatistics.py script: http://[myhost]:8080/GetFarmStatistics?arg1&arg2&arg3

Here is an example of how you would pass values for arguments named ‘arg1’, ‘arg2’, and ‘arg3’ in the form of key-value pairs: http://[myhost]:8080/GetFarmStatistics?arg1=value1&arg2=value2&arg3=value3

The way the results of the script will be displayed is entirely dependent on the format in which the Script returns them.

7.9 Standalone Python API

7.9.1 Overview

The Standalone Python API can be used in Python for communicating with the HTTP API (documented in REST Overview). In order to use the HTTP API you must have Pulse running with the Pulse Web Service running on a machine whose address and port number you know. For a list of the API’s functions and how they are used go to the Deadline Downloads page and download the documentation.

7.9.2 Set-up

In order to use the Standalone Python API you must have Python 2.7 or later installed. Copy the “Deadline” Folder containing the Standalone Python API from \\your\repository\api\python to the “site-packages” folder of your Python installation and the API is ready to use.

7.9.3 Using the API

A DeadlineCon object must be created which is used to communicate with Pulse to send and receive requests. First enter “import Deadline.DeadlineConnect as Connect”, then create your connection object “connectionObject = Dead- line.DeadlineConnect.DeadlineCon(‘PulseName’, PulsePortNumber)”, where ‘PulseName’ is the DNS name or IP address of the machine currently running Pulse (and Web Service) and configured as the Pulse machine in the [Repos- itory Options] and ‘PulsePortNumber’ is the Pulse Web Service Port Number as configured in the [Repository Options] - [Pulse Settings] - [Web Service] tab. By default it is: 8080. The “connectionObject” variable can now be used to communicate requests to Pulse. Example: Getting group names and suspending a job

>>> from Deadline.DeadlineConnect import DeadlineCon as Connect >>> con= Connect('PulseName', 8080) >>> con.Groups.GetGroupNames() [u'none', u'group1', u'group2', u'group3'] >>> jobId= validjobID >>> con.Jobs.SuspendJob(jobId) 'Success'

7.9. Standalone Python API 379 Deadline User Manual, Release 7.0.3.0

Documentation for all the possible API functions can be found on at the Deadline Downloads page.

7.9.4 Authenticating

If your Web Service has authentication enabled then you must set up authentication for the Python API. This can be achieved through the “EnableAuthentication” and “SetAuthenticationCredentials” functions. Setting your authentica- tion credentials allows the Python API to use them for as long as that instance of python is running.

>>> from Deadline.DeadlineConnect import DeadlineCon as Connect >>> con= Connect('PulseName', 8080) >>> con.Groups.GetGroupNames() "Error: HTTP Status Code 401. Authentication with the Web Service failed. Please ensure that the authentication credentials are set, are correct, and that authentication mode is enabled." >>> con.AuthenticationModeEnabled() False >>> con.EnabledAuthentication(True) >>> con.AuthenticationModeEnabled() True >>> con.SetAuthenticationCredentials("username","password") >>> con.Groups.GetGroupNames() [u'none', u'group1', u'group2', u'group3']

By default “SetAuthenticationCredentials” also enables authentication, so it is not actually necessary to explicitly call “EnableAuthentication” as well. If you want to store your credentials without enabling authentication you may do so as well using the optional third parameter.

>>> con.SetAuthenticationCredentials("username","password", False)

7.9.5 API Functions

All of the Standalone Python API functions return a Python dictionary, a Python list, or a Python string. Lists often contain dictionaries. Examples: Getting a list, a list containing dictionaries, a dictionary, and a string back.

>>> groupNames= con.Groups.GetGroupNames() >>> groupNames[0] group1 >>> jobs= con.Jobs.GetJobs() >>> jobs[0]['FailedChunks'] 12 >>> task= con.Tasks.GetJobTask(jobId,0) >>> task["Errs"] 8 >>> root= con.Repository.GetRootDirectory() >>> root 'C:/DeadlineRepository'

Example: Getting a job, changing the pool and priority then saving it.

>>> job= con.Jobs.GetJob(jobId) >>> str(job['Props']['Pool']) none

380 Chapter 7. Scripting Deadline User Manual, Release 7.0.3.0

>>> job['Props']['Pool']= unicode('jobPool') >>> str(job['Props']['Pool']) jobPool >>> print str(job['Props']['Pri']) 50 >>> job['Props']['Pri']= 75 >>> str(job['Props']['Pri']) 75 >>> con.Jobs.SaveJob(job) 'Success' >>> job= con.Jobs.GetJob(jobId) >>> str(job['Props']['Pool'])+''+str(job['Props']['Pri']) jobPool 75

Example: Submitting a job using Python dictionaries import Deadline.DeadlineConnect as Connect if __name__ =='__main__':

Deadline= Connect.DeadlineCon('PulseName', 8080)

JobInfo={ "Name":"Submitted via Python", "UserName":"UserName", "Frames":"0-1", "Plugin":"VraySpawner" }

PluginInfo={ "Version":"Max2014" }

try: newJob= Deadline.Jobs.SubmitJob(JobInfo, PluginInfo) print newJob except: print "Sorry, Web Service is currently down!"

7.9. Standalone Python API 381 Deadline User Manual, Release 7.0.3.0

382 Chapter 7. Scripting CHAPTER EIGHT

REST API

8.1 REST Overview

8.1.1 Overview

The RESTful HTTP API can be used to interact with an instance of Pulse. HTTP requests can be made to request information from the database, store new data, alter existing data or remove entries from the database. Requests to the API can be categorized by the type of data you are attempting to access and by the type of HTTP request you are using to access said data. In order to use the HTTP API you must have the Web Service running on a machine whose address and port number you know. Requests that alter data are primarily POST or PUT messages, and they typically return text stating whether they succeeded or if there was an error. Requests made to retrieve data are done using GET messages and return JavaScript Object Notation (JSON) formatted objects if successful, and text explaining the error if not. Some POST or PUT messages will return JSON objects as well, but usually only if there is information about the action that the user may need (an example of this would be a request to create a new object, the object’s primary key may be returned on creation). Requests made to remove data are typically done using DELETE messages and return text stating whether they succeeded or if there was an error, just like POST and PUT messages. In the event of an error message being returned the HTTP Status Code will also be set to describe the error.

8.1.2 Request Types

• Jobs • Job Reports • Groups • Pools • Limits • Repository • Pulse • Slaves • Tasks • Task Reports • Users • Balancer

383 Deadline User Manual, Release 7.0.3.0

8.1.3 Request Formats and Responses

• GET Request for some data. These messages are constructed entirely within the URL. Successful requests will usually return a JSON object and failed requests will return a brief error message along with the HTTP Status Code. There are some GET requests that will return plain text for a successful request. • PUT Typically a request to modify some data. These messages use the URL to specify what type of data that you wish to alter, and use the message body for storing the message to the database. The message body must be a JSON object, although how this object must be built depends on the data being modified. PUT messages for data that does not exist will often fail, but in some cases will act as a POST. Successful requests will usually return text stating success. Failed requests will return a brief error message along with the HTTP Status Code. There are some PUT messages that return JSON objects, and this usually occurs when data has been created instead of altered. • POST Request to create some data. These messages use the URL to specify what type of data that you wish to create, and use the message body for storing the message to the database. The message body must be a JSON object, although how this object must be built depends on the data being modified. POST messages for data that already exists will fail. Successful requests will usually return text stating success. Failed requests will return a brief error message along with the HTTP Status Code. There are some POST messages that return JSON objects. • DELETE Request to delete some data. These messages are constructed entirely within the URL. Successful requests will usually return text stating success. Failed requests will return a brief error message along with the HTTP Status Code.

8.1.4 HTTP Status Codes

The following are the HTTP Status Code that can be returned, and what they signify in Deadline. • 200 - OK Request completed without error. Note that this does not always mean the request modified every- thing as intended. Example: trying to send a “complete” message to a completed job will do nothing and return this status code. Another example: trying to release a job from pending when the job is not pending will return this status code and do nothing. • 400 - Bad Request Request could not be completed due to incorrect request message structure in either the URL or the body of the request message. • 404 - Not Found Requested data could not be found, or requested command could not be found. • 405 - Method Not Allowed Requested operation could not be completed using the request format given. • 500 - Internal Server Error Request message could not be interpreted properly, or the action being attempted causing an excep- tion in Deadline.

384 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

• 501 - Not Implemented Request type is not supported. For example, a JobReport PUT request would return this because only GET is supported.

8.1.5 Additional Information

If a request is made for a JSON object, and an empty JSON object is returned, then the information provided for the request did not match any entry in the repository. Adding additional key-value pairs to a JSON object for a request that does not specify their use can have surprising consequences. Keys that are not used by other commands will be ignored, but be sure to read the documentation for each possible query for each request type before building a JSON object for your query, as some commands are identical other than the presence of a single key and have vastly different effects. If a documented query requires a JSON object that you do not know how to properly construct, it is often possible to do a GET query for the same object type and receive the JSON format that the query expects. A query that returns “Success” does not imply that the actions your query requested occurred. Some actions are impossible, but do not warrant an error message. (Example, sending a Suspend message to a Suspended job, or Deleting a Slave that does not exist or was already Deleted.)

8.2 Jobs

8.2.1 Overview

Job requests can be used to set and retrieve information for one or many jobs. Job requests support GET, PUT, POST and DELETE request types. For more about these request types and their uses see the Request Formats and Responses documentation.

8.2.2 Requests and Responses

List of possible requests for Jobs. All PUT and POST requests may also return a 400 Bad Request error if there was no message body in the request. All PUT requests may also return a 400 Bad Request error message if the command key is not present in the message body’s JSON object. All PUT requests may also return a 500 Internal Server Error error message if the command key in the message body contained an invalid command. Get All The Jobs URL: http://hostname:portnumber/api/jobs Request Type: GET Message Body: N/A Response: JSON object containing all the job information for every job in the repository. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get All The Job IDs URL: http://hostname:portnumber/api/jobs?IdOnly=true Request Type: GET Message Body: N/A Response: JSON object containing all the job IDs in the repository.

8.2. Jobs 385 Deadline User Manual, Release 7.0.3.0

Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Job Gets job info for the given job ID. URL: http://hostname:portnumber/api/jobs?JobID=validjobidhere Request Type: GET Message Body: N/A Response: JSON object containing all the job information for the job ID provided. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Save Job Saves the job info provided. Job info must be in JSON format. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = save • Job = JSON object containing the job info Response: “Success” Possible Errors: • 400 Bad Request: There was no Job entry in the JSON object in the message body. • 500 Internal Server Error: An exception occurred within the Deadline code. Suspend Job Puts the job with the matching ID into the Suspended state. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = suspend • JobID = the ID of the Job to be suspended Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Suspend Job: Non-rendering tasks Puts the job with the matching ID into the Suspended state. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory:

386 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

• Command = suspendnonrendering • JobID = the ID of the Job to be suspended Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Resume Job Resumes the job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = resume • JobID = the ID of the Job to be resumed Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Resume Failed Job Resumes the failed job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = resumefailed • JobID = the ID of the failed Job to be resumed Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository.

8.2. Jobs 387 Deadline User Manual, Release 7.0.3.0

Requeue Job Requeues the job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = requeue • JobID = the ID of the Job to be requeued Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Archive Job Archives the job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = archive • JobID = the ID of the Job to be archived Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Import Job Imports the job path provided. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = import • File = the file location of the archived job/s (May be an array) The following keys are optional: • DeleteFile = true (deletes the archive file/s after importing) Response: The job ids of the imported jobs and of the jobs that were not imported.

388 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

Possible Errors: • 400 Bad Request: There was no File path provided. • 500 Internal Server Error: An exception occurred within the Deadline code. Pend Job Puts the job with the ID that matches the provided ID in the pending state. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = pend • JobID = the ID of the Job to be put in the pending state Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Release Pending Job Releases the job with the ID that matches the provided ID from the pending state. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = releasepending • JobID = the ID of the Job to be release from the pending state Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Complete Job Marks the job with the ID that matches the provided ID as complete. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = complete • JobID = the ID of the Job to be marked as complete

8.2. Jobs 389 Deadline User Manual, Release 7.0.3.0

Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Fail Job Marks the job with the ID that matches the provided ID as failed. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = fail • JobID = the ID of the Job to be marked as failed Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Update Job Submission Date Updates the Submission Date for the job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = updatesubmissiondate • JobID = the ID of the Job to have the submission date updated for Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Set Job Machine Limit Sets the Job Machine Limit for the job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body:

390 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

JSON object where the following keys are mandatory: • Command = setjobmachinelimit • JobID = the ID of the Job The following keys are optional: • Limit = the new job machine limit, must be an integer • SlaveList = the slave/s to be set as the slave list (May be an array) • WhiteListFlag = boolean : sets the whitelistflag to true or false • Progress = Floating point number for the release percentage Response: “Success” Possible Errors: • 400 Bad Request: There was no JobID entry in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Add Slaves To Job Machine Limit List Adds the provided Slaves to the job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = addslavestojobmachinelimitlist • JobID = the ID of the Job • SlaveList = the slave/s to be added to the slave list (May be an array) Response: “Success” Possible Errors: • 400 Bad Request: – There was no JobID entry in the JSON object in the message body, or – There needs to be at least one Slave passed. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Remove Slaves From Job Machine Limit List Removes the provided Slaves from the Job Machine Limit List for the job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory:

8.2. Jobs 391 Deadline User Manual, Release 7.0.3.0

• Command = removeslavesfromjobmachinelimitlist • JobID = the ID of the Job • SlaveList = the slave/s to be removed from the slave list (May be an array) Response: “Success” Possible Errors: • 400 Bad Request: – There was no JobID entry in the JSON object in the message body, or – There needs to be at least one Slave passed. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Set Job Machine Limit Listed Slaves Sets provided Slaves as Job Machine Limit Listed Slaves for the Job whose ID matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = setjobmachinelimitlistedslaves • JobID = the ID of the Job • SlaveList = the slave/s to be set as the slave list (May be an array) Response: “Success” Possible Errors: • 400 Bad Request: – There was no JobID entry in the JSON object in the message body, or – There needs to be at least one Slave passed. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Set Job Machine Limit White List Flag Sets Job Machine Limit White List Flag for the job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = setjobmachinelimitwhitelistflag • JobID = the ID of the Job • WhiteListFlag = boolean : sets the whitelistflag to true or false

392 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

Response: “Success” Possible Errors: • 400 Bad Request: – There was no JobID entry in the JSON object in the message body, or – Must pass a boolean WhiteListFlag. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Set Job Machine Limit Sets Job Machine Limit Maximum for the job with the ID that matches the provided ID. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = setjobmachinelimitmaximum • JobID = the ID of the Job • Limit = the new job machine limit, must be an integer Response: “Success” Possible Errors: • 400 Bad Request: – There was no JobID entry in the JSON object in the message body, or – Must pass an integer Limit • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Submit Job Submits a job using the job info provided. URL: http://hostname:portnumber/api/jobs Request Type: POST Message Body: JSON object where the following keys are mandatory: • JobInfo = JSON object containing the Job Info • PluginInfo = JSON object containing the Plugin Info • AuxFiles = Array of Auxiliary File paths (May be empty, but must be provided) • IdOnly = Set to “true” to only return the job ID (defaults to “false”) Response: JSON object containing the new Job that was submitted or the Job ID Possible Errors: • 400 Bad Request: Missing one or more of the mandatory keys listed above.

8.2. Jobs 393 Deadline User Manual, Release 7.0.3.0

• 500 Internal Server Error: – An exception occurred within the Deadline code, or – Could not access the file path specified in NetworkRoot. Delete Jobs Deletes the job corresponding to the job ID provided. URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIdsToDelete Request Type: DELETE Message Body: N/A Response: “Success” Possible Errors: • 400 Bad Request: Need to provide at least one job ID to delete. • 500 Internal Server Error: An exception occurred within the Deadline code. Get Job Details Gets the Job Details, similar to the Job Details panel, for the Jobs corresponding to the provided Job IDs. URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIds&Details=true Request Type: GET Message Body: N/A Response: A JSON object containing the Job Details. Possible Errors: • 400 Bad Request: Need to provide at least one job ID to get details for. • 500 Internal Server Error: An exception occurred within the Deadline code. Get Deleted Jobs Gets the Deleted Jobs that correspond to the provided Job IDs. URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIds&Deleted=true Request Type: GET Message Body: N/A Response: A JSON object containing the deleted Jobs. Possible Errors: • 400 Bad Request: Need to provide at least one deleted job ID. • 500 Internal Server Error: An exception occurred within the Deadline code. Get All Deleted Jobs Gets all the Deleted Jobs. URL: http://hostname:portnumber/api/jobs?Deleted=true Request Type: GET Message Body: N/A Response: A JSON object containing the deleted Jobs. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Purge Deleted Jobs Purges the Deleted Jobs that correspond to the provided Job IDs. URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIdsToDelete&Purge=true

394 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

Request Type: DELETE Message Body: N/A Response: “Success” Possible Errors: • 400 Bad Request: Need to provide at least one job ID to delete. • 500 Internal Server Error: An exception occurred within the Deadline code. Undelete Jobs Undeletes the Deleted Jobs that correspond to the provided Job IDs. URL: http://hostname:portnumber/api/jobs Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = undelete • JobID/s = job ID/list of job IDs to undelete Response: “Success” Possible Errors: • 400 Bad Request: Need to provide at least one job ID to delete. • 500 Internal Server Error: An exception occurred within the Deadline code.

8.2.3 Job Property Values

Values for some Job properties are represented by numbers. Those properties and their possible values are listed below. Stat (Status) • 0 = Unknown • 1 = Active • 2 = Suspended • 3 = Completed • 4 = Failed • 6 = Pending Note that an active job can either be idle or rendering. Use the RenderingChunks property to determine if anything is rendering. Timeout (OnTaskTimeout) • 0 = Both • 1 = Error • 2 = Notify OnComp (OnJobComplete) • 0 = Archive • 1 = Delete

8.2. Jobs 395 Deadline User Manual, Release 7.0.3.0

• 2 = Nothing Schd (ScheduledType) • 0 = None • 1 = Once • 2 = Daily

8.3 Job Reports

8.3.1 Overview

Job Report requests can be used to retrieve Job Reports for a Job using the GET request type. PUT, POST and DELETE are not supported and sending a message of any of these types will result in a 501 Not Implemented error message. For more about these request types and their uses see the Request Formats and Responses documentation.

8.3.2 Requests and Responses

List of possible requests for Job Reports. It is possible to get a 400 Bad Request error message for any of the requests if the value for Data is incorrect. Get All Job Reports Gets all the Job Reports for the Job that corresponds to the provided Job ID. URL: http://hostname:portnumber/api/jobreports?Data=all&JobID=validJobID http://hostname:portnumber/api/jobreports?JobID=validJobID Request Type: GET Message Body: N/A Response: JSON object containing all the job reports for the requested job, or a message stating that there are no reports for the job. Possible Errors: • 400 Bad Request: No Job ID was provided. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – The Job ID provided does not correspond to any Job in the repository. Get Job Error Reports Gets all the Job Error Reports for the Job that corresponds to the provided Job ID. URL: http://hostname:portnumber/api/jobreports?Data=error&JobID=validJobID Request Type: GET Message Body: N/A Response: JSON object containing all the job error reports for the requested job, or a message stating that there are no error reports for the job. Possible Errors: • 400 Bad Request: No Job ID was provided. • 500 Internal Server Error: – An exception occurred within the Deadline code, or

396 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

– The Job ID provided does not correspond to any Job in the repository. Get Job Log Reports Gets all the Job Reports for the Job that corresponds to the provided Job ID. URL: http://hostname:portnumber/api/jobreports?Data=log&JobID=validJobID Request Type: GET Message Body: N/A Response: JSON object containing all the job log reports for the requested job, or a message stating that there are no log reports for the job. Possible Errors: • 400 Bad Request: No Job ID was provided. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – The Job ID provided does not correspond to any Job in the repository. Get Job Requeue Reports Gets all the Job Requeue Reports for the Job that corresponds to the provided Job ID. URL: http://hostname:portnumber/api/jobreports?Data=requeue&JobID=validJobID Request Type: GET Message Body: N/A Response: JSON object containing all the job requeue reports for the requested job, or a message stating that there are no requeue reports for the job. Possible Errors: • 400 Bad Request: No Job ID was provided. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – The Job ID provided does not correspond to any Job in the repository. Get Job History Entries Gets all the Job History Entries for the Job that corresponds to the provided Job ID. URL: http://hostname:portnumber/api/jobreports?Data=history&JobID=validJobID Request Type: GET Message Body: N/A Response: JSON object containing all the job history entries for the requested job, or a message stating that there are no history entries for the job. Possible Errors: • 400 Bad Request: No Job ID was provided. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – The Job ID provided does not correspond to any Job in the repository.

8.3. Job Reports 397 Deadline User Manual, Release 7.0.3.0

8.3.3 Job Report Property Values

Values for some Job Report properties are represented by numbers. Those properties and their possible values are listed below. Type (ReportType) • 0 = LogReport • 1 = ErrorReport • 2 = RequeueReport

8.4 Tasks

8.4.1 Overview

Task requests can be used to set and retrieve Task information using GET and PUT request types. POST and DELETE are not supported and sending a message of either of these types will result in a 501 Not Implemented error message. For more about these request types and their uses see the Request Formats and Responses documentation.

8.4.2 Requests and Responses

List of possible requests for Tasks. For all PUT requests it is possible to return a 400 Bad Request error message if the message body is empty or if no command key is provided. All requests may return a 400 Bad request error message if no Job ID is provided or a 500 Internal Server Error if the Job ID provided does not correspond to any Job in the repository. Get Task IDs Gets all the Task IDs for the Job that corresponds to the Job ID provided. URL: http://hostname:portnumber/api/tasks?IdOnly=true&JobID=aValidJobID Request Type: GET Message Body: N/A Response: JSON object containing all the Task IDs for the Job. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Task Gets the Task that correspond to the Task ID provided for the Job that corresponds to the Job ID provided. URL: http://hostname:portnumber/api/tasks?TaskID=oneValidTaskID&JobID=aValidJobID Request Type: GET Message Body: N/A Response: JSON object containing the Task information for the requested Task. Possible Errors: • 400 Bad Request: – No Task ID provided, or – Task ID must be an integer value.

398 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

• 500 Internal Server Error: An exception occurred within the Deadline code. Get All Tasks Gets the Tasks for the Job that corresponds to the Job ID provided. URL: http://hostname:portnumber/api/tasks?JobID=aValidJobID Request Type: GET Message Body: N/A Response: JSON object containing the Task information for all the Job Tasks. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Requeue Tasks Requeues the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID provided. If no Task IDs are provided, all Job tasks will be requeued. URL: http://hostname:portnumber/api/tasks Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = requeue • JobID = the id of the Job The following keys are optional: • TaskList = integer Task ID/s (May be an Array) Response: “Success” Possible Errors: • 400 Bad Request: TaskList contains entries, but none of them are valid integers. • 404 Not Found: Requested Task ID does not correspond to a Task for the Job. • 500 Internal Server Error: An exception occurred within the Deadline code. Complete Tasks Completes the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID provided. If no Task IDs are provided, all Job tasks will be completed. URL: http://hostname:portnumber/api/tasks Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = complete • JobID = the id of the Job The following keys are optional: • TaskList = integer Task ID/s (May be an Array) Response: “Success” Possible Errors:

8.4. Tasks 399 Deadline User Manual, Release 7.0.3.0

• 400 Bad Request: TaskList contains entries, but none of them are valid integers. • 404 Not Found: Requested Task ID does not correspond to a Task for the Job. • 500 Internal Server Error: An exception occurred within the Deadline code. Suspend Tasks Suspend the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID provided. If no Task IDs are provided, all Job tasks will be suspended. URL: http://hostname:portnumber/api/tasks Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = suspend • JobID = the id of the Job The following keys are optional: • TaskList = integer Task ID/s (May be an Array) Response: “Success” Possible Errors: • 400 Bad Request: TaskList contains entries, but none of them are valid integers. • 404 Not Found: Requested Task ID does not correspond to a Task for the Job. • 500 Internal Server Error: An exception occurred within the Deadline code. Fail Tasks Fails the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID pro- vided. If no Task IDs are provided, all Job tasks will be failed. URL: http://hostname:portnumber/api/tasks Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = fail • JobID = the id of the Job The following keys are optional: • TaskList = integer Task ID/s (May be an Array) Response: “Success” Possible Errors: • 400 Bad Request: TaskList contains entries, but none of them are valid integers. • 404 Not Found: Requested Task ID does not correspond to a Task for the Job. • 500 Internal Server Error: An exception occurred within the Deadline code. Resume Failed Tasks

400 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

Resumes the Failed Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID provided. If no Task IDs are provided, all Job failed tasks will be resumed. URL: http://hostname:portnumber/api/tasks Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = resumefailed • JobID = the id of the Job The following keys are optional: • TaskList = integer Task ID/s (May be an Array) Response: “Success” Possible Errors: • 400 Bad Request: TaskList contains entries, but none of them are valid integers. • 404 Not Found: Requested Task ID does not correspond to a Task for the Job. • 500 Internal Server Error: An exception occurred within the Deadline code. Pend Tasks Pends the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID provided. If no Task IDs are provided, all Job tasks will be pended. URL: http://hostname:portnumber/api/tasks Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = pend • JobID = the id of the Job The following keys are optional: • TaskList = integer Task ID/s (May be an Array) Response: “Success” Possible Errors: • 400 Bad Request: TaskList contains entries, but none of them are valid integers. • 404 Not Found: Requested Task ID does not correspond to a Task for the Job. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Trying to pend a task for a Suspended Job. Release Pending Tasks Releases the pending Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID provided. If no Task IDs are provided, all Job pending tasks will be released. URL: http://hostname:portnumber/api/tasks

8.4. Tasks 401 Deadline User Manual, Release 7.0.3.0

Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = releasepending • JobID = the id of the Job The following keys are optional: • TaskList = integer Task ID/s (May be an Array) Response: “Success” Possible Errors: • 400 Bad Request: TaskList contains entries, but none of them are valid integers. • 404 Not Found: Requested Task ID does not correspond to a Task for the Job. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Trying to release a task from pending for a Suspended Job.

8.4.3 Task Property Values

Values for some Task properties are represented by numbers. Those properties and their possible values are listed below. Stat (Status) • 1 = Unknown • 2 = Queued • 3 = Suspended • 4 = Rendering • 5 = Completed • 6 = Failed • 8 = Pending

8.5 Task Reports

8.5.1 Overview

Task Report requests can be used to retrieve Task Reports for a Job Task using the GET request type. PUT, POST and DELETE are not supported and sending a message of any of these types will result in a 501 Not Implemented error message. For more about these request types and their uses see the Request Formats and Responses documentation.

402 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

8.5.2 Requests and Responses

List of possible requests for Task Reports. It is possible to get a 400 Bad Request error message for any of the requests if the value for Data is incorrect. All requests may return a 400 Bad request error message if no Job ID is provided or a 500 Internal Server Error if the Job ID provided does not correspond to any Job in the repository. All requests may also return a 400 Bad Request error message if the Task ID was not provided, or was not valid, or was not an integer. Get All Task Reports Gets all the Task Reports for the Job Task that corresponds to the provided Job ID and provided Task ID. URL: http://hostname:portnumber/api/taskreports?Data=all&JobID=validJobID&TaskID=validTaskID http://hostname:portnumber/api/taskreports?JobID=validJobID&TaskID=validTaskID Request Type: GET Message Body: N/A Response: JSON object containing all the Task reports for the requested Job Task, or a message stating that there are no reports for the Job Task. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Task Error Reports Gets all the Task Error Reports for the Job Task that corresponds to the provided Job ID and provided Task ID. URL: http://hostname:portnumber/api/taskreports?Data=error&JobID=validJobID&TaskID=validTaskID Request Type: GET Message Body: N/A Response: JSON object containing the Task error reports for the requested Job Task, or a message stating that there are no error reports for the Job Task. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Task Log Reports Gets all the Task Log Reports for the Job Task that corresponds to the provided Job ID and provided Task ID. URL: http://hostname:portnumber/api/taskreports?Data=log&JobID=validJobID&TaskID=validTaskID Request Type: GET Message Body: N/A Response: JSON object containing the Task log reports for the requested Job Task, or a message stating that there are no log reports for the Job Task. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Task Requeue Reports Gets all the Task Requeue Reports for the Job Task that corresponds to the provided Job ID and provided Task ID. URL: http://hostname:portnumber/api/taskreports?Data=requeue&JobID=validJobID&TaskID=validTaskID Request Type: GET Message Body: N/A

8.5. Task Reports 403 Deadline User Manual, Release 7.0.3.0

Response: JSON object containing the Task requeue reports for the requested Job Task, or a message stating that there are no requeue reports for the Job Task. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

8.5.3 Task Report Property Values

Values for some Task Report properties are represented by numbers. Those properties and their possible values are listed below. Type (ReportType) • 0 = LogReport • 1 = ErrorReport • 2 = RequeueReport

8.6 Slaves

8.6.1 Overview

Slave requests can be used to set or retrieve Slave information. Slave requests support GET, PUT and DELETE request types. POST is not supported and sending such a message will result in a 501 Not Implemented error message. For more about these request types and their uses see the Request Formats and Responses documentation.

8.6.2 Requests and Responses

List of possible requests for Slaves. For all PUT requests it is possible to return a 400 Bad Request error message if there is no message body or if the command key is not set. PUT requests may also return a 500 Internal Server Error message if the command key is set to an invalid command. Get Slave Names Gets all the Slave names. URL: http://hostname:portnumber/api/slaves?NamesOnly=true Request Type: GET Message Body: N/A Response: JSON object containing all the Slave names. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Slaves’ InfoSettings Gets the InfoSettings for every Slave name provided. URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=infosettings Request Type: GET Message Body: N/A Response: JSON object containing the Slave InfoSettings for all the Slave names provided. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

404 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

Get All Slaves’ InfoSettings Gets the InfoSettings for every Slave. URL: http://hostname:portnumber/api/slaves?Data=infosettings Request Type: GET Message Body: N/A Response: JSON object containing the Slave InfoSettings for all the Slaves. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Slaves’ Information Gets the Slave Information for every Slave name provided. URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=info Request Type: GET Message Body: N/A Response: JSON object containing the Slave Information for all the Slave names provided. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get All Slaves’ Information Gets the Slave Information for every Slave. URL: http://hostname:portnumber/api/slaves?Data=info Request Type: GET Message Body: N/A Response: JSON object containing the Slave Information for all the Slaves. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Slaves’ Settings Gets the Slave Settings for every Slave name provided. URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=settings Request Type: GET Message Body: N/A Response: JSON object containing the Slave Settings for all the Slave names provided. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get All Slaves’ Settings Gets the Slave Settings for every Slave. URL: http://hostname:portnumber/api/slaves?Data=settings Request Type: GET Message Body: N/A Response: JSON object containing the Slave Settings for all the Slaves. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Save Slave Information

8.6. Slaves 405 Deadline User Manual, Release 7.0.3.0

Saves the Slave Information provided. URL: http://hostname:portnumber/api/slaves Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = saveinfo • SlaveInfo = JSON object containing the Slave information to save. Response: “Success” Possible Errors: • 400 Bad Request: JSON object containing Slave Information was not provided. • 500 Internal Server Error: An exception occurred within the Deadline code. Save Slave Settings Saves the Slave Settings provided. URL: http://hostname:portnumber/api/slaves Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = savesettings • SlaveInfo = JSON object containing the Slave Settings to save. Response: “Success” Possible Errors: • 400 Bad Request: JSON object containing Slave Settings was not provided. • 500 Internal Server Error: An exception occurred within the Deadline code. Delete Slaves Deletes every Slave that corresponds to a Slave name provided. URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames Request Type: DELETE Message Body: N/A Response: “Success” Possible Errors: • 400 Bad Request: Need to provide at least one Slave name to delete. • 500 Internal Server Error: An exception occurred within the Deadline code. Get Slaves’ Reports Gets all Slave Reports for all Slave names provided. URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=reports Request Type: GET

406 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

Message Body: N/A Response: JSON object containing all Slave Reports for all Slave names provided. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Slave Reports For All Slaves Gets all Slave Reports for all Slaves. URL: http://hostname:portnumber/api/slaves?Data=reports Request Type: GET Message Body: N/A Response: JSON object containing all Slave Reports for all Slave names provided. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Slaves’ History Gets all Slave History Entries for all Slave names provided. URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=history Request Type: GET Message Body: N/A Response: JSON object containing all Slave History Entries for all Slave names provided. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Slave History For All Slaves Gets all Slave History Entries for all Slaves. URL: http://hostname:portnumber/api/slaves?Data=history Request Type: GET Message Body: N/A Response: JSON object containing all Slave History for all Slaves. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Slave Names Rendering Job Gets all Slave names rendering Job that corresponds to Job ID provided. URL: http://hostname:portnumber/api/slavesrenderingjob?JobID=validJobID Request Type: GET Message Body: N/A Response: JSON object all the Slave names rendering the Job. Possible Errors: • 400 Bad Request: No Job ID was provided. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Get Host Names of Machines Rendering Job

8.6. Slaves 407 Deadline User Manual, Release 7.0.3.0

Gets all machine host names for slaves rendering Job that corresponds to Job ID provided. URL: http://hostname:portnumber/api/machinessrenderingjob?JobID=validJobID Request Type: GET Message Body: N/A Response: JSON object containing all the host names. Possible Errors: • 400 Bad Request: No Job ID was provided. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository. Get IP Address of Machines Rendering Job Gets all machine IP addresses for slaves rendering Job that corresponds to Job ID provided. URL: http://hostname:portnumber/api/machinessrenderingjob?JobID=validJobID&GetIpAddress=true Request Type: GET Message Body: N/A Response: JSON object containing all the IP addresses. Possible Errors: • 400 Bad Request: No Job ID was provided. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Job ID provided does not correspond to a Job in the repository.

8.6.3 Slave Property Values

Values for some Slave Info, Settings, and Report properties are represented by numbers. Those properties and their possible values are listed below. Stat (SlaveStatus) • 0 = Unknown • 1 = Rendering • 2 = Idle • 3 = Offline • 4 = Stalled • 8 = StartingJob Type (ReportType) • 0 = LogReport • 1 = ErrorReport • 2 = RequeueReport

408 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

8.7 Pulse

8.7.1 Overview

Pulse requests can be used to set and retrieve Pulse information using GET and PUT. POST and DELETE are not supported and sending a message of either of these types will result in a 501 Not Implemented error message. For more about these request types and their uses see the Request Formats and Responses documentation.

8.7.2 Requests and Responses

List of possible requests for Pulse. Get Pulse Names Gets all the Pulse names. URL: http://hostname:portnumber/api/pulse?NamesOnly=true Request Type: GET Message Body: N/A Response: JSON object containing all the Pulse names. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Pulse Information Gets the Pulse information for the Pulse names provided. URL: http://hostname:portnumber/api/pulse?Info=true&Names=oneOrMorePulseNamesOR http://hostname:portnumber/api/pulse?Info=true&Name=onePulseName Request Type: GET Message Body: N/A Response: JSON object containing all the Pulse information for the requested Pulse names. Possible Errors: • 404 Not Found: Pulse name provided does not exist (can only occur if you use Name= ) • 500 Internal Server Error: An exception occurred within the Deadline code. Save Pulse Information Saves the Pulse information provided. URL: http://hostname:portnumber/api/pulse Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = saveinfo • PulseInfo = JSON object containing all the Pulse information. Response: “Success” Possible Errors: • 400 Bad Request: Did not provide a Pulse Information JSON object

8.7. Pulse 409 Deadline User Manual, Release 7.0.3.0

• 500 Internal Server Error: An exception occurred within the Deadline code. Get Pulse Settings Gets the Pulse settings for the Pulse names provided. URL: http://hostname:portnumber/api/pulse?Settings=true&Names=oneOrMorePulseNamesOR http://hostname:portnumber/api/pulse?Settings=true&Name=onePulseName Request Type: GET Message Body: N/A Response: JSON object containing all the Pulse settings for the requested Pulse names. Possible Errors: • 404 Not Found: Pulse name provided does not exist (can only occur if you use Name= ) • 500 Internal Server Error: An exception occurred within the Deadline code. Save Pulse Settings Saves the Pulse settings provided. URL: http://hostname:portnumber/api/pulse Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = savesettings • PulseSettings = JSON object containing all the Pulse information. Response: “Success” Possible Errors: • 400 Bad Request: Did not provide a Pulse Information JSON object • 500 Internal Server Error: An exception occurred within the Deadline code. Get Pulse InfoSettings Gets the Pulse information and settings for the Pulse names provided. URL: http://hostname:portnumber/api/pulse?Names=oneOrMorePulseNamesOR http://hostname:portnumber/api/pulse?Name=onePulseName Request Type: GET Message Body: N/A Response: JSON object containing all the Pulse information and settings for the requested Pulse names. Possible Errors: • 404 Not Found: Pulse name provided does not exist (can only occur if you use Name= ) • 500 Internal Server Error: An exception occurred within the Deadline code.

410 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

8.7.3 Pulse Property Values

Values for some Pulse properties are represented by numbers. Those properties and their possible values are listed below. Stat (PulseStatus) • 0 = Unknown • 1 = Running • 2 = Offline • 4 = Stalled

8.8 Balancer

8.8.1 Overview

Balancer requests can be used to set and retrieve Balancer information using GET and PUT. POST and DELETE are not supported and sending a message of either of these types will result in a 501 Not Implemented error message. For more about these request types and their uses see the Request Formats and Responses documentation.

8.8.2 Requests and Responses

List of possible requests for Balancer. Get Balancer Names Gets all the Balancer names. URL: http://hostname:portnumber/api/balancer?NamesOnly=true Request Type: GET Message Body: N/A Response: JSON object containing all the Balancer names. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Balancer Information Gets the Balancer information for the Balancer names provided. URL: http://hostname:portnumber/api/balancer?Info=true&Names=oneOrMoreBalancerNamesOR http://hostname:portnumber/api/balancer?Info=true&Name=oneBalancerName Request Type: GET Message Body: N/A Response: JSON object containing all the Balancer information for the requested Balancer names. Possible Errors: • 404 Not Found: Balancer name provided does not exist (can only occur if you use Name= ) • 500 Internal Server Error: An exception occurred within the Deadline code. Save Balancer Information

8.8. Balancer 411 Deadline User Manual, Release 7.0.3.0

Saves the Balancer information provided. URL: http://hostname:portnumber/api/balancer Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = saveinfo • BalancerInfo = JSON object containing all the Balancer information. Response: “Success” Possible Errors: • 400 Bad Request: Did not provide a Balancer Information JSON object • 500 Internal Server Error: An exception occurred within the Deadline code. Get Balancer Settings Gets the Balancer settings for the Balancer names provided. URL: http://hostname:portnumber/api/balancer?Settings=true&Names=oneOrMoreBalancerNamesOR http://hostname:portnumber/api/balancer?Settings=true&Name=oneBalancerName Request Type: GET Message Body: N/A Response: JSON object containing all the Balancer settings for the requested Balancer names. Possible Errors: • 404 Not Found: Balancer name provided does not exist (can only occur if you use Name= ) • 500 Internal Server Error: An exception occurred within the Deadline code. Save Balancer Settings Saves the Balancer settings provided. URL: http://hostname:portnumber/api/balancer Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = savesettings • BalancerSettings = JSON object containing all the Balancer information. Response: “Success” Possible Errors: • 400 Bad Request: Did not provide a Balancer Information JSON object • 500 Internal Server Error: An exception occurred within the Deadline code. Get Balancer InfoSettings Gets the Balancer information and settings for the Balancer names provided. URL: http://hostname:portnumber/api/balancer?Names=oneOrMoreBalancerNamesOR http://hostname:portnumber/api/balancer?Name=oneBalancerName

412 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

Request Type: GET Message Body: N/A Response: JSON object containing all the Balancer information and settings for the requested Balancer names. Possible Errors: • 404 Not Found: Balancer name provided does not exist (can only occur if you use Name= ) • 500 Internal Server Error: An exception occurred within the Deadline code.

8.8.3 Balancer Property Values

Values for some Balancer properties are represented by numbers. Those properties and their possible values are listed below. Stat (BalancerStatus) • 0 = Unknown • 1 = Running • 2 = Offline • 4 = Stalled

8.9 Limits

8.9.1 Overview

Limit Group requests can be used to set and retrieve information about one or many Limit Groups. Limit Group requests support GET, PUT, POST and DELETE request types. For more about these request types and their uses see the Request Formats and Responses documentation.

8.9.2 Requests and Responses

List of possible requests for Limit Groups. All PUT and POST requests can return a 400 Bad Request error message if no message body is passed, or if no command key is present in the message body. All PUT and POST requests may also return a 500 Internal Server Error error message if the command key in the message body contained an invalid command. Get Limit Group Names Gets the names of all Limit Groups in the repository. URL: http://hostname:portnumber/api/limitgroups?NamesOnly=true Request Type: GET Message Body: N/A Response: JSON object containing all the Limit Group names. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Limit Groups Gets the Limit Groups for the provided Limit Group names. URL: http://hostname:portnumber/api/limitgroups?Names=listOfOneOrMoreLimitGroupNames http://hostname:portnumber/api/limitgroups?Name=aSingleLimitGroupName

8.9. Limits 413 Deadline User Manual, Release 7.0.3.0

Request Type: GET Message Body: N/A Response: JSON object containing the requested Limit Group/s Possible Errors: • 404 Not Found: There is no Limit Group with provided Name (this can only occur if a single name is passed) • 500 Internal Server Error: An exception occurred within the Deadline code. Get All Limit Groups Gets the names of all Limit Groups in the repository. URL: http://hostname:portnumber/api/limitgroups Request Type: GET Message Body: N/A Response: JSON object containing all the Limit Groups. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Set Limit Group Sets the Limit, Slave List, White List Flag, Release Percentage and/or Excluded Slaves for an existing Limit Group, or creates a new Limit Group with the provided properties. URL: http://hostname:portnumber/api/limitgroups Request Type: PUT/POST Message Body: JSON object where the following keys are mandatory: • Command = set • Name = name of Limit Group The following keys are optional: • Limit= integer limit • Slaves = list of slave names to include • SlavesEx = list of slave names to exclude • RelPer = floating point number for release percentage • White = boolean white list flag Response: “Success” Possible Errors: • 400 Bad Request: No name provided for the Limit Group • 500 Internal Server Error: An exception occurred within the Deadline code. Save Limit Group Updates a Limit Group using a JSON object containing all the Limit Group information. URL: http://hostname:portnumber/api/limitgroups Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = save

414 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

• LimitGroup = JSON object containing all relevant Limit Group information Response: “Success” Possible Errors: • 400 Bad Request: No valid Limit Group object provided. • 500 Internal Server Error: An exception occurred within the Deadline code. Reset Limit Group Resets the counts for a Limit Group. URL: http://hostname:portnumber/api/limitgroups Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = save • Name = name of Limit Group Response: “Success” Possible Errors: • 400 Bad Request: No name provided for the Limit Group • 404 Not Found: Provided Limit Group name does not correspond to a Limit Group in the repository. • 500 Internal Server Error: An exception occurred within the Deadline code. Delete Limit Groups Deletes the Limit Groups for the provided Limit Group names. URL: http://hostname:portnumber/api/limitgroups Request Type: DELETE Message Body: N/A Response: JSON object containing the requested Limit Group/s Possible Errors: • 400 Bad Request: Must provide at least one Limit Group name to delete. • 500 Internal Server Error: An exception occurred within the Deadline code.

8.9.3 Limit Group Property Values

Values for some Limit Group properties are represented by numbers. Those properties and their possible values are listed below. Type (LimitGroupType) • 0 = General • 1 = JobSpecific • 2 = MachineSpecific StubLevel (currently not used) • 0 = Slave • 1 = Task

8.9. Limits 415 Deadline User Manual, Release 7.0.3.0

• 2 = Machine

8.10 Users

8.10.1 Overview

User requests can be used to set and retrieve information for one or many Users. User requests support GET, PUT, POST and DELETE request types. For more about these request types and their uses see the Request Formats and Responses documentation.

8.10.2 Request and Responses

List of possible requests for Users. Get User Names Gets all the User names. URL: http://hostname:portnumber/api/users?NamesOnly=true Request Type: GET Message Body: N/A Response: JSON object containing all the User names. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Users Gets all the User information for the provided User names. URL: http://hostname:portnumber/api/users?Name=oneOrMoreValidUserNames Request Type: GET Message Body: N/A Response: JSON object containing all the User information for the Users provided. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get All Users Gets all the Users. URL: http://hostname:portnumber/api/users Request Type: GET Message Body: N/A Response: JSON object containing all the User information for the Users provided. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Save User Saves the User Information provided. URL: http://hostname:portnumber/api/users Request Type: PUT/POST

416 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

Message Body: JSON object containing all the User Information to save. Response: “Success” for PUT, the User name and ID for POST. Possible Errors: • 400 Bad Request: – No user information provided, or – No User name provided, or – User info already exists (POST error only). • 500 Internal Server Error: An exception occurred within the Deadline code. Delete User Deletes the Users corresponding to the User names provided. URL: http://hostname:portnumber/api/users?Name=oneOrMoreValidUserNames Request Type: DELETE Message Body: N/A Response: “Success” Possible Errors: • 400 Bad Request: – No user information provided, or – No User names provided. • 500 Internal Server Error: An exception occurred within the Deadline code. Get User Group Names Gets all the User Group names. URL: http://hostname:portnumber/api/usergroups Request Type: GET Message Body: N/A Response: JSON object containing all the User Group names. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Users Names For User Group Gets all the User names for the User Group that corresponds to the provided User Group name. URL: http://hostname:portnumber/api/usergroups?Name=oneValidUserGroupName Request Type: GET Message Body: N/A Response: JSON object containing all the User names in the User Group. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get User Group Names For User

8.10. Users 417 Deadline User Manual, Release 7.0.3.0

Gets all the User Group names for the User corresponding to the provided User name. URL: http://hostname:portnumber/api/usergroups?User=onValidUserName Request Type: GET Message Body: N/A Response: JSON object containing all the User Group names for the User. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Add Users To User Groups Adds the Users corresponding to the User names provided to the User Groups corresponding with the User Group names provided. URL: http://hostname:portnumber/api/usergroups Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = add • User = the user name/s to add (May be an Array) • Group = the user group name/s to add to (May be an Array) Response: “Success” Possible Errors: • 400 Bad Request: Missing one or more of the required keys in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Command key does not contain a valid command string, or – None of the provided User names correspond to real Users, or – None of the provided User Group names correspond to real User Groups. Remove Users From User Groups Removes the Users corresponding to the User names provided from the User Groups corresponding with the User Group names provided. URL: http://hostname:portnumber/api/usergroups Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = remove • User = the user name/s to add (May be an Array) • Group = the user group name/s to add to (May be an Array) Response: “Success” Possible Errors:

418 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

• 400 Bad Request: Missing one or more of the required keys in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Command key does not contain a valid command string, or – None of the provided User names correspond to real Users, or – None of the provided User Group names correspond to real User Groups. Set Users For User Groups Sets the Users corresponding to the User names provided for the User Groups corresponding with the User Group names provided. URL: http://hostname:portnumber/api/usergroups Request Type: PUT Message Body: JSON object where the following keys are mandatory: • Command = set • User = the user name/s to add (May be an Array) • Group = the user group name/s to add to (May be an Array) Response: “Success” Possible Errors: • 400 Bad Request: Missing one or more of the required keys in the JSON object in the message body. • 500 Internal Server Error: – An exception occurred within the Deadline code, or – Command key does not contain a valid command string, or – None of the provided User names correspond to real Users, or – None of the provided User Group names correspond to real User Groups. Create New User Groups Creates and saves new user groups with the given names. URL: http://hostname:portnumber/api/usergroups Request Type: POST Message Body: JSON object where the following keys are mandatory: • Group = the user group name/s to create (array) Response: “Success” Possible Errors: • 400 Bad Request: Missing one or more of the required keys in the JSON object in the message body.

8.10. Users 419 Deadline User Manual, Release 7.0.3.0

• 500 Internal Server Error: An exception occurred within the Deadline code Delete User Groups Deletes a user groups with the given name. URL: http://hostname:portnumber/api/usergroups?Name=user+group+name+to+delete Request Type: DELETE Message Body: N/A Response: “Success” Possible Errors: • 400 Bad Request: Must provide a user group name to delete. • 500 Internal Server Error: An exception occurred within the Deadline code

8.11 Repository

8.11.1 Overview

Repository requests can be used to retrieve Repository information, such as directories or paths, using the GET request type. Repository requests can also be used for adding history entries for jobs, slaves or the repository using the POST request type. PUT and DELETE are not supported and sending a message of either of these types will result in a 501 Not Implemented error message. For more about these request types and their uses see the Request Formats and Responses documentation.

8.11.2 Requests and Responses

List of possible requests for the Repository. Get Root Directory URL: http://hostname:portnumber/api/repository?Directory=root Request Type: GET Message Body: N/A Response: JSON Object containing the root directory, or a message stating that the directory is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Bin Directory URL: http://hostname:portnumber/api/repository?Directory=bin Request Type: GET Message Body: N/A Response: JSON Object containing the bin directory, or a message stating that the directory is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.

420 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

• 404 Not Found: Requested Directory could not be found. Get Settings Directory URL: http://hostname:portnumber/api/repository?Directory=settings Request Type: GET Message Body: N/A Response: JSON Object containing the settings directory, or a message stating that the directory is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Events Directory URL: http://hostname:portnumber/api/repository?Directory=events Request Type: GET Message Body: N/A Response: JSON Object containing the events directory, or a message stating that the directory is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Custom Events Directory URL: http://hostname:portnumber/api/repository?Directory=customevents Request Type: GET Message Body: N/A Response: JSON Object containing the custom events directory, or a message stating that the directory is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Plugins Directory URL: http://hostname:portnumber/api/repository?Directory=plugins Request Type: GET Message Body: N/A Response: JSON Object containing the plugins directory, or a message stating that the directory is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Custom Plugins Directory

8.11. Repository 421 Deadline User Manual, Release 7.0.3.0

URL: http://hostname:portnumber/api/repository?Directory=customplugins Request Type: GET Message Body: N/A Response: JSON Object containing the custom plugins directory, or a message stating that the directory is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Scripts Directory URL: http://hostname:portnumber/api/repository?Directory=scripts Request Type: GET Message Body: N/A Response: JSON Object containing the scripts directory, or a message stating that the directory is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Custom Scripts Directory URL: http://hostname:portnumber/api/repository?Directory=customscripts Request Type: GET Message Body: N/A Response: JSON Object containing the custom scripts directory, or a message stating that the directory is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Auxiliary Path URL: http://hostname:portnumber/api/repository?AuxiliaryPath=job&JobID=aValidJobID Request Type: GET Message Body: N/A Response: JSON Object containing the auxiliary path for the provided job id, or a message stating that the path is not set. Possible Errors: • 400 Bad Request: – Must provide a Directory or an Auxiliary Path to find, or – Must provide a Job ID. • 404 Not Found: – Requested Directory could not be found, or

422 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

– Job ID provided does not correspond to a Job in the repository. Get Alternate Auxiliary Path URL: http://hostname:portnumber/api/repository?AuxiliaryPath=alternate Request Type: GET Message Body: N/A Response: JSON Object containing the alternate auxiliary path, or a message stating that the path is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Windows Alternate Auxiliary Path URL: http://hostname:portnumber/api/repository?AuxiliaryPath=windowsalternate Request Type: GET Message Body: N/A Response: JSON Object containing the windows alternate auxiliary path, or a message stating that the path is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Linux Alternate Auxiliary Path URL: http://hostname:portnumber/api/repository?AuxiliaryPath=linuxalternate Request Type: GET Message Body: N/A Response: JSON Object containing the linux alternate auxiliary path, or a message stating that the path is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found. Get Mac Alternate Auxiliary Path URL: http://hostname:portnumber/api/repository?AuxiliaryPath=macalternate Request Type: GET Message Body: N/A Response: JSON Object containing the mac alternate auxiliary path, or a message stating that the path is not set. Possible Errors: • 400 Bad Request: Must provide a Directory or an Auxiliary Path to find. • 404 Not Found: Requested Directory could not be found.

8.11. Repository 423 Deadline User Manual, Release 7.0.3.0

Get Maximum Priority URL: http://hostname:portnumber/api/maximumpriority Request Type: GET Message Body: N/A Response: JSON Object containing the Maximum Priority. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline. Get Plugin Names URL: http://hostname:portnumber/api/plugins Request Type: GET Message Body: N/A Response: JSON Object containing the plugin names Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline. Get Plugin Event Names URL: http://hostname:portnumber/api/plugins?EventNames=true Request Type: GET Message Body: N/A Response: JSON Object containing the plugin event names Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline. Get Database Connection String URL: http://hostname:portnumber/api/repository?DatabaseConnection Request Type: GET Message Body: N/A Response: The Database Connection string in the form of: (server:port,server:port...). Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline. Add Job History Entry URL: http://hostname:portnumber/api/repository Request Type: POST Message Body: JSON object where the following keys are mandatory: • Command = jobhistoryentry • JobID = The job id string. • Entry = The entry string to be added. Response: “Success” Possible Errors: • 400 Bad Request: – JSON object was not provided in message body or,

424 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

– The provided JSON object is missing some values. • 500 Internal Server Error: An exception occurred within the Deadline. Add Slave History Entry URL: http://hostname:portnumber/api/repository Request Type: POST Message Body: JSON object where the following keys are mandatory: • Command = slavehistoryentry • SlaveName = The slave name. • Entry = The entry string to be added. Response: “Success” Possible Errors: • 400 Bad Request: – JSON object was not provided in message body or, – The provided JSON object is missing some values. • 500 Internal Server Error: An exception occurred within the Deadline. Add Repository History Entry URL: http://hostname:portnumber/api/repository Request Type: POST Message Body: JSON object where the following keys are mandatory: • Command = repositoryhistoryentry • Entry = The entry string to be added. Response: “Success” Possible Errors: • 400 Bad Request: – JSON object was not provided in message body or, – The provided JSON object is missing some values. • 500 Internal Server Error: An exception occurred within the Deadline.

8.12 Pools

8.12.1 Overview

Pool requests can be used to set and retrieve information for one or many Pools. Pool requests support GET, PUT, POST and DELETE request types. For more about these request types and their uses see the Request Formats and Responses documentation.

8.12. Pools 425 Deadline User Manual, Release 7.0.3.0

8.12.2 Requests and Responses

List of possible requests for Pools Get Pool Names Gets Pool Names. URL: http://hostname:portnumber/api/pools Request Type: GET Message Body: N/A Response: JSON object containing all the Pool names. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Slaves For Pools Gets all the Slave names for the provided Pool names. URL: http://hostname:portnumber/api/pools?Pool=listOfOneOrMorePoolNames Request Type: GET Message Body: N/A Response: JSON object containing all Slave names that are in the provided Pools. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Add Pools Creates new Pools using the provided Pool names. URL: http://hostname:portnumber/api/pools Request Type: POST Message Body: JSON object that must contain the following keys: • Pool = pool name/s (May be an Array) Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Set Pools Removes all pools not provided and creates any provided pools that did not exist. URL: http://hostname:portnumber/api/pools Request Type: POST Message Body: JSON object that must contain the following keys: • Pool = pool name/s (May be an Array) • OverWrite = true Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Add Pools to Slaves Adds the provided Pools to the assigned pools for each provided Slave. For both Pools and Slaves, only the names are required. URL: http://hostname:portnumber/api/pools Request Type: PUT Message Body:

426 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

JSON object that must contain the following keys: • Slave = slave name/s (May be an Array) • Pool = pool name/s (May be an Array) Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Set Pools for Slaves Sets provided Pools as the assigned pools for each provided Slave. For both Pools and Slaves, only the names are required. URL: http://hostname:portnumber/api/pools Request Type: PUT Message Body: JSON object that must contain the following keys: • Slave = slave name/s (May be an Array) • ReplacementPool = pool name to replace the pools being purged • OverWrite = true Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Purge Pools Purges all obsolete pools using the provided replacement pool. URL: http://hostname:portnumber/api/pools Request Type: PUT Message Body: JSON object that must contain the following keys: • OverWrite = true • ReplacementPool = pool name to replace the pools being purged Response: “Success” Possible Errors: • 500 Internal Server Error: An exception occurred within the Deadline code, or • Replacement Pool name provided does not exist. Set and Purge Pools Sets the list of pools to the provided list of pool names, creating them if necessary. Purges all the obsolete pools using the provided replacement pool. URL: http://hostname:portnumber/api/pools Request Type: PUT Message Body: JSON object that must contain the following keys: • OverWrite = true • ReplacementPool = pool name to replace the pools being purged • Pool = the pool/s provided for setting, the replacement pool must be in this pool list or must be “none” (May be an Array)

8.12. Pools 427 Deadline User Manual, Release 7.0.3.0

Response: “Success” Possible Errors: • 500 Internal Server Error: An exception occurred within the Deadline code, or • Replacement Pool name provided does not exist. Add and Purge Pools Adds the list of provided pools, creating them if necessary. Purges all the obsolete pools using the provided replacement pool. URL: http://hostname:portnumber/api/pools Request Type: PUT Message Body: JSON object that must contain the following keys: • OverWrite = true • ReplacementPool = pool name to replace the pools being purged • Pool = the pool/s provided for adding (May be an Array) Response: “Success” Possible Errors: • 500 Internal Server Error: An exception occurred within the Deadline code, or • Replacement Pool name provided does not exist. Delete Pools Deletes all Pools with the provided Pool names. URL: http://hostname:portnumber/api/pools?Pool=listOfOneOrMorePoolNames Request Type: DELETE Message Body: N/A Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Delete Pools From Slaves Deletes all Pools from the Slaves’ list of pools. URL: http://hostname:portnumber/api/pools?Pool=listOfOneOrMorePoolNames&Slaves=ListOfOneOrMoreSlaveNames Request Type: DELETE Message Body: N/A Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

8.13 Groups

8.13.1 Overview

Group requests can be used to set and retrieve information for one or many Groups. Group requests support GET, PUT, POST and DELETE request types. For more about these request types and their uses see the Request Formats and Responses documentation.

428 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

8.13.2 Requests and Responses

List of possible requests for Groups Get Group Names Gets Group Names. URL: http://hostname:portnumber/api/groups Request Type: GET Message Body: N/A Response: JSON object containing all the Group names. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Get Slaves For Groups Gets all the Slave names for the provided Group names. URL: http://hostname:portnumber/api/groups?Group=listOfOneOrMoreGroupNames Request Type: GET Message Body: N/A Response: JSON object containing all Slave names that are in the provided Groups. Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Add Groups Creates new Groups using the provided Group names. URL: http://hostname:portnumber/api/groups Request Type: POST Message Body: JSON object that must contain the following keys: • Group = group name/s (May be an Array) Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Set Groups Removes all groups not provided and creates any provided groups that did not exist. URL: http://hostname:portnumber/api/groups Request Type: POST Message Body: JSON object that must contain the following keys: • Group = group name/s (May be an Array) • OverWrite = true Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Add Groups to Slaves Adds the provided Groups to the assigned groups for each provided Slave. For both Groups and Slaves, only the names are required. URL: http://hostname:portnumber/api/groups Request Type: PUT Message Body:

8.13. Groups 429 Deadline User Manual, Release 7.0.3.0

JSON object that must contain the following keys: • Slave = slave name/s (May be an Array) • Group = group name/s (May be an Array) Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Set Groups for Slaves Sets provided Groups as the assigned groups for each provided Slave. For both Groups and Slaves, only the names are required. URL: http://hostname:portnumber/api/groups Request Type: PUT Message Body: JSON object that must contain the following keys: • Slave = slave name/s (May be an Array) • ReplacementGroup = group name to replace the groups being purged • OverWrite = true Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Purge Groups Purges all obsolete groups using the provided replacement group. URL: http://hostname:portnumber/api/groups Request Type: PUT Message Body: JSON object that must contain the following keys: • OverWrite = true • ReplacementGroup = group name to replace the groups being purged Response: “Success” Possible Errors: • 500 Internal Server Error: An exception occurred within the Deadline code, or • Replacement Group name provided does not exist. Set and Purge Groups Sets the list of groups to the provided list of group names, creating them if necessary. Purges all the obsolete groups using the provided replacement group. URL: http://hostname:portnumber/api/groups Request Type: PUT Message Body: JSON object that must contain the following keys: • OverWrite = true • ReplacementGroup = group name to replace the groups being purged • Group = the group/s provided for setting, the replacement group must be in this group list or must be “none” (May be an Array)

430 Chapter 8. REST API Deadline User Manual, Release 7.0.3.0

Response: “Success” Possible Errors: • 500 Internal Server Error: An exception occurred within the Deadline code, or • Replacement Group name provided does not exist. Add and Purge Groups Adds the list of provided groups, creating them if necessary. Purges all the obsolete groups using the provided replacement group. URL: http://hostname:portnumber/api/groups Request Type: PUT Message Body: JSON object that must contain the following keys: • OverWrite = true • ReplacementGroup = group name to replace the groups being purged • Group = the group/s provided for adding (May be an Array) Response: “Success” Possible Errors: • 500 Internal Server Error: An exception occurred within the Deadline code, or • Replacement Group name provided does not exist. Delete Groups Deletes all Groups with the provided Group names. URL: http://hostname:portnumber/api/groups?Group=listOfOneOrMoreGroupNames Request Type: DELETE Message Body: N/A Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code. Delete Groups From Slaves Deletes all Groups from the Slaves’ list of groups. URL: http://hostname:portnumber/api/groups?Group=listOfOneOrMoreGroupNames&Slaves=ListOfOneOrMoreSlaveNames Request Type: DELETE Message Body: N/A Response: “Success” Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.

8.13. Groups 431 Deadline User Manual, Release 7.0.3.0

432 Chapter 8. REST API CHAPTER NINE

APPLICATION PLUGINS

9.1 3ds Command

9.1.1 Job Submission

You can submit jobs from within 3ds Max by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within 3ds Max, select the Deadline (3dsCmd) menu item that you created during the integrated submission script setup.

433 Deadline User Manual, Release 7.0.3.0

434 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The 3ds Command specific options are: • Force Build: You can force 32 bit or 64 bit rendering. • Path Config: Allows you to specify an alternate path file in the MXP format that the slaves can use to find bitmaps that are not found on the primary map paths. • Show Virtual Frame Buffer: Enable the virtual frame buffer during rendering. • Apply VideoPost To Scene: Whether or not to use VideoPost during rendering. • Continue On Errors: Enable to have the 3ds command line renderer ignore errors during rendering. • Enable Local Rendering: If enabled, the frames will be rendered locally, and then copied to their final network location. • Gamma Correction: Enable to apply gamma correction during rendering. • Split Rendering: Enable split rendering. Specify the number of strips to split the frame into, as well as the overlap you want to use. • VRay/Mental Ray DBR: Enable this option to offload a VRay or Mental Ray DBR render to Deadline. See the VRay/Mental Ray DBR section for more information. • Run Sanity Check On Submission: Check for scene problems during submission.

VRay/Mental Ray off-load DBR

You can offload a VRay or Mental Ray DBR job to Deadline by enabling the Distributed Rendering option in your VRay or Mental Ray settings, and by enabling the VRay/Mental Ray DBR checkbox in the submission dialog. With this option enabled, a job will be submitted with its task count equal to the number of Slaves you specify, and it will render the current frame in the scene file. The slave that picks up task 0 will be the “master”, and will wait until all other tasks are picked up by other slaves. Once the other tasks have been picked up, the “master” will update its local VRay or Mental Ray config file with the names of the machines that are rendering the other tasks. It will then start the distributed render by connecting to the other machines. Note that the render will not start until ALL tasks have been picked up by a slave. It is recommended to setup VRay DBR or Mental Ray DBR for 3ds Max and verify it is working correctly prior to submitting a DBR off-load job to Deadline. RTT (Render To Texture) is not supported with distributed bucket rendering. If running multiple Deadline slaves on one machine, having these 2 or more slaves both pick up a different DBR job concurrently as either master or slave is not supported. Notes for VRay DBR: • Ensure VRay is the currently assigned renderer in the 3ds Max scene file prior to submission. • You must have the Distributed Rendering option enabled in your VRay settings under the Settings tab. • Ensure “Save servers in the scene” (“Save hosts in the scene” in VRay v2) option in VRay distributed rendering settings is DISABLED as otherwise it will ignore the vray_dr.cfg file list! • Ensure “Max servers” value is set to 0. When set to 0 all listed servers will be used. • It is recommended to disable “Use local host” checkbox to reduce network traffic on the “master” machine, when using a large number of slaves (5+). If disabled, the “master” machine only organises the DBR process, sending rendering tasks to the Deadline slaves. This is particularly important if you intend to use the VRay v3+ “Transfer missing assets” feature. Note that Windows 7 OS has a limitation of a maximum of 20 other machines concurrently ‘connecting’ to the “master” machine.

9.1. 3ds Command 435 Deadline User Manual, Release 7.0.3.0

• VRay v3.00.0x has a bug in DBR when the “Use local host” is unchecked, it still demands a render node license. This is resolved in a newer version of VRay. Please contact Chaos Group for more information. • The slaves will launch the VRay Spawner executable found in the 3ds Max root directory. Do NOT install the VRay Spawner as a service on the master or slave machines. Additionally, Drive Mappings are unsupported when running as a service. • The vray_dr.cfg file in the 3ds Max’s plugcfg directory must be writeable so that the “master” machine can update it. This is typically located in the user profile directory, in which case it will be writeable already. • Chaos Group recommend that each machine to be used for DBR has previously rendered at least one other 3ds Max job prior to trying DBR on the same machine. • Ensure all slaves can correctly access any mapped drives or resolve all UNC paths to obtain any assets required by the 3ds Max scene file to render successfully. Use the Deadline Mapped Drives feature to ensure the necessary drive mappings are in place. • Default lights are not supported by Chaos Group in DBR mode and will not render. • Ensure you have sufficient VRay DR licenses if processing multiple VRay DBR jobs through Deadline concur- rently. Use the Deadline Limits feature to limit the number of licenses being used at any time. • Ensure the necessary VRay executables & TCP/UDP ports have been allowed to pass-through the Windows Firewall. Please consult the VRay user manual for specific information. • VRay does NOT currently support in 3ds Max the ability to dynamically add or remove DBR slaves to the currently processing DBR render once started on the “master” slave. Notes for Mental Ray DBR: • Ensure Mental Ray is the currently assigned renderer in the 3ds Max scene file prior to submission. • You must have the Distributed Render option enabled in your Mental Ray settings under the Processing tab. • The Mental Ray Satellite service must be running on your slave machines. It is installed by default during the 3ds Max installation. • The max.rayhosts file must be writeable so that the “master” machine can update it. It’s location is different for different versions of 3ds Max: • 2010 and earlier: It will be in the “mentalray” folder in the 3ds Max root directory. • 2011 and 2012: It will be in the “mentalimages” folder in the 3ds Max root directory. • 2013 and later: It will be in the “NVIDIA” folder in the 3ds Max root directory. • Ensure the “Use Placeholder Objects” checkbox is enabled in the “Translator Options” rollout of the “Process- ing” tab. When placeholder objects are enabled, geometry is sent to the renderer only on demand. • Ensure “Bucket Order” is set to “Hilbert” in the “Options” section of the “Sampling Quality” rollout of the “Renderer” tab. With Hilbert order, the sequence of buckets to render uses the fewest number of data transfers. • Contour shading is not supported with distributed bucket rendering. • Autodesk Mental Ray licensing in 3ds Max is restricted. Autodesk says “Satellite processors allow any owner of a 3ds Max license to freely use up to four slave machines (with up to four processors each and an unlimited number of cores) to render an image using distributed bucket rendering, not counting the one, two, or four processors on the master system that runs 3ds Max.” Mental Ray Standalone licensing can be used to go beyond this license limit. Use the Deadline Limits feature to limit the number of licenses being used at any time if required. • Ensure the necessary Mental Ray executables & TCP/UDP ports have been allowed to pass-through the Win- dows Firewall. Please consult the user manual for specific information.

436 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Sanity Check

The 3ds Command Sanity Check script defines a set of functions to be called to ensure that the scene submission does not contain typical errors like wrong render view and frame range settings, incorrect output path, etc. The Sanity Check is enabled by the Run Sanity Check Automatically Before Submission checkbox in the User Options group of controls in the Submit To Deadline (3dsmaxCmd) dialog. You can also run the Sanity Check automatically by clicking the Run Now! button.

The dialog contains the following elements: • The upper area (Error Report) lists the problems found in the current scene. • The lower area (Feedback Messages) lists the actions the Sanity Check performs and gives feedback to the user. The latest message is always on top. • Between the two areas, there is a summary text line listing the total number of errors and a color indicator of the current Sanity Check state. When red, the Sanity Check will not allow a job submission to be performed. The Error Report The left column of the Error Report displays a checkbox and the type of the error. The checkbox determines whether the error will be taken into account by the final result of the check. Currently, there are 3 types of errors: • FATAL: The error cannot be fixed automatically and requires manual changes to the scene itself. A job submis- sion with such error would be pointless. The state of the checkbox is ignored and assumed always checked. • Can Be Fixed: The error can be fixed automatically or manually. If the checkbox is active, the error contributes to the final result. If unchecked, the error is ignored and handled as a warning. • Warning: The problem might not require fixing, but could be of importance to the user. It is not taken into account by the final result (the state of the checkbox is ignored and assumed always unchecked). Repairing Errors

9.1. 3ds Command 437 Deadline User Manual, Release 7.0.3.0

Right-clicking an Error Message in the Error Report window will cause an associated repair function to be executed and/or a Report Message to be output in the Feedback Messages window. This difference was caused by the switch to DotNet controls which handle double-clicks as checked events, changing the checkbox state in front of the error instead. Updating the Error Report You can rerun/update the Sanity Check in one of the following ways: • Clicking the dialog anywhere outside of the two message areas will rerun the Sanity Check and update all messages. • Double-clicking any Message in the Feedback Messages window will rerun the Sanity Check and update all messages. • Reparing an error by double-clicking will also automatically rerun the Sanity Check • Pressing the Run Now! button in the Submit To Deadline dialog will update the Sanity Check. The following Sanity Checks have been implemented in the current version:

438 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Type Message Description Fix FA- The scene does not The scene is empty and should not be Load a valid scene or create/ merge TAL contain ANY sent to Deadline. objects, then try again. objects! Can The current Scene The scene has never been saved to a Double-click the error message to open a Be Name is Untitled. MAX file. Save As dialog and save to disk. Fixed While it is possible to submit an untitled scene to Deadline, it is not a good practice. Can The current view is The active viewport is not a camera Double-click the error message to open a Be NOT a camera. viewport. Select By Name dialog to pick a canera Fixed for the current viewport. Can The current The active viewport is a camera, but the Double-click the error message to enable Be Camera has NO camera has no MultiPass Blur MultiPass Motion Blur in the current Fixed MultiPass Motion enabled. NOTE that this check is camera. Blur performed for Default Scanline Renderer and Brazil only, as Mental Ray, Entropy, Renderman and VRay usually do not require MultiPass Motion Blur Can The current By default, the MPass MBlur Duration Double-click the error message to set the Be Camera’s value is 1.0. At our facility though, the MultiPass Motion Blur Duration of the Fixed MultiPass Motion default value is 0.5. current camera to 0.5. Blur Duration is NOT 0.5! Can The Render Time While it is ok to send a single frame to Double-click the error message to set the Be Output is set to Deadline, users are sending animations Render Time Output to “Active Time Fixed SINGLE FRAME! 99% of the cases. Segment:. The Render Dialog will open so you can check the options and set to Range or Frames instead. Can The Render Output While it is technically possible to save Double-click the error message to open Be Path appears to locally on each Slave, this is a bad idea - the Render Dialog and select a valid Fixed point at a LOCAL all Slaves should write their output to a path, then double-click again to retest. DRIVE! cental location on the network. Currently, disks C:, D: and E: are considered local and will be tested agains the output path. Can The Render Output The Name to be saved to ends with one, Double-click the error message to add an Be File Name ends two or three digits. Rendering to this file underscore _ to the end of the file name, Fixed with a DIGIT - name will append 4 more digits and for example z:\temp\test123.tga will be trailing numbers make loading sequential files in other changed to z:\temp\test123_.tga might fail. applications hard or impossible. This check is performed only when the type is not AVI or MOV and will ignore 4 trailing digits which will be replaced by 3dsmax correctly when rendering to sequential files. Warn-The Render Output No frames will be saved to disk. This is Double-click the error message to open ing Path is NOT allowed if you want to output render the Render Dialog and select a valid DEFINED! elements only. path, then double-click again to retest. Warn-The Render Output The file extension is set to an AVI or Double-click the error message to open ing is set to a MOVIE MOV format. the Render Dialog and select a single format. In the current version of Deadline, this frame output format, then double-click would result in a sequence of single again to retest. frame MOV files rendered by separate slaves. In the future, the behaviour might be changed to render a single MOV or 9.1. 3ds Command AVI file on a single slave as one Task. 439 Deadline User Manual, Release 7.0.3.0

This list will be extended to include future checks and can be edited by 3rd parties by adding new definitions and functions to the original script. Documentation on extending the script will be published later. Please email suggestions for enhancements and additional test cases to Deadline Support.

9.1.2 Plug-in Configuration

You can configure the 3ds Command plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the 3ds Command plug-in from the list on the left.

Render Executables • 3ds Max Cmd Executable: The path to the 3dsmaxcmd.exe executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Render Options • 3ds Cmd Verbosity Level: The verbose level (0-5). VRay DBR and Mental Ray Satellite Rendering • Use IP Addresses: If offloading a VRay DBR or Mental Ray Satellite render to Deadline, Deadline will update the appropriate config file with the host names of the machines that are running the VRay Spawner or Satellite service. If this is enabled, the IP addresses of the machines will be used instead.

9.1.3 Integrated Submission Script Setup

The following procedure describes how to install the integrated Autodesk 3ds Command submission script. The integrated submission script allows for submitting 3ds Command Line render jobs to Deadline directly from within

440 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0 the Max editing GUI. The integrated render job submission script and the following installation procedure has been tested with Max versions 2010 and later (including Design editions). Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts will not work. However, this bug has been addressed in 3ds Max 2012 Hotfix 1. If you cannot apply this patch, it means that you must submit your 3ds Max 2012 jobs from the Monitor. You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/3dsmax/Installers Manual Installation of the Submission Script • Copy [Repository]/submission/3dsCmd/Client/Deadline3dsCmdClient.mcr to [3ds Install Direc- tory]/MacroScripts. If you don’t have a MacroScripts folder in your 3ds Max install directory, check to see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsCmdClient.mcr file there if you do. • Copy [Repository]/submission/3dsmax/Client/SMTDSetup.ms to [3ds Max Install Direc- tory]/scripts/Startup/SMTDSetup.ms

9.1.4 FAQ

Which versions of Max are supported? The 3dsCommand plugin has been tested with 3ds Max 2010 and later (including Design editions). Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts will not work. However, this bug has been addressed in 3ds Max 2012 Hotfix 1. If you cannot apply this patch, it means that you must submit your 3ds Max 2012 jobs from the Monitor. When should I use the 3dsCommand plugin to render Max jobs instead of the original? This plugin should only be used when a particular feature doesn’t work with our normal 3dsmax plugin. For example, there was a time when using the 3dsCommand plugin was the only way to render scenes that made use of Vray’s Frame Buffer features. Note that the 3dsCommand plugin has less features in the submission dialog, and the error handling isn’t as robust. In addition, using 3dsCommand causes Max to take extra time to start up because 3dsmaxcmd.exe needs to be launched for each task, so renders might take a little extra time to complete. Is PSoft’s Pencil+ render effects plugin supported? Yes. Ensure the render output and render element output directory paths all exist on the file server before rendering commences. Please note at least Pencil+ v3.1 is required if you are using the alternative 3ds- max(Lightning) plugin in Deadline. Note, you will require the correct network render license from PSoft for each Deadline Slave, which is not the same as the full, workstation license of Pencil+.

9.1.5 Error Messages And Meanings

This is a collection of known 3ds Command error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.1. 3ds Command 441 Deadline User Manual, Release 7.0.3.0

9.2 3ds Max

9.2.1 Job Submission

You can submit jobs from within 3ds Max after installing the integrated Submit Max To Deadline (SMTD) script, or you can submit them from the Monitor. The instructions for installing the integrated SMTD script can be found further down this page. You can also submit jobs from within RPManager, the Render Pass Manager for 3ds Max. The instructions for installing the integrated submitter for RPManager can also be found further down the page. To submit from within 3ds Max, select the Deadline menu item that you created during the integrated submission script setup.

442 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.2. 3ds Max 443 Deadline User Manual, Release 7.0.3.0

If you are submitting from RPManager, just select the Network tab in RPManager after setting up the integrated submitter.

444 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The 3ds Max specific options are as follows. Scene File Submission Options • SAVE and Submit Current Scene File with the Job to the REPOSITORY: The current scene will be saved to a temporary file which will be sent with the job and will be stored in the Job’s folder in the Repository. • SAVE and Submit Current Scene File to GLOBAL NETWORK PATH: The current scene will be saved to a tem- porary file which will be copied to a Globally-Defined Alternative Network Location (e.g. dedicated file server). It is specified in [Repository]\submission\3dsmax\SubmitMaxToDeadline_Defaults.ini under [GlobalSettings] as the SubmitSceneGlobalBasePath key. It will be referenced by the Job via its path only. This will reduce the load on the Repository server. • SAVE and Submit Current Scene File to USER-DEFINED NETWORK PATH: The current scene will be saved to a temporary file which will be copied to a User-Defined Alternative Network Location (e.g. dedicated file

9.2. 3ds Max 445 Deadline User Manual, Release 7.0.3.0

server) stored as a local setting. It will be referenced by the Job via its path only. This will reduce the load on the Repository server. • DO NOT SAVE And Use Current Scene’s ORIGINAL NETWORK PATH: The current scene will NOT be saved, but the original file it was opened from will be referenced by the job. Assuming the file resides on a dedicated file server, this will speed up submission and rendering significantly, but current changes to the scene objects will be ignored. Sanity Check • Run Sanity Check Automatically Before Submission: This options forces Submit To Deadline to perform a Sanity Check before submitting the job. The Sanity Check is implemented as a separate set of scripted functions which can be enhanced by 3rd parties to meet specific studio needs. For more information, please refer to the Sanity Check section. • Run Sanity Check Now!: This button performs a Sanity Check without submitting a job. Any potential problems will be reported and can be fixed before actually submitting the job.

Job Tab

Job Options

• Render Task Chunk Size (Frames Per Task): Defines the number of Tasks (Frames) to be processed at once by a Slave. • Limit Number of Machines Rendering Concurrently: When checked, only the number of Slaves specified by the [Machines] value will be allowed to dequeue the job. When unchecked, any number of Slaves can work on the job. • Machines: Defines the number of Slaves that will be allowed to dequeue the job at the same time. • Out-Of-Order Rendering Every Nth Frame: Deadline will render every Nth frame based on the order selected in the drop down box. This option can be very useful when rendering long test animations - you can render a rough animation containing evey Nth frame early enough to detect any major issues before all frames have been rendered, or in cases where the major action happens in the end of the sequence, reverse the rendering order. • Log: Print Frame Sequence to the Log File, then double-click the feedback window to open the Log, Copy & Paste into Monitor > Job’s Frame Range.

446 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Render Preview Job First: When the checkbox is checked, two jobs will be submitted. The first job will have [PREVIEW FRAMES] added to its name, have a priority of 100, and will render only N frames based on the spinner’s value. The step will be calculated internally. If the spinner is set to 2, the first and the last frame will be rendered. With a value of 3, the first, middle and last frames will be rendered and so on. The second job will have [REST OF FRAMES] added to its name, and will be DEPENDENT on the first job and will start rendering once the preview frames job has finished. It will have the priority specified in the dialog, and render all frames not included in the preview job. • Priority+: Defines the Priority Increase for the PREVIEW job. For example if the Job Priority is set to 50 and this value is +5, the PREVIEW job will be submitted with Priority of 55 and the REST job with 50. • Dependent: When checked, the [REST OF FRAMES] Job will be made dependent on the [PREVIEW FRAMES] Job. When unchecked, the [REST OF FRAMES] Job will use the same dependencies (none or custom) as the [PREVIEW FRAMES] Job. • Frames: Defines the number of frames to be submitted as a PREVIEW job. The frames will be taken at equal intervals, for example a value of 2 will send the first and last frames, a value of 3 will send first, middle and last and so on. • Task Timeout: When checked, a task will be requeued if it runs longer than the specified time. This is useful when the typical rendering time of the job is known from previous submissions and will prevent stalling. • Enable Auto Task Timeout: Enables the Auto Task Timeout option. • Restart 3ds Max Between Tasks: When unchecked (default), 3ds Max will be kept in memory for the duration of the give job’s processing. This can reduce render time significantly as multiple Tasks can be rendered in sequence without reloading 3ds Max. When checked, 3ds Max will be restarted between tasks, thus releasing all memory and resetting the scene settings at cost of startup time. • Enforce Sequential Rendering: When checked, the Tasks will be processed in ascending order in order to reduce the performance hit from History-Dependent calculations, for example from particle systems. When unchecked, Tasks can be picked up by Slaves in any order. Recommended for Particle Rendering. • Submit Visible Objects Only: This option should be used at your own risk, as it is heavily dependent on the content of your scene. In most cases, it can be used to submit only a subset of the current scene to Deadline, skipping all hidden objects that would not render anyway. This feature will be automatically disabled if the current scene contains any Scene XRefs. The feature will create an incorrect file if any of the scene objects depend INDIRECTLY on hidden objects. • Concurrent Tasks: Defines the number of Tasks a single Slave can pick up at once (by launching multiple instances of 3ds Max on the same machine). Note that only one Deadline license will be used, but if rendering in Workstation Mode, multiple licenses of 3ds Max might be required. This is useful to maximize performance when the tasks don’t saturate all CPUs at 100% and don’t use up all memory. Typically, as a rule of thumb, this feature is NOT required as 3ds Max uses 100% of CPU’s during rendering. • Limit Tasks To Slave’s Task Limit: When checked, the number of Concurrent Tasks will be limited by the Slave’s Task Limit which is typically set to the number of available CPUs. For example, if ‘Concurrent Tasks’ is set to 16 but a Slave has 8 cores, only 8 concurrent tasks will be processed. • On Job Completion: Defines the action to perform when the job has completed rendering successfully. The job can be either left untouched, ARCHIVED to improve Repository performance, or automatically DELETED from the Repository. • Submit Job As Suspended: When checked, the Job will be submitted to the Repository as Suspended. It will require manual user intervention before becoming active. • Force 3ds Max Build: This drop-down list allows you to specify which build of 3ds Max (32 bit vs. 64 bit) to use when rendering the job. The list will be greyed out when running in 3ds Max 8 or earlier. • Make Force 3ds Max Build Sticky: When the checkbox is unchecked, the “Force 3ds Max Build” drop-down list selection will NOT persist between sessions and will behave as documented above in the “Default” section.

9.2. 3ds Max 447 Deadline User Manual, Release 7.0.3.0

When the checkbox is checked, the “Force 3ds Max Build” drop-down list selection will persist between ses- sions. For example, if you are submitting from a 64 bit build of 3ds Max to an older network consisting of only 32 bit builds, you can set the drop-down list to “32bit” once and lock that setting by checking “Make Force 3ds Max Build Sticky”. Job Dependencies When the checkbox is checked and one or more jobs have been selected from the multi-list box, the job will be set to Pending state and will start rendering when all jobs it depends on have finished rendering. Use the Get Jobs List button to populate the Job List and the Filter options with job data from the Repository.

RPM Pass Dependencies - Global Setup This option is ONLY available when submitting jobs from RPManager. If enabled, all passes that are submitted will be dependent on the passes selected in this rollout.

448 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Job Scheduling Enable job scheduling. See the Scheduling section of the Modifying Job Properties documentation for more informa- tion on the available options.

Job Failure Detection Override the job failure detection settings. See the Scheduling section of the Modifying Job Properties documentation for more information on the available options.

Render Tab

3ds Max Rendering

9.2. 3ds Max 449 Deadline User Manual, Release 7.0.3.0

• Use Alternate Plugin.ini file: By default, 3ds Max will launch using the default plugin.ini file in the local installation. You can use this option to select an alternative plugin.ini file to use instead. Alternative plugin.ini files can be added to [Repository]\plugins\3dsmax, and then they will appear in the drop down box in the submitter (see the Custom Plugin.ini File Creation section for more information). If you have the [Default] option selected, it’s the equivalent to having this feature disabled. • Fail On Black Frames: This option can be used to fail the render if a certain portion of the output image or its render elements is black. The Black Pixel % defines the minimum percentage of the image’s pixels that must be black in order for the image to be considered black. If each of RGB are all less than or equal to the Threshold, and the alpha is not between the Threshold and (1.0 - threshold), then the pixel is considered black. If the Threshold is greater than or equal to 0.5, then the alpha value has no effect. • Override Bitmap Pager Setting While Rendering: You can specify if you want the 3dsmax Bitmap Pager setting to be enabled or disabled. • Submit External Files With Scene: Whether the external files (bitmaps, xrefs etc.) will be submitted with the scene or not. • Merge Object XRefs: If object XRefs will be merged during submission. • Merge Scene XRefs: If scene XRefs will be merged during submission. • Force 3dsmax Workstation Mode (Uses up a 3dsmax License): Used mainly for testing and debugging purposes and should be left unchecked. When this option is unchecked, 3ds max will be started in Slave mode without the User Interface, which does not require a 3ds Max license. When checked, 3ds max will be launched in full Interactive mode and will require a license. Note that Workstation mode is set automatically when submitting MAXScripts to Deadline. • Enabled Silent Mode: This option is only available when Force Workstation Mode is checked. It can help suppress some popups that 3ds Max displays (although some popups like to ignore this setting). • Ignore Missing External File Errors: Missing external files could mean that the 3ds Max scene will render incorrectly (with textures missing etc). In some cases though, missing external files could be ignored- for example if the job is meant for test rendering only. If you want the job to fail if a missing external resource is detected, uncheck this checkbox.

450 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Ignore Missing UVW Errors: Missing UVWs could mean that some 3ds Max object would render incorrectly (with wrong texture mapping etc). In some cases though, missing UVWs could be ignored (for example if the job is meant for test rendering). • Ignore Missing XREF Errors: Missing XFEFs could mean that the 3ds Max scene cannot be loaded correctly. In some cases though, missing XFEFs could be ignored. If you want the job to fail if a missing XFEF message is detected at startup, keep this checkbox unchecked. • Ignore Missing DLL Errors: Missing DLLs could mean that the 3ds Max scene cannot be loaded or rendered correctly. In some cases though, missing DLLs could be ignored. If you want the job to fail if a missing DLL message is detected at startup, keep this checkbox unchecked. • Do Not Save Render Element Files: Enable this option to have Deadline skip the saving of Render Element image files during rendering (the elements themselves are still rendered). • Show Virtual Frame Buffer: If checked, the 3ds Max frame buffer will be displayed on the slave during render- ing. • Disable Progress Update Timeout: Enable this option to disable progress update checking. This is useful for renders like Fume FX sims that don’t constantly supply progress to 3dsmax. • Disable Frame Rendering: Enable this option to skip the rendering process. This is useful for renders like Fume FX sims that don’t actually require any rendering. • Restart Renderer Between Frames: This option can be used to force Deadline to restart the renderer after each frame to avoid some potential problems with specific renderers. Enabling this option has little to no impact on the actual render times. This feature should be ENABLED to resolve VRay renders where typically the beauty pass renders correctly but the Render Element’s are all black or perhaps seem to be swapped around. When enabled, the c++ Lightning plugin (unique to Deadline), will unload the renderer plugins and then reload them instantly. This has the effect of forcing a memory purge and helps to improve renderer stability, as well as ensure the lowest possible memory footprint. This can be helpful, when rendering close to the physical memory limit of a machine. Ensure this feature is DISABLED if you are sending FG/LC/IM caching map type jobs to the farm, as the renderer will get reset for each frame and the FG/LC/IM file(s) won’t get incrementally increased with the additional data per frame. • Disable Multipass Effects: Enable this option to skip over multipass effects if they are enabled for the camera to be rendered. • VRay/Mental Ray DBR: Enable this option to offload a VRay or Mental Ray DBR render to Deadline. See the VRay/Mental Ray DBR section for more information. • Job Is Interruptible: If enabled, this job will be cancelled if a job with higher priority is submitted to the queue. • Apply Custom Material To Scene: If checked, all geometry objects in the scene will be assigned one of the user-defined materials available in the drop down box. 3ds Max Gamma Options

• Gamma Correction: Enable to apply gamma correction during rendering. 3ds Max Pathing Options

9.2. 3ds Max 451 Deadline User Manual, Release 7.0.3.0

• Remove Filename Padding: If checked, the output filename will be (for example) “output.tga” instead of “out- put0000.tga”. This feature should only be used when rendering single frames. If you render a range of frames with this option checked, each frame will overwrite the previous existing frame. • Force Strict Output Naming: If checked, the output image filename is automatically modified to include the scene’s name. For example, if the scene name was myScene.max and the out- put image path was \\myServer\images\output.tga, the output image path would be changed to \\my- Server\images\myScene\myScene.tga. If the new output image path doesn’t exist, it is created by the 3dsmax plugin before rendering begins. • Purify Filenames: If checked, all render output including Render Elements will be purged of any illegal charac- ters as defined by “PurifyCharacterCodes” in “SubmitMaxToDeadline_Defaults.ini” file. • Force Lower-Case Filenames: If checked, all render output including Render Elements will be forced to have a lowercase filename. • Update Render Elements’ Paths: Each Render Element has its own output path which is independent from the render output path. When this option is unchecked, changing the output path will NOT update the Render Elements’ paths and the Elements could be written to the wrong path, possibly overwriting existing passes from a previous render. When checked, the paths will be updated to point at sub-folders of the current Render Output path with names based on the name and class of the Render Element. The actual file name will be left unchanged. • Also Update RE’s Filenames: If enabled, the Render Element file names will also be updated along with their paths. • Include RE Name in Paths: If enabled, the new Render Element files will be placed in a folder that contains the RE name. • Include RE Name in Filenames: If enabled, the new Render Element files will contains the RE name in the file name. • Include RE Type in Paths: If enabled, the new Render Element files will be placed in a folder that contains the RE type. • Include RE Type in Filenames: If enabled, the new Render Element files will contains the RE type in the file name. • Permanent RE Path Changes: When this checkbox is checked and the above option is also enabled, changes to the Render Elements paths will be permanent (in other words after the submission, all paths will point at the new locations created for the job). When unchecked, the changes will be performed temporarily during the submission, but the old path names will be restored right after the submission. • Rebuild Render Elements: If checked, Render Elements will be automatically removed and rebuilt during sub- mission to try and work around known 3dsMax issues.

452 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Include Local Paths With Job: (Thinkbox internal use only) Currently not hooked up to any functionality. • Use Alternate Path: Allows you to specify an alternate path file in the MXP format that the slaves can use to find bitmaps that are not found on the primary map paths. Render Output Autodesk ME Image Sequence (IMSQ) Creation

• Save File: Specify the render output. Note that this updates the 3ds Max Render Output dialog, and is meant as a convenience to update the output file. • Create Image Sequence (IMSQ) File: If checked, an Autodesk IMSQ file will be created from the output files at the output location. • Copy IMSQ File On Completion: If checked, the IMSQ file will be copied to the location specified in the text field.

Options Tab

User Options

• Enable Local Rendering: If checked, Deadline will render the frames locally before copying them over to the final network location. • One Cpu Per Task: Forces each task of the job to only use a single CPU. This can be useful when doing single threaded renders and the Concurrent Tasks setting is greater than 1. • Automatically Update Job Name When Scene File Name Changes: If checked, the Job Name setting in the submission dialog will automatically match the file name of the scene loaded. So if you load a new scene, the Job Name will change accordingly.

9.2. 3ds Max 453 Deadline User Manual, Release 7.0.3.0

• Override Renderer’s Low Priority Thread Option (Brazil r/s, V-Ray): When checked, the Low Priority Thread option of the renderers supporting this feature will be forced to false during the submission. Both Brazil r/s and V-Ray provide the feature to launch the renderer in a low priority thread mode. This is useful when working with multiple applications on a workstation and the rendering should continue in the background without eating all CPU resources. When submitting a job though, this should be generally disabled since we want all slaves to work at 100% CPU load. • Clear Material Editor In The Submitted File: Clears the material editor in the submitted file during submission. • Delete Empty State Sets In The Submitted File: Deletes any empty State Sets in the submitted file during submission and the State Sets dialog/UI will be reset. This fixes an ADSK bug when running 3dsMax as a service. • Warn about Missing External Files on Submission: When checked, a warning will be issued if the scene being submitted contains any missing external files (bitmaps etc.). Depending on the state of the ‘Ignore Missing External File Errors checkbox under the Render tab, such files might not cause the job to fail but could cause the result to look wrong. When unchecked, scenes with missing external files will be submitted without any warnings. • Warn about Copying External Files with Job only if the count is greater than 100 or the size is greater than 1024MB. Both values can be configured to a studio’s need. • Override 3ds Max Language: If enabled, you can choose a language to force during rendering. Export Renderer-Specific Advanced Settings

If this option is enabled for a specific renderer, you will be able to modify a variety of settings for that renderer after submission from the Monitor. To modify these settings from the Monitor, right-click on the job and select Modify Properties, then select the 3dsmax tab. Submission Timeouts

• Job Submission Timeout in seconds: This value spinner defines how many seconds to wait for the external Submitter application to return from the Job submission before stopping the attempt with a timeout message.

454 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Quicktime Submission Timeout in seconds: This value spinner defines how many seconds to wait for the external Submitter application to return from the Quicktime submission before stopping the attempt with a timeout message. • Data Collection Timeout in seconds: This value spinner defines how many seconds to wait for the external Submitter application to return from data collecting before stopping the attempt with a timeout message. Data collecting includes collecting Pools, Categories, Limit Groups, Slave Lists, Slave Info, Jobs etc.

Limits Tab

Blacklist/Whitelist Slaves Set the whitelist or blacklist for the job. See the Scheduling section of the Modifying Job Properties documentation for more information on the available options.

Limits

9.2. 3ds Max 455 Deadline User Manual, Release 7.0.3.0

Set the Limits that the job requires. See the Scheduling section of the Modifying Job Properties documentation for more information on the available options.

StateSets Tab

Select the State Sets you want to submit to Deadline. This option is only available in 3ds Max 2012 (Subscription Advantage Pack 1) and later.

456 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Shotgun/Draft Tab

Integration The available Integration options are explained in the Draft and Integration documentation.

Draft Post-Render Processing The available Draft/Integration options are explained in the Draft and Integration documentation.

Extra Info These are some extra arbitrary properties that can be set for the job. Note that some of these are reserved when enabling the Shotgun or Draft settings.

9.2. 3ds Max 457 Deadline User Manual, Release 7.0.3.0

Scripts Tab

Run Python Scripts

• Run Pre-Job Script: Specify the path to a Python script to execute when the job initially starts rendering. • Run Post-Job Script: Specify the path to a Python script to execute when the job finishes rendering. • Run Pre-Task Script: Specify the path to a Python script to execute before each task starts rendering. • Run Post-Task Script: Specify the path to a Python script to execute after each task finishes rendering.

458 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Run Maxscript Script

• Submit Script Job: This checkbox lets you turn the submission into a MAXScript job. When checked, the scene will NOT be rendered, instead the specified MAXScript code will be executed for the specified frames. Options that collide with the submission of a MAXScript Job like “Tile Rendering” and “Render Preview Job First” will be disabled or ignored. • Single Task: This checkbox lets you run the MAXScript Job on one slave only. When checked, the job will be submitted with a single task specified for frame 1. This is useful when the script itself will perform some operations on ALL frames in the scene, or when per-frame operations are not needed at all. When unchecked, the frame range specified in the Render Scene Dialog of 3ds Max will be used to create the corresponding number of Tasks. In this case, all related controls in the Job tab will also be taken into account. • Workstation Mode: This checkbox is a duplicate of the one under the Render tab (checking one will affect the other). MAXScript Jobs that require file I/O (loading and saving of 3ds Max files) or commands that require the 3ds Max UI to be present, such as manipulating the modifier stack, HAVE TO be run in Workstation mode (using up a 3ds Max license on the Slave). MAXScript Jobs that do not require file I/O or 3ds Max UI functionality can be run in Slave mode on any number of machines without using up 3ds Max licenses. • New Script From Template: This button creates a new MAXScript without any execution code, but with all the necessary template code to run a MAXScript Job on Deadline. • Pick Script: This button lets you select an existing script from disk to use for the MAXScript Job. It is advisable to use scripts created from the Template file using the “New Script From Template” button. • Edit MAXScript File: This button lets you open the current script file (if any) for editing. • Run Pre-Load Script: This checkbox lets you run a MAXScript specified in the text field below it BEFORE the 3ds Max scene is loaded for rendering by the Slave. • Run Post-Load Script: This checkbox lets you run a MAXScript specified in the text field below it AFTER the 3ds Max scene is loaded for rendering by the Slave.

9.2. 3ds Max 459 Deadline User Manual, Release 7.0.3.0

• Run Pre-Frame Script: This checkbox lets you run a MAXScript specified in the text field below it BEFORE the Slave renders a frame. • Run Post-Frame Script: This checkbox lets you run a MAXScript specified in the text field below it AFTER the Slave renders a frame. • Post-Submission Function Call: This field can be used by TDs to enter an arbitrary user-defined MAXScript Expression (NOT a path to a script!) which will be executed after the submission has finished. This can be used to trigger the execution of user-defined functions or to press a button in a 3rd party script. In the screenshot, the expression presses a button in a globally defined rollout which is part of an in-house scene management script. If you want to execute a multi-line script after each submission, you could enter fileIn “c:\temp\somescript.ms” in this field and the content of the specified file will be evaluated. The content of this field is sticky and saved in the local INI file - it will persist between sessions until replaced or removed manually. The MAXScript Job Template file is located in the Repository under \submission\3dsmax\MAXScriptJobTemplate.ms. When the button is pressed, a copy of the template file with a name pattern “MAXScrip- tJob_TheSceneName_XXXX.ms” will be created in the \3dsmax#\scripts\SubmitMaxToDeadline folder where XXXX is a random ID and 3dsmax# is the name of the 3ds Max root folder. The script file will open in 3ds Max for editing. You can add the code to be executed in the marked area and save to disk. The file name of the new template will be set as the current MAXScript Job file automatically. If a file name is already selected in the UI, you will be prompted about replacing it first. Deadline exposes an interface to MAXScript, which allows you to gather information about the job being rendered. See the Maxscript Interface documentation for the available functions and properties.

Tiles Tab

Tile & Region Rendering Options • Region Rendering Mode: This drop-down list controls the various rendering modes: • FULL FRAME Rendering, All Region Options DISABLED - this is the default mode of the Submitter. No region rendering will be performed and the whole image will be rendered.

– SINGLE FRAME, MULTI-REGION ‘Jigsaw’ Rendering - Single Job, Regions As Tasks - this mode allows one or more regions to be defined and rendered on one or more network machines. Each region can be optionally sub-divided to a grid of sub-regions to split between machines. The resulting fragments will then be combined to a new single image, or optionally composited over a previous version of the full image using DRAFT. This mode is recommended for large format single frame rendering. Note that the current frame specified by the 3ds Max TIME SLIDER will be rendered, regardless of the Render Dialog Time settings. – ANIMATION, MULTI-REGION ‘Jigsaw’ Rendering - One Job Per Region, Frames As Tasks - this mode allows one or more regions to be defined and rendered on one or more network machines. Each region can be optionally sub-divided to a grid of sub-regions to split between machines. Each region can be optionally animated over time by hand or by using the automatic tracking features. The resulting fragments from each frame will then be combined to a new single image, or optionally composited over a previous version of the full image using DRAFT. This mode is recommended for animated sequences where multiple small portions of the scene are changing relative to the previous render iteration. – SINGLE FRAME TILE Rendering - Single Job, Tiles As Tasks - this mode splits the final single image into multiple equally-sized regions (Tiles). Each Tile will be rendered by a different machine and the final image can be assembled either using DRAFT, or by the legacy command line Tile Assembler. This mode is recommended when the whole image needs to be re-rendered, but you want to split it between multiple machines. – ANIMATION, TILE Rendering - One Job Per Tile, Frames As Tasks -

460 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

– 3DS MAX REGION Rendering - Single Job, Frames As Tasks - • Cleanup Tiles After Assembly: When checked, the Tile image files will be removed after the final image has been assembled. Keep this unchecked if you intend to resubmit some of the tiles and expect them to re-assemble with the previous ones. • Pixel Padding: Default is 4 pixels. This is the number of pixels to be added on each side of the region or tile to ensure better stitching through some overlapping. Especially when rendering , it might be necessary to render tiles with significant overlapping to avoid artefacts. • Copy Draft Config Files To Output Folder: When checked, the configuration files for Draft Assembly jobs will be duplicated in the output folder(s) for archiving purposes. The actual assembling will be performed using the copies stored in the Job Auxiliary Files folder. Use this option if you want to preserve a copy next to the assembled frames even after the Jobs have been deleted from the Deadline Repository. • Draft Assembly Job Error On Missing Tiles: When unchecked, missing region or tile fragments will not cause errors and will simply be ignored, leaving either black background or the previous image’s pixels in the assem- bled image. When checked, the Assembly will only succeed if all requested input images have been found and actually put together. • Override Pool, Group, Priority for Assembly Job: When enabled, the Assembly Pool, Secondary Pool, Group and Priority settings will be used for the Assembly Job instead of the main job’s settings. The output formats that are supported by the Tile Assembler jobs are BMP, DDS, EXR, JPG, JPE, JPEG, PNG, RGB, RGBA, SGI, TGA, TIF, and TIFF. Jigsaw [Single-Frame | Animation] Multi-Region Rendering

This rollout contains all controls related to defining, managing and animating multiple regions for the’Jigsaw’ modes. The rollout title will change to include an ACTIVE: prefix and the “Single-Frame” or “Animation” token when the

9.2. 3ds Max 461 Deadline User Manual, Release 7.0.3.0 respective mode is selected in the Region Rendering Mode drop-down list (see above). • UPDATE List: Press this button to refresh the ListView. • LOAD/SAVE File...: Click to open a menu with the following options: – LOAD Regions From Disk Preset File...: Selecting this option will open a file open dialog and let you select a previously saved Regions Preset. Any existing regions will be replaced by the ones from the file. – MERGE Regions From Disk Preset File...: Selecting this option will open a file open dialog and let you select a previously saved Regions Preset. Any existing regions will be preserved, and the file regions will be appended to the end of the list. – SAVE Regions To Disk Preset File...: Only enabled if there are valid regions on the list. When selected, a file save dialog will open and let you save the current regions list to a disk preset for later loading or merging in the same or different projects. • GET From Camera...: If the current view is a Camera, a list of region definitions stored in the current view’s Camera will be displayed, allowing you to replace the current region list with the stored one. If the current view is not a Camera view, a warning message will be shown asking you to select a Camera view. If the current view’s Camera does not have any regions stored in it, nothing will happen. • STORE In Camera...: If the current view is a Camera, a list of region definitions stored in the current view’s Camera will be displayed, with the added option to Save New Preset... in a new “slot”. Alternatively, you can select any of the previously stored “slots” to override or update. The Notes text specified in the Notes: field below will be used to describe the preset. Also, additional information including the number of regions, the user, machine name, date and time and the MAX scene name will be stored with the preset. • Notes: Enter a description of the current Region set to be used when saving a Preset to disk or camera. When a preset is loaded, the field will display the notes stored with the preset. • ADD New Region: Creates a new region and appends it to the list. If objects are selected in the scene, the region will be automatically resized to frame the selection. If nothing is selected, the Region will be set to the full image size. • CREATE From...: Click to open a context menu with several multi-region creation options: • Create from SCENE SELECTION...: Select one or more objects in the scene and pick this option to create one region for each object in the selection. Note that regions might overlap or be completely redundant depending on the size and location of the selected objects - use the OPTIMIZE options below to reduce. • Create from TILES GRID...: Pick this option to create one region for each tile specified in the Tiles rollout. For example, if the Tiles in X is set to 4 and Tiles in Y is 3, 12 regions resembling the Tile Grid will be created. Note that once the regions are created, some of them can be merged together, others can be subdivided or split as needed to distribute regions with different content and size to different machines, providing more flexibility than the original Tiles mode. • Create from 3DS MAX REGION...: Create a region with the size specified by the 3ds Max Region gizmo. • OPTIMAL FILL Of Empty Areas: After the grid is created, two passes are performed: first a Horizontal Fill where regions are merged horizontally to produce wider regions, then a Vertical Fill merging regions with shared horizontal edges. The result is the least amount of tiles and equivalent to manually merging any neighbor tiles with shared edges in Maya Jigsaw. Thus, it is the top (recommended) option. • HORIZONTAL FILL Of Empty Areas: After creating the grid, a pass is performed over all regions to find neighbors sharing vertical edges. When two regions share an edge and the same top and bottom corner, they get merged. This is the equivalent to the Maya Jigsaw behavior, producing wider regions where possible, but leaving a lot of horizontal edges between tiles with the same width. • VERTICAL FILL Of Empty Areas: After creating the grid, a pass is performed to merge neighboring regions sharing a horizontal edge with the same left/right corners. The result is the opposite of the Horizontal Fill - a lot of tall regions.

462 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• GRID FILL Of Empty Areas: Takes the horizontal and vertical coordinates of all tiles and creates a grid that contains them all. No merging of regions will be performed. • OPTIMIZE Regions, Overlap Threshold > 25%: Compare the overlapping of all highlighted regions and if the overlapping area is more than 25% of the size of the smaller one of the two, combine the two regions to a single region. Repeat for all regions until no overlapping can be detected. • OPTIMIZE Regions, Overlap Threshold > 50%: Same as the previous option, but with a larger overlap thresh- old. • OPTIMIZE Regions, Overlap Threshold > 75%: Same as the previous options, but with an even larger overlap threshold. • Clone LEFT|RIGHT: Select a single region in the list and click with the Left Mouse Button to clone the region to the left, or Right Mouse Button to clone to the right. The height will be retained. The width will be clamped automatically if the new copy is partially outside the screen. • Clone UP|DOWN: Select a single region in the list and click with the Left Mouse Button to clone the region up, or Right Mouse Button to clone down. The width will be retained. The height will be clamped automatically if the new copy is partially outside the screen. • FIT to N Objects / Fit Padding Value: Highlight exactly one region in the list and select one or more objects in the scene, then click with the Left Mouse Button to perform a precise vertex-based Fit to the selection, or click with the Right Mouse Button to perform a quick bounding-box based Fit to the selection. Click the small button with the number to the right to select the Padding Percentage to use when fitting in either modes. • TRACK Region...: Left-click to open the Track dialog in Vertex-based mode for the currently selected region and scene objects. Right-click for Bounding Box-based mode. While you can switch the mode in the dialog itself, both the radio buttons and the Padding % values will be adjusted for faster access according to the mouse button pressed. • SELECT | INVERT: Left-click to highlight all regions on the list. Right-click to invert the current selection. • DELETE Regions: Click to delete the highlighted regions on the list. • SET Keyframe: Highlight one or more regions and click this button to set a keyframe with the current region settings at the current time. • << PREVIOUS Key: Click to change the time slider to the previous key of the highlighted region(s), if case there are such keys. • NEXT Key >>: Click to change the time slider to the next key of the highlighted region(s), if case there are such keys. • DELETE Keyframe: Click to delete the keys (if any) of the highlighted regions. If there is no key on the current frame, nothing will happen. Use in conjunction with Previous/Next Key navigation to delete actually existing keys. • Regions ListView: The list view is the main display of the current region settings. It provides several columns and a set of controls under each column for editing the values on the list: – On # column: Shows a checkbox to toggle a region on and off for rendering, and the index of the region. – X and Y columns: These two columns display the coordinates of the upper left corner of the Region. Note that internally the values are stored in relative screen coordinates, but in the list they are shown in current output resolution pixel coordinates for convenience. Changing the output resolution in the Render Setup dialog and pressing the UPDATE List button will recalculate the pixel coordinates accordingly. – Width and Height columns: These two columns display the width and height of the region in pixels. Like the upper left corner’s X and Y coordinates, they are stored internally as relative screen coordinates and are shown as pixels for convenience.

9.2. 3ds Max 463 Deadline User Manual, Release 7.0.3.0

– Tiles column: Each region can be subdivided additionally horizontally and vertically into a grid of sub- tiles, each to be rendered by a different network machine. This column shows the number of tiles of the region, default is 1x1. – Keys column: This column shows the number of animation keys recorded for the region. By default regions have no animation keys and will show – in the column unless animated manually or via the Tracking option. – Locked column: After Tracking, the region will be locked automatically to avoid accidental changes to its position and size. You can also lock the region manually if you want to prevent it from being moved accidentally. – Notes column: This column displays auto-generated or user-defined notes for each region. When a region is created, it might be given a name based on the object it was fitted to, the original region it was cloned or split from etc. You can enter descriptive notes to explain what every region was meant for. • UNDO... / REDO...:: Most operations performed in the Mutli-Region rollout will create undo records automat- ically. The Undo buffer is saved to disk in a similar form as the presets, and you can undo or redo individual steps by left-clicking the button, or multiple steps at once by right-clicking and selecting from a list. • HOLD: Not all operations produce a valid undo record. If you feel that the next operation might be dangerous, you can press the HOLD button to force the creation of an Undo record at the current point to ensure you can return back to it in case the following operations don’t produce desirable results. • SPLIT To Tiles: Pressing this button will split the highlighted region to new regions according to the Tiles settings, assuming they are higher than 1x1 subdivisions. You can use this feature together with the Tiles controls to quickly produce a grid of independent regions from a single large region. For example, if you create a single region with no scene selection, it will have the size of the full screen. Enter Tile values like 4 and 3 and hit the SPLIT To Tiles to produce a grid of 12 regions. • MERGE Selected: Highlight two or more regions to merge them into a single region. The regions don’t have to necessarily touch or overlap - the minimum and maximum extents of all regions will be found and they will be replaced by a single region with that position and size. • Summary Field: This field displays information about the number of regions and sub-regions (tiles), the number of pixels to be rendered by these regions, and the percentage of pixels that would be rendered compared to the full image. • Assemble Over... drop-down list: This list provides the assembly compositing options: – Assemble Over EMPTY Background: The regions will be assembled into a new image using a black empty background with zero alpha. – Compose Over PREVIOUS OUTPUT Image: The regions will be assembled over the previously rendered (or assembled) image matching the current output filename (if it exists). If such an image does not exist, the regions will be assembled over an empty background. – Compose Over CUSTOM SINGLE Image: The regions will be assembled over a user-defined bitmap specified with the controls below. The same image will be used on all frames if an animation is rendered. – Compose Over CUSTOM Image SEQUENCE: The regions will be assembled over a user-defined image sequence specified with the controls below. Each frame will use the corresponding frame from the image sequence. • Pick Custom Background Image: Press this button to select the custom image or image sequence to be used in the last compositing modes above. Make sure you specify a network location that can be accessed by the Draft jobs on Deadline performing the Assembly! [Single-Frame | Animation] Tile Rendering

464 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Tiles In X / Tiles In Y: These values specify the number of tiles horizontally and vertically. The total number of tiles (and jobs) to be rendered is calculated as X*Y and is displayed in the UI. • Show Tiles In Viewport: Enables the tile display gizmo. • Tile Pixel Padding: This value defines the number of pixels to overlap between tiles. By default it is set to 0, but when rendering Global Illumination, it might be necessary to render tiles with significant overlapping to avoid artifacts. • Re-Render User-Defined Tiles: When checked, only user-defined tiles will be submitted for re-rendering. Use the [Specify Tiles To Re-render...] check-button to open a dialog and select the tiles to be rendered. • Specify Tiles To Re-render: When checked, a dialog to select the tiles to be re-rendered will open. To close the dialog, either uncheck the button or press the [X] button on the dialog’s title bar. • Enable Blowup Mode: If enabled, tile rendering will work by zooming in on the region and rendering it at a smaller resolution. Then that region is blown up to bring it to the correct resolution. This has been known to help save memory when rendering large high resolution images. • Submit All Tiles As A Single Job: By default, a separate job is submitted for each tile (this allows for tile rendering of a sequence of frames). For easier management of single frame tile rendering, you can choose to submit all the tiles as a single job. • Submit Dependent Assembly Job: When rendering a single tile job, you can also submit a dependent assembly job to assemble the image when the main tile job completes. • Use Draft For Assembly: If enabled, Draft will be used to assemble the images. Note that you’ll need a Draft license from Thinkbox. Region Rendering

9.2. 3ds Max 465 Deadline User Manual, Release 7.0.3.0

When enabled, only the specified region will be rendered and depending on the region type selected, it can be cropped or blown up as well. If the Enable Distributed Tiles Rendering checkbox is checked, it will be unchecked. This option REPLACES the “Crop” option in the Render mode drop-down list in the 3ds Max UI. In other words, the 3ds Max option does not have to be selected for Region Rendering to be performed on Deadline. The region can be specified either using the CornerX, CornerY, Width and Height spinners, or by getting the current region from the active viewport. To do so, set the Render mode drop-down list to either Region or Crop, press the Render icon and drag the region marker to specify the desired size. Then press ESC to cancel and press the Get Region From Active View to capture the new values.

Misc Tab

Quicktime Generation From Rendered Frame Sequence

Create a Quicktime movie from the frames rendered by a 3ds Max job. See the Quicktime documentation for more information on the available options. Render To Texture

This option enables texture baking through Deadline. Use the Add, Remove, and Clear All buttons to add and remove

466 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0 objects from the list of objects to bake. * One Object Per Task: If enabled, each RTT object will be allocated to an individual task thereby allowing multiple machines to carry out RTT processing simultaneously. Batch Submission

• Use Data from 3ds Max Batch Render: This checkbox enables Batch Submission using the 3ds Max Batch Render dialog settings. If checked, a single MASTER job will be sent to Deadline which in turn will “spawn” all necessary BATCH jobs. • Open Dialog: This button opens the 3ds Max Batch Render dialog in Version 8 and higher. • Update Info: This button reads the 3ds Max Batch Render dialog settings and displays the number of enabled vs. defined Views.

Sanity Check

The 3ds Command Sanity Check script defines a set of functions to be called to ensure that the scene submission does not contain typical errors like wrong render view and frame range settings, incorrect output path, etc. The Sanity Check is enabled by the Run Sanity Check Automatically Before Submission checkbox in the User Options group of controls in the Submit To Deadline (3dsmax) dialog. You can also run the Sanity Check automatically by clicking the Run Now! button.

9.2. 3ds Max 467 Deadline User Manual, Release 7.0.3.0

The dialog contains the following elements: • The upper area (Error Report) lists the problems found in the current scene. • The lower area (Feedback Messages) lists the actions the Sanity Check performs and gives feedback to the user. The latest message is always on top. • Between the two areas, there is a summary text line listing the total number of errors and a color indicator of the current Sanity Check state. When red, the Sanity Check will not allow a job submission to be performed. The Error Report The left column of the Error Report displays a checkbox and the type of the error. The checkbox determines whether the error will be taken into account by the final result of the check. Currently, there are 3 types of errors: • FATAL: The error cannot be fixed automatically and requires manual changes to the scene itself. A job sub- mission with such an error would be pointless. The state of the checkbox is ignored and considered always checked. • Can Be Fixed: The error can be fixed automatically or manually. If the checkbox is active, the error contributes to the final result. If unchecked, the error is ignored and handled as a warning. • Warning: The problem might not require fixing, but could be of importance to the user. It is not taken into account by the final result (the state of the checkbox is ignored and considered always unchecked). Repairing Errors Right-clicking an Error Message in the Error Report window will cause an associated repair function to be executed and/or a Report Message to be output in the Feedback Messages window. This difference was caused by the switch to DotNet controls which handle double-clicks as checked events, changing the checkbox state in front of the error instead. Updating the Error Report You can rerun/update the Sanity Check in one of the following ways:

468 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Clicking the dialog anywhere outside of the two message areas will rerun the Sanity Check and update all messages. • Double-clicking any Message in the Feedback Messages window will rerun the Sanity Check and update all messages. • Reparing an error by double-clicking will also automatically rerun the Sanity Check • Pressing the Run Now! button in the Submit To Deadline dialog will update the Sanity Check. The following Sanity Checks have been implemented in the current version:

9.2. 3ds Max 469 Deadline User Manual, Release 7.0.3.0

Type Message Description Fix FA- The scene does not The scene is empty and should not be Load a valid scene or create/ merge TAL contain ANY sent to Deadline. objects, then try again. objects! Can The current Scene The scene has never been saved to a Double-click the error message to open a Be Name is Untitled. MAX file. Save As dialog and save to disk. Fixed While it is possible to submit an untitled scene to Deadline, it is not a good practice. Can The current view is The active viewport is not a camera Double-click the error message to open a Be NOT a camera. viewport. Select By Name dialog to pick a canera Fixed for the current viewport. Can The current The active viewport is a camera, but the Double-click the error message to enable Be Camera has NO camera has no MultiPass Motion Blur MultiPass Motion Blur in the current Fixed MultiPass Motion enabled. NOTE that this check is camera. Blur performed for Default Scanline Renderer and Brazil only, as Mental Ray, Entropy, Renderman and VRay usually do not require MultiPass Motion Blur Can The current By default, the MPass MBlur Duration Double-click the error message to set the Be Camera’s value is 1.0. At our facility though, the MultiPass Motion Blur Duration of the Fixed MultiPass Motion default value is 0.5. current camera to 0.5. Blur Duration is NOT 0.5! Can The Render Time While it is ok to send a single frame to Double-click the error message to set the Be Output is set to Deadline, users are sending animations Render Time Output to “Active Time Fixed SINGLE FRAME! 99% of the cases. Segment:. The Render Dialog will open so you can check the options and set to Range or Frames instead. Can The Render Output While it is technically possible to save Double-click the error message to open Be Path appears to locally on each Slave, this is a bad idea - the Render Dialog and select a valid Fixed point at a LOCAL all Slaves should write their output to a path, then double-click again to retest. DRIVE! cental location on the network. Currently, disks C:, D: and E: are considered local and will be tested agains the output path. Can The Render Output The Name to be saved to ends with one, Double-click the error message to add an Be File Name ends two or three digits. Rendering to this file underscore _ to the end of the file name, Fixed with a DIGIT - name will append 4 more digits and for example z:\temp\test123.tga will be trailing numbers make loading sequential files in other changed to z:\temp\test123_.tga might fail. applications hard or impossible. This check is performed only when the type is not AVI or MOV and will ignore 4 trailing digits which will be replaced by 3dsmax correctly when rendering to sequential files. Warn-The Render Output No frames will be saved to disk. This is Double-click the error message to open ing Path is NOT allowed if you want to output render the Render Dialog and select a valid DEFINED! elements only. path, then double-click again to retest. Warn-The Render Output The file extension is set to an AVI or Double-click the error message to open ing is set to a MOVIE MOV format. the Render Dialog and select a single format. In the current version of Deadline, this frame output format, then double-click would result in a sequence of single again to retest. frame MOV files rendered by separate slaves. In the future, the behaviour might be changed to render a single MOV or 470AVI file on a single slave as one Task. Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

This list will be extended to include future checks and can be edited by 3rd parties by adding new definitions and functions to the original script. Documentation on extending the script will be published later. Please email suggestions for enhancements and additional test cases to Deadline Support.

9.2.2 VRay/Mental Ray off-load DBR

You can offload a VRay or Mental Ray DBR job to Deadline by enabling the Distributed Rendering option in your VRay or Mental Ray settings, and by enabling the VRay/Mental Ray DBR checkbox in the submission dialog (under the Render tab). With this option enabled, a job will be submitted with its task count equal to the number of Slaves you specify, and it will render the current frame in the scene file. The slave that picks up task 0 will be the “master”, and will wait until all other tasks are picked up by other slaves. Once the other tasks have been picked up, the “master” will update its local VRay or Mental Ray config file with the names of the machines that are rendering the other tasks. It will then start the distributed render by connecting to the other machines. Note that the render will not start until ALL tasks have been picked up by a slave. It is recommended to setup VRay DBR or Mental Ray DBR for 3ds Max and verify it is working correctly prior to submitting a DBR off-load job to Deadline. RTT (Render To Texture) is not supported with distributed bucket rendering. If running multiple Deadline slaves on one machine, having these 2 or more slaves both pick up a different DBR job concurrently as either master or slave is not supported. Notes for VRay DBR: • You MUST have the Force Workstation Mode option enabled in the submission dialog (under the Render tab). This means that the “master” will use up a 3ds Max license. If you don’t want to use a 3ds Max license, you can submit to the 3ds Command plugin instead. • Ensure VRay is the currently assigned renderer in the 3ds Max scene file prior to submission. • You must have the Distributed Rendering option enabled in your VRay settings under the Settings tab. • Ensure “Save servers in the scene” (“Save hosts in the scene” in VRay v2) option in VRay distributed rendering settings is DISABLED as otherwise it will ignore the vray_dr.cfg file list! • Ensure “Max servers” value is set to 0. When set to 0 all listed servers will be used. • It is recommended to disable “Use local host” checkbox to reduce network traffic on the “master” machine, when using a large number of slaves (5+). If disabled, the “master” machine only organises the DBR process, sending rendering tasks to the Deadline slaves. This is particularly important if you intend to use the VRay v3+ “Transfer missing assets” feature. Note that Windows 7 OS has a limitation of a maximum of 20 other machines concurrently ‘connecting’ to the “master” machine. • VRay v3.00.0x has a bug in DBR when the “Use local host” is unchecked, it still demands a render node license. This is resolved in a newer version of VRay. Please contact Chaos Group for more information. • The slaves will launch the VRay Spawner executable found in the 3ds Max root directory. Do NOT install the VRay Spawner as a service on the master or slave machines. Additionally, Drive Mappings are unsupported when running as a service. • The vray_dr.cfg file in the 3ds Max’s plugcfg directory must be writeable so that the “master” machine can update it. This is typically located in the user profile directory, in which case it will be writeable already. • Chaos Group recommend that each machine to be used for DBR has previously rendered at least one other 3ds Max job prior to trying DBR on the same machine. • Ensure all slaves can correctly access any mapped drives or resolve all UNC paths to obtain any assets required by the 3ds Max scene file to render successfully. Use the Deadline Mapped Drives feature to ensure the necessary drive mappings are in place. • Default lights are not supported by Chaos Group in DBR mode and will not render.

9.2. 3ds Max 471 Deadline User Manual, Release 7.0.3.0

• Ensure you have sufficient VRay DR licenses if processing multiple VRay DBR jobs through Deadline concur- rently. Use the Deadline Limits feature to limit the number of licenses being used at any time. • Ensure the necessary VRay executables & TCP/UDP ports have been allowed to pass-through the Windows Firewall. Please consult the VRay user manual for specific information. • VRay does NOT currently support in 3ds Max the ability to dynamically add or remove DBR slaves to the currently processing DBR render once started on the “master” slave. Notes for Mental Ray DBR: • Ensure Mental Ray is the currently assigned renderer in the 3ds Max scene file prior to submission. • You must have the Distributed Render option enabled in your Mental Ray settings under the Processing tab. • The Mental Ray Satellite service must be running on your slave machines. It is installed by default during the 3ds Max 2014 or earlier installation. Note that ADSK changed this default from 3dsMax 2015 onwards and the Mental Ray Satellite Service is installed as part of the install process but is NOT automatically started, so you will need to start it manually the very first time. See this AREA blog post about Distributed Bucket Rendering in 3ds Max 2015 • The max.rayhosts file must be writeable so that the “master” machine can update it. It’s location is different for different versions of 3ds Max:

– 2010 and earlier: It will be in the “mentalray” folder in the 3ds Max root directory. – 2011 and 2012: It will be in the “mentalimages” folder in the 3ds Max root directory. – 2013 and later: It will be in the “NVIDIA” folder in the 3ds Max root directory. • Ensure the “Use Placeholder Objects” checkbox is enabled in the “Translator Options” rollout of the “Process- ing” tab. When placeholder objects are enabled, geometry is sent to the renderer only on demand. • Ensure “Bucket Order” is set to “Hilbert” in the “Options” section of the “Sampling Quality” rollout of the “Renderer” tab. With Hilbert order, the sequence of buckets to render uses the fewest number of data transfers. • Contour shading is not supported with distributed bucket rendering. • Autodesk Mental Ray licensing in 3ds Max is restricted. Autodesk says “Satellite processors allow any owner of a 3ds Max license to freely use up to four slave machines (with up to four processors each and an unlimited number of cores) to render an image using distributed bucket rendering, not counting the one, two, or four processors on the master system that runs 3ds Max.” Mental Ray Standalone licensing can be used to go beyond this license limit. Use the Deadline Limits feature to limit the number of licenses being used at any time if required. • Ensure the necessary Mental Ray executables & TCP/UDP ports have been allowed to pass-through the Win- dows Firewall. Please consult the Autodesk 3ds Max user manual for specific information.

9.2.3 Plug-in Configuration

You can configure the 3dsmax plugin settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the 3dsmax plugin from the list on the left.

472 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

3ds Max Render Executables • 3ds Max Executable: The path to the 3ds Max executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. 3ds Max Design Render Executables • 3ds Max Design Executable: The path to the 3ds Max Design executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Render Options • Alternate Plugin ini File: Location of alternate plugin ini file. • Fail On Existing 3dsmax Process: Prevent deadline from rendering when 3dsmax is already open. • Run Render Sanity Check: If enabled, Deadline will do a quick sanity check with 3dsmaxcmd.exe prior to rendering to ensure 3dsmax is properly set up for network rendering. • Kill ADSK Comms Center Process: If enabled, Deadline will kill Autodesk Communications Center process if it’s running during network rendering. • Disable Saving Output To Alternate File Name: If enabled, Deadline won’t try to rename the output file(s) if it is unable to save the output to its default file name. Timeouts • Timeout For Loading 3dsmax: Maximum time for 3dsmax to load, in seconds. • Timeout For Starting A Job: Maximum time for 3dsmax to start a job, in seconds. • Timeout For Progress Updates: Maximum time before progress update times out, in seconds. VRay DBR and Mental Ray Satellite Rendering

9.2. 3ds Max 473 Deadline User Manual, Release 7.0.3.0

• Use IP Addresses: If offloading a VRay DBR or Mental Ray Satellite render to Deadline, Deadline will update the appropriate config file with the host names of the machines that are running the VRay Spawner or Satellite service. If this is enabled, the IP addresses of the machines will be used instead.

9.2.4 Firewall Considerations

Here is an non-exhaustive list of specific 3dsMax related application executables which should be granted access to pass through the Windows Firewall for all applicable policy scopes (Windows - domain, private, public) and both inbound & outbound rules (where is the 3dsMax install directory): • /3dsmax.exe • /3dsmaxcmd.exe • /maxadapter.adp.exe • /vrayspawnerYYYY.exe where YYYY is the yearDate such as “2015” (Only applicable if VRay installed) • /python/python.exe • /python/pythonw.exe Autodesk Communication Center (InfoCenter) Path (dependent on 3dsMax version being used): • 3dsMax 2009-2010: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr1.exe • 3dsMax 2011: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr\lib\WSCommCntr2.exe • 3dsMax 2012: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr3\lib\WSCommCntr3.exe • 3dsMax 2013-2015: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr4\lib\WSCommCntr4.exe It is recommended to always start 3dsMax for the very first time with Administrative rights to ensure the application is fully initialized correctly. This can also be achieved by right-clicking the 3dsmax.exe application and selecting “Run as administrator”.

9.2.5 Integrated Submission Script Setup

The following procedures describe how to install the integrated Autodesk 3ds Max submission script. The integrated submission script allows for submitting 3ds Max render jobs to Deadline directly from within the 3ds Max editing GUI. The integrated render job submission script and the following installation procedure has been tested with 3ds Max versions 2010 and later (including Design editions). You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/3dsmax/Installers Manual Installation of the Submission Script • Copy [Repository]/submission/3dsmax/Client/Deadline3dsMaxClient.mcr to [3ds Max Install Direc- tory]/MacroScripts. If you don’t have a MacroScripts folder in your 3ds Max install directory, check to see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsMaxClient.mcr file there if you do. • Copy [Repository]/submission/3dsmax/Client/SMTDSetup.ms to [3ds Max Install Direc- tory]/scripts/Startup/SMTDSetup.ms. • Launch 3ds Max, and find the new Deadline menu.

474 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

RPManager Script Setup

To install the 3ds Max integrated submission script in RPManager, just follow these steps: • Create a Deadline7 folder in [maxdir]\scripts\RPManager • Copy [repo]\submission\3dsmaxRPM\Client\Deadline3dsMaxRPMClient.ms to [maxdir]\scripts\RPManager\Deadline7\Deadline3dsMaxRPMClient.ms • In RPManager, select Customize -> Preferences to open the preferences window • In the Network Manager section, choose Custom Submit in the drop down, and then choose the Dead- line3dsMaxRPMClient.ms script you copied over

• Click OK to close the preferences, and then click on the Network tab to see the submitter

9.2.6 Advanced Features For Technical Directors

MAXScript Interface

When running a MAXScript job through Deadline, there is an interface called DeadlineUtil which you can use to get information about the job being rendered. The API for the interface between MAXScript and Deadline is as follows: Functions

9.2. 3ds Max 475 Deadline User Manual, Release 7.0.3.0

Function Description string GetAuxFilename( int index) Gets the file with the given index that was submitted with the job. string GetJobInfoEntry( string key ) Gets a value from the plugin info file that was submitted with the job, and returns an empty string if the key doesn’t exist. string GetOutputFilename( int Gets the output file name for the job at the given index. index ) string GetSubmitInfoEntry( string Gets a value from the job info file that was submitted with the job, and key ) returns an empty string if the key doesn’t exist. int If the job info entry is an array, this gets the number of elements in that GetSubmitInfoEntryElementCount( array. string key ) string GetSubmitInfoEntryElement( If the job info entry is an array, this gets the element at the given index. int index, string key ) void FailRender( string message ) Fails the render with the given error message. void LogMessage( string message ) Logs the message to the slave log. void SetProgress( float percent ) Sets the progress of the render in the slave UI. void SetTitle( string title ) Sets the render status message in the slave UI. void WarnMessage( string message Logs a warning message to the slave log. ) Properties Property Description int CurrentFrame Gets the current frame. int CurrentTask Gets the current task ID. string JobsDataFolder Gets the local folder on the slave where the Deadline job files are copied to. string PluginsFolder Gets the local folder on the slave where the Deadline plugin files are copied to. string SceneFileName Gets the file name of the loaded 3ds Max scene. string SceneFilePath Gets the file path of the loaded 3ds Max scene.

Submitter’s Sticky Settings and Factory Defaults

The latest version of the Submit Max To Deadline script allows the user to control the stickiness of most User Interface controls and, in the case of non-sticky settings, the defaults to be used. In previous versions of SMTD, both the stickiness and the defaults were hard-coded. Overview Two INI files located in the Repository in the folder \submission\3dsmax control the stickiness and the defaults: • SubmitMaxToDeadline_StickySettings.ini - this file can be used to define which controls in the SMTD UI will be stored locally in an INI file (“sticky”) and which will be reset to defaults after a restart of the Submitter. • SubmitMaxToDeadline_Defaults.ini - this file can be used to define the default settings of those controls set to non-sticky in the other file. In addition, a local copy of the SubmitMaxToDeadline_StickySettings.ini file can be saved in a user’s application data folder. This file will OVERRIDE the stickiness settings in the Repository and can contain a sub-set of the definitions in the global file. Details When SMTD is initializing, it will perform the following operations: 1. The SMTDSettings Struct will be initialized to the factory defaults of all settings. 2. Each UI setting will be initially assumed to be sticky. 3. The global Stickiness definition file is searched for a key matching the current UI setting’s name.

476 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• If the key is set to “false”, the setting is not sticky. • If the key is set to anything but “false”, the setting is sticky. • If the key does not exist, the stickiness still defaults to the initial value of true. 4. A local Stickiness definition file is searched for a key matching the current UI setting’s name. • If the key is set to “false”, the setting is not sticky and overrides whatever was found in the global file. • If the key is set to anything but “false”, the setting is sticky, overriding whatever was found in the global file. • If the key does not exist in the local file, the last known value (initial or from the global file) remains in power. 5. At this point, SMTD knows whether the setting is sticky or not. Now it gets the global default value: • If a matching key exists in the file SubmitMaxToDeadline_Defaults.ini, the setting is initialized to its value. • If no matching key exists in the global defaults file, the original factory default defined in the SMTD- Struct definition will remain in power. • If the setting is sticky, SMTD loads the last known value from the local INI file. If the value turns out to be invalid or not set, it uses the default instead. • If the setting is not sticky, the default loaded from the global defaults file or, if no such default was loaded, the factory default, will be assigned to the setting. When the User Interface is created, the stickiness info from the local and global files will determine whether a *star character will be added to the control’s name, reflecting the current stickiness settings. Using this new feature, a facility can customize the submitter globally to default to the preferred settings and keep certain settings sticky so their values can be determined by the artists. In addition, single users can override the company-wide stickiness settings using a local file if they feel their workflow require a different setup.

Custom Job Name Controls

There are two ways to customize the job name. You can use keys in the job name that are replaced with actual values (like $scene), or you can have the job name be generated from a list of shows, shots, etc. You will then be able to use the [>>] button to the right of the Job Name field to select these custom job names. Generate Job Name From Keys There is a file in the ..\submission\3dsmax\ folder in your Repository called SubmitMaxToDeadline_NameFormats.ini. In addition, a local copy of the SubmitMaxToDeadline_NameFormats.ini file can be saved in a user’s application data folder. This file will OVERRIDE the name formats in the Repository and can contain a sub-set of the definitions in the global file. This file will contain some key-value pairs such as:

$scene=(getfilenamefile(maxfilename)) $date=((filterstring (localtime) " ")[1]) $deadlineusername=(SMTDFunctions.GetDeadlineUser()) $username=(sysInfo.username) $camera=(local theCam = viewport.getCamera() if (isValidNode theCam) then theCam.name else "") $outputfilename=(if (rendOutputFilename != "") then (filenameFromPath rendOutputFilename) else "") $outputfile=(if (rendOutputFilename != "") then (getFilenameFile rendOutputFilename) else "") $outputtype=(if (rendOutputFilename != "") then (getFilenameType rendOutputFilename) else "") $maxversion=(((maxVersion())[1]/1000) as string) $frames=(if rendTimeType == 1 then ((currentTime.frame as integer) as string) else (if rendTimeType == 2 then (substituteString ((animationRange.start as string) + "-" + (animationRange.end as string) + "x" + (rendNthFrame as string)) "f" "") else (if rendTimeType == 3 then (substituteString ((rendStart as string) + "-" + (rendEnd as string) + "x" + (rendNthFrame as string)) "f" "") else (rendPickupFrames)))) $passname=(if RPMdata != undefined then ( if (RPMdata.GetPassCount() > 0) then (RPMdata.GetPassName ((RPMdata.GetPassSelection())[1])) else "No Passes Found" ) else "RPManager Not Found" )

9.2. 3ds Max 477 Deadline User Manual, Release 7.0.3.0

The key to the left of = is the string that will be replaced in the job name. The value to the right of the = is the maxscript code that is executed to return the replacement string (note that the value returned must be returned as a string). So if you use $scene in your job name, it will be swapped out for the scene file name. You can append additional key-value pairs or modify the existing ones as you see fit. By default, the [>>] button will already have $scene or $outputfilename as selectable options. You can then create an optional JobNames.ini file in the 3dsmax submission folder, with each line representing an option. For example:

$scene $outputfilename $scene_$camera_$username $maxversion_$date

These options will then be available for selection in the submission dialog. This allows for all sorts of customization with regards to the job name. Generate Job Name For Shows This advanced feature allows the addition of custom project, sequence, shot and pass names to the [>>] list to the right of the Job Name field. Producers in larger facilities could provide full shot lists via a central set of files in the Repository to allow users to pick existing shot names and ensuring consistent naming conventions independent from the 3ds Max scene naming. To create a new set of files, go to the ..\submission\3dsmax\ folder in your Repository and create the following files: Projects.ini - This file describes the projects currently available for Custom Job Naming. Each Project is defined as a Category inside this file, with two keys: Name and ShortName. For example:

[SomeProject] Name=Some Project in 3D ShortName=SP [AnotherProject] Name=Another Project ShortName=AP

SomeProject.ini - This is a file whose name should match exactly the Category name inside the file Projects.ini and contains the actual sequence, shot and pass description of the particular project. One file is expected for each project definition inside the Projects.ini file. For example:

[SP_SS_010] Beauty=true Diffuse=true Normals=true ZDepth=true Utility=true [SP_SS_150] Beauty=true Diffuse=true Utility=true [SP_SO_020] Beauty=true [SP_SO_030] Beauty=true

478 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

The Submitter will parse this file and try to collect the Sequences by matching the prefix of the shot names, for example in the above file, it will collect two sequences - SP_SS and SP_SO - and build a list of shots within each sequence, then also build a list of passes within each shot. Then, when the [>>] button is pressed, the context menu will contain the name of each project and will provide a cascade of sub-menus for its sequences, shots and passes.

If you selected the entry SomeProject>SP_SS>SP_SS_150>Diffuse, the resulting Job Name will be “SP_SS_150_Diffuse”:

You can enter as many projects into your Projects.ini file as you want and provide one INI file for each project describing all its shots and passes. If an INI file is missing, no data will be displayed for that project.

Custom Comment Controls

Just like job names, you can use keys in the comment field that are replaced with actual values (like $scene). There is a file in the ..\submission\3dsmax\ folder in your Repository called SubmitMaxToDeadline_CommentFormats.ini. In addition, a local copy of the SubmitMaxToDeadline_CommentFormats.ini file can be saved in a user’s application data folder. This file will OVERRIDE the comment formats in the Repository and can contain a sub-set of the definitions in the global file. This file will contain some key-value pairs such as:

$default=("3ds Max " + SMTDFunctions.getMaxVersion() + " Scene Submission") $scene=(getfilenamefile(maxfilename)) $date=((filterstring (localtime) " ")[1]) $deadlineusername=(SMTDFunctions.GetDeadlineUser()) $username=(sysInfo.username) $camera=(local theCam = viewport.getCamera() if (isValidNode theCam) then theCam.name else "") $outputfilename=(if (rendOutputFilename != "") then (filenameFromPath rendOutputFilename) else "") $outputfile=(if (rendOutputFilename != "") then (getFilenameFile rendOutputFilename) else "") $outputtype=(if (rendOutputFilename != "") then (getFilenameType rendOutputFilename) else "") $maxversion=(((maxVersion())[1]/1000) as string)

9.2. 3ds Max 479 Deadline User Manual, Release 7.0.3.0

The key to the left of = is the string that will be replaced in the comment. The value to the right of the = is the maxscript code that is executed to return the replacement string (note that the value returned must be returned as a string). So if you use $scene in your comment, it will be swapped out for the scene file name. You can append additional key-value pairs or modify the existing ones as you see fit. By default, the [>>] button will already have $default. You can then create an optional Comments.ini file in the 3dsmax submission folder, with each line representing an option. For example:

$default $scene $outputfilename $scene_$camera_$username $maxversion_$date

These options will then be available for selection in the submission dialog. This allows for all sorts of customization with regards to the comment field.

Auto-Suggest Category and Priority Mechanism

This feature has been implemented to help Producers suggest categories and priorities based on Shots and Sequence signatures which are part of the 3ds Max Scene Name. This feature DOES NOT ENFORCE the Category and Priority for the job, it only suggests a value based on project guidelines - the Category and Priority can be changed manually after the suggestion. To use this feature, you have to edit the file called “SubmitMaxToDeadline_CategoryPatterns.ms” located in the Repos- itory in the \submission\3dsmax folder. As a shortcut, you can press the button Edit Patterns... in the Options tab of the Submitter - the file will open in the built-in MAXScript Editor. The file defines a global array variable called SMTD_CategoryPatterns which will be used by the Submitter to perform pattern matching on the Job Name and try to find a corresponding Category and optionally a priority value in the array. The array can contain one or more sub-arrays, each one representing a separate pattern definition. Every pattern sub-array consists of four array elements: • The first element is an array containing zero, one or more string patterns using * wildcards. These strings will be used to pattern match the Job Name. If it matches, it will be considered for adding to the Category and for changing the Priority. If the subarray is empty, all jobs will be considered matching the pattern. • The second element is also an array containing similar pattern strings. These strings will be used to EXCLUDE jobs matching these patterns from being considered for this Category and Priority. If the subarray is empty, no exclusion matching will be performed. • The third element contains the EXACT name (Case Sensitive!) of the category to be set if the Job Name matches the patterns. If the category specified here does not match any of the categories defined via the Monitor, no action will be performed. • The fourth element specifies the Priority to give the job if it matches the patterns. If the value is -1, the existing priority will NOT be changed. The pattern array can contain any number of pattern definitions. The higher a definition is on the list, the higher its priority - if a Job Name matches multiple pattern definitions, only the first one will be used. The pattern matching will be performed only if the checkbox Auto-Suggest Job Category and Priority in the Options Tab is checked. It will be performed when the dialog first opens or when the the Job Name is changed. An example:

480 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Let’s assume that a VFX facility is working on a project called “SomeProject” with multiple sequences labelled “AB”, “CD” and “EF”. • The network manager has created categories called “SomeProject”, “AB_Sequence”, “CD_Sequence”, “EF_Sequence” and “High_Priority” via the Monitor. • The Producers have instructed the Artists to name their 3ds Max files “SP_AB_XXX_YYY_”where SP stands for “SomeProject”, “AB” is the label of the sequence followed by the scene and shot numbers. • Now we want to set up the Submitter to suggest the right Categories for all Max files sent to Deadline based on these naming conventions. • We want jobs from the CD sequence to be set to Priority of 60 unless they are from the scene with number “007”. • We want jobs from the AB sequence to be set to Priority of 50 • We don’t want to enforce any priority to jobs for sequence EF. • Also we want shots from the “AB” sequence with scene number “123” and “EF” sequence with scene shot number “038” to be sent at highest priority and added to the special “High Priority” category for easier filtering in the Monitor. • Finally we want to make sure that any SP project files that do not contain a sequence label are added to the general “SomeProject” category with lower priority. To implement these rules, we could create the following definitions in the “SubmitMaxToDead- line_CategoryPatterns.ms” - press the Edit Patterns... button in the Options tab to open the file:

SMTD_CategoryPatterns = #( #(#("*AB_123*","*EF_*_038*"),#(),"High_Priority",100), #(#("*AB_*"),#(),"AB_Sequence",50), #(#("*CD_*"),#("*CD_007_*"),"CD_Sequence",60), #(#("*EF_*"),#(),"EF_Sequence",-1), #(#("SP_*"),#(),"SomeProject",30), )

• The first pattern specifies that files from the “AB” sequence, scene “123” and “EF” sequence, shot “038” (re- gardless of scene number) will be suggested as Category “High_Priority” and set Priority to 100. • The second pattern specifies all AB jobs to have priority of 50 and be added to Category “AB_Sequence”. Since the special case of AB_123 has been handled in the previous pattern, this will not apply to it. • The third pattern sets jobs that contain “CD_” in their name but NOT the signature “CD_007_” to the “CD_Sequence” Category and sets the Priority to 60. • The fourth pattern sets jobs that contain “EF_” in their name to the “EF_Sequence” Category but does not change the priority (-1). • The fifth pattern specifies that any jobs that have not matched the above rules but still start with the “SP_” signature should be added to the “SomeProject” Category and set to low priority of 30. Note that since we used “*” instead of “SP_”in the beginning of the first 4 patterns, even if the job is not named correctly with the project prefix “SP_”, the pattern will correctly match the job name.

Custom Plugin.ini File Creation

This section covers the Alternate Plugin.ini feature in the 3ds Max Rendering rollout (under the Render tab). Alternate Plugin.ini File

9.2. 3ds Max 481 Deadline User Manual, Release 7.0.3.0

The plugin.ini list will show a list of alternative plug-in configuration files located in the Repository. By default, there will be no alternative plugin.ini files defined in the repository. The list will show only one entry called [Default], which will cause all slaves to render using their own local plugin.ini configuration and is equivalent to having the Use Custom Plugin.ini file unchecked. To define an alternative plugin.ini, copy a local configuration file from one of the slaves to [Reposi- tory]\plugins\3dsmax in the repository. Edit the name of the file by adding a description of it. For example, plu- gin_brazil.ini, plugin_vray.ini, plugin_fr.ini, plugin_mentalray.ini, etc. Open the file and edit its content to include the plug-ins you want and exclude the ones you don’t want to use in the specific case. The next time you launch Submit To Deadline, the list will show all alternative files whose names start with “plugin” and end with ”.ini”. The list will be alphabetically sorted, with [Default] always on top. You can then select an alternative plugin.ini file manually from the list. Pressing the Edit Plugin.ini File button will open the currently selected alternative configuration file in a MAXScript Editor window for quick browsing and editing, except when [Default] is selected. Pressing the Browse Directory button will open Windows Explorer, taking you directly to the plug-ins directory containing the alternative plugin.ini files. Note that if you create a new plugin.ini file, you will have to restart the Submit To Deadline script to update the list. Since the alternative plug-in configuration file is located in the Repository and will be used by all slave machines, the plug-in paths specified inside the alternative plugin.ini will be used as LOCAL paths by each slave. There are two possible installation configurations that would work with alternative plug-ins (you could mix the two methods, but it’s not recommended): • Centralized Plug-ins Repository: In this case, all 3dsmax plug-ins used in the network are located at a centralized location, with all Slaves mapping a drive letter to the central plug-in location and loading the SAME copy of the plug-in. In this case, the alternative plugin.ini should also specify the common drive letter of the plug-in repository. • Local Plug-in: To avoid slow 3dsmax booting in networks with heavy traffic, some studios (including ones we used to work for) deploy local versions of the plug-ins. Every slave’s 3dsmax installation contains a full set of all necessary plug-ins (which could potentially be automatically synchronized to a central repository to keep all machines up-to-date). In this case, the alternative plugin.ini files should use the LOCAL drive letter of the 3dsmax installation, and all Slaves’ 3dsmax copies MUST be installed on the same partition, or at least have the plug-ins directory on the same drive, for example, “C:”. Auto-Detect Plugin.ini For Current Renderer When enabled, the following operations will be performed: 1. When you check the checkbox, the current renderer assigned to the scene will be queried. 2. The first 3 characters of the renderer’s name will be compared to a list of known renderers. 3. If the renderer is not on the list, the alternative list will be reset to [Default]. 4. If the renderer is the Default Scanline Renderer of 3dsmax, the alternative list will be reset to [Default]. 5. If the renderer is a known renderer, the plugin*.ini file that matches its name will be selected. Supported renderers for auto-suggesting an alternative configuration are: • Brazil plugin*.ini should contain “brazil” in its name (i.e.: plugin_brazil.ini, plugin-brazil.ini, plugin- brazil_1_2.ini etc). • Entropy plugin*.ini should contain “entropy” in its name (i.e.: plugin_entropy.ini, plugin-entropy.ini, pluginen- tropy.ini, etc). • finalRender plugin*.ini should contain “fr” or “final” in its name (i.e.: plugin_fr.ini, plugin-finalrender.ini, plu- gin_finalRender_Stage1.ini etc).

482 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• MaxMan plugin*.ini should contain “maxman” in its name (i.e.: plugin_maxman.ini, plugin-maxman.ini, plug- inmaxman001.ini etc). • mentalRay plugin*.ini should contain “mr” or “mental” in its name (i.e.: plugin_mr.ini, plugin-mentalray.ini, plugin_mental33.ini etc). • VRay plugin*.ini should contain “vray” in its name (i.e.: plugin_vray.ini, plugin-vray.ini, pluginvray109.ini etc). Notes: • In 3dsmax 5 and higher, opening a MAX file while the Auto-Detect option is checked will trigger a callback which will perform the above check automatically and switch the plugin.ini to match the renderer used by the scene. • In 3dsmax 6 and higher, changing the renderer via the “Current Renderers” rollout of the Render dialog will also trigger the auto-suggesting mechanism. • You can override the automatic settings anytime by disabling the Auto-Detect option and selecting from the list manually.

9.2.7 FAQ

Which versions of 3ds Max are supported? 3ds Max versions 2010 and later are all supported (including Design editions). Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts will not work. However, this bug has been addressed in 3ds Max 2012 Hotfix 1. If you cannot apply this patch, it means that you must submit your 3ds Max 2012 jobs from the Deadline Monitor. Which 3ds Max renderers are supported? Deadline should already be compatible with all 3ds Max renderers, but it has been explicitly tested with Scanline, MentalRay, Brazil, VRay, Corona, finalRender, and Maxwell. If you have successfully used a 3ds Max renderer that is not on this list, please email Deadline Support. Does Backburner need to be installed to render with Deadline? Yes. Backburner installs the necessary files that are needed for command line and network rendering, so it must be installed to render with Deadline. Does the 3ds Max plugin support Tile Rendering? Yes. See the Tile Rendering section of the submission dialog documentation for more details. Does the 3ds Max plugin support multiple arbitrary sized, multi-resolution Tile Rendering for both stills or animations and automatic re-assembly, including the use of multi-channel image formats and Render Elements (incl. VRay VFB specific image files)? Yes. We call it ‘Jigsaw’ and it’s unique to the Deadline system! See the Tile Rendering section of the submission dialog documentation for more details. Does the 3ds Max plugin support Batch Rendering? Yes. See the Batch Rendering section of the submission dialog documentation for more details. Is PSoft’s Pencil+ render effects plugin supported? Yes. Please note at least Pencil+ v3.1 is required to resolve an issue with the line element render element failing to be rendered. Note, you will require the correct network render license from PSoft for each Deadline Slave or render with a Deadline Slave that already has a full, workstation license of Pencil+ already installed.

9.2. 3ds Max 483 Deadline User Manual, Release 7.0.3.0

When I submit a render with a locked viewport, Deadline sometimes renders a different viewport. Prior to the release of 3ds Max 2009, the locked viewport feature wasn’t exposed to the 3ds Max SDK, so it was impossible for Deadline to know whether a viewport is locked or not. Now that the feature has been exposed, we are working to improve Deadline’s locked viewport support. However, in the 3ds Max 2010 SDK, there is a bug that prevents us from supporting it completely (Autodesk is aware of this bug). As of 3ds Max 2015, this bug is now resolved. For earlier versions, we can only continue to recommend that users avoid relying on the locked viewport feature, and instead ensure that the viewport they want to render is selected before submitting the job. The SMTD sanity check continues to provide a warning for those versions of 3ds Max, where the locked viewport SDK bug still exists. When Deadline is running as a service, 3ds Max 2015 render jobs crash during startup. This can happen if the new Scene (Content) Explorer is docked. This is a known issue with 3ds Max network rendering when it is launched by a program running as a service. See this AREA blog post about running 3ds Max 2015 as a service for a workaround and more information. Can I mix 3ds Max and 3ds Max Design jobs in Deadline? Yes. ADSK have introduced (April 2014) a new system environment variable you can set which will make all jobs from 3ds Max and 3ds Max Design appear as 3ds Max jobs: “MIX_MAX_DESIGN_BB” set to “1” to enable this feature. Note, Windows typically requires a machine restart or log-off/log-on for the new environment setting value to become available once set. ADSK have confirmed this works for 3ds Max 2015, 3ds Max Design 2015 with Backburner 2015.0.1. It may also work with 2014 SP5 version of 3ds Max and 3ds Max Design, with Backburner 2015.0.1. See this AREA blog post about mixing 3ds Max and 3ds Max design on a render farm for more information. Note, Backburner Manager or Server are NOT required to make this system work in Deadline. When I submit a render job that uses more than one default light, only one default light gets rendered. The workaround for this problem is to add the default lights to the scene before submitting the job. This can be done from within 3ds Max by selecting Create Menu -> Lights -> Standard Lights -> Add Default Lights to Scene. Is it possible to submit MAXscripts to Deadline instead of just a *.max scene? Yes. Deadline supports MAXscript jobs from the Scripts tab in the submission dialog. Does Deadline’s custom interface for rendering with 3ds Max use workstation licenses? No. Deadline’s custom interface for rendering with 3ds max does not use any workstation licenses when running on slaves unless you have the Force Workstation Mode option checked in the submission dialog, a workstation license will be used. Slaves are rendering their first frame/tile correctly, but subsequent frames and render elements have problems or are rendered black. Try enabling the option to “Restart Renderer Between Frames” in the submission dialog before submis- sion, or in the job properties dialog after submission. We have found that this works 99% of the time in these cases. When enabled, the c++ Lightning plugin (unique to Deadline), will unload the renderer plu- gins and then reload them instantly. This has the effect of forcing a memory purge and helps to improve renderer stability, as well as ensure the lowest possible memory footprint. This can be helpful, when rendering close to the physical memory limit of a machine. See note below for when this feature should be disabled. VRay Light-Cache / Irradiance Maps are not the correct file size or seem to be getting reset between incremental frames on Deadline but calculate correctly when executed locally.

484 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Ensure the option “Restart Renderer Between Frames” is DISABLED if you are sending FG/LC/IM caching map type jobs to the farm, as the renderer will get reset for each frame and the FG/LC/IM file(s) won’t get incrementally increased with the additional data per frame and will only contain the data from the last frame it calculated. (The resulting file size will be too small as well). 3dsMax Point Cache Files dropping geometry in renders randomly Sometimes 3dsMax can drop point cache geometry in renders, in an almost random only certain rigs fashion. Typically but not exclusively, this happens on the 2nd assigned frame processed by a particular slave. Ensure the option “Restart Renderer Between Frames” is DISABLED in the submission dialog before submission, or in the job properties dialog after submission. We have found that this works 99% of the time in these cases. When rendering with VRay/Brazil, it appears as if some maps are not being displayed properly. Try enabling the option to “Restart Renderer Between Frames” in the submission dialog before submis- sion, or in the job properties dialog after submission. We have found that this works 99% of the time in these cases. Tile rendering with a Mental Ray camera shader known as “wraparound” results in an incorrect final image. How can I fix this? This is another situation where enabling the option to “Restart Renderer Between Frames” in the submis- sion dialog seems to fix the problem. When tile rendering with a renderer that supports global/secondary illumination, I get bucket stamps (different lighting conditions in each tile) on the final image. Try calculating the irradiance/final gather light caching map first in one pass at full resolution. Then perform your tile render on a scene that reads the irradiance/final gather map created at full resolution. If creating the map at full resolution is impossible then you can make it in the tile, but you need to make sure the tiles are overlapping each other (use Deadline’s tile/jigsaw padding to help here) and make sure to use the irradiance/final gather map method that appends to the map file. Alternatively, you could consider using the VRay/Mental Ray DBR off-load system to accelerate the caculation of the light caching map. In summary: you create (pre-calculate) the secondary/global illumination map first then run the final render in tiles as a second job. Deadline job dependencies can be used here to release the second job as the first job successfully completes the lighting pre-calculation job. Can I perform Distributed Bucket Rendering (DBR) with VRay or VRay RT? Yes. A special ‘reserve’ job is submitted that will run the VRay Spawner/VRay standalone process on the render nodes. Once the VRay Spawner/VRay standalone process is running, these nodes will be able to participate in distributed rendering. Please see the VRay Distributed Rendering (DBR) Plug-in Guide for more information. Can I fully off-load 3dsMax VRay or Mental Ray DBR rendering from my machine? Yes, see the VRay/Mental Ray DBR section for more information. The advantages to off-loading a VRay DBR job fully from your workstation include; releasing your local workstation to carry out other process- ing tasks and helping to accelerate the irradiance map/photon cache calculation process as the VRay DBR system supports distributing this across multiple machines. A risk/disadvantage to this way of working is if a single machine currently being used to calculate a DBR bucket crashes/fails for an unknown reason, then the whole process will fail at it’s current stage and start from the beginning again. Can I Perform Fume FX Simulations With Deadline? Yes. To do so, follow these steps: 1. Your render nodes need to have Fume FX licensed properly, either with a “full” or “simulation” licenses. This requirement is the same if you were rendering with Backburner.

9.2. 3ds Max 485 Deadline User Manual, Release 7.0.3.0

2. Before you launch the 3dsmax submission script, make sure that the Fume FX NetRender toggle button is “ON” in the Fume FX options in 3dsmax. 3. Before you submit the job, make sure the “Disable Progress Update Timeout” option is enabled under the Render in the 3dsmax submission window. 4. Note that Fume FX uses its own frame range (in the Fume FX settings/prefs), so submit the Max scene file to Deadline as a single frame/task. Can I force a render to use a specific language? Yes. Using the option located in “User Options” tab of SMTD or in the monitor submission, “Advanced Options” tab (2013+ only). This will change the default on the machine it is rendered on to the chosen language. Note that the change is permanent on the machine until such time 3dsMax is restarted and the language is forced to a different language. You can manually force the language to be changed back via the language specific shortcuts in the start menu, which effectively start 3dsMax with the language flag. In this example, EN-US (default) is forced: “C:/Program Files/Autodesk/3ds Max 2015/3dsmax.exe” /Language=ENU When submitting to Deadline, non-ASCII characters in output paths, camera names, etc, are not being sent to Deadline properly. You need to enable the “Save strings in legacy non-scene files using UTF8” property in the Preference Set- tings in 3ds Max. After enabling this, the Deadline submission files will be saved as UTF8 and therefore non-ASCII characters will be saved properly. See the Character Encoding Defaults in 3ds Max section in the 3ds Max Character Encoding documentation for more information.

9.2.8 Error Messages and Meanings

This is a collection of known 3ds Max error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Note that when an error occurs in a Max render, we parse the Max render log (Max.log) for any messages that might explain the problem and include them in the error message. Some examples are: • ERR: An unexpected exception has occurred in the network renderer and it is terminating. • ERR: Missing dll: BrMaxPluginMgr.dlu • ERR: [V-Ray] UNHANDLED EXCEPTION: Preparing ray server Last marker is at .srcvrayrenderer.cpp 3dsmax startup: Error getting connection from 3dsmax: 3dsmax startup: Deadline/3dsmax startup error: lightningMax*.dlx does not appear to have loaded on 3dsmax startup, check that it is the right version and installed to the right place. You likely need to install the appropriate Visual C++ Redistributable package, which can be done as part of the Client installation.

486 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

3dsmax startup: Error getting connection from 3dsmax: Monitored managed process “3dsmaxProcess” has exited or been terminated. Full error message:

3dsmax startup: Error getting connection from 3dsmax: Monitored managed process "3dsmaxProcess" has exited or been terminated. 2012/08/24 14:48:40 DBG: Starting network 2012/08/24 14:48:40 DBG: Calling NetRenderPreLoad 2012/08/24 14:48:40 DBG: in NetWorkerPreLoad. jobFile: ; jobname: C:\Users\root\AppData\Local\Thinkbox\Deadline\slave\Simul005\plugins\deadlineStartupMax2013.max 2012/08/24 14:48:40 DBG: in NetWorkerPreLoad. LoadLib() failed 2012/08/24 14:48:40 DBG: NetRenderPreLoad failed 2012/08/24 14:48:40 ERR: Error loading *.max file 2012/08/24 14:49:10 INF: SYSTEM: Production renderer is changed to Default Scanline Renderer. Previous messages are cleared. 2012/08/24 14:49:10 DBG: Stop network

This is a known issue with 3ds Max, and can occur when IPv6 is enabled on the render node. The issue can be fixed by disabing IPv6 on the machines, or by disabing the IPv6 to IPv4 tunnel. See this Area blog post about IPv6 errors for more information. Could not delete old lightning.dlx... This file may be locked by a copy of 3ds max Usually this is because a 3dsmax.exe process didn’t quit or get killed properly. Looking in task manager on the slaves reporting the message for a 3dsmax.exe process and killing it is the solution. 3dsmax crashed in GetCoreInterFace()->LoadFromFile() There are a number of things that can be tried to diagnose the issue: • Try opening the file on a machine where it crashed. You may already have done this. • Try rendering a frame of it on a machine where it crashed, using the 3dsmaxcmd.exe renderer. This will make it open the file in slave mode and possibly give an idea of what’s failing.

9.2. 3ds Max 487 Deadline User Manual, Release 7.0.3.0

• Submit the job to run in workstation mode. In workstation mode there’s often more diagnostic output. There’s a checkbox in the submission script for this. • If you’re comfortable sending us the .max file which is crashing, we’d be happy to diagnose the issue here. • Try stripping down the max file by deleting objects and seeing if it still crashes then. Trapped SEH Exception in CurRendererRenderFrame(): Access Violation An Access Violation means that when rendering the frame, Max either ran out of memory, or memory became corrupted. The stack trace in the error message usually shows which plugin the error occurred in. If that doesn’t help track down the issue, try stripping down the max file by deleting objects and seeing if the error still occurs. 3dsmax: Trapped SEH Exception in LoadFromFile(): Access Violation An Access Violation means that when loading the scene, Max either ran out of memory, or memory became corrupted. The stack trace in the error message usually shows which plugin the error occurred in. If that doesn’t help track down the issue, try stripping down the max file by deleting objects and seeing if the error still occurs. 3dsmax: PNG Plugin: PNG Library Internal Error 3dsMax Render Elements can become corrupt or be placed in a bad state with regard the image file format plugin trying to being used to save each Render Element to your file server. This issue is not limited to the PNG file format (TGA, TIF) but is common. A known option, which has been known to fix the issue in most circumstances, is to rebuild the render elements by deleting and re-creating them in the 3dsmax scene file. This feature is automated in SMTD if you enable the checkbox “Rebuild Render Elements” under the “Render” tab -> “3ds Max Pathing Options”. RenderTask: 3dsmax exited unexpectedly (it may have crashed, or someone may have terminated) This generic error message means that max crashed and exited before the actual error could be propagated up to Deadline. Often when you see this error, it helps to look through the rest of the error reports for that job to see if they contain any information that’s more specific. RenderTask: 3dsmax may have crashed (recv: socket error trying to receive data: WSAError code 10054) This generic error message means that max crashed and exited before the actual error could be propagated up to Deadline. Often when you see this error, it helps to look through the rest of the error reports for that job to see if they contain any information that’s more specific. 3dsmax startup: Error getting connection from 3dsmax: 3dsmax startup: Deadline/3dsmax startup error: lightningMax*.dlx does not appear to have loaded on 3dsmax startup, check that it is the right version and installed to the right place. This error is likely the side effect of another error, but the original error wasn’t propagated to Deadline properly. Often when you see this error, it helps to look through the rest of the error reports for that job to see if they contain any information that’s more specific. 3dsmax startup: Max exited unexpectedly. Check that 1) max starts up with no dialog messages and in the case of 3dsmax 6, 2) 3dsmaxcmd.exe produces the message ‘Error opening scene file: “”’ when run with no command line arguments This message is often the result of an issue with the way Max starts up. Try starting 3ds Max on the slave machine that produced the error to see if it starts up properly. Also try running 3dsmaxcmd.exe from the command line prompt to see if it produces the message ‘Error opening scene file: “”’ when run with no command line arguments. If it doesn’t produce this message, there may be a problem with the Max install or how its configured. Sometimes reinstalling Max is the best solution. The 3dsmax command line renderer, ...\3dsmaxcmd.exe, hung during the verification of the 3ds max install

488 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Try running 3dsmaxcmd.exe from the command line prompt to see if it pops up an error dialog or crashes, which is often the cause of this error message. If this is the case, there may be a problem with the Max install or with how it is configured. Sometimes reinstalling Max is the best solution. 3dsmax: Failed to load max file: ”...” There could be many reasons my Max would fail to load the scene file. Check for ERR or WRN messages included in the error message for information that might explain the problem. Often, this error is the result of a missing plugin or dll. Error: “3ds Max The Assembly Autodesk.Max.Wrappers.dll encountered an error while loading” This is a specific 3ds Max 2015 crash when you try to launch the program. Ensure you perform a Windows update and get latest updates for Windows 7 or 8. Additionally, install the update for Autodesk 3ds Max 2015 Service Pack 1 and Security Fix. See this ADSK Knowledge post for more information. Error message: 3dsmax adapter error : Autodesk 3dsMax 17.2 reported error: Could not find the specified file in DefaultSettingsParser::parse() ; Could not find the specified file in DefaultSettingsParser::parse() ; The error “Could not find the specified file in DefaultSettingsParser::parse() ;” occurs if you don’t have the Populate Data installed on each of your Deadline Slave machines. To resolve the issue you need to ensure that the Populate Data is installed on all the render machines. You can run the 3ds- Max_2015_PopulateData.msi installer from the “\x64\PDATA\” folder of the 3ds Max 2015 installer. In case there was a previous install of the Populate Data on the machine please delete the following folder before installing “C:\Program Files\Common Files\Autodesk Shared\PeoplePower\2.0\”. See this Area blog post for more information. Unexpected exception (Error in bm->OpenOutput(): error code 12) Ensure all instances of 3dsMax are running a consistent LANGUAGE. By default 3dsMax ships with the LANGUAGE code set to “ENU” - “US English” and this is recommended for the majority of customers. If you are using a 3rd party plugin in 3dsMax, please contact the plugin developer to verify that their plugin is capable of running as a different language inside of 3dsMax. Note, that the majority of 3rd party plugins are still only developed to work in “ENU”. Please see this FAQ for more information regarding options to control the LANGUAGE: 3dsMax Language Code FAQ. Exception: Failed to render the frame. There could be many reasons my Max would fail to render the frame. Check for ERR or WRN messages included in the error message for information that might explain the problem. DBG: in Init. nrGetIface() failed This error message is often an indication that 3dsmax or backburner is out of date on the machine. Up- dating both to the latest service packs should fix the problem. ERROR: ImageMagick: Invalid bit depth for RGB image ‘[path to tile/region render output image]’ This error is due to the old TileAssembler executable not supporting certain bit depth images such as VRay’s RE’s “Reflection”, “Refraction” and “Alpha” when saved from the VRay Frame Buffer (VFB). Please note that the Tile Assembler plugin is EOL (End-Of-Life/deprecated). Please use the newer “Draft Tile Assembler” plugin (Use Draft for Assembly) checkbox option in SMTD when rendering using the older tile system to ensure all image types/bit depths are correctly assembled. Draft Tile Asssembler jobs can also be submitted independently if you already have the *.config file(s) and is explained further in the Draft Tile Assembler documentation.

9.2. 3ds Max 489 Deadline User Manual, Release 7.0.3.0

9.3 After Effects

9.3.1 Job Submission

You can submit jobs from within After Effects by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within After Effects, select File -> Run Script -> DeadlineAfterEffectsClient.jsx.

490 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.3. After Effects 491 Deadline User Manual, Release 7.0.3.0

Project Configuration

In After Effects, place the comps you want to render in the Render Queue (CTRL+ALT+0). Due to an issue with the Render Queue, if you have more than one comp with the same name, only the settings from the first one will be used

492 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

(whether they are checked or not). It is important that all comps in the Render Queue have unique names, and our submission script will notify you if they do not. Each comp that is in the Render Queue and that has a check mark next to it will be submitted as separate job to Deadline.

Note that under the comp’s Output Module settings, the Use Comp Frame Number check box must be checked. If this is not done, every frame in the submitted comp will try to write to the same file.

9.3. After Effects 493 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. Note that the Draft/Integration options are only available in After Effects CS4 and later. The After Effects specific options are: • Use Comp Name As Job Name: If enabled, the job’s name will be the Comp name.

494 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Use Frame List From The Comp: Check this option to use the frame range defined for the comp. • Render The First And Last Frames Of The First: Enable this option to render the first and last frames first, followed by the the remaining frames in the comp’s frame list. Note that this ignores the Frame List setting in the submission dialog. • Comps Are Dependent On Previous Comps: If enabled, the job for each comp in the render queue will be dependent on the job for the comp ahead of it. This is useful if a comp in the render queue uses footage rendered by a comp ahead of it. • Submit The Entire Render Queue As One Job With A Single Task: Use this option when the entire render queue needs to be rendered all at once because some queue items are dependent on others or use proxies. Note though that only one machine will be able to work on this job. • Ignore Missing Layer Dependencies: If enabled, Deadline will ignore errors due to missing layer dependencies. • Ignore Missing Effects References: If enabled, Deadline will ignore errors due to missing effect references. • Fail On Warning Messages: If enabled, Deadline will fail the job whenever After Effects prints out a warning message. • Export XML Project File: Enable to export the project file as an XML file for Deadline to render (After Effects CS4 and later). The original project file will be restored after submission. If the current project file is already an XML file, this will do nothing. • Continue On Missing Footage: If enabled, rendering will not stop when missing footage is detected. • Enable Local Rendering: If enabled, Deadline will render the frames locally before copying them over to the final network location. • Multi-Process Rendering: Enable multi-process rendering. The following After Effects specific options are only available in After Effects CS4 and later: • Multi-Machine Rendering: This mode submits a special job where each task represents the full frame range. The slaves will all work on the same frame range, but if “Skip existing frames” is enabled for the comps, they will skip frames that other slaves are already rendering. – This mode requires “Skip existing frames” to be enabled for each comp in the Render Queue. – Set the number of tasks to be the number of slaves you want working simultaneously on the render. – This mode ignores the Frame List, Machine Limit, and Frames Per Task settings. – This mode does not support Local Rendering or Output File Checking. • Minimum Output File Size: If an output image’s file size is less than what’s specified, the task is requeued (specify 0 for no limit). • Enable Memory Management: Whether or not to use the memory management options. • Image Cache %: The maximum amount of memory after effects will use to cache frames. • Max Memory %: The maximum amount of memory After Effects can use overall.

Layer Submission

In addition to normal job submission, you also have the option to submit layers in your After Effects project as separate jobs. To do so, first select the layers you want to submit. Then run the submission script, set the submission options mentioned above as usual, and press the Submit Selected Layers button. This will bring up the layers window.

9.3. After Effects 495 Deadline User Manual, Release 7.0.3.0

The layer submission options are: • Render With Unselected Layers: Specify the unselected layers that will render with each of the selected layers. • Layer Name Parsing: Allows you to specify how the layer names should be formatted. You can then grab parts of the formatting and stick them in either the output name or the subfolder format box with square brackets. So, for example, if you’re naming your layers something like “ops024_a_diff”, you could put “__” in this box. Then in the subfoler box, you could put “[graphic]\[layer]\v001\[pass]”, which would give you “ops024\a\v001\diff” as the subfolder structure. • Render Settings: Which render settings to use. • Output Module: Which output module to use. • Output Format: How the output file name should be formatted. • Output Folder: Where the output files should be rendered to. • Use Subfolders: Enable this to render each layer to its own subfolder. If this is enabled, you must also specify the subfolder format.

496 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.3.2 Cross-Platform Rendering Considerations

In order to perform cross-platform rendering with After Effects, you must setup Mapped Paths so that Deadline can swap out the Project and Output file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super user mode by selecting Tools -> Configure Repository. You’ll find the Mapped Paths Setup in the list on the left. You then have two options on how to set up your After Effects project file. The traditional way is to ensure that your After Effects project file is on a network shared location, and that any footage or assets that the project uses is in the same folder or in sub-folders. Then when you submit the job, you must make sure that the option to submit the project file with the job is disabled. If you leave it enabled, the project file will be copied to and loaded from the Slave’s local machine, and thus won’t be able to find the footage. You also have the option to save your After Effects project as an AEPX file, which is just an XML file. Deadline will automatically detect that an AEPX file has been submitted, and will swap out paths within the file itself (because it is just plain text). This way, you don’t have to worry about setting up the project structure described in the first option. Note though that all the asset paths still need to be network accessible.

9.3.3 Plug-in Configuration

You can configure the After Effects plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the After Effects plug-in from the list on the left.

Render Executables • After Effects Executable: The path to the After Effects aerender executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Render Options

9.3. After Effects 497 Deadline User Manual, Release 7.0.3.0

• Fail On Existing After Effects Process: Prevent Deadline from rendering when After Effects is already open. • Force Rendering In English: You can configure the After Effects plug-in to force After Effects to render in English. This is useful if you are rendering with a non-English version of After Effects, because it ensures that Deadline’s progress gathering and error checking function properly (since they are currently based on English output from the After Effects renderer). Font Folder Synchronization You can configure the After Effects plug-in to synchronize fonts from a central location whenever a new After Effects job is picked up by a Slave. You can have Deadline synchronize the fonts to one or more local directories. The following settings are available: • Network Windows Font Folder: The network Windows Font folder used for synchronization. • Local Windows Font Folder: The local Windows Font folder to synchronize with the network Font folder. Enter alternative paths on separate lines. • Network Mac OSX Font Folder: The network Mac OSX Font folder used for synchronization. • Local Mac OSX Font Folder: The local Mac OSX Font folder to synchronize with the network Font folder. Enter alternative paths on separate lines. On Windows, the fonts synchronized by Deadline are only available to After Effects and other applications temporarily until the machine is rebooted. Also, on Vista and higher, there are access restrictions to install fonts within the “C:\Windows\Fonts” directory and associated font registry keys. This restriction is for any users (Admin or otherwise), and only an application running as the TrustedInstaller user will have access to the “fonts” directory by default. Because of this, we recommend using “C:\Windows\Temp” on newer versions of Windows. After Effects will be able to find these fonts regardless. Path Mapping For aepx Project Files (For Mixed Farms) • Enable Path Mapping For aepx Files: If enabled, a temporary aepx file will be created locally on the slave for rendering and Deadline will do path mapping directly in the aepx file.

9.3.4 Integrated Submission Script Setup

The following procedures describe how to install the integrated After Effects submission script. This script allows for submitting After Effects render jobs to Deadline directly from within the After Effects editing GUI. The script and the following installation procedure has been tested with with After Effects CS3 and later. You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/AfterEffects/Installers Manual Installation of the Submission Script • Copy [Repository]\submission\AfterEffects\Client\DeadlineAfterEffectsClient.jsx to [After Effects Install Di- rectory]\Support Files\Scripts • After starting up After Effects, make sure that under Edit -> Preferences -> General, the Allow Scripts to Write Files and Access Network option is enabled. This is necessary so that the submission script can create the necessary files to submit to Deadline.

498 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Custom Sanity Check

A CustomSanityChecks.jsx file can be created alongside the main SubmitAEToDeadline.jsx submission script (in [Repository]\submission\AfterEffects\Main), and will be evaluated if it exists. This script will let you set any of the initial properties in the submission script prior to displaying the submission window. You can also use it to run your own checks and display errors or warnings to the user. Here is a very simple example of what this script could look like:

{ initDepartment = "The Best Department"; initPriority = 33; initConcurrentTasks = 2;

alert( "You are in a custom sanity check!" ); }

9.3.5 FAQ

Which versions of After Effects are supported? After Effects CS3 and later are supported. Why is there no Advanced tab in the integrated submission script for After Effects CS3?

9.3. After Effects 499 Deadline User Manual, Release 7.0.3.0

Tabs are only supported in CS4 and later, so the Advanced tab and its options are not available in CS3 and earlier. Does network rendering with After Effects require a full After Effects license? In After Effects CS5.0 and earlier, a license is not required. In After Effects CS5.5, a full license is required. In After Effects CS6.0 and later, a license isn’t required if you enable “non-royalty-bearing” mode. Rendering through Deadline seems to take longer than rendering through After Effects locally. After Effects needs to be restarted at the beginning of each frame, and this loading time results in the render taking longer than expected. If you know ahead of time that your frames will render quickly, it is recommended to submit your frames in groups of 5 or 10. This way, After Effects will only load at the beginning of each group of frames, instead of at the beginning of every frame. When rendering a job, only the images from the first task are saved, and subsequent tasks just seem to overwrite those initial image files. In the comp’s Output Module Settings, make sure that the “Use Comp Frame Number” checkbox is checked. Check out step 1 here for complete details. I get the error that the specified comp cannot be found when rendering, but it is in the render queue. This can occur for a number of reasons, most of which are related to the name of the comp. Exam- ples are names with two spaces next to each other, or names with apostrophes in them. Try using only alphanumeric characters and underscores in comp names and output paths to see if that resolves the issue. Why do the comps in the After Effects Render Queue require unique names? Due to an issue with the Render Queue, if you have more than one comp with the same name, only the settings from the first one will be used (whether they are checked or not). It is important that all comps in the Render Queue have unique names, and our submission script will notify you if they do not.

9.3.6 Error Messages and Meanings

This is a collection of known After Effects error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Exception during render: Renderer returned non-zero error code, -1073741819 The error code -1073741819 is equivalent to 0xC0000005, which represents a Memory Access Violation error. So After Effects is either running out of memory, or memory has become corrupt. If you find that your frames are still being rendered, you can modify the After Effects plugin to ignore this error. Just add the following function to the AfterEffectsPlugin class in AfterEffects.py, which can be found in [Repository]/plugins/AfterEffects.

def CheckExitCode( self, exitCode ): if exitCode !=0: if exitCode ==-1073741819: LogInfo("Ignoring exit code -1073741819") else: FailRender("Renderer returned non-zero error code %d."% exitCode )

You can find another example of the CheckExitCode function in MayaCmd.py, which can be found in [Repository]/plugins/MayaCmd. aerender ERROR: No comp was found with the given name.

500 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

This can occur for a number of reasons, most of which are related to the name of the comp. Examples are names with to spaces next to each other, or names with apostrophes in them. Try using only alphanumeric characters and underscores in comp names and output paths to see if that resolves the issue.

Exception during render: Renderer returned non-zero error code, 1 aerender ERROR: An existing connection was forcibly closed by the remote host. Unable to receive at line 287 aerender ERROR: After Effects can not render for aerender. Another instance of aerender, or another script, may be running; or, AE may be waiting for response from a modal dialog, or for a render to complete. Try running aerender without the -reuse flag to invoke a separate instance of After Effects.

It is unknown what the exact cause of this error is, but it is likely that After Effects is simply crashing or running out of memory. If you are rendering with Concurrent Tasks set to a value greater than 1, try reducing the number and see if that helps. The Knoll Light Factory plugin has also been known to cause this error message when it can’t get a license.

9.4 Anime Studio

9.4.1 Job Submission

You can submit Anime Studio Standalone jobs from the Monitor.

9.4. Anime Studio 501 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Anime Studio specific options are:

502 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Anime Studio File: The scene file (*.anme) to be rendered. • Output File: The path to where the rendered images will be saved. • Add Format Suffix: If this option is enabled, the format name will be appended to the file name of the output path. Version 9.5 and later. • Version: The version of Anime Studio to render with. • Layer Comp: Render a specific layer comp, or select All to render all layer comps to separate files. Additional Rendering Options: • Antialiased Edges: Normally, Anime Studio renders your shapes with smoothed edges. Uncheck this box to turn this feature off. • Apply Shape Effects: If this box is unchecked, Anime Studio will skip shape effects like shading, texture fills, and gradients. • Apply Layer Effects: If this box is unchecked, Anime Studio will skip layer effects like layer shadows and layer transparency. • Render At Half Dimensions: Check this box to render a smaller version of your movie. This makes rendering faster if you just want a quick preview, and is useful for making smaller movies for the web. • Render At Half Frame Rate: Check this box to skip every other frame in the animation. This makes rendering faster, and results in smaller movie files. • Reduced Particles: Some particle effects require hundreds of particles to achieve their effect. Check this box to render fewer particles. The effect may not look as good, but will render much faster if all you need is a preview. • Extra-smooth Images: Renders image layers with a higher quality level. Exporting takes longer with this option on. • Use NTSC-safe Colors: Automatically limits colors to be NTSC safe. This is only an approximation - you should still do some testing to make sure your animation looks good on a TV monitor. • Do Not Premultiply Alpha Channel: Useful if you plan to composite your Anime Studio animation with other elements in a video editing program. QT Options: • Video Codec: The video codec (leave blank to not specify one). Version 10 and later. • Quality: The quality of the export. Version 10 and later. 0 = Minimum, 1 = Low, 2 = Normal, 3 = High, 4 = Max, 5 = Lossless • Depth: The pixel depth of the export. Version 10 and later. iPhone/iPad Movie Options: • Format: The available formats for m4v movies. AVI Options: • Format: The available formats for avi movies. SWF Options: • Variable Line Widths: Exports variable line widths to SWF.

9.4.2 Plug-in Configuration

You can configure the Anime Studio plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Anime Studio plug-in from the list on the left.

9.4. Anime Studio 503 Deadline User Manual, Release 7.0.3.0

Render Executables • Anime Studio Executable: The path to the Anime Studio executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.4.3 FAQ

Which versions of Anime Studio are supported by Deadline? Anime Studio 8 and later are supported.

9.4.4 Error Messages And Meanings

This is a collection of known Anime Studio error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.5 Arion Standalone

9.5.1 Job Submission

You can submit Arion jobs from the Monitor. Note that Arion’s RenderWarrior application does not support animations and therefore only single Arion files may be submitted. Arion animations can be rendered through the Arion live plugins.

504 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.5. Arion Standalone 505 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Integration options are explained in the Integration documentation. The Arion specific options are: • Arion File: The Arion scene that will be rendered. Can be a .rcs or .obj file. • LDR Output File: The name of the rendered LDR output file. If no output file is specified a default image file will be saved beside the Arion file. • HDR Output File: The name of the rendered HDR output file. If no output file is specified a default image file will be saved beside the Arion file. • Passes: If enabled, Arion will render until the specified number of passes have completed. • Minutes: If enabled, Arion will render until the specified number of minutes have passed. • Threads: The number of threads that Arion will use to render the input file. If no threads are specified, a default of one will be used. • Command Line Args: Here you can specify additional command line arguments. Arion accepts command line arguments in the format “-arg:value”. • Channels: Each channel enabled will generate a different image appended with the channel name. If both Passes and Minutes are specified, Arion will finish rendering when the first limit is reached. If neither are enabled, Arion will render indefinitely and the job will have to be stopped manually.

9.5.2 Plug-in Configuration

You can configure the Arion plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Arion plug-in from the list on the left.

506 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Render Executables • Arion Engine Executable: The path to the Arion engine executable file used for rendering. Enter alternative paths on separate lines.

9.5.3 FAQ

Which versions of Arion are supported? Only the Arion 2 Standalone is supported. Are there any issues with referencing a file in the global input folder when one or more other files exist with the same name? Yes. When there is a file in the scene that has the same name as a file in another subdirectory, the network renderer will reference the first file with that name that it finds. It ignores the direct path to the correct subdirectory. Can I render multiple channels? Yes! The Arion submitter supports the selection of individual channels. How can I pass additional information to Arion? The Command Line Args field allows you to specify additional arguments to Arion. For example, typing “-h:100 -w:100” in the Command Line Args field will tell Arion to change the image size to 100px by 100px. To find out more information about additional command line arguments, please visit Arion’s website. Can I submit a Arion animations? The Arion 2 Standalone does not support animations and can only render single images. Arion does still support animations through there Live plugins.

9.5.4 Error Messages and Meanings

This is a collection of known Combustion error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.6 Arnold Standalone

9.6.1 Job Submission

You can submit Arnold Standalone jobs from the Monitor.

9.6. Arnold Standalone 507 Deadline User Manual, Release 7.0.3.0

508 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Setup your Arnold Files

Before you can submit an Arnold Standalone job, you must export your scene into .ass files. The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Arnold specific options are: • Arnold File: The Arnold file(s) to be rendered. – If you are submitting a sequence of .ass files, select one of the numbered frames in the sequence, and the frame range will automatically be detected if Calculate Frames From Arnold File is enabled. The frames you choose to render should correspond to the numbers in the .ass files. • Output File: The output file. If left blank, Arnold will save the output to the location defined in the .ass file. • Version: Choose the Beta or Release version of Arnold to render with (these can be configured in the Arnold plugin configuration). • Threads: The number of threads to use for rendering. • Verbosity: The verbosity level for the render output. • Enable Local Rendering: If enabled, Deadline will render the frames locally before copying them over to the final network location. • Command Line Args: Specify additional command line arguments you would like to pass to the Arnold renderer. • Additional Plugin Folders: Specify up to three additional plugin folders that Arnold should use when rendering.

9.6.2 Plug-in Configuration

You can configure the Arnold plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Arnold plug-in from the list on the left.

9.6. Arnold Standalone 509 Deadline User Manual, Release 7.0.3.0

Render Executables • Arnold Kick Executable: The path to the Arnold kick executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.6.3 FAQ

Is Arnold Standalone supported by Deadline? Yes. Can I submit a sequence of Arnold .ass files that each contain one frame? Yes, this is supported.

9.6.4 Error Messages and Meanings

This is a collection of known Arnold error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.7 Blender

9.7.1 Job Submission

You can submit jobs from within Blender by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within Blender 2.5 and later, select Render -> Submit To Deadline. For previous versions of Blender, you must submit from the Monitor.

510 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Blender specific options are:

9.7. Blender 511 Deadline User Manual, Release 7.0.3.0

• Threads: The number of threads to use for rendering. • Build To Force: You can force 32 bit or 64 bit rendering.

9.7.2 Plug-in Configuration

You can configure the Blender plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Blender plug-in from the list on the left.

Render Executables • Blender Executable: The path to the Blender executable file used for rendering. Enter alternative paths on separate lines. Output • Suppress Verbose Progress Output To Log: When enabled, this will prevent excessive progress logging to the Slave and task logs.

9.7.3 Integrated Submission Script Setup

The following procedures describe how to install the integrated Blender submission script. This script allows for submitting Blender render jobs to Deadline directly from within the Blender editing GUI. Note that this script only works with Blender 2.5 and later. You can submit to older versions of Blender from the Monitor. You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/Blender/Installers • In Blender, select File -> User Preferences, and then select the Add-Ons tab.

512 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Click on the Render filter on the left, and check the box next to the Render: Submit To Deadline add-on.

Manual Installation of the Submission Script • In Blender, select File -> User Preferences, and then select the Add-Ons tab.

9.7. Blender 513 Deadline User Manual, Release 7.0.3.0

• Click the Install Add-On button at the bottom, browse to [Repository]\submission\Blender\Client, and select the DeadlineBlenderClient.py script. Then press the Install Add-On button to install it. Note that on Win- dows, you may not be able to browse the UNC repository path, in which case you can just copy [Reposi- tory]\submission\Blender\Client\DeadlineBlenderClient.py locally to your machine before pointing the Add-On installer to it.

• Then click on the Render filter on the left, and check the box next to the Render: Submit To Deadline add-on.

514 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• After closing the User Preferences window, the Submit To Deadline option should now be in your Render menu.

9.7.4 FAQ

Which versions of Blender are supported? Blender 2.x is currently supported.

9.7.5 Error Messages And Meanings

This is a collection of known Blender error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.8 Cinema 4D

9.8.1 Job Submission

You can submit jobs from within Cinema 4D by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within Cinema 4D, select Python -> Plugins -> Submit To Deadline.

9.8. Cinema 4D 515 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Cinema 4D specific options are: • Threads To Use: The number of threads to use for rendering. • Build To Force: Force rendering in 32 bit or 64 bit.

516 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Export Project Before Submission: If your project is local, or you are rendering in a cross-platform environment, you may find it useful to export your project to a network directory before the job is submitted. • Enable Local Rendering: If enabled, the frames will be rendered locally, and then copied to their final network location.

9.8.2 Cross-Platform Rendering Considerations

In order to perform cross-platform rendering with Cinema 4D, you must setup Mapped Paths so that Deadline can swap out the Scene and Output file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super user mode by selecting Tools -> Configure Repository. You’ll find the Mapped Paths Setup in the list on the left. When submitting the Cinema 4D job for rendering, you should enable the Export Project Before Submission option, and choose a network location when prompted for the export path. This will strip any absolute asset paths and make them relative to the scene file, and will also ensure the option to submit the Cinema 4D scene file with the job is disabled. If you don’t enable the Export Project Before Submission option, you need to manually export the project to a network location. Then, you must submit the exported scene file from the Submit menu in the Monitor and you need to specify the output and/or multipass output paths in the submitter. Make sure the option to submit the Cinema 4D scene file with the job is disabled. If you leave it enabled, the scene file will be copied to and loaded from the Slave’s local machine, which will break the relative asset paths.

9.8.3 Plug-in Configuration

You can configure the Cinema 4D plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Cinema 4D plug-in from the list on the left.

9.8. Cinema 4D 517 Deadline User Manual, Release 7.0.3.0

Render Executables • C4D Executable: The path to the C4D executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.8.4 Integrated Submission Script Setup

The following procedures describe how to install the integrated Cinema 4D submission script. This script allows for submitting Cinema 4D render jobs to Deadline directly from within the Cinema 4D editing GUI. You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/Cinema4D/Installers Manual Installation of the Submission Script • Copy [Repository]/submission/Cinema4D/Client/DeadlineC4DClient.pyp to [Cinema 4D Install Direc- tory]/plugins. • Restart Cinema 4D, and the Submit To Deadline menu should be available from the Python -> Plugins menu.

Custom Sanity Check

A CustomSanityChecks.py file can be created alongside the main SubmitC4DToDeadline.py submission script (in [Repository]\submission\Cinema4D\Main), and will be evaluated if it exists. This script will let you set any of the initial properties in the submission script prior to displaying the submission window. You can also use it to run your own checks and display errors or warnings to the user. Here is a very simple example of what this script could look like: import c4d from c4d import gui def RunSanityCheck( dialog ):

dialog.SetString( dialog.DepartmentBoxID,"The Best Department!") dialog.SetLong( dialog.PriorityBoxID, 33) dialog.SetLong( dialog.ConcurrentTasksBoxID,2)

gui.MessageDialog("This is a custom sanity check!")

return True

The available dialog IDs can be found in the SubmitC4DToDeadline.py script mentioned above. They are defined near the top of the SubmitC4DToDeadlineDialog class. These can be used to set the initial values in the submission dialog. Finally, if the RunSanityCheck method returns False, the submission will be cancelled.

9.8.5 FAQ

Which versions of Cinema 4D are supported? Cinema 4D 12 and later are supported. When I use Adobe Illustrator files as textures, the render fails with “Asset missing”

518 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

While Cinema 4D is able to use AI files in workstation mode, there is often problems when rendering in command line mode. Convert the AI files to another known type such as TIFF or JPEG before using them. Sometimes when I open the submission dialog in Cinema 4D, the pool list or group list are empty. Simply close the submission dialog and reopen it to repopulate the lists. Does rendering with Cinema 4D with Deadline use up a full Cinema 4D license? There are separate Cinema 4D command line licenses that are required to render with Deadline. Please contact Maxon for more information regarding licensing requirements. Can Deadline render with Cinema 4D’s Net Render Client software? No. It isn’t possible for 3rd party software such as Deadline to control Cinema 4D’s Net Render Client, which is why it uses the command line renderer. I have copied over SubmitToDeadline.pyp file but the integrated submission script does not show up under the python menu. This is likely caused by some failure in the script. Check your repository path to ensure the client is able to read and write to that folder. Using the python console within C4D may provide more specific hints. My frames never seem to finish rendering. When I check the slave machine, it doesn’t appear to be doing anything. This can occur if Cinema 4D hasn’t been licensed yet. Try starting Cinema 4D normally on the machine and see if you are prompted for a license. If you are, configure everything and then try rendering on that machine again.

9.8.6 Error Messages And Meanings

This is a collection of known Cinema 4D error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.9 Cinema 4D Team Render

9.9.1 Job Submission

You can submit jobs from within Cinema 4D by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within Cinema 4D, select Python -> Plugins -> Submit Team Render To Deadline.

9.9. Cinema 4D Team Render 519 Deadline User Manual, Release 7.0.3.0

520 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Cinema 4D Team Render specific options are: • Render Client Count: The number of render clients to use. • Security Token: The security token that the Team Render application will use on the slaves (it will be generated automatically if left blank).

9.9. Cinema 4D Team Render 521 Deadline User Manual, Release 7.0.3.0

Rendering

After you’ve configured your submission options, press the Reserve Clients button to submit the Team Render job. After the job has been submitted, you can press the Update Clients button to update the job’s ID and Status in the submitter. As nodes pick up the job, pressing the Update Clients button will also show them in the Active Servers list. Cinema 4D’s Team Render Machines window will will also appear after pressing the Reserve Clients button, and will show you the Team Render machines that are currently available. Before you can render with them though, you must verify them by following these steps: 1. Copying the Security Token from the submitter to the clipboard (use the Copy to Clipboard button). 2. Right-click on each machine in the Team Render Machines window and select the Verify option, then paste the Security Token and press OK.

When you are ready to render, select the Team Render To Picture Viewer option in C4D’s Render menu to start rendering.

9.9.2 Plug-in Configuration

You can configure the Cinema 4D Team Render plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Cinema 4D plug-in from the list on the left.

522 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Cinema 4D Options • C4D Team Render Executable: The path to the Cinema 4D Team Render Client executable file used for ren- dering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.9.3 Integrated Submission Script Setup

The following procedures describe how to install the integrated Cinema 4D Team Render submission script. This script allows for submitting Cinema 4D Team Render render jobs to Deadline directly from within the Cinema 4D editing GUI. You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/Cinema4DTeamRender/Installers Manual Installation of the Submission Script • Copy [Repository]/submission/Cinema4DTeamRender/Client/DeadlineC4DTeamRenderClient.pyp to [Cinema 4D Install Directory]/plugins. • Restart Cinema 4D, and the Submit To Deadline menu should be available from the Python -> Plugins menu.

9.9.4 FAQ

Which versions of Cinema 4D are supported?

9.9. Cinema 4D Team Render 523 Deadline User Manual, Release 7.0.3.0

Cinema 4D 15 and later are supported.

9.9.5 Error Messages And Meanings

This is a collection of known Cinema 4D error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.10 Clarisse iFX

9.10.1 Job Submission

You can submit jobs from within Clarisse iFX by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within Clarisse iFX, click on the custom toolbar item you created during the integrated submission script setup. You will first be prompted to specify a file to export the render archive to.

After you specify the render archive file, the submitter will come up with the Render Archive and Frame List fields already populated.

524 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Note that if you are submitting from the Monitor, you will have to manually export your render archive from inside Clarisse iFX, and then browse to the Render Archive file in the Monitor submitter.

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Clarisse iFX specific options are: • Threads: The number of threads to use for rendering. If set to 0, the value in the Clarisse configuration file will be used. • Verbose Logging: Enables verbose logging during rendering.

9.10. Clarisse iFX 525 Deadline User Manual, Release 7.0.3.0

9.10.2 Plug-in Configuration

You can configure the Clarisse iFX plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Clarisse plug-in from the list on the left.

Render Executables • CRender Executable: The path to the Clarisse’s crender executable file used for rendering. Enter alternative paths on separate lines. Configuration Options • Global Config File: A global configuration file to be used for rendering. If left blank, the Clarisse.cfg file in the user home directory will be used instead. • Module Paths: Additional paths to search for modules (one path per line). • Search Paths: Additional paths to search for includes (one path per line).

9.10.3 Integrated Submission Script Setup

The following procedures describe how to install the integrated Clarisse iFX submission script. This script allows for submitting Clarisse iFX render jobs to Deadline directly from within the Clarisse iFX editing GUI. You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/Clarisse/Installers Manual Installation of the Submission Script • In Clarisse iFX, right-click on the toolbar at the top and select Add Item.

526 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• In the Add New Item dialog, set the following properties: – Title: Submit To Deadline – Category: Custom – Category Custom: Deadline – Script Path: Choose the DeadlineClarisseClient.py script from [Reposi- tory]\submission\Clarisse\Client

9.10. Clarisse iFX 527 Deadline User Manual, Release 7.0.3.0

• Click Add, and you should now see a Deadline tab in the toolbar with a button that you can click on to submit the job.

9.10.4 FAQ

Which versions of Clarisse iFX are supported?

528 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

The “crender” application is used for rendering, so any version of Clarisse iFX that includes this applica- tion is supported.

9.10.5 Error Messages and Meanings

This is a collection of known Clarisse iFX error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.11 Combustion

9.11.1 Job Submission

You can submit Combustion jobs from the Monitor.

9.11. Combustion 529 Deadline User Manual, Release 7.0.3.0

Workspace Configuration

• In Combustion, when you are ready to submit your workspace, open the Render Queue by selecting File -> Render... (CTRL+R). • Select which items you want to render in the box on the left.

530 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Configure your output settings under the tab Output Settings.

• Under the tab Global Settings, specify an Input Folder (a shared folder where all the footage for you workspace can be found) and an Output Folder (a shared folder where the output will be dumped). Note that Combustion will search any subfolders in you Input Folder for footage as well.

9.11. Combustion 531 Deadline User Manual, Release 7.0.3.0

• Close the Render Queue and save your workspace.

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Combustion specific options are: • Workspace File: The Combustion workspace file to be rendered. • Output Operator: Select the output operator in the workspace file to render. The render will fail if the operator cannot be found. • Version: The version of Combustion to render with. • Skip Existing Frames: Skip over existing frames during rendering (version 4 and later only). • Use Only One CPU To Render: Limit rendering to one CPU (version 4 and later only).

9.11.2 Plug-in Configuration

You can configure the Combustion plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Combustion plug-in from the list on the left.

Render Executables • Combustion Executable: The path to the ShellRender executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

532 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.11.3 FAQ

Which versions of Combustion are supported? Combustion 4 and later are supported. All my input footage is spread out over the network, so how do I specify a single Input Folder during submis- sion? When Combustion is given an Input Folder, it will search all subfolders for the required footage until the footage is found. So if you have a root folder that all of your footage branches off from, you should specify that root as the Input Folder. Are there any issues with referencing a file in the global input folder when one or more other files exist with the same name? Yes. When there is a file in the scene that has the same name as a file in another subdirectory, the network renderer will reference the first file with that name that it finds. It ignores the direct path to the correct subdirectory. Can Deadline render multiple outputs? No. Only one output can be enabled in your Combustion workspace. If no outputs are enabled, or multiple outputs are enabled, the workspace cannot be submitted to Deadline. When rendering, I receive a pop up error message. Since rendering is supposed to be silent, should I not be getting error messages like this in the first place?

Make sure that you’re using ShellRenderer.exe as the render executable, (combustion.exe starts up Com- bustion normally, while ShellRenderer.exe is the command line rendering appliation). You can make the switch in the Plugin Configuration (Tools -> Configure Plugins in the Monitor while in super user mode). Why isn’t path mapping working properly between Windows and Mac? On the Mac, the Combustion workspace file saves network paths in the form share:\\folder\..., so you have to set up your Path Mapping settings in the Repository options accordingly.

9.11.4 Error Messages And Meanings

This is a collection of known Combustion error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.12 Command Line

9.12.1 Job Submission

Arbitrary command line jobs can be submitted to Deadline that will execute the same command line for each frame of the job. To submit arbitrary command line jobs, refer to the Manual Job Submission documentation. To submit from the Monitor, refer to the documentation below.

9.12. Command Line 533 Deadline User Manual, Release 7.0.3.0

534 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Command Line specific options are: • Job Type: Choose a normal job or maintenance job. A normal job will let you specify an arbitrary frame list, but a maintenance job requires a start frame and an end frame. • Executable: The executable to use for rendering. • Arguments: The arguments to pass to the executable. Use the Start Frame and End Frame buttons to add their corresponding tags to the end of the current arguments. See the Manual Job Submission documentation for more information on these tags. • Frame Tag Padding: Determines the amount of frame padding to be added to the Start and End Frame tags. • Start Up Folder: The folder that the executable will be started in. If left blank, the executable’s folder will be used instead.

9.12.2 Plug-in Configuration

The Command Line plug-in does not require any configuration.

9.12.3 FAQ

How do I handle paths in the arguments with spaces in them? Use double-quotes around the path. For example, “T:\projects\path with spaces\project.ext”. Do I need to use the tags? These are only needed when submitting manually from the command line. When using the Monitor submitter, you can just type in the double-quote character in the Arguments field.

9.13 Command Script

9.13.1 Job Submission

You can submit Command Script jobs from the Monitor. Command Script can execute a series of command lines, which can be configured to do anything from rendering to folder synchronization.

9.13. Command Script 535 Deadline User Manual, Release 7.0.3.0

536 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Command Script specific options are: • Commands To Execute: Specify a list of commands to execute by either typing them in, or by loading them from a file. You also have the option to save the current list of commands to a file. To insert file or folder paths into the Commands field, use the Insert File Path or Insert Folder Path buttons. • Startup Directory: The directory where each command will startup. This is optional, and if left blank, the executable’s directory will be used as the startup directory. • Commands Per Task: Number of commands that will be executed for each task.

9.13.2 Manual Submission

Command Script jobs can also be manually submitted from the command line.

Submission File Setup

Three files are required for submission: • the Job Info File • the Plugin Info File • the Command file The Job Info file contains the general job options, which are explained in the Job Submission documentation. The Plugin info file contains one line (this is the directory where each command will startup):

StartupDirectory=...

The Command file contains the list of commands to run. There should be one command per line, and no lines should be left blank. If you’re executable path has a space in it, make sure to put quotes around the path. The idea is that one frame in the job represents one command in the Command file. For example, let’s say that your Command file contains the following:

"C:\Program Files\Executable1.exe" "C:\Program Files\Executable1.exe"-param1 "C:\Program Files\Executable1.exe" "C:\Program Files\Executable1.exe"-param1-param2 "C:\Program Files\Executable1.exe"

Because there are five commands, the Frames specified in the Job Info File should be set to 0-4. If the Chunksize is set to 1, then a separate task will be created for each command. When a slave dequeues a task, it will run the command that is on the corresponding line number in the Command file. Note that the frame range specified must start at 0. If you wish to run the commands in the order that they appear in the Command file, you can do so by setting the MachineLimit in the Job Info File to 1. Only one machine will render the job at a given time, thus dequeuing each task in order. However, if a task throws an error, the slave will move on to dequeue next task. To submit the job, run the following command (where DEADLINE_BIN is the path to the Deadline bin directory):

DEADLINE_BIN\deadlinecommand JobInfoFile PluginInfoFile CommandFile

9.13. Command Script 537 Deadline User Manual, Release 7.0.3.0

Manual Submission Example

This example demonstrates how you can render a single frame from a Maya scene with different options, and direct the output to a specific location. To get the submission script, download Example Script For Command Script Plugin from the Miscellaneous Deadline Downloads Page. To run the script, run the following command (you must have Perl installed):

Perl SubmitMayaCommandScript.pl "SceneFile.mb" FrameNumber "OutputPath"

9.13.3 Plug-in Configuration

The Command Script plug-in does not require any configuration.

9.13.4 FAQ

Can I use executables with spaces in the path? Yes, just add quotes around the executable path.

9.14 Composite

9.14.1 Job Submission

You can submit jobs from within Composite by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

538 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

To submit from within Composite, select the version you would like to submit, hit render, and choose the Background option when prompted.

9.14. Composite 539 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Integration options are explained in the Integration documentation. The Composite specific options are: • Project File: The Composite .txproject file. • Composition: Path to the composition that you want to submit. • Composition Version: The version of the current composition selected. • Users ini file: The path to the user.ini file for this composition. • Version: The version of Composite to use. • Build to Force: Force 32 bit or 64 bit rendering.

9.14.2 Plug-in Configuration

You can configure the Composite plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Composite plug-in from the list on the left.

540 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Render Executables • Composite Executable: The path to the txrender executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.14.3 Integrated Submission Script Setup

The following procedures describe how to setup the integrated Composite submission script. This script allows for submitting Composite render jobs to Deadline directly from within the Composite editing GUI. You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/Composite/Installers Manual Installation of the Submission Script • Copy [Repository]\submission\Composite\Client\DeadlineCompositeClient.py to [CompositeInstall Direc- tory]\resources\scripts\ • Setup the Custom Render Action. – In Composite under the Edit menu select Edit -> Project Preferences – In the opened dialog select the Render Actions tab – Under Render Actions, right click and select New – Name the new action ‘Deadline’ – Enter the following for the Render Command (all on one line):

9.14. Composite 541 Deadline User Manual, Release 7.0.3.0

” “/DeadlineCompositeClient.py” -d “” -u “” -c “” -v “” -o “” -s “” -e “” – There are two additional options you can add to this line:

* -r “COMPOSITE_VERSION” (where COMPOSITE_VERSION is the version of Composite, like 2012)

* -b “COMPOSITE_BUILD” (where COMPOSITE_BUILD is the bitness of Composite, which can be set to None, 32bit, or 64bit)

– In the Render window, select ‘Deadline’ as the Action and press Start.

9.14.4 FAQ

Which versions of Composite are supported? Composite 2010 and later are supported.

9.14.5 Error Messages and Meanings

This is a collection of known Composite error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know.

542 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Currently, no error messages have been reported for this plug-in.

9.15 Corona Standalone

9.15.1 Job Submission

You can submit Corona jobs from the Monitor.

9.15. Corona Standalone 543 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Corona specific options are: • Corona Scene: The Corona scene that will be rendered. Must be a .scn file.

544 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Output File Directory: The directory for the output to be saved to. • Output File Name: The prefix for the output file names. If not specified it defaults to “output”. • Frame List: The list of frames to be rendered. Each frame will be rendered to a separate output file. • Single Frame Job: If selected, the job is a single frame job. • Configuration File(s): Add any configuration files for Corona here. Configuration files are processed in the order they are listed.

9.15.2 Plug-in Configuration

You can configure the Corona plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Corona plug-in from the list on the left.

Render Executables • Corona Executable: The path to the corona standalone executable file used for rendering. Enter alternative paths on separate lines.

9.15.3 Error Messages and Meanings

This is a collection of known Combustion error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.15. Corona Standalone 545 Deadline User Manual, Release 7.0.3.0

9.16 DJV

9.16.1 Job Submission

You can submit DJV jobs from the Monitor. You can use the Submit menu, or you can right-click on a job and select Scripts -> Submit DJV Quicktime Job To Deadline to automatically populate some fields in the DJV submitter based on the job’s output.

Submission Options

The general submission options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. You can get more information about the DJV specific

546 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0 options by hovering your mouse over the label for each setting. The Settings buttons can be used to quickly save and load presets, or reset the settings back to their defaults.

9.16.2 Plug-in Configuration

You can configure the DJV plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the DJV plug-in from the list on the left.

DJV Executables • DJV Executable: The path to the djv_convert executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.16.3 FAQ

Is DJV supported by Deadline? Yes. Can I create Apple Quicktime mov files with DJV? Yes. On Windows, you must use the x32 bit version of DJV only. The LibQuicktime based codecs are only available in DJV v1.0.1 or later AND only on Linux. As an alternative, you can also use Thinkbox’s Draft product (image/movie creation automation toolkit) which is included in Deadline and is licensed against your active Deadline support subscription. See Draft for more information. Can I create EXR files compressed with DreamWorks Animations DWAA or DWAB compression? Yes, but this is only supported in DJV v1.0.01 or later.

9.16. DJV 547 Deadline User Manual, Release 7.0.3.0

9.16.4 Error Messages and Meanings

This is a collection of known DJV error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. [ -auto_tag] and [ -tag Name Value] options not working in DJV plugin DJV has a bug causing DJV to crash which is currently stopping these 2 command line flag options from working. The code has been commented out in the DJV plugin and can be re-enabled as such time the bug is fixed by the DJV developer. Various Command Line options failing in DJV Many of the [djv_convert] commmand line flags are broken due to “spaces” being present between the flag options in DJV versions earlier than v1.0.1. This is all resolved in DJV v1.0.1 and later, so it is recommended to use at least this version (wrapping the flag options with additional quotation marks does not resolve the issue as it’s a bug in the actual [djv_convert] command line args parser function).

9.17 Draft

9.17.1 Job Submission

There are many ways to submit Draft jobs to Deadline. As always, you can simply submit a Draft job from within the Monitor from the Submit menu. In addition, we’ve also added a right-click job script to the Monitor, which will allow you to submit a Draft job based on an existing job. This will pull over output information from the original job, and fill in Draft parameters automatically where possible.

548 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

On top of the Monitor scripts, you can also get set up to submit Draft jobs directly from Shotgun. This will again pull over as much information as possible, this time from the Shotgun database, in order to pre-fill several of the Draft parameter fields. See the Integrated Submission Script Setup section below for more details on this. We’ve also added a Draft section to all of our other submitters. Submitting a Draft job from any of these uses our

9.17. Draft 549 Deadline User Manual, Release 7.0.3.0

Draft Event Plug-in to submit a Draft job based on the job currently being submitted (this is similar in concept to the right-click job script described above). The Draft job will get automatically created upon completion of the original job.

9.17.2 Submission Options

The general Deadline options are available in the Draft submitters, and are explained in the Job Submission documen- tation. Draft-specific options are explained below. It should be noted, however, that given the nature of Draft scripts, not all of these parameters will be used by all scripts. They can even feasibly be used for different purposes than listed here. • Draft Script: This is the Draft script (or Template) that you want to run. • Input File: Indicates where the input file(s) for the Draft Script can be found. What kind of file this is will depend entirely on the Draft Script itself. Passed to the Draft script as ‘inFile’. • Output Folder: Indicates where the output file(s) of the Draft Script will be placed. Can be a relative path, in which case it will be relative to the input. This is passed to the Draft script as ‘outFolder’. • Output File Name: As above, the type of file this is will depend entirely on the selected Draft Script. Passed to the Draft script as ‘outFile’. • Frame List: The list of Frames that the Draft Script should work with. Passed to the Draft Script as ‘frameList’, ‘firstFrame’, and ‘lastFrame’. • User: The name of the user that is submitting the job. Typically used by the Draft script for frame annotations. Passed to the Draft script as ‘username’. • Entity: The name of the entity being submitted. Typically used by the Draft script for frame annotations. Passed to the Draft script as ‘entity’. • Version: The version of the entity being submitted. Typically used by the Draft script for frame annotations. Passed to the Draft script as ‘version’. • Additional Args: Any additional command line arguments that you wish to pass to the Draft script should be listed here. Appended to arguments listed above.

9.17.3 Plug-in Configuration

The Draft plug-in does not require any configuration.

9.17.4 Integrated Submission Script Setup

All of our integrated submission scripts have been updated to have a Draft section, in order to submit dependent Draft jobs. In addition to this, we also have created scripts to allow you to submit a Draft job directly from Shotgun.

Shotgun Action Menu Item

The best way to install the Draft Submission menu item in Shotgun is to use the automated setup script included in the Monitor. To access this, select Scripts -> Install Integration Submission Scripts from the Monitor’s menu. From there, click the ‘Install’ button next to the Draft entry. It should be noted that this functionality is currently only available on the Windows version, and requires administrator privileges to run successfully. It should also be noted that while this script will create the ‘Submit Draft Job’ entry in Shotgun for everyone to see, this must still be done on each machine that will be submitting Draft jobs.

550 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.18 Draft Tile Assembler

9.18.1 Job Submission

You can submit Draft tile assembler jobs from the Monitor. Normally, these jobs are submitted as dependent jobs for your original tile jobs, but you can submit them manually if you wish.

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Draft Tile Assembler specific options are: • Input Config File: The file that will control a majority of the assembly.

9.18. Draft Tile Assembler 551 Deadline User Manual, Release 7.0.3.0

• Error on Missing File: If enabled, the job will error if any of the tiles in the config file are missing. • Cleanup Tiles: If enabled, the job Delete all of the tile files after the assembly is complete. • Build To Force: You can force 32 bit or 64 bit rendering.

Config File Setup

The config file is a plain text file that uses Key/Value pairs (key=value) to control the draft tile assembly. • TileCount=<#>: The number of tiles that are going to be assembled • DistanceAsPixels=: Distances provided in pixels or in a 0.0-1.0 percentage range (Defaults to True) • ImageFolder=: If enabled, the assembler will try to assemble all files within a folder. • BackgroundSource=: If provided, the assembler will attempt to assemble the new tiles over the specified image. • TilesCropped=: If disabled, the assembler will crop the tiles before assembling them. • ImageHeight=<#>: The height of the final image. This will be ignored if a background is provided. If this is not provided and the tiles are not cropped then the first tile will be used to determine the final image size. • ImageWidth=<#>: The height of the final image. This will be ignored if a background is provided. If this is not provided and the tiles are not cropped then the first tile will be used to determine the final image size. • Tile<#>Filename=: The file name of the tile to be assembled. (Only used if Image Folder is False, 0 indexed) • Tile<#>X=<#>: The X coordinates for the tile that is to be assembled. 0 at the left side. • Tile<#>Y=<#>: The Y coordinates for the tile that is to be assembled. 0 at the bottom. • Tile<#>Width=<#>: The width of the tile that is to be cropped. (Only used if TilesCropped is false) • Tile<#>Height=<#>: The height of the tile that is to be cropped. (Only used if TilesCropped is false) • ImageFolder=: The folder that you would like to assemble images from. (Only used if ImageFolder is true) • ImagePadding=<#>: The amount of padding on the file names within the folder.(Only used if ImageFolder is true) • ImageExtension=: The extension that the files to be assembled. (Only used if ImageFolder is true) • Tile<#>Prefix=: The Prefix that the file must contain (Only used if ImageFolder is true)

9.18.2 Plug-in Configuration

The Draft Tile Assembler plug-in does not require any configuration.

9.18.3 FAQ

There are no FAQ entries at this time.

552 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.18.4 Error Messages And Meanings

This is a collection of known Draft Tile Assembler error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.19 FFmpeg

9.19.1 Job Submission

You can submit FFmpeg jobs from the Monitor.

9.19. FFmpeg 553 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The FFmpeg specific options are:

554 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Input File: The input file. • Input Arguments: Additional command line arguments for the input file. • Replace Frame in Input File(s) With Padding: If enabled, the frame number in the file name will be replaced by frame padding before being passed to FFMpeg. • This should be enabled if you are passing a sequence of images as input. • Output File: The output file. • Output Arguments: Additional command line arguments for the output file. • Additional Arguments: Additional general command line arguments. • Additional Input Files: Specify up to 9 additional input files. You can give each file their own arguments, or use the same arguments as the main input file. • FFmpeg Preset Files: Specify preset files for video, audio, or subtitle.

9.19.2 Plug-in Configuration

You can configure the FFmpeg plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the FFmpeg plug-in from the list on the left.

Render Executables • FFmpeg Executable: The path to the FFmpeg executable file used for rendering. Enter alternative paths on separate lines.

9.19. FFmpeg 555 Deadline User Manual, Release 7.0.3.0

9.19.3 FAQ

Currently, there are no FAQs for this plug-in.

9.19.4 Error Messages and Meanings

This is a collection of known FFmpeg error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.20 Fusion

9.20.1 Job Submission

You can submit jobs from within Fusion by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within Fusion, select Script -> DeadlineFusionClient.

556 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation.

9.20. Fusion 557 Deadline User Manual, Release 7.0.3.0

• Render First And Last Frames First: The first and last frame of the flow/comp will be rendered first, followed by the remaining frames in the sequence. Note that the Frame List above is ignored if this box is checked (the frame list is pulled from the flow/comp itself). • Build: Force 32 or 64 bit rendering. • Proxy: The proxy level to use. • High Quality Mode: Whether or not to render with high quality. • Check Saver Output: If checked, Deadline will check all savers to ensure they have saved their image file. • Command Line Mode: Render using separate command line calls instead of keeping the scene loaded in memory between tasks. Using this feature disables the High Quality, Proxy, and Check Saver Output options. This uses the FusionCmd plug-in, instead of the Fusion one.

9.20.2 Plug-in Configuration

You can configure the Fusion and FusionCmd plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Fusion plug-in from the list on the left.

Fusion

Fusion Options • Fusion Render Executable: The path to the Fusion Render Slave executable used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. • Fusion Wait For Executable: If you use a proxy RenderSlave.exe, set this to the name of the renamed original. For example, it might be set to RenderSlave_original.exe. Leave blank to disable this feature.

558 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Fusion Version To Enforce: Deadline will only render Fusion jobs on slaves running this version of Fusion. Use a ; to separate alternative versions. Leave blank to disable this feature. • Fusion Slave Preference File: The path to a global RenderSlave.prefs preference file that is copied over before starting the Render Slave. Leave blank to disable this feature. General Fusion Options • Load Comp Timeout: Maximum time for Fusion to load a comp, in seconds. • Script Connect Timeout: Amount of time allowed for Fusion to start up and accept a script connection, in seconds.

FusionCmd

• Fusion Render Executable: The path to the Fusion Console Slave executable used for rendering. Enter alter- native paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. • Fusion Slave Preference File: The path to a global RenderSlave.prefs preference file that is copied over before starting the Render Slave. Leave blank to disable this feature.

9.20.3 Integrated Submission Script Setup

The following procedures describe how to install the integrated Fusion submission script. This script allows for submitting Fusion render jobs to Deadline directly from within the Fusion editing GUI. Submitter Installer • Run the Submitter Installer located at /submission/Fusion/Installers

9.20. Fusion 559 Deadline User Manual, Release 7.0.3.0

Manual Installation of the Submission Script • Copy [Repository]/submission/Fusion/Client/DeadlineFusionClient.eyeonscript to [Fusion Install Direc- tory]/Scripts/Comp • Restart Fusion to find the DeadlineFusionClient option in the Script menu.

Custom Sanity Check Setup

In the [Repository]/submission/Fusion/Main folder, you can create a file called CustomSanityChecks.eyeonscript. This script will be called by the main Fusion submission script before submission, and can be used to perform sanity checks. Within this script file, you must define this function, which is called by the main script: function CustomDeadlineSanityChecks(comp) local message = ""

...

return message end

All your checks should be placed within this function. This function should return a message that contains the sanity check warnings. If an empty message is returned, then it is assumed the sanity check was a success and no warning is displayed to the user. Here is a simple example that checks if any CineFusion tools are being used in the comp file: function CustomDeadlineSanityChecks(comp) local message = ""

------RULE: Check to make sure Cinefusion is disabled ------cinefusionAttrs = fusion:GetRegAttrs("CineFusion") if not (cinefusionAttrs == nil) then cinefusion_regID = cinefusionAttrs.REGS_ID local i = nil for i, v in comp:GetToolList() do if (v:GetID() == cinefusion_regID) and (v:GetAttrs().TOOLB_PassThrough == false) then message = message .. "CineFusion '" .. v:GetAttrs().TOOLS_Name .. "' should be disabled\n" end end end

return message end

9.20.4 FAQ

Which versions of Fusion are supported? Fusion 5 and later are supported. What’s the difference between the Fusion and FusionCmd plugins? The Fusion plugin starts the Fusion Render Node in server mode and uses eyeonscript to communicate with the Fusion renderer. Fusion and the comp remain loaded in memory between tasks to reduce over- head. This is usually the preferred way of rendering with Fusion.

560 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

The FusionCmd plugin renders with Fusion by executing command lines, and can be used by selecting the Command Line mode option in the Fusion submitter. Because Fusion needs to be launched for each task, there is some additional overhead when using this plugin. In addition, the Proxy, High Quality, and Saver Output Checking features are not supported in this mode. However, this mode tends to print out better debugging information when there are problems (especially when the Fusion complains that it can’t load the comp), so we recommend using it to help figure out problems that may be occurring when using the Fusion plugin. Can I use both workstation and render node licenses to render jobs in Deadline? You can use workstation licenses to render, you just need to do a little tweaking to get this to work nicely. In the Plugin Configuration settings, you need to specify two paths for the render executable option. The first path will be the render node path, and the second will be the actual Fusion executable path. You then have to make sure that the render node is not installed on your workstations. Because you have specified two paths, Deadline will only use the second path if the first one doesn’t exist, which is why the render nodes can’t be installed on your workstations. Why is it not possible to have to 2 instances of Fusion running? With Fusion there is only one tcp/ip port to which eyeonscript (the scripting language used to run Fusion renders on a slave computer) can connect. If Fusion is open on a slave computer then the port will be in use and the Fusion Render Node will have to wait for the port to become available before rendering of Fusion jobs on that slave can begin. Fusion alone renders fine, but with Deadline, the slaves are failing on the last frame. This is usally accompanied by this error message:

INFO: Checking file \\path\to\filename####.ext INFO: Saver "SaverName" did not produce output file. INFO: Expected file "\\path\to\filename####.ext" to exist.

The issue likely has to do with the processing of fields as opposed to full frames. When processing your output as fields, the frames are rendered in two halves (for example, frame 1 would be rendered as 1.0 and 1.5). This error often occurs when the Global Timeline is not set to include the second half of the final frame. Simply adding a .5 to the Global End Time should resolve this issue. For example, let us assume that you are processing fields and your output range is 0 - 100. If the Global Timeline is set to be 0.0 - 100.0, Fusion will render everything, but Deadline will fail on the last frame. If the Global Timeline is set to be 0.0 - 100.5, Deadline will render everything just fine. Is there a way to increase Deadline’s efficiency when rendering Fusion frames that only take a few seconds to complete? Rendering these frames in groups (groups of 5 for example) tends to reduce the job’s overall rendering time. The group size can be set in the Fusion submission dialog using the Task Group Size option. Does Fusion cache data between frames on the network, in the same way it does when rendering sequences locally? Deadline renders each block of frames using the eyeonscript comp.render function. The Fusion Render Node is kept running between each block rendered, so when Fusion caches static results, it can be used by the next block of frames to be rendered on the same machine. Fusion seems to be taking a long time to start up when rendering. What can I do to fix this? If you are running Fusion off a remote share, this can occur when there is a large number of files in the Autosave folder. If this is the case, deleting the files in the Autosave folder should fix the problem. Can I use relative paths in my Fusion comp when rendering with Deadline?

9.20. Fusion 561 Deadline User Manual, Release 7.0.3.0

If your comp is on a network location, and everything is relative to that network path, you can use relative paths if you choose the option to not submit the comp file with the job. In this case, the slaves will load the comp directly over the network, and there shouldn’t be any problems with the relative paths. Just make sure that your render nodes resolve the paths the same way your workstation does.

9.20.5 Error Message and Meanings

This is a collection of known Fusion error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Exception during render: Failed to load the comp “[flowname].comp” in startjob.eyeonscript”? This error usually occurs because the render node is missing a plug-in that is referenced by the flow in question. Often this is because there is a plug-in installed on the machine from which the job was submitted that is not in the Fusion Render Node plug-in directory on the slave machine. It is important to remember that the Fusion Render Node has a different plug-in store than Fusion – even on the same machine – thus one should ensure that the needed plug-ins are copied/installed in both locations. Exception during render: The fusion renderer reported that the render failed. Scroll down to the bottom of the log below for more details. This can occur for a number of reasons, but often Fusion will print out the cause for the error. In the error log window, scroll to the end of the Slave Log capture which is near the bottom of the error message window, and there will be a part which looks something like the following message. This particular message indicates that a font was missing on the machine. INFO: Render started at Wed 8:17PM (Range: 198 to 198) INFO: INFO: Comments: Could not find font “SwitzerlandCondensed” INFO: INFO: Saver 1 failed at time 198 INFO: INFO: Render failed at Wed 8:18PM! Last frame rendered: (none)! INFO: INFO: Render failed We’ve usually found that the problem behind this error was a plug-in that was installed for Fusion, but not for the Fusion Render Node. Try updating your Fusion Render Node plug-ins to match your Fusion plug-ins exactly, and check whether the error still occurs. Exception during render: Eyeonscript failed to make a connection in startjob.eyeonscript - check that Eyeon- script is set to no login required? In order to connect to the Fusion Render Node and communicate with it, Deadline uses the eyeonscript, the scripting language provided for Fusion. The script connects to the Fusion Render Node via a socket connection, which by default requires a login username and password to connect. In order for Deadline to be able to render using a given Fusion Render Node, you must change its settings so that it no longer requires the username and password. This is done by running the Fusion Render Node, right clicking on the icon it creates, and choosing preferences. From there, pick the Script option, and you will see radio buttons, one of which says ‘No login required’. Make sure that that is the option selected, then click Save to save the preferences, and exit Fusion Render Node.

9.21 Generation

9.21.1 Job Submission

You can submit comp jobs to Fusion from within Generation by installing the integrated submission script. The instructions for installing the integrated submission script can be found further down this page. In Generation, select the comp(s) you want to submit, and then right-click and select Submit.

562 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

This will bring up the submission window. Note that the submission window is only shown once, and all jobs that are submitted will use the same job settings.

9.21. Generation 563 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Fusion options are: • Use Frame List In Comp: Uses the frame list defined in the comp files instead of the Frame List setting. If you

564 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

are submitting more than one comp from Generation, you should leave this option enabled unless you want the Frame List setting to be used for each comp. • Proxy: The proxy level to use. • High Quality Mode: Whether or not to render with high quality. • Check Output: If checked, Deadline will check all savers to ensure they have saved their image file. • Version: The version of Fusion to render with. • Build: Force 32 or 64 bit rendering. • Command Line Mode: Render using separate command line calls instead of keeping the scene loaded in memory between tasks. Using this feature disables the High Quality, Proxy, and Check Saver Output options. This uses the FusionCmd plug-in, instead of the Fusion one.

9.21.2 Plug-in Configuration

The Generation submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for information on con- figuring the Fusion plug-in.

9.21.3 Integrated Submission Script Setup

The following procedures describe how to install the integrated Generation submission script. This script allows for submitting Generation comp jobs to Deadline directly from within the Generation editing GUI. Submitter Installer • Run the Submitter Installer located at /submission/Generation/Installers Manual Installation of the Submission Script • Copy [Repository]\submission\Generation\Client\DeadlineGenerationClient.lua to the Generation scripts folder ([Generation Install Folder]\scripts\generation). • In the Generation program data folder (%PROGRAMDATA%\eyeon\Generation), you’ll need to edit your Gen- eration.cfg file. If you currently do not have a Generation.cfg file, create an empty one. Open your Genera- tion.cfg file and add this:

SCRIPT_FARMSUBMIT="scripts\generation\DeadlineGenerationClient.lua"

• Save the file. The next time you start up Generation, this script will be used when you select the Submit option for the selected comps.

9.21.4 FAQ

Which versions of Generation are supported? Generation 2 and later are supported.

9.21.5 Error Messages and Meanings

The Generation submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for Fusion error messages and meanings.

9.21. Generation 565 Deadline User Manual, Release 7.0.3.0

9.22 Hiero

9.22.1 Job Submission

You can submit transcoding jobs to Nuke from within Hiero by installing the integrated submission script. The instructions for installing the integrated submission script can be found further down this page. To submit from within Hiero, open the Export window from the File menu, or by right-clicking on a sequence. Then choose the Submit To Deadline option in the Render Background Tasks drop down and press Export.

This will bring up the submission window. Note that the submission window is only shown once, and all jobs that are submitted will use the same job settings.

566 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Nuke specific options are: • Render With NukeX: Enable this option if you want to render with NukeX instead of Nuke. • Render Threads: The number of threads to use for rendering. • Continue On Error: If enabled, Nuke will attempt to keep rendering if an error occurs. • Maximum RAM Usage: The maximum RAM usage (in MB) to be used for rendering. • Use Batch Mode: If enabled, Deadline will keep the Nuke file loaded in memory between tasks. • Build To Force: Force 32 or 64 bit rendering.

9.22. Hiero 567 Deadline User Manual, Release 7.0.3.0

9.22.2 Cross-Platform Rendering Considerations

The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for cross-platform rendering considerations.

9.22.3 Plug-in Configuration

The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for information on configuring the Nuke plug-in.

9.22.4 Integrated Submission Script Setup

The following procedures describe how to install the integrated Hiero submission script. This script allows for sub- mitting Hiero transcoding jobs to Deadline directly from within the Hiero editing GUI. These jobs are then rendered using the Nuke plugin. Submitter Installer • Run the Submitter Installer located at /submission/Hiero/Installers Manual Installation of the Submission Script • Go to your .hiero user folder (~/.hiero or %USERPROFILE%\.hiero) and create a folder called “Python” if it doesn’t exist. • Open the “Python” folder and create another folder called “Startup” if it doesn’t exist. • Copy [Repository]\submission\Hiero\Client\DeadlineHieroClient.py to the “Startup” folder (~/.hi- ero/Python/Startup or %USERPROFILE%\.hiero\Python\Startup). The next time you launch Hiero, there should be a Submit To Deadline option in the Hiero Export window, in the Render Background Tasks drop down.

9.22.5 FAQ

The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for additional FAQs related to Nuke. Which versions of Hiero are supported? Hiero 1.0 and later are supported. How does the Deadline submission script for Hiero work? The submission script submits transcoding jobs from Hiero to Deadline, which are rendered with the Nuke plugin.

9.22.6 Error Messages and Meanings

The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for Nuke error messages and meanings.

568 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.23 Houdini

9.23.1 Job Submission

You can submit jobs from within Houdini by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within Houdini, select ‘Render’ -> ‘Submit To Deadline’.

9.23. Houdini 569 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Houdini specific options are:

570 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• ROP To Render: – Choose: Allows you to choose your ROP from the dropbox to the right. – Selected: Allows you to render each ROP that you currently have selected in Houdini (in the order that you selected) – All: Allows you to render every ROP in the Houdini file. • Houdini File: The Houdini file to be rendered. Note that anything within the hip file that points to an outside file absolutely must have an absolute address on a shared network location, or the render nodes won’t be able to find the external references. • Output File: The output filename. • Render Node: You must manually enter a renderer (output driver) to use. Note that the full node path must be specified, so if your output driver is ‘mantra1’, it’s likely that the full node path would be ‘/out/mantra1’. • Version: The version of Houdini to use. • Build to Force: Force 32 or 64 bit rendering. • Override Export IFD: Enable this option to export IFD files from the Houdini scene file. Specify the path to the output IFD files here. • Submit Dependent Mantra Standalone Job: Enable this option to submit a dependent Mantra standalone job that will render the exported IFD files. See the section below for more details. • Ignore Inputs: If enabled, only the specified ROP will be rendered. No dependencies will rendered. • Submit Houdini Scene: If this option is enabled, the scene file will be submitted with the job, and then copied locally to the slave machine during rendering.

IFD Exporting and Mantra Standalone

The Houdini submitter allows you to submit a job that will export the scene to IFD files, and then submit a dependent Mantra Standalone job to render the exported IFD files.

9.23. Houdini 571 Deadline User Manual, Release 7.0.3.0

When submitting from the Monitor, you just need to enable the Override Export IFD option. When submitting from within Houdini using the integrated submission script, you must first make sure that the ROPs you wish to export have the Disk File option enabled in their properties, and then enable the Submit Dependent Mantra Standalone Job option in the submitter. Note that if a ROP does not have the Disk File setting enabled, it will simply render the image, and no dependent Mantra Standalone job will be submitted.

572 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

The general Deadline options for the Mantra Standalone job are explained in the Job Submission documentation. The Mantra Standalone specific options are: • Mantra Threads: The number of threads to use for the Mantra stanadlone job.

9.23.2 Plug-in Configuration

You can configure the Houdini plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Houdini plug-in from the list on the left.

9.23. Houdini 573 Deadline User Manual, Release 7.0.3.0

Render Executables • Hython Executable: The path to the hython executable. It can be found in the Houdini bin folder. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Licensing Options • Slaves To Use Escape License: A list of slaves that should use a Houdini Escape license instead of a Batch license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003

9.23.3 Integrated Submission Script Setup

The following procedures describe how to setup the integrated Houdini submission script for Deadline. This script has been tested with Houdini 9 and later. Submitter Installer • Run the Submitter Installer located at /submission/houdini/Installers ManualInstallation of the Submission Script On Windows or Linux, copy the client script to the Houdini install directory • If the folder [Houdini Install Directory]\houdini\scripts\deadline\ doesn’t exist, create it. • Copy [Repository]\submission\Houdini\Client\DeadlineHoudiniClient.py to [Houdini Install Direc- tory]\houdini\scripts\deadline\DeadlineHoudiniClient.py On Mac OSX, copy the client script to the Houdini Framework folder • If the folder [Houdini Framework]/Versions/[Houdini Version]/Resources/houdini/scripts/deadline/ doesn’t ex- ist, create it.

574 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Copy [Repository]\submission\Houdini\Client\DeadlineHoudiniClient.py to [Houdini Frame- work]/Versions/[Houdini Version]/Resources/houdini/scripts/deadline/DeadlineHoudiniClient.py • The Houdini Framework folder can typically be found in /Library/Frameworks/Houdini.Framework Add a menu item to execute the script • Open the file [Houdini Install Directory]/houdini/MainMenuCommon in a text editor. • Add the following in between the and tags, and make sure it is added after the closing tag.

render_menu $HFS/houdini/scripts/deadline/DeadlineHoudiniClient.py

For example, this is what the last few lines of your MainMenuCommon file might look like:

render_menu $HFS/houdini/scripts/deadline/DeadlineHoudiniClient.py

9.23.4 FAQ

Which versions of Houdini are supported by Deadline? Houdini 9 and later are supported. To render with Houdini 7 or 8, use the Mantra Plug-in. Which Houdini license(s) are required to render with Deadline? Deadline uses Hython to render, which uses hbatch licenses. If those are not available, it will try to use a Master License instead.

9.23.5 Error Messages and Meanings

This is a collection of known Houdini error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.23. Houdini 575 Deadline User Manual, Release 7.0.3.0

9.24 Lightwave

9.24.1 Job Submission

You can submit jobs from within Lightwave by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

576 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

To submit from within Lightwave, select the Render Tab and click the SubmitToDeadline button on the left.

9.24. Lightwave 577 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Lightwave specific options are: • Content Directory: The Lightwave Content directory. Refer to your Lightwave documentation for more infor- mation. • Config Directory: The Lightwave Config directory. Refer to your Lightwave documentation for more informa- tion. • Force Build: For Lightwave 9 and later, force rendering in 32 bit or 64 bit. • Use FPrime Renderer: If you want to use the FPrime renderer instead of the normal Lightwave renderer. • Use ScreamerNet Rendering: ScreamerNet rendering keeps the Lightwave scene loaded in memory between frames, which reduces overhead time when rendering. Notes: • At the moment, there is no support for rendering animation (movie) files. Any animation options will be ignored, and an RGB output and/or Alpha output must be specified in order to submit to Deadline.

578 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• In the Scene file, some versions of Lightwave use a number to specify the output file type and some use the actual file type extension (.tif, .tga, etc). In the versions that use the actual file type extension, individual rendered images can be viewed from the Monitor task list by right-clicking on them. • For information on how to properly set up your network for Lightwave rendering, see the ScreamerNet section of your Lightwave documentation. When Lightwave is properly configured for ScreamerNet rendering, it will then render properly through Deadline.

9.24.2 Cross-Platform Rendering Considerations

In order to perform cross-platform rendering with Lightwave, you must setup Mapped Paths so that Deadline can swap out file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super user mode by selecting Tools -> Configure Repository. You’ll find the Mapped Paths Setup in the list on the left.

From here, you can set the list of executables that will be used for rendering. To get a more detailed description of each setting, simply hover the mouse cursor over a setting and a tool tip will be displayed.

9.24.3 Integrated Submission Script Setup

This section describes how to install the integrated render job submission script for Lightwave. This script allows for submitting Lightwave render jobs to Deadline directly from within the Lightwave editing GUI. Note that on Mac OSX, this script is only supported by the Universal Binary versions of Lightwave. • Click the Utilities tab. Find the Plugins section on the right and click the Add Plug-ins button. Select the DeadlineLightwaveClient.ls file found in [Repository]\submission\Lightwave\Client.

9.24. Lightwave 579 Deadline User Manual, Release 7.0.3.0

• Click the Edit menu in the top-left corner and select the Edit Menu Layout... option.

580 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• In the Command list on the left, expand the Plug-ins section in Lightwave 8 or the Additional section in Light- wave 9 and later, and find the DeadlineLightwaveClient plugin. Drag and drop it into the Menus list in the Render section. Click Done.

• Click the Render tab. There should be a DeadlineLightwaveClient button on the right. If there is not, check to make sure you placed the DeadlineLightwaveClient plugin in the correct section.

9.24. Lightwave 581 Deadline User Manual, Release 7.0.3.0

9.24.4 FAQ

Which version of Lightwave are supported? Lightwave versions 8 and later are supported. On Mac OSX, both the PPC and Universal Binary versions work. However, the integrated Lightwave submission script only works with the Universal Binary version. Lightwave 10 integrated submitter crashes with Deadline 5.0 and older on Mac OSX. Due to an API change in LightWave, previous integrated submission scripts will not work under Light- Wave 10 on OSX. This is fixed in Deadline 5.1. Does Deadline support the FPrime renderer? Yes. FPrime has its own net rendering application called wsn.exe, which can be configured in the Light- wave plugin configuration. When you submit your Lightwave job, just make sure to have the Use FPrime Renderer option checked. When rendering with FPrime, I get an error that it can’t create a temporary config directory. This can occur when the job is using a shared Config folder on the network. FPrime tries to create a temporary config directory in this shared folder, and this can fail if many slaves are trying to access that Config folder at the same time. To avoid this problem, we suggest enabling the FPrime Use Local Config option in the Lightwave Plugin Configuration, which can be accessed from the Monitor while in Super User mode by selecting Tools ->

582 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Configure Plugins. When this option is enabled, Deadline will copy the contents of the shared Config folder to a local folder, and this is the Config folder that FPrime will use. What does the Use ScreamerNet Rendering option in the submission dialog do? When using ScreamerNet rendering, the Lightwave scene is kept loaded in memory between each frame for a job, which greatly reduces the overhead of having to load the scene at the beginning of each frame. Does Deadline work if one renames the Lightwave configuration files in the configuration directory? Currently, Deadline assumes that you have not renamed the Lightwave configuration files in the Lightwave configuration directory.

9.24.5 Error Messages and Meanings

This is a collection of known Lightwave error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.25 LuxRender

9.25.1 Job Submission

You can submit LuxRender jobs from the Monitor.

9.25. LuxRender 583 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The LuxRender specific options are: • LXS File: The file to render. • Threads: The number of threads to use. Specify 0 to use the same number of threads as there are CPUs.

9.25.2 Plug-in Configuration

You can configure the LuxRender plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the LuxRender plug-in from the list on the left.

584 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Render Executables • Luxrender Executable: The path to the luxconsole executable file used for rendering. Enter alternative paths on separate lines.

9.25.3 FAQ

Is LuxRender supported by Deadline? Yes.

9.25.4 Error Messages and Meanings

This is a collection of known LuxRender error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.26 Mantra Standalone

9.26.1 Job Submission

You can submit Mantra Standalone jobs from the Monitor.

9.26. Mantra Standalone 585 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Mantra specific options are: • IFD File: Specify the Mantra IFD file(s) to render.

586 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

– If you are submitting a sequence of .IFD files, select one of the numbered frames in the sequence, and the frame range will automatically be detected if Calculate Frames From IFD File is enabled. The frames you choose to render should correspond to the numbers in the .IFD files. • Output File: The output file path. • Version: The Mantra version to render with. • Threads: The number of threads to use for rendering. • Additional Arguments: Additional command line arguments to pass to the renderer.

9.26.2 Plug-in Configuration

You can configure the Mantra plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Mantra plug-in from the list on the left.

Render Executables • Mantra Executable: The path to the Mantra executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Path Mapping (For Mixed Farms) • Enable Path Mapping: If enabled, a temporary IFD file will be created locally on the slave for rendering because Deadline does the path mapping directly in the IFD file. This feature can be turned off if there are no Path Mapping entries defined in the Repository Options.

9.26.3 FAQ

Which versions of Mantra are supported by Deadline?

9.26. Mantra Standalone 587 Deadline User Manual, Release 7.0.3.0

Mantra for Houdini 7 and later supported by Deadline.

9.26.4 Error Messages and Meanings

This is a collection of known Mantra error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.27 Maxwell

9.27.1 Job Submission

You can submit Maxwell jobs from the Monitor.

588 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Maxwell specific options are: • Maxwell File(s): The Maxwell files to be rendered. Can be a single file, or a sequence of files. • Version: The version of Maxwell to render with.

9.27. Maxwell 589 Deadline User Manual, Release 7.0.3.0

• Verbosity: Set the amount of information that Maxwell should output while rendering. • Single Frame Job: This should be checked if you’re submitting a single Maxwell file only. • Build To Force: Force 32 bit or 64 bit rendering. • Render Threads: The number of threads to use during rendering. Specify 0 to use the default setting. • Cooperative Rendering: Enable this to use Maxwell’s co-op rendering feature to render the same image across multiple computers. You can then use Maxwell to combine the resulting output after the rendering has com- pleted. • Split Co-op Renders Into Separate Jobs: By default, a co-op render is submitted as a single job, where each task represents a different seed. If this option is enabled, a separate job will represent each seed. • Auto-Merge Files: Enable this option to auto-merge the co-op renders into the final image. • Fail On Missing Intermediate Files: If enabled, the auto-merge will fail if any co-op renders are missing. • Delete Intermediate Files: If enabled, the co-op renders will be deleted after the final image is merged together. • Output MXI File: Optionally configure the output path for the MXI file which can be used to resume the render later. Note that this is required for co-op rendering though. • Output Image File: Optionally configure the output path for the image file. • Enable Local Rendering: If enabled, Deadline will save the output locally and then copy it to the final network location. • Resume Rendering From MXI File: If enabled, Maxwell will use the specified MXI file to resume the render if it exists. If you suspend the job in Deadline, it will pick up from where it left off when it resumes. • Override Time: Enable to override the Time setting in the Maxwell file. • Override Sampling: Enable to override the Sampling setting in the Maxwell file. If the Adjust For Cooperative Rendering option is enabled, the sampling level given to each slave will be reduced accordingly to ensure that final merged sampling level will match the requested one. • Additional Arguments: Additional command line arguments to pass to the renderer.

Resuming a Render

When specifying an MXI file, you now have the option to have Maxwell use it to resume a render job if that MXI file already exists. This means that if you suspend a Maxwell job from the Monitor mid-render, it will resume from where it left off when you resume the job.

9.27.2 Plug-in Configuration

You can configure the Maxwell plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Maxwell plug-in from the list on the left.

590 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

General Maxwell Options • Slaves To Use Interactive License: A list of slaves that should use an interactive Maxwell license instead of a render license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003 Maxwell Version Options • Render Executable: The path to the Maxwell executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. • Merge Executable: The path to the Maxwell executable file used for merging. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.27.3 FAQ

Which version of Maxwell is supported by Deadline? Versions 2 and later are supported. Is Co-op Rendering supported? Yes. Can I resume from a previous Maxwell render? If you have the Resume Rendering From MXI File option enabled when submitting the job, Maxwell will use the specified MXI file to resume the render if it exists. If you suspend the job in Deadline, it will pick up from where it left off when it resumes.

9.27. Maxwell 591 Deadline User Manual, Release 7.0.3.0

9.27.4 Error Messages and Meanings

This is a collection of known Maxwell error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.28 Maya

9.28.1 Job Submission

You can submit jobs from within Maya by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

592 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.28. Maya 593 Deadline User Manual, Release 7.0.3.0

To submit from within Maya, select the Thinkbox shelf and press the green button there. If the green icon is missing, you can delete the shelf and restart Maya to get it back.

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Maya specific options are: • Camera: Select the camera to render with. Leaving this blank will force Deadline to render using the default camera settings (including multiple camera outputs). • Project Path: The Maya project folder (this should be a shared folder on the network). • Output Path: The folder where your output will be dumped (this should be a shared folder on the network). • Maya Build: Force 32 bit or 64 bit rendering. • Use MayaBatch Plugin: This uses our new MayaBatch plugin that keeps the scene loaded in memory between frames, thus reducing the overhead of rendering the job. This plugin is no longer considered experimental. • Ignore Error Code 211: This allows a Maya task to finish successfully even if the Maya command line renderer returns the non-zero error code 211 (not available when using the MayaBatch plugin). Sometimes Maya will return this error code even after successfully saving the rendered images. • Startup Script: Maya will source the specified script file on startup (only available when using the MayaBatch plugin). • Command Line Args: Specify additional command line arguments to pass to the Maya command line renderer (not available when using the MayaBatch plugin). • Deadline Job Type: Select the type of Maya job you want to submit. The available options are covered in the next few sections.

Maya Render Job

If rendering a normal Maya job, select the Maya Render Job type.

594 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

General Options The following options are available: • Threads: The maximum number of CPUs per machine to render with. • Frame Number Offset: Uses Maya’s frame renumbering option to offset the frames that are rendered. • Submit Render Layers As Separate Jobs: Enable to submit each layer in your scene as a seperate job. • Override Layer Job Settings: If submitting each layer as a separate job, enable this option override the job name, frame list, and task size for each layer. When enabled, the override dialog will appear after you press Submit.

9.28. Maya 595 Deadline User Manual, Release 7.0.3.0

• Submit Cameras As Separate Jobs: Enable to submit each camera as a separate job. • Ignore Default Cameras: Enable to have Deadline skip over cameras like persp, top, etc, when submitting each camera as a separate job (even if those cameras are set to renderable). • Enable Local Rendering: If enabled, Deadline will render the frames locally before copying them over to the final network location. This has been known to improve the speed of Maya rendering in some cases. • Strict Error Checking: Enable this option to have Deadline fail Maya jobs when Maya prints out any “error” or “warning” messages. If disabled, Deadline will only fail on messages that it knows are fatal. • Render Half Frames: If checked, frames will be split into two using a step of 0.5. Note that frame 0 will save out images 0 and 1, frame 1 will save out images 2 and 3, frame 2 will save out images 4 and 5, etc. Region Rendering Options Setting up a region rendering job:

596 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Enable Region Rendering: If enabled, the frame will be split into multiple tiles that are rendered individually and can be assembled after. • Region Render Type: If set to Jigsaw Rendering then the submissions will use Jigsaw, otherwise it will use a grid of tiles. • Submit All Tiles as a single Job: If enabled, a single frame will be submitted with all tiles in a single job, otherwise each tile will be submitted as a seperate job with each frame being a seperate frame. • Submit Dependent Assembly Job: Submit a job dependent on the region job that will assemble the tiles, if doing jigsaw animation a seperate job will be created for each different named output file • Cleanup Tiles after Assembly: If selected the tiles will be deleted after assembly • Error on missing Tiles: If enabled, then if any of the tiles are missing the assembly job will fail • Assemble Over: Determine what the Draft Tile Assembler should assemble over be it a blank image, previous output or a specified file. • Error on Missing Background: If enabled, then if the background file is missing the job will fail • You can submit a dependent assembly job to assemble the image when the main tile job completes. If using Draft for the assembly, you’ll need a license from Thinkbox. Otherwise, the output formats that are supported are BMP, DDS, EXR, JPG, JPE, JPEG, PNG, RGB, RGBA, SGI, TGA, TIF, and TIFF. • Note that the Error On Missing Tiles option only applies to Draft assemblies. Renderer Specific Options If rendering with Mental Ray, there is an additional Mental Ray Options section under the Maya Options: • Mental Ray Verbosity: Set the verbosity level for Mental Ray renders. • Auto Memory Limit: If enabled, Mental Ray will automatically detect the optimal memory limit when render- ing. • Memory Limit: Soft limit (in MB) for the memory used by Mental Ray (specify 0 for unlimited memory). If rendering with VRay, there is an additional VRay Options section under the Maya Options: • Auto Memory Limit Detection: If enabled, Deadline will automatically detect the dynamic memory limit for VRay prior to rendering. • Memory Buffer: Deadline subtracts this value from the system’s unused memory to determine the dynamic memory limit for VRay. If rendering with Redshift, there will be an additional Redshift Options under the Maya Options: • GPUs Per Task: If set to 0 (the default), then Redshift will be responsible to choosing the GPUs to use for rendering. If this is set to 1 or greater, then each task for the job will be assigned specific GPUs. This can be used in combination with concurrent tasks to get a distribution over the GPUs. • For example: – if this is set to 1, then tasks rendered by the Slaves’s thread 0 would use GPU 0, thread 1 would use GPU 1, etc. – if this is set to 2, then tasks rendered by the Slaves’s thread 0 would use GPUs {0,1}, thread 1 would use GPUs {2,3}, etc.

Mental Ray Export Job

If rendering a Mental Ray Export job, select the Mental Ray Export Job type.

9.28. Maya 597 Deadline User Manual, Release 7.0.3.0

The following options are available: • Output File: The full filename of the Mental Ray files that will be exported. Padding is handled automatically by the exporter. • Export Settings: This opens up the Mental Ray export settings dialog where you can configure the remaining of the settings. Note that this dialog must be open when you submit the job.

598 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

You have the option to submit a dependent Mental Ray Standalone job that will render the exported mi files after the export job finishes. Settings like pool, priority, group, etc, will be the same as the export job, but there are some Mental Ray specific job options that you can specify as well.

VRay Export Job

If rendering a VRay Export job, select the VRay Export Job type.

9.28. Maya 599 Deadline User Manual, Release 7.0.3.0

The following options are available: • Output File: The full file name of the VRay files that will be exported (padding is handled automatically by the exporter). • VRay Render Job: You have the option to submit a dependent VRay Standalone job that will render the exported vrscene files after the export job finishes. Settings like pool, priority, group, etc, will be the same as the export job, but there are some VRay specific job options that you can specify here. • Vrimg2Exr Render Job: If you are submitting a dependent VRay Standalone job, and the output format is vrimg, you have the option to submit a dependent job that will convert the vrimg files to exr files, using VRay’s vrimg2exr application.

Renderman Export Job

If rendering a Renderman Export job, select the Renderman Export Job type.

The following options are available: • Threads: The number of threads to use for exporting. Specify 0 to automatically use the optimal number of threads. • PRMan Render Job: You have the option to submit a dependent PRMan job that will render the exported rib files after the export job finishes. Settings like pool, priority, group, etc, will be the same as the export job, but there are some PRMan specific job options that you can specify here.

600 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Arnold Export Job

If rendering an Arnold Export job, select the Arnold Export Job type.

The following options are available: • Arnold Render Job: You have the option to submit a dependent Arnold Standalone job that will render the exported .ass files after the export job finishes. Settings like pool, priority, group, etc, will be the same as the export job, but there are some Arnold specific job options that you can specify here.

9.28.2 Cross-Platform Rendering Considerations

In order to perform cross-platform rendering with Maya, you must setup Mapped Paths so that Deadline can swap out the Project and Output paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super user mode by selecting Tools -> Configure Repository. You’ll find the Mapped Paths Setup in the list on the left. As long as all paths used in your Maya scene are relative to the Project and Output paths, and those paths are network accessible, you should have no problems performing cross-platform renders. However, if you are using absolute paths in your Maya scene file, it is possible for Deadline to swap them as well, but you must save your scene file as a Maya Ascii (.ma) file. Because .ma files are ascii files, Deadline can read them and swap out paths as necessary. If they’re saved as Maya Binary (.mb) files, they can’t be read, and can’t have their paths swapped.

9.28.3 Plug-in Configuration

You can configure the MayaBatch and MayaCmd plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Maya plug-in from the list on the left.

9.28. Maya 601 Deadline User Manual, Release 7.0.3.0

MayaBatch

Render Executables • Maya Executable: The path to the Maya executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Maxwell For Maya (version 2 and later • Slaves To Use Interactive License: A list of slaves that should use an interactive Maxwell license instead of a render license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003 Path Mapping For ma Scene Files (For Mixed Farms) • Enable Path Mapping For ma Files: If enabled, a temporary ma file will be created locally on the slave for rendering and Deadline will do path mapping directly in the ma file. Debugging • Log Script Contents To Render Log: If enabled, the full script that Deadline is passing to Maya will be written to the render log. This is useful for debugging purposes.

602 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

MayaCmd

Render Executables • Maya Executable: The path to the Maya executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Maxwell For Maya (version 2 and later • Slaves To Use Interactive License: A list of slaves that should use an interactive Maxwell license instead of a render license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003 Path Mapping For ma Scene Files (For Mixed Farms) • Enable Path Mapping For ma Files: If enabled, a temporary ma file will be created locally on the slave for rendering and Deadline will do path mapping directly in the ma file.

9.28.4 Integrated Submission Script Setup

The following procedures describe how to install the integrated Maya submission script. This script allows for submit- ting Maya render jobs to Deadline directly from within the Maya editing GUI. The script and the following installation procedure has been tested with Maya 2010 and later. Submitter Installer • Run the Submitter Installer located at /submission/3dsmax/Installers Manual Installation of the Submission Script On Windows, copy the file [Repository]\submission\Maya\Client\DeadlineMayaClient.mel to [Maya Install Direc- tory]\scripts\startup. If you do not have a userSetup.mel in [My Documents]\maya\scripts, copy the file [Reposi- tory]\submission\Maya\Client\userSetup.mel to [My Documents]\maya\scripts. If you have a userSetup.mel file, add the following line to the end of this file:

9.28. Maya 603 Deadline User Manual, Release 7.0.3.0

source "DeadlineMayaClient.mel";

On Mac OS X, copy the file [Repository]/submission/Maya/Client/DeadlineMayaClient.melto [Maya Install Directory]/Maya.app/Contents/scripts/startup. If you do not have a user- Setup.mel in /Users/[USERNAME]/Library/Preferences/Autodesk/maya/scripts, copy the file [Reposi- tory]/submission/Maya/Client/userSetup.mel to /Users/[USERNAME]/Library/Preferences/Autodesk/maya/scripts. If you have a userSetup.mel file, add the following line to the end of this file: source "DeadlineMayaClient.mel";

On Linux, copy the file [Repository]/submission/Maya/Client/DeadlineMayaClient.mel to [Maya Install Direc- tory]/Maya.app/Contents/scripts/startup. If you do not have a userSetup.mel in /home/[USERNAME]/maya/scripts, copy the file [Repository]/submission/Maya/Client/userSetup.melto /home/[USERNAME]/maya/scripts. If you have a userSetup.mel file, add the following line to the end of this file: source "DeadlineMayaClient.mel";

The next time Maya is started, a Deadline shelf should appear with a green button that can be clicked on to launch the submitter. If you don’t see the Deadline shelf, it’s likely that Maya is loading another userSetup.mel file from somewhere. Maya can only load one userSetup.mel file, so you either have to configure Maya to point to the file mentioned above, or you have to modify the file that Maya is currently using as explained above. To figure out which userSetup.mel file Maya is using, open up Maya and then open up the Script Editor. Run this command: whatIs userSetup.mel

Custom Sanity Check

You can create a CustomSanityChecks.mel file alongside the main SubmitMayaToDeadline.mel in the [Reposi- tory]\submission\Maya\Main folder, and it can be used to set defaults in the submission script before it is displayed. For example, here is a script that can set the default Limit Groups based on the renderer:

AddStringAttribute( "deadlineLimitGroups" ); if( GetCurrentRenderer() == "mentalRay" ) setAttr defaultRenderGlobals.deadlineLimitGroups -type "string" "mental_ray_for_maya"; else if( GetCurrentRenderer() == "vray" ) setAttr defaultRenderGlobals.deadlineLimitGroups -type "string" "vray_for_maya"; else setAttr defaultRenderGlobals.deadlineLimitGroups -type "string" "";

604 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

The available Deadline globals are defined in the SavePersistentDeadlineOptions function in the SubmitMayaToDead- line.mel script. These can be used to set the initial values in the submission dialog. You can also create a CustomPostSanityChecks.mel file alongside the main SubmitMayaToDeadline.mel in the [Repository]submission\Maya\Main folder. It can be used to run some additional checks after the user clicks the Submit button in the submitter. It must define a global proc called CustomPostSanityCheck() that takes no arguments, and must return 0 or 1. If 1 is returned, the submission process will continue, otherwise it will be aborted. Here is an example script: global proc CustomPostSanityCheck() { // Don't allow mayaSoftware jobs to be submitted if( GetCurrentRenderer() == "mayaSoftware" ) return 0; return 1; }

9.28.5 FAQ

Do I need to install Maya on each machine that will render and all 3rd party plugins that are required? Yes. Traditionally, Maya and all required scripts & 3rd party plugins should always be installed and licensed (where applicable) on each machine where it is intended to network render on. However, VFX studios tend to operate a Linux OS platform and take advantage of installing software onto a centralized file server that importantly has the performance to support this configuration and then all local machines can be configured to point at this central location. Additionally, 3rd party plugins/scripts can then be added to this central server path location in combination with floating licenses. This level of custom deployment and configuration is beyond the scope of Thinkbox support and you would be best advised to engage an approved Autodesk reseller or Autodesk directly on best practices here. Here are some URL links, which may be of assistance. If you are able to install and successfully run Maya & all your plugins/scripts from a network location in your studio, then Deadline will be able to support network rendering from this location as well. Simply update the MayaBatch & MayaCmd plugins with the new executable path location using Deadline Monitor, click on “Tools” –> “Super User Mode” –> “Configure Plugins...” –> “MayaBatch” or “MayaCmd”. How to install Maya on a network share Maya Environment Variables Which versions of Maya are supported? Maya versions 2010 and later are all supported. Which Maya renderers are supported? All Maya renderers should work fine with Deadline. The renderers that are known to work with Deadline are 3Delight, Arnold, Caustic Visualizer, Final Render, Gelato, Maxwell, MayaSoftware, MayaHardware, MayaVector, Mental Ray, Mental Ray Exporter, Octane, Renderman, Turtle, and VRay. If you see a Maya renderer that’s not on this list, email Deadline Support and let us know! Does the Maya plugin support Tile Rendering? Yes. See the “Region Rendering Options” section above for more details. Does the Maya plugin support multiple arbitrary sized, multi-resolution Tile Rendering for both stills or an- imations and automatic re-assembly, including the use of multi-channel image formats and arbitary Render Passes? (incl. VRay/Arnold/MR support?) Yes. We call it ‘Jigsaw’ and it’s unique to the Deadline system! See the “Region Rendering Options” section above for more details.

9.28. Maya 605 Deadline User Manual, Release 7.0.3.0

Which Maya application should I select as the render executable in the MayaCmd plugin configuration? Select the Render.exe application. This is Maya’s command line renderer. Which Maya application should I select as the render executable in the MayaBatch plugin configuration? Select the MayaBatch.exe application. This is Maya’s batch renderer. What is the MayaBatch plugin, and how is it different than the MayaCmd plugin? This plugin keeps the Maya scene loaded in memory between frames, thus reducing the overhead of rendering the job. This is the recommended plugin to use, but if you run into any problems, you can always try using the MayaCmd plugin. Why is each task of my job is rendering the same frame(s)? This happens if you have the Renumber Frames option enabled in your Maya render settings. Each task is a separate batch, and if Renumber Frames is enabled, each batch will start at that frame number. I have a multi-core machine, but when rendering the machine isn’t using 100% of the cpu. What can I do? When submitting the job to Maya, set the Threads option to 0. This will instruct Maya to use the optimal number of threads when rendering based on the machine’s core-count. Does Deadline support Maya render layers? Yes. You can either submit one job that renders all the layers, or you can submit a single job per layer. Can I render scenes that use Maya Fur? A recommended setup for Maya is to have your project folder on a shared location that all of your ma- chines can see (whether it be a Windows folder share or a mapped path), then create your Maya scene in this project folder. This way, when you submit the job, you can specify the shared project path in the submission dialog, and all of your slave machines will be able to see it (and therefore see the Maya Fur folders within the project folder). Can I make use of the particle cache during network renders? Yes you can. All that is necessary to do this is to make your scene’s project directory network-accessible by your slaves. For a guide to setting up particle caches, check out this guide on the ResPower Website that describes the proper set-up procedure for the Maya particle cache. When clicking on one of the folder browser buttons in the Maya submission dialog, I sometimes get an error. There is an article on this problem. It’s a .NET problem that seems to randomly occur when the user specifies a path of more than 130 characters, but it looks like Microsoft provides a hotfix for it. When submitting the job from Maya, if I check the Submit Each Render Layer As A Separate Job box, no jobs are submitted when you click submit. The render layers you want to submit need to be set to renderable (the letter ‘R’ need to be there next to the render layer) for the submitter to submit the layer. Note that render layer should not be confused with display layer. Deadline only deals with render layers. It is not using the Maya option to render only the content of a specific display layer. I’m trying to render certain frame range from maya, but Deadline is rendering the entire frame range set in the Maya render globals. If you have the Submit Each Render Layer As A Separate Job box checked, Deadline grabs the frame information from each individual layer’s render globals when submitting the job. If unchecked, Deadline will use the info from the Frame List in the submission dialog. Rendering Maya scenes with Deadline is taking forever in comparison to a local render of the same file.

606 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

One thing you can try is ensuring that the Local Rendering option is enabled when submitting the job to Deadline. This forces Maya to render the frame locally, then copy it to the final destination after. This has been known to improve rendering speeds. How do I configure Mental Ray Satellite to render Mental Ray for Maya jobs with Deadline? 1. Choose a satellite master machine, then modify the maya.rayhosts of the that machine so that it uses the slaves you want. 2. Only put the master machine in Deadline. 3. Submit a job, and make sure that the job will be picked-up by the master machine you have setup. Use pools to do so. 4. In the job property page of the Maya job, in the Maya tab, you could add the following line in the additional arguments field: -rnm 1 This -rnm 1 means “render no master true”, whicht will force the master not to participate in the rendering but only submit and receive the render tasks. You will get better results this way. You could also use -rnm 0 which means “render no master false” and force 1 cpu on the master (if your master is a dual cpu) so you have 1 cpu free on the master to dispatch the task. In short you should always have 1 cpu free on the master machine for dispatching or else your render time will suffer. Can I submit MEL or Python Maya script files to Deadline? Yes, you can submit your own custom scripts from the Advanced tab in the Maya submission script in the Monitor Submit menu. Can I Perform Fume FX Simulations With Deadline? Yes and it’s supported by both our MayaBatch & MayaCmd plugins. To do so, follow these steps: 1. Requires min. FumeFX for Maya v3.5.4 2. Ensure you have “MayaSoftware” selected as renderer in Maya. 3. Before you launch the Maya submission script, make sure that the Fume FX NetRender (Backburner) toggle button is “ON” in the FumeFX options in Maya. 4. Fume FX output paths must be UNC or via mapped drive letter (Windows) 5. Deadline Slave must have either Fume FX “full” or Fume FX “simulation” license available and authorized. If you wish to use “Sim Only Mode” license, then you can switch via the “Fume FX Prefs Dialog” in Maya prior to Deadline Submission. Note, that you must restart Maya for this license mode change to be committed. Do this before submitting to Deadline if you need to use a Sim. only license on a Deadline Slave. 6. Submit any arbitrary Maya single frame to begin the Fume FX Simulation (Fume FX uses its own frame range). However, note that Maya will render whatever single frame the Fume FX job was submitted on at the end of the simulation. 7. Please see the Fume FX for Maya help manual for more details on the above requirements. How can I region render large VRay Scenes? By changing the memory frame buffer on the VRay Common tab of the render settings to None you will be able to render larger tiles since VRay does not crop the tiles.

9.28.6 Error Messages and Meanings

This is a collection of known Maya error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline

9.28. Maya 607 Deadline User Manual, Release 7.0.3.0

Support and let us know. Error in StartJob: Error in RenderExecutable: 64 bit Maya ####_0 ren- der executable was not found in the semicolon separated list “C:/Program Files/Autodesk/Maya####/bin/MayaBatch.exe;/usr/autodesk/maya####/bin/maya”. The path to the ren- der executable can be configured from the Plugin Configuration in the Deadline Monitor. (Where #### is the year of Maya release such as “2015”). Using Deadline Monitor, click on “Tools” –> “Super User Mode” –> “Configure Plugins...” –> “MayaBatch” or “MayaCmd”. The above error message is indicating that Deadline has been unable to locate the correct local install path of Maya on the machine which generated the error message. Deadline already ships with all the known, default install path locations for Maya across the 3 supported operating systems. So you shouldn’t have to edit these paths unless you have installed Maya to a custom location. Please note the subtle diffences between the different executables being used between MayaBatch & MayaCmd and also the slightly different file path locations depending on OS. If in doubt, please contact Deadline Support for further assistance. Error in RenderTasks: Monitored managed process “MayaBatch” has exited or been terminated. This is the most common Maya error in Deadline. The “MayaBatch” plugin, which keeps the Maya scene file open between tasks (frames) will sometimes not display the full stack trace (error message) regarding the root cause of your issue. To obtain a full error message, you should re-submit the Maya file via either Monitor –> “Submit Maya Job To Deadline” –> “Advanced Options” –> uncheck the “Use MayaBatch Plugin” checkbox OR alternatively in the in-app, Maya submission UI –> “Render Options” –> uncheck the “Use MayaBatch Plugin” checkbox. Allow this job to run and fail in Deadline and a more comprehensive error message should now be available. It is this comprehensive error message that should be sent to Deadline Support if further assistance is required. Exception during render: Error: (Mayatomr) : could not get a license Mental Ray is reporting that it can’t find a license. Mental Ray requires an additional license for network rendering, whereas renders such as mayaSoftware and mayaHardware simply uses your Maya license. Certain versions of Maya come with satellite licenses for Mental Ray, but this requires some additional setting up to enable network rendering. It’s probably best to contact the Maya support team about this. Exception during render: Renderer returned non-zero error code, 211 When Maya prints this error message it usually means that Maya can’t access a particular path because it either doesn’t exist or it doesn’t have the necessary read/write permissions to access it. This error tends to occur when Maya is either loading the scene or other referenced data or when saving the final output images. When you get this error, you should check the slave log that is included with the error report. If it is a path problem, Maya shows which path it wasn’t able to access. Check to make sure that the slave machine rendering the job can see the path, and that it has the necessary permissions to read/write to it. If it’s not a path problem, the slave log should still provide some useful information that can help explain the problem. There is also the case where Maya exits with this error code after successfully rendering the images. If this is the case, there are two things to try: 1. When you submit the job, enable the option to ignore error code 211. 2. When you submit the job, enable the MayaBatch option. Deadline doesn’t check error codes in this case. Exception during render: Error: Cannot find procedure “getStrokeUVFromPoly” This error can occur when rendering with paint effects. When you write prerender/postrender scripts be sure to use maya commands and not function wrappers that the gui posts since a huge number of functions don’t get loaded when rendering in batch mode.

608 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

For a quick fix, add the following before the call to the prerenderscripts main functions:

source "getStrokes";

Turtle: The system cannot find the path specified. When Turtle is installed, it sets some environment variables. However, Deadline will not recognize these variables until the Launcher (the application in the Windows tray) is restarted. Restarting the Launcher will fix this problem. Exception during render: Renderer returned non-zero error code, -1073741819 The error code -1073741819 is equivalent to 0xC0000005, which represents a Memory Access Violation error. So Maya is either running out of memory, or memory is becoming corrupt. Take a look at the full render log to see if Maya prints out any information prior to the crash that might explain the problem. mental ray: out of memory Try tweaking your Memory and Performance settings in the mental ray tab in the Maya Render Settings window. Try increasing the Physical memory setting (if you have the extra RAM). A common suggestion is to set it to 80% of your available RAM. You could also try tweaking the Acceleration method settings. Another thing you can try is trimming down your scene so that is uses less memory.

9.29 Mental Ray Standalone

9.29.1 Job Submission

You can submit Mental Ray Standalone jobs from the Monitor.

9.29. Mental Ray Standalone 609 Deadline User Manual, Release 7.0.3.0

610 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Setup your Mental Ray Files

Before you can submit a Mental Ray Standalone job, you must export your scene into .mi files. You can export into either one .mi file with all your frames in it, or one .mi file per frame.

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Mental Ray specific options are: • Mental Ray File: The Mental Ray file(s) to be rendered. – If you are submitting with one frame per .mi file, select one of the numbered frames in the sequence, and Monitor will automatically detect the frame range. In this case, you should leave the check- box marked Separate Input MI Files Per Frame checked. The frames you choose to render should correspond to the numbers on the .mi files. – If your .mi file contains all the frames you wish to render, you should leave the Separate Input MI Files Per Frame box unchecked. In this case, you must specify the Input MI File Start Frame, which is the first frame in the input MI file being rendered, as it is used to offset the frame range being passed to the mental ray renderer. You may then specify the frame range as normal. • Output Folder: The location to which your output files will be written. • Separate Input MI Files Per Frame: Should be checked if you are submitting a sequence of MI files that represent a single frame each. • Frame Offset: The first frame in the input MI file being rendered, which is used to offset the frame range being passed to the mental ray renderer. • Threads: The number of threads to use for rendering. • Verbosity: Control how much information Mental Ray prints out during rendering. • Build To Force: You can force 32 or 64 bit rendering. • Enable Local Rendering: If enabled, the frames will be rendered locally, and then copied to their final network location. • Command Line Args: Specify additional command line arguments you would like to pass to the mental ray renderer.

9.29.2 Plug-in Configuration

You can configure the Mental Ray plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Mental Ray plug-in from the list on the left.

9.29. Mental Ray Standalone 611 Deadline User Manual, Release 7.0.3.0

Render Executables • Render Executable: The path to the Mental Ray Standalone executable file used for rendering. Enter alternative paths on separate lines. Render Options • Error Codes To Ignore: Mental Ray error codes that Deadline should ignore and instead assume the render has finished successfully. Use a ; to separate the error codes. • Treat Exit Code 1 As Error: If set then Exit Code 1 will not be treated as success.

9.29.3 FAQ

Is Mental Ray Standalone supported by Deadline? Yes. Can I submit a sequence of MI files that each contain one frame, or must I submit a single MI file that contains all the frames? Deadline supports both methods. When rendering a single MI file that contains all the frames, the frame range I tell Deadline to render doesn’t match up with the files that are actually rendered. When submitting a single MI file that contains all the frames, make sure the Input MI File Start Frame option is set to the first frame that is in the MI file. This value is used to offset the frame range being passed to the mental ray renderer. Mental Ray is printing out an error that is causing Deadline to fail the render, but when I render from the command line outside of Deadline, the error is still printed out, but the render finishes successfully.

612 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

By default, Deadline fails a Mental Ray job whenever it prints out an error. However, you can configure the Mental Ray plugin to ignore certain error codes, which are printed out alongside the error in the error lob. After a frame is rendered, Deadline takes a long time releasing the task before it moves on to another. What’s going on? This can occur when a single MI file that contains all the frames is submitted to Deadline. Try exporting your frames to a sequence of MI files (one per frame) and submit the sequence of MI files to Deadline instead.

9.29.4 Error Messages and Meanings

This is a collection of known Mental Ray error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.30 Messiah

9.30.1 Job Submission

You can submit jobs from within Messiah by installing the integrated submission plugin, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

9.30. Messiah 613 Deadline User Manual, Release 7.0.3.0

To submit from within Messiah, select the Customize tab, and then from the drop down, select Submit To Deadline. Click the Submit Messiah Job button to launch the submitter.

614 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Messiah specific options are: • Messiah File: The scene file to render. • Content Folder: This is the folder that contains the Messiah scene assets. It is recommended that you have a network accessible content folder when network rendering with Messiah. • Output Folder: The folder where the output files will be saved (including images from all enabled buffers). If left blank, the output folders in the scene file will be used. • Threads: The number of threads to use for rendering. • Build To Force: The build of Messiah to force. • Frame Resolution: Override the width and height of the output images. If a value is set to 0, the value from the scene file will be respected. • Antialiasing: Override the antialiasing settings in the scene file.

9.30.2 Plug-in Configuration

You can configure the Messiah plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Messiah plug-in from the list on the left.

9.30. Messiah 615 Deadline User Manual, Release 7.0.3.0

Messiah Settings • Messiah Host Library: The path to the messiahHOST.dll library. Enter alternative paths on separate lines.

9.30.3 Integrated Submission Script Setup

The following procedures describe how to install the integrated Messiah submission plugin. This plugin allows for submitting Messiah render jobs to Deadline directly from within the Messiah editing GUI. Note that this has only been tested with Messiah version 5. Submitter Installer • Run the Submitter Installer located at /submission/3dsmax/Installers Manual Installation of the Submission Script Messiah 32 Bit • Copy [Repository]\submission\Messiah\Client\DeadlineMessiahClient32.mp to [Messiah 32 Bit Install Direc- tory]\Plugins. • Restart Messiah • You’ll find the ‘Submit To Deadline’ option in the drop down under the Customize tab. Messiah 64 Bit • Copy [Repository]\submission\Messiah\Client\DeadlineMessiahClient64.mp to [Messiah 64 Bit Install Direc- tory]\Plugins. • Restart Messiah • You’ll find the ‘Submit To Deadline’ option in the drop down under the Customize tab.

616 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.30.4 FAQ

Which versions of Messiah are supported by Deadline? Messiah 5 is currently supported.

9.30.5 Error Messages and Meanings

This is a collection of known Messiah error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.31 MetaFuze

9.31.1 Job Submission

You can submit MetaFuze jobs from the Monitor.

9.31. MetaFuze 617 Deadline User Manual, Release 7.0.3.0

Setup your MetaFuze Batch File

Export you MetaFuze transcode as a batch XML file. Multiple files may be submitted to Deadline at once. Be sure to set the output path in MetaFuze to a network drive accessible by both yourself and your slave machines.

Submission Options

The general Deadline options are explained in the Job Submission documentation. The MetaFuze specific options are: • Input File(s): The MetaFuze XML files to transcode. Multiple files may be selected in the file browser and added to the list. • Submit Each File In Folder As A Separate Job: Submits each file (including non-XML files) in the folder of the selected file as a job.

618 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.31.2 Plug-in Configuration

You can configure the MetaFuze plug-in settings from the Monitor. While in super user mode, select Tools -> Config- ure Plugins and select the MetaFuze plug-in from the list on the left.

Render Executables • MetaFuze Executable: The path to the MetaFuze executable file used for rendering. Enter alternative paths on separate lines.

9.31.3 FAQ

Is MetaFuze supported by Deadline? Yes.

9.31.4 Error Messages and Meanings

This is a collection of known MetaFuze error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.31. MetaFuze 619 Deadline User Manual, Release 7.0.3.0

9.32 MetaRender

9.32.1 Job Submission

You can submit MetaRender jobs from the Monitor.

620 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The MetaRender specific options are: • Input File: The input file. It could be a movie file or part of an image sequence. • Output File: The file or image sequence name that MetaRender will write to. • Encoding Profile: The path to the encoding profile saved with the Profile Editor. • Burn File (optional): Superimpose the specified burn-in template over the output frames. • Rendering Mode: Select CPU or GPU. • Strip Alpha Channel: Strips the alpha channel from the input sequence during conversion. • Threads: The number of render threads to use (CPU mode only). • Draft Mode: Speed up rendering for non-critical color work (GPU mode only). • Render CPU Masks: Uses high quality mask rendering instead of low quality GPU-based masks (GPU/Draft mode only). • Write Flex File: Writes a flex file for the entire timeline. • Render Takes Into Subfolders: If the Flex File option is enabled, render takes into subfolders. • Core Command Args: Specify additional Core Command Line arguments (the basic command line options for all IRIDAS applications). • MetaRender Args: Specify additional MetaRender-specific command line arguments.

9.32.2 Plug-in Configuration

You can configure the MetaRender plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the MetaRender plug-in from the list on the left.

9.32. MetaRender 621 Deadline User Manual, Release 7.0.3.0

Render Executables • Meta Render Executable: The path to the Meta Render executable file used for rendering. Enter alternative paths on separate lines.

9.32.3 FAQ

Is MetaRender supported by Deadline? Yes.

9.32.4 Error Messages and Meanings

This is a collection of known MetaRender error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.33 modo

9.33.1 Job Submission

You can submit jobs from within modo by using the integrated submitter (7xx and up), running SubmitModoToDead- line.pl script, or you can submit them from the Monitor.

622 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

To run the integrated submitter within modo 7xx or later, after it’s been installed: • Render -> Submit To Deadline To run the integrated submitter within modo 6xx or earlier, after it’s been installed: • Under the system menu, choose Run Script • Choose the DeadlineModoClient.pl script from [Repository]\submission\Modo\Client

9.33. modo 623 Deadline User Manual, Release 7.0.3.0

– Alternatively, you can also copy this script to your local machine and run it from there. You should do this if the path to your Deadline repository is a UNC path and you are running modo on Windows OS.

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The modo specific options are: • Pass Group: The pass group to render, or blank to not render a pass group. This option is only available for modo 6xx or later. Note: the monitor-only options are preserved within modo for the 7xx, or later, submitter. ie. Render threads can be specified in preferences->Final Rendering. Monitor-only options: • modo File: The scene file to render. • Render Threads: The number of render threads to use. Specify 0 to use automatic render threads. • Version: The version of modo to render with. • Build To Force: You can force 32 or 64 bit rendering with this option. • Output Folder: Specify an output folder. Note that specifying the output is optional unless you want to use tile rendering. • Output File Prefix: If specifying an output folder, specify the prefix for the output file name (extension is not required). • Output Format: If specifying an output folder, specify the format of the output images. Region Rendering Options Setting up a region rendering job:

624 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Enable Region Rendering: If enabled, the frame will be split into multiple tiles that are rendered individually and can be assembled after. • Region Render Type: If set to Jigsaw Rendering then the submissions will use Jigsaw, otherwise it will use a grid of tiles. • Submit All Tiles as a single Job: If enabled, a single frame will be submitted with all tiles in a single job, otherwise each tile will be submitted as a seperate job with each frame being a seperate frame. • Submit Dependent Assembly Job: Submit a job dependent on the region job that will assemble the tiles, if doing jigsaw animation a seperate job will be created for each different named output file • Cleanup Tiles after Assembly: If selected the tiles will be deleted after assembly • Error on missing Tiles: If enabled, then if any of the tiles are missing the assembly job will fail • Assemble Over: Determine what the Draft Tile Assembler should assemble over be it a blank image, previous output or a specified file. • Error on Missing Background: If enabled, then if the background file is missing the job will fail • Open Jigsaw Panel: Opens the Jigsaw UI • Reset Jigsaw Background: Resets the background of the jigsaw regions • Save Jigsaw Regions: Saves the Jigsaw Regions to the scene File • Load Jigsaw Regions: Loads the save Jigsaw Regions and sends them to the open panel.

9.33.2 Interactive Distributed Rendering

You can submit interactive modo Distributed Rendering jobs to Deadline. The instructions for installing the integrated submission script can be found further down this page. The interactive submitter will submit a special modo server job to reserve render nodes. Note that this feature is only supported in modo 7xx and later.

9.33. modo 625 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The modo Distributed Rendering specific options are: • Maximum Servers: The maximum number of modo Servers to reserve for distributed rendering. • Use Server IP Address instead of Host Name: If checked, the Active Servers list will show the server IP ad- dresses instead of host names.

Rendering

After you’ve configured your submission options, press the Reserve Servers button to submit the modo Server job. After the job has been submitted, you can press the Update Servers button to update the job’s ID and Status in the submitter. As nodes pick up the job, pressing the Update Servers button will also show them in the Active Servers list. Once you are happy with the server list, press Start Render to start distributed rendering. Note that the modo Server process can sometimes take a little while to initialize. This means that a server in the Active Server list could have started the modo Server, but it’s not fully initialized yet. If this is the case, it’s probably best to

626 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0 wait a minute or so after the last server has shown up before pressing Start Render. After the render is finished, you can press Release Servers or close the submitter to mark the VRay Spawner job as complete so that the render nodes can move on to another job.

9.33.3 Network Rendering Considerations

This Article provides some useful information for setting up modo for network rendering.

9.33.4 Cross-Platform Rendering Considerations

In order to perform cross-platform rendering with modo, you must setup Mapped Paths so that Deadline can swap out the Scene and Output file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super user mode by selecting Tools -> Configure Repository. You’ll find the Mapped Paths Setup in the list on the left. You must also ensure that your modo project file is on a network shared location, and that any footage or assets that the project uses is in the same folder. Then when you submit the job to Deadline, you must make sure that the option to submit the scene file with the job is disabled. If you leave it enabled, the scene file will be copied to and loaded from the Slave’s local machine, and thus won’t be able to find the footage.

9.33.5 Plug-in Configuration

You can configure the modo plug-in settings from the Monitor. While in super user mode, select Tools -> Plugins Configuration and select the modo plug-in from the list on the left.

Render Executables • modo Executable: The path to the modo executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.33. modo 627 Deadline User Manual, Release 7.0.3.0

Geometry Cache • Auto Set Geometry Cache: Enable this option to have Deadline automatically set the modo geometry cache before rendering (based on the geometry cache buffer below). • Geometry Cache Buffer (MB): When auto-setting the geometry cache, Deadline subtracts this buffer amount from the system’s total memory to calcuate what the geometry cache should be set to.

9.33.6 Integrated Submission Script Setup

This section describes how to install the integrated submission scripts for modo. The integrated submission scripts and the following installation procedures have been tested with modo 7xx and later. Submitter Installer • Run the Submitter Installer located at /submission/Modo/Installers Manual Installation of the Submission Script 7xx or later: • Open modo, and select System -> Open User Scripts Folder. • Copy the DeadlineModo folder from \\your\repository\submission\Modo\Client to this User Scripts folder. • Restart modo, and you should find the Submit To Deadline menu item in your Render menu. 6xx or earlier: • Under the system menu, choose Run Script • Choose the DeadlineModoClient.pl script from [Repository]\submission\Modo\Client – Alternatively, you can also copy this script to your local machine and run it from there. You should do this if the path to your Deadline repository is a UNC path and you are running modo on Windows OS.

Distributed Rendering Script Setup

Submitter Installer • Run the Submitter Installer located at /submission/ModoDBR/Installers Manual Installation of the Submission Script 7xx or later only: • Open modo, and select System -> Open User Scripts Folder. • Copy the DeadlineModoDBR folder from \\your\repository\submission\ModoDBR\Client to this User Scripts folder. • Restart modo, and you should find the Submit To Deadline: Modo DBR menu item in your Render menu.

9.33.7 FAQ

Which versions of modo are supported? Modo 3xx and later are supported. Which versions of modo can I use for interactive distributed rendering?

628 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Modo 7xx and later are supported. When rendering with modo on Windows, it hangs after printing out “@start modo_cl [48460] Luxology LLC”. We’re not sure of the cause, but a known fix is to copy the ‘perl58.dll’ from the ‘extra’ folder into the main modo install directory (“C:Program FilesLuxologymodo601”). When rendering with modo on Mac OSX, the Slave icon in the Dock changes to the modo icon, and the render gets stuck. This is a known problem that can occur when the Slave application is launched by double-clicking it in Finder. There are a few known workarounds: 1. Start the Launcher application, and launch the Slave from the Launcher’s Launch menu. 2. Launch the slave from the terminal by simply running ‘DEADLINE_BIN/deadlineslave’ or ‘DEAD- LINE_BIN/deadlinelauncher -slave’, where DEADLINE_BIN is the Deadline bin folder. 3. Use ‘modo’ as the render executable instead of ‘modo_cl’. When tile rendering, each tile is rendered, but there is image data in the “unrendered” region of each tile. This happens when there is a cached image in the modo frame buffer. Open up modo on the offending render node(s) and delete all cached images to fix the problem.

9.33.8 Plugin Error Messages

This is a collection of known modo error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.34 Naiad

9.34.1 Job Submission

You can submit jobs from the Monitor.

9.34. Naiad 629 Deadline User Manual, Release 7.0.3.0

630 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Naiad specific options are: • Naiad File: The Naiad file to simulate.

Naiad Simulation Job

• Submit Simulation Job: Enable to submit a Simulation job to Deadline. • Run Simulation On A Single Machine: If enabled, the simulation job will be submitted as a single task consisting of all frames so that a single machine runs the entire simulation. • Threads: The number of render threads to use. Specify 0 to let Naiad determine the number of threads to use. • Enable Verbose Logging: Enables verbose logging during the simulation.

EMP to PRT Conversion Job

• Submit an EMP to PRT Conversion Job: Enable to submit a PRT Conversion job to Deadline. – If you are also submitting a simulation job, this job will use the EMP files created by the simulation job. – If you are not submitting a simulation job, the EMP files must already exist. • EMP Body Name: The EMP body name. • EMP Body File Name: The path to the EMP files to be converted.

9.34.2 Plug-in Configuration

You can configure the Naiad plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Naiad plug-in from the list on the left.

9.34. Naiad 631 Deadline User Manual, Release 7.0.3.0

Naiad Executables • Simulation Executable: The path to the command line client executable file used for simulation. Enter alternative paths on separate lines. • Emp to Prt Executable: The path to the emp2prt executable file used for emp conversion. Enter alternative paths on separate lines.

9.34.3 FAQ

Is Naiad supported by Deadline? Yes.

9.34.4 Error Messages and Meanings

This is a collection of known Naiad error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

632 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.35 Nuke

9.35.1 Job Submission

You can submit jobs from within Nuke by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within Nuke, select Submit To Deadline from the Thinkbox menu.

9.35. Nuke 633 Deadline User Manual, Release 7.0.3.0

634 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.35. Nuke 635 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Nuke specific options are: • Render With NukeX: Enable this option if you want to render with NukeX instead of Nuke. • Use Batch Mode: If enabled, Deadline will keep the Nuke file loaded in memory between tasks.

636 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Render Threads: The number of threads to use for rendering. • Use The GPU For Rendering: If Nuke should also use the GPU for rendering (Nuke 7 and later only). • Maximum RAM Usage: The maximum RAM usage (in MB) to be used for rendering. • Enforce Write Node Render Order: Forces Nuke to obey the render order of Write nodes. • Minimum Stack Size: The minimum stack size (in MB) to be used for rendering. Set to 0 to not enforce a minimum stack size. • Continue On Error: If enabled, Nuke will attempt to keep rendering if an error occurs. • Use Performance Profiler: If enabled, Nuke will profile the performance of the Nuke script while rendering and create a xml file per task for later analysis (Nuke 9 and later only). • XML Directory: If Use Performance Profiler is enabled, this is the directory on the network where the perfor- mance profile xml files will be saved. • Render in Proxy Mode: If enabled, Nuke will render using the proxy file paths. • Choose Views To Render: Enable this option to choose which view(s) to render. By default, all views are rendered. • Submit Write Nodes As Separate Jobs: Each write node is submitted as a separate job. • Use Node’s Frame List: If submitting each write node as a separate job or task, enable this to pull the frame range from the write node, instead of using the global frame range. • Set Dependencies Based on Write Node Render Order: When submitting write nodes as separate jobs, this option will make the separate jobs dependent on each other based on write node render order. • Submit Write Nodes As Separate Tasks For The Same Job: Enable to submit a job where each task for the job represents a different write node, and all frames for that write node are rendered by its corresponding task. • Selected Nodes Only: If enabled, only the selected Write nodes will be rendered. • Nodes With ‘Read File’ Enabled Only: If enabled, only the Write nodes that have the ‘Read File’ option enabled will be rendered. The Submit Each Write Node As A Separate Task option can be useful if you have a bunch of write nodes in a Nuke script to output different Quicktime movies. You can enable this option, and bump up the Concurrent Tasks value to allow machines to process multiple write nodes concurrently. Since Quicktime generation only uses a single thread, you can get much better throughput with this option on multi-core machines.

9.35.2 Cross-Platform Rendering Considerations

In order to perform cross-platform rendering with Nuke, you must setup Mapped Paths so that Deadline can swap out Read Node and Write Node file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super user mode by selecting Tools -> Configure Repository. You’ll find the Mapped Paths Setup in the list on the left.

9.35.3 Plug-in Configuration

You can configure the Nuke plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Nuke plug-in from the list on the left.

9.35. Nuke 637 Deadline User Manual, Release 7.0.3.0

Render Executables • Nuke Executable: The path to the Nuke executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Licensing Options • Slaves To Use Interactive License: A list of slaves that should use an interactive Nuke license instead of a render license. Use a , to separate multiple slave names, for example: slave001,slave002,slave003 OFX Cache • Prepare OFX Cache Before Rendering: If enabled, Deadline will try to create the temporary ofxplugincache folder before rendering, which helps ensure that comps that use OFX plugins render properly. Path Mapping (For Mixed Farms) • Enable Path Mapping: If enabled, a temporary Nuke file will be created locally on the slave for rendering because Deadline does the path mapping directly in the Nuke file. This feature can be turned off if there are no Path Mapping entries defined in the Repository Options.

9.35.4 Integrated Submission Script Setup

The following procedures describe how to install the integrated Nuke submission script. This script allows for submit- ting Nuke render jobs to Deadline directly from within the Nuke editing GUI. Note that this has only been tested with Nuke version 6 and later. Submitter Installer • Run the Submitter Installer located at /submission/Nuke/Installers Manual Installation of the Submission Script

638 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Copy [Repository]\submission\Nuke\Client\DeadlineNukeClient.py to your .nuke user folder (~/.nuke or %USERPROFILE%\.nuke) • If menu.py does not exist in your .nuke user folder, copy [Repository]\submission\Nuke\Client\menu.py to your .nuke user folder • If menu.py does exist, then open it in a text editor and add the following lines of code:

import DeadlineNukeClient menubar= nuke.menu("Nuke") tbmenu= menubar.addMenu("&Thinkbox") tbmenu.addCommand("Submit Nuke To Deadline", DeadlineNukeClient.main,"")

The next time you launch Nuke, there should be a Thinkbox menu with the option to Submit Nuke to Deadline.

Custom Sanity Check

A CustomSanityChecks.py file can be created alongside the main SubmitNukeToDeadline.py submission script (in [Repository]\submission\Nuke\Main), and will be evaluated if it exists. This script will let you set any of the initial properties in the submission script prior to displaying the submission window. You can also use it to run your own checks and display errors or warnings to the user. Here is a very simple example of what this script could look like: import nuke import DeadlineGlobals def RunSanityCheck(): DeadlineGlobals.initDepartment="The Best Department!" DeadlineGlobals.initPriority= 33 DeadlineGlobals.initConcurrentTasks=2

nuke.message("This is a custom sanity check!")

return True

The DeadlineGlobals module can be found in the same folder as the SubmitNukeToDeadline.py script mentioned above. It just contains the list of global variables that you can set, which are then used by the submission script to set the initial values in the submission dialog. Simply open DeadlineGlobals.py in a text editor to view the global variables. Finally, if the RunSanityCheck method returns False, the submission will be cancelled.

9.35.5 FAQ

Which versions of Nuke are supported? Nuke 6 and later are supported. Can I render with NukeX instead of Nuke? Yes. Simply enable the Render With NukeX option when submitting your Nuke job. What’s the benefit to using Batch Mode? If enabled, Deadline will keep the Nuke file loaded in memory between tasks. This can help reduce overhead, because the Nuke file is only loaded once when the job is started on the Slave. Can I use 3rd party plugins such as Peregrine Lab’s Bokeh plugin with Nuke via Deadline?

9.35. Nuke 639 Deadline User Manual, Release 7.0.3.0

Yes. Ensure all machines have the same plugin software installed locally or via a network where appli- cable (depending on your studio pipeline/software deployment management and if the plugin in question supports deployment in this manner). Ensure any neccessary environment variables required are also present on each Slave. In the case of Peregrine Lab’s Bokeh plugin, ensure the Slaves have the environ- ment variable available: “PEREGRINE_LICENSE=port@hostname”, where port & hostname are your license server details. Environment variables can be set for a job in Deadline, and these variables will be applied to the rendering process’ environment. These variables can be set in the Job Properties in the Monitor, and they can be set during Manual Job Submission.

9.35.6 Error Messages and Meanings

This is a collection of known Nuke error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.36 Octane Standalone

9.36.1 Job Submission

You can submit Octane Standalone jobs from the Monitor.

640 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Octane specific options are:

9.36. Octane Standalone 641 Deadline User Manual, Release 7.0.3.0

• Octane Scene File: Specify the Octane scene file(s) to render. If you have an animation with one OCS file per frame, you just need to select one of the OCS files from the sequence. • Output File: The output file path. This is optional and can be left blank. • Render Target: Select the target to render. This list is automatically populated based on the selected OCS file. • Single Frame Job: Check this option if you are submitting a single frame to render, as opposed to an animation consisting of a sequence of OCS files. • Override Sampling: Overrides the maximum samples in the OCS file. • Command Line Args: Additional command line arguments to pass to the renderer.

9.36.2 Plug-in Configuration

You can configure the Octane plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Octane plug-in from the list on the left.

Render Executables • Octane Executable: The path to the Octane executable file used for rendering. Enter alternative paths on separate lines.

9.36.3 FAQ

Is Octane Standalone supported by Deadline? Yes!

642 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.36.4 Error Messages and Meanings

This is a collection of known Octane error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.37 PRMan (Renderman Pro Server)

9.37.1 Job Submission

You can submit PRMan jobs from the Monitor.

9.37. PRMan (Renderman Pro Server) 643 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The PRMan specific options are:

644 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• RIB Files: The RIB files to be rendered (can be ASCII or binary formatted). These files should be network accessible. • Working Directory: The working directory used during rendering. This is required if your RIB files contain relative paths. • Threads: The number of threads to use for rendering. Set to 0 to let PRMan automatically determine the optimal thread count. • Additional Arguments: Specify additional command line arguments you would like to pass to the PRMan renderer.

9.37.2 Plug-in Configuration

You can configure the PRMan plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the PRMan plug-in from the list on the left.

Render Executables • PRMan Executable: The path to the PRMan executable file used for rendering. Enter alternative paths on separate lines.

9.37.3 FAQ

Is PRMan supported by Deadline? Yes. Is PRMan’s folder structure where each frame has its own folder supported by Deadline?

9.37. PRMan (Renderman Pro Server) 645 Deadline User Manual, Release 7.0.3.0

Yes. Deadline can render rib files that are in separate folders per frame, and can also render rib files that are all stored in the same folder.

9.37.4 Error Messages and Meanings

This is a collection of known PRMan error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.38 Puppet

9.38.1 Job Submission

You can submit Puppet jobs from the Monitor.

646 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Puppet specific options are: • Verbose Output: Prints very detailed output when the job is run.

9.38.2 Plug-in Configuration

You can configure the Puppet plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Puppet plug-in from the list on the left.

9.38. Puppet 647 Deadline User Manual, Release 7.0.3.0

Options • Puppet Batch: The path to the Puppet executable file. Enter alternative paths on separate lines.

9.39 Python

9.39.1 Job Submission

You can submit Python jobs from the Monitor.

648 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Python specific options are: • Script File: The script you want to submit. • Arguments: The arguments to pass to the script. Leave blank if the script takes no arguments. • Version: The version of Python to use.

9.39.2 Plug-in Configuration

You can configure the Python plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Python plug-in from the list on the left. To get a description of each setting, simply hover the mouse cursor over a setting and a tool tip will be displayed.

9.39. Python 649 Deadline User Manual, Release 7.0.3.0

Python Executables • Python Executable: The path to the Python executable file used. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.39.3 FAQ

Which versions of Python are supported? Python 2.3 to 3.2 are all supported. Additional versions can be added when necessary.

9.40 Quicktime Generation

9.40.1 Job Submission - Apple Quicktime

Jobs can be submitted from the Monitor. You can use the Submit menu, or you can right-click on a job and select Scripts -> Quicktime Submission to automatically populate some fields in the Quicktime submitter based on the job’s output. Note that in order to use this Quicktime plugin, you MUST have Quicktime version 7.04 or later installed on your slaves, as well as any workstations that Quicktime jobs will be submitted from.

650 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Quicktime specific options are: • Input Images: The frames you would like to generate the Quicktime from. If a sequence of frames exist in the same folder, Deadline will automatically collect the range of the frames and will set the Frame Range field accordingly.

9.40. Quicktime Generation 651 Deadline User Manual, Release 7.0.3.0

• Output Movie File: The name of the Quicktime to be generated. • Frames: The frame range used to generate the Quicktime. • Codec: The codec format to use for the Quicktime. • Frame Rate: The frame rate of the Quicktime. • Audio File: Specify an audio file to be added to the Quicktime movie. Leave blank to disable this feature. • Settings File: The Quicktime settings file to use. Press the Create Settings button to create a new Quicktime settings file.

9.40.2 Plug-in Configuration

The Apple Quicktime plug-in does not require any configuration.

652 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.40.3 FAQ

Which version of Apple Quicktime is required to create Quicktime movies with Deadline using the Apple Quick- time renderer? Apple Quicktime version 7.04 or later is required. It must be installed on all slaves that will be rendering Quicktime movies, as well as any machines from which Quicktime jobs will be submitted from. You can download the latest version of Quicktime from here. Can I submit an Apple Quicktime job from Windows to run on Mac OSX, or vice versa? No, because the export settings are saved out differently on each operating system. The Windows Quick- time generator doesn’t recognize settings that are exported on a Mac, and vice versa. We hope to find a solution for this in the future, but for now you should ensure that your Quicktime job renders on the same operating system from which it was submitted from (using groups, pools, machine lists, etc). Can multiple machines work together to render a single movie file? No, this is not possible. This is why Quicktime Generation jobs should always consist of a single task that contains all the frames to be included in the movie file. When submitting an Apple Quicktime, an error message pops up when I click the Submit button.

This error pops up when you have an older version of Apple Quicktime installed. Installing the latest version should fix the problem.

9.40.4 Error Messages and Meanings

This is a collection of known Quicktime Generation error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Exception during render: Error: Class not registered - Make sure that QuickTime version 7.04 or later is installed on this machine This error occurs when you have an older version of Apple Quicktime installed. Installing the latest version should fix the problem. Exception during render: Renderer returned non-zero error code, 128 The Apple Quicktime renderer is crashing for some reason. Check to make sure you have the latest version of Apple Quicktime installed.

9.41 Realflow

9.41.1 Job Submission

You can submit jobs from within RealFlow by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

9.41. Realflow 653 Deadline User Manual, Release 7.0.3.0

To submit from within RealFlow 5 or later, select Commands -> System Commands -> SubmitToDeadline.py. To submit from within RealFlow 4, select ‘Scripts’ -> ‘User Scripts’ -> ‘Deadline’ -> ‘Submit To Deadline’.

654 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.41. Realflow 655 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Realflow specific options are: • Submit IDOC Jobs: Enable to submit separate IDOC jobs for each IDOC name specified. Separate multiple IDOC names with commas. For example: IDOC01,IDOC02,IDOC03 • Start Rendering At [Start Frame - 1]: Enable this option if RealFlow rendering should start at the frame preced- ing the Start Frame. For example, if you are rendering frames 1-100, but you need to pass 0-100 to RealFlow, then you should enable this option. • Use One Machine Only: Forces the entire job to be rendered on one machine. If this is enabled, the Machine Limit, Task Chunk Size and Concurrent Tasks settings will be ignored. • Version: The version of RealFlow to render with. • Build: Force 32 bit or 64 bit rendering. • Rendering Threads: The number of threads to use during simulation. • Reset Scene: If this option is enabled, the scene will be reset before the simulation starts. • Generate Mesh: This option will generate the mesh for a scene where particle cache files were created previously. • Use Particle Cache: If you have created particle cache files for a specific frame and you want to resume your simulation from that frame you have to use this option. The starting cached frame is the Start Frame entered above. • Render Preview: Enable this option to create a Maxwell Render preview.

9.41.2 Plugin Configuration

You can configure the RealFlow plug-in settings from the Monitor. While in super user mode, select Tools -> Plugins Configuration and select the RealFlow plug-in from the list on the left.

656 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Render Executables • Realflow Executable: The path to the Realflow executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.41.3 Integrated Submission Script Setup

The following procedures describe how to install the integrated RealFlow submission script. This script allows for submitting RealFlow render jobs to Deadline directly from within the RealFlow editing GUI. Submitter Installer • Run the Submitter Installer located at /submission/RealFlow/Installers Manual Installation of the Submission Script RealFlow 5 and Later: • Copy [Repository]\submission\RealFlow\Client\DeadlineRealFlowClient.py to [RealFlow Install Direc- tory]\scripts. • Launch RealFlow. • Now you can select Commands -> System Commands -> DeadlineRealFlowClient.py.

9.41. Realflow 657 Deadline User Manual, Release 7.0.3.0

RealFlow 4*: • Copy [Repository]\submission\RealFlow\Client\DeadlineRealFlowClient.py to [RealFlow Install Direc- tory]\scripts. • Launch RealFlow and select Scripts -> Add.

658 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• In the Add Script dialog, for the Name enter ‘Submit To Deadline’, and for the Script enter the path to the DeadlineRealFlowClient.py file that you just copied over. Then click the New Folder button and name the folder ‘Deadline’. Then select the ‘Deadline’ folder and click OK.

• Now you can select ‘Scripts’ -> ‘User Scripts’ -> ‘Deadline’ -> ‘Submit To Deadline’ to launch the submission dialog.

9.41.4 FAQ

What versions of RealFlow are supported by Deadline? RealFlow versions 3 and later are supported. The integrated submission script is only supported in Re- alFlow 4 and later. RealFlow 3 jobs can still be submitted from the Monitor. Does rendering with RealFlow require a separate license? Yes. You need separate command line licenses to render. Can I render separate IDOCs from the same scene across different machines? Yes. You can specify which IDOCs you want to render in the submitter, and a separate job will be submitted for each one. Why is RealFlow looking for the particle cache on the local C: instead of on the network?

9.41. Realflow 659 Deadline User Manual, Release 7.0.3.0

This is likely happening because you are choosing to submit the RealFlow file with the job. This means the file is copied locally to the slave machines, which is why they are looking for the cache locally. If you disable the option to submit the file with the job, the slave machines should be able to find the cache properly.

9.41.5 Error Messages and Meanings

This is a collection of known RealFlow error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Exception during render: [RealFlow Error]: License file not found. RealFlow requires a separate license for network rendering, and that licensing system needs to be setup before you can render through Deadline.

9.42 REDLine

9.42.1 Job Submission

You can submit REDLine jobs from the Monitor. REDLine is the command line tool that ships with Redcine-X, and previously with REDAlert.

660 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.42. REDLine 661 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The REDLine specific options are: • Input R3D File: Specify the R3D file you want to render. • Output Folder: The folder where the output files will be saved. • Output Filename: The prefix portion of the output filename. It is not necessary to specify the extension. • Output Format: The output format. This will determine the filename extension. • Render Resolution: The resolution to render at. • Make Output Subfolder: Makes subdirectory for each output. • Frame List: The list of frames to render if rendering an animation. • Renumber Start Frame: The new start frame number (optional). • Frames Per Task: The number of frames per task. • Submit Input R3D File With Job: If checked, the input file is submitted with the job to the repository. Deadline basically supports all the options that are available in the Redcine-X application. It also supports the ability to specify RSX files to use when rendering, so you can set your options in Redcine-X and then use them to render the job through Deadline. Please refer to your Redcine-X documentation for more information about these additional render options.

9.42.2 Plug-in Configuration

You can configure the REDLine plug-in settings from the Deadline Monitor. While in super user mode, select Tools -> Configure Plugins and select the REDAlert plug-in from the list on the left.

662 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Render Executables • REDLine Executable: The path to the REDline executable file used for rendering. Enter alternative paths on separate lines.

9.42.3 FAQ

Is Redcine-X/REDAlert supported by Deadline? Yes. Both applications ship with a command line application called REDLine, which Deadline uses to render. Which Operating System(s) can I render REDLine jobs with? Currently, REDLine is available on Windows and OSX, so you can render REDLine jobs on these oper- ating systems.

9.42.4 Error Messages and Meanings

This is a collection of known REDLine error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.43 Renderman (RIB)

9.43.1 Job Submission

You can submit Renderman jobs from the Monitor.

9.43. Renderman (RIB) 663 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. Note that a Draft job can only be submitted if Deadline is able to parse absolute Display paths from the selected rib file. If it cannot extract the output paths, it will let you know during submission so that you can disable the Draft job option.

664 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

The RIB specific options are: • RIB Files: The RIB files to be rendered (can be ASCII or binary formatted). These files should be network accessible. • Renderer: The renderer that will be used to render the RIB files. • Additional Arguments: Specify additional command line arguments you would like to pass to the RIB renderer. See the documentation for your particular RIB renderer for additional arguments.

9.43.2 Plug-in Configuration

You can configure the RIB plug-in settings from the Monitor. While in super user mode, select Tools -> Plugins Configuration and select the RIB plugin from the list on the left.

Render Executables • Executable: The path to the RIB executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each RIB renderer installed on your render nodes.

9.43.3 FAQ

Which RIB renderes are supported by Deadline? The following renders are supported: • 3Delight • Air • Aqsis • BMRT

9.43. Renderman (RIB) 665 Deadline User Manual, Release 7.0.3.0

• Entropy • Pixie • PRMan • RenderDotC • RenderPipe If you use a RIB renderer that is not on this list, please contact Deadline Support and let us know.

9.43.4 Errors Messages and Meanings

This is a collection of known RIB error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.44 Rendition

9.44.1 Job Submission

You can submit Rendition jobs from the Monitor.

666 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Rendition specific options are: • Input MI File: The MI file to render. This needs to be on a network location, since Rendition often saves the

9.44. Rendition 667 Deadline User Manual, Release 7.0.3.0

output file(s) relative to the input MI file location. • Output File: Optionally override the output file. • Build To Force: Force 32 bit or 64 bit rendering. • Skip Existing Frames: If existing images should be skipped. • Additional Cmd Line Args: Additional command line arguments to pass to Rendition during rendering. • Tile Rendering Options: Setup a Rendition tile rendering job. Note that this requires you to override the output file. Also make sure that the final image resolution settings are correct, because these are used to determine the size of the tiles to render. The output formats that are supported are BMP, DDS, EXR, JPG, JPE, JPEG, PNG, RGB, RGBA, SGI, TGA, TIF, and TIFF.

9.44.2 Plug-in Configuration

You can configure the Rendition plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Rendition plug-in from the list on the left.

Render Executables • Rendition Executable: The path to the Rendition executable file used for rendering. Enter alternative paths on separate lines.

9.44.3 FAQ

Is Rendition supported by Deadline? Yes. Why do the image format options (like color depth) get reverted to defaults when rendering with Deadline?

668 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

This only happens when overriding the output file in the submission script. When we pass the output path to Rendition, it uses the default image format options for the output type. If you don’t want this to occur, simply don’t override the output file.

9.44.4 Error Messages and Meanings

This is a collection of known Rendition error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.45 Rhino

9.45.1 Job Submission

You can submit jobs from within Rhino by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

9.45. Rhino 669 Deadline User Manual, Release 7.0.3.0

To submit from within Rhino, left-click on the ‘Deadline’ button you created during the integrated submission script installation.

670 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Rhino specific options are: • Rhino File: The Rhino file to be rendered. • Output File: The filename of the image(s) to be rendered. • Renderer: Specify the renderer to use. • Render Bongo Animation: If your Rhino file uses the Bongo animation plugin, you can enable a Bongo anima- tion job.

9.45.2 Region Rendering

Region Rendering Options: * Enable Region Rendering: If enabled, the Image will be rendered in smaller pieces and assembling them after * Use Jigsaw: Use Jigsaw to determine the regions * Tiles in X and Tiles in Y: The number of tiles to divide the job into if not using jigsaw * Submit Dependent Assembly Job: If enabled, then a dependent job will be submitted to assemble the tiles into a single image * Cleanup Tiles as Assembly: If enabled, then after assembly the tiles will be deleted * Error on Missing Tiles: If enabled, then if any of the tiles are missing the assembly job will fail * Assemble Over: Determines what the tiles will be assembled over be it, nothing, a single image, the same image as the final image * Error on Missing Background: Determines if the assembler should fail if the background image is missing

9.45. Rhino 671 Deadline User Manual, Release 7.0.3.0

9.45.3 Supported Renderers

Deadline supports many of the Rhino renderers out of the box, including Rhino Render, Flamingo, VRay, Brazil, Penguin, and TreeFrog. If you are using a renderer that Deadline does not currently support, please email Deadline Support and let us know!

672 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

It is also possible to manually add new renderers to the list that Deadline supports. Go to \\your\repository\script\Submission\RhinoSubmission and open Renderers.ini in a text editor. You’ll see that this file contains the list of renderers that Deadline currently supports, one per line. Just add the missing renderer as a new line and save the file. Note that the name needs to match that of the renderer exactly!

9.45.4 Plug-in Configuration

You can configure the Rhino plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Rhino plug-in from the list on the left.

Render Executables • Rhino Executable: The path to the Rhino executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.45.5 Integrated Submission Script Setup

The following procedures describe how to install the integrated Rhino submission script for different versions of Rhino. This script allows for submitting Rhino render Jobs to Deadline directly from within the Rhino editing GUI.

Rhino 5

The following installation procedure is intended and has been tested for Rhino 5.0. • In Rhino, select ‘Tools’ -> ‘Toolbar Layout’.

9.45. Rhino 673 Deadline User Manual, Release 7.0.3.0

• Select the Toolbar Collection file that you want to add the Deadline submission button to, and then select ‘File’ -> ‘Import Toolbars...’. Browse to [Repository]\submission\Rhino\Client\ and select the ‘deadline.rui’ file. • Check the box next to ‘Deadline’ and press ‘OK’.

674 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• There should not be a toolbar with a ‘Deadline’ button on your screen, which you can dock anywhere you want.

– Left-click on the button to submit a Rhino Job to Deadline. – Right-click on the button to launch the Monitor.

Rhino 4

The following installation procedure is intended and has been tested for Rhino 4.0. It is largely similar to the procedure described for Rhino 5 above, with some slight differences. • In Rhino, select Tools -> Toolbar Layout.

9.45. Rhino 675 Deadline User Manual, Release 7.0.3.0

• Select the Toolbar collection file that you want to add the Deadline submission button to, then select Toolbar -> Import. Browse to [Repository]\submission\Rhino\Client\ and select the deadline.tb file. • Check the box next to Deadline and press Import.

676 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Select File -> Save to save the changes to the selected Toolbar collection file. • There should now be a toolbar with a ‘Deadline’ button on your screen, which you can dock. – Left-click on the button to submit a Rhino job to Deadline. – Right-click on the button to launch the Monitor.

9.45.6 FAQ

Which versions of Rhino are supported? Rhino 4 and later are supported. Does Rhino need to be licensed on each render node? Yes. Is the Bongo plugin for animation supported? Yes. The Rhino submission dialog has the option to render a Bongo animation.

9.45.7 Error Messages and Meanings

This is a collection of known Rhino error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know.

9.45. Rhino 677 Deadline User Manual, Release 7.0.3.0

Currently, no error messages have been reported for this plug-in.

9.46 RVIO

9.46.1 Job Submission

You can submit RVIO jobs from the Monitor, or you can right-click on a job and select ‘Scripts’ -> ‘Submit RVIO Job To Deadline’ to automatically populate some fields in the RVIO submitter based on the job’s output.

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The RVIO submitter allows you to create and save Layers, each of which can contain one or two source images, an arbitrary number of audio files, and a list of overrides. • Click the New button to add a new Layer.

678 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Click the Rename button to rename the selected Layer. • Click the Remove button to remove the selected Layer. • Click the Clear All button to remove all Layers. • Click the Load Layers button to load saved Layers from disk. • Click the Save Layers button to save the list of current Layers to disk. For Layers, the only required setting is the Source 1 file(s). If specifying a sequence, you can set the range to the right of the file name (the same for the Source 2 file if specified). Note that the .rv file format is also supported as a Source file. For Audio Files, a comma separated list is used to allow the submission of multiple files. Other than submitting at least one Layer, the only other required option is the Output File under the Output tab. See the RVIO Documentation for more information about the available options and overrides.

Codec Lists

The RVIO submitter pulls its codec settings from the Codecs.txt file in \\your\repository\scripts\submission\RVIOSubmission. The contents of this file were retrieved from running “rvio.exe -formats” in a command prompt. This means that if your installation of RVIO supports additional codecs that aren’t available in the submitter, you can run the following and then take the resulting Codecs.txt file and replace our original one with it: rvio.exe-formats> Codecs.txt

9.46.2 Plug-in Configuration

You can configure the RVIO plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the RVIO plug-in from the list on the left.

9.46. RVIO 679 Deadline User Manual, Release 7.0.3.0

Render Executables • RVIO Executable: The path to the rvio executable file used for rendering. Enter alternative paths on separate lines.

9.46.3 FAQ

Is RVIO supported by Deadline? Yes.

9.46.4 Error Messages and Meanings

This is a collection of known RVIO error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.47 Salt

9.47.1 Job Submission

You can submit Salt jobs from the Monitor.

680 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Salt’s specific options are: • Verbose Logging Level: The level of logging a Salt job will output.

9.47.2 Plug-in Configuration

You can configure the Salt plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Salt plug-in from the list on the left.

9.47. Salt 681 Deadline User Manual, Release 7.0.3.0

Options • Salt Executable: The path to the Salt Executable. Enter alternative paths on separate lines.

9.48 Shake

9.48.1 Job Submission

You can submit Shake jobs from the Monitor.

682 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Shake specific options are: • Shake Script File: The Shake script file to be rendered.

9.48. Shake 683 Deadline User Manual, Release 7.0.3.0

• CPUs: The number of CPUs to use during rendering. • Additional Arguments: Additional arguments to pass to the Shake command line renderer.

9.48.2 Plug-in Configuration

You can configure the Shake plug-in settings from the Deadline Monitor. While in super user mode, select Tools -> Plugins Configuration and select the Shake plug-in from the list on the left.

Render Executables • Shake Executable: The path to the Shake executable file used for rendering. Enter alternative paths on separate lines.

9.48.3 FAQ

Is Shake supported by Deadline? Yes.

9.48.4 Error Messages and Meanings

This is a collection of known Shake error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

684 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

9.49 SketchUp

9.49.1 Job Submission

You can submit jobs from within SketchUp by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page. To submit from within SketchUp, select ‘Plugins’ -> ‘Submit To Deadline’.

9.49. SketchUp 685 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The SketchUp specific options are: • SketchUp File: The file to be exported.

686 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Export Directory: The export destination folder. • Export File Prefix: The file prefix of the export. • Export Type: The type of export (3D model, 2D image sequence or 2D image). • Export Format: The file format of the export file (ie. .3ds, .png, etc.). • Frame Rate: Enabled if exporting an image sequence, determines frequency of images (in frames per second). • Compression Rate: Float compression factor for JPEG images (between 0.0 and 1.0, lower is smaller size, larger is better quality). • Width: Width of image in pixels (if 0, uses information from SketchUp file). • Height: Height of image in pixels (if 0, uses information from SketchUp file). • Anti-alias: Enables image anti-aliasing. • Transparent: Enables image Transparency. • Version: The version of SketchUp to use.

9.49.2 Plug-in Configuration

You can configure the SketchUp plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the SketchUp plug-in from the list on the left.

Render Executables • SketchUp Executable: The path to the SketchUp executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.49. SketchUp 687 Deadline User Manual, Release 7.0.3.0

9.49.3 Integrated Submission Script Setup

The following procedures describe how to setup the integrated SketchUp submission script for Deadline. This script has been tested with SketchUp 7 and later. Submitter Installer • Run the Submitter Installer located at /submission/SketchUp/Installers Manual Installation of the Submission Script Windows: • Copy [Repository]/submission/SketchUp/Client/DeadlineSketchUpClient.rb to [SketchUp Plugin Directory] which will look different depending on your version of SketchUp. – SketchUp 8 and earlier, the plug-in directory may look something like this, C:\Program Files\Google\Google SketchUp #\plugins – SketchUp 2013, the plug-in directory may look something like this, C:\Program Files\SketchUp\SketchUp 2013\plugins – SketchUp 2014, the plug-in directory may look something like this, C:\Users\[User]\AppData\Roaming\SketchUp\SketchUp 2014\SketchUp\Plugins Mac OS X: • Copy [Repository]/submission/SketchUp/Client/DeadlineSketchUpClient.rb to [SketchUp Plugin Directory] which will look different depending on your version of SketchUp. – SketchUp 8 and ealier, the plug-in directory may look something like this, /Library/Application Sup- port/Google SketchUp #/SketchUp/plugins – SketchUp 2013 and later, the plug-in directory may look something like this (Note: it may have to be in the specific user’s /Library/ directory as of 2014), /Library/Application Support/SketchUp #/plugins

9.49.4 FAQ

Which versions of SketchUp are supported by Deadline? The commercial versions of SketchUp 7 and later are supported.

9.49.5 Error Messages and Meanings

This is a collection of known SketchUp error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.50 Softimage

9.50.1 Job Submission

You can submit jobs from within Softimage by installing the integrated submission script, or you can submit them from the Monitor. The instructions for installing the integrated submission script can be found further down this page.

688 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

To submit from within Softimage, select the Render toolbar on the left and click Render -> Submit To Deadline.

9.50. Softimage 689 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Softimage specific options are: • Workgroup: Specify the workgroup that Softimage should use during rendering. Leave blank to ignore. • Force Build: Force 32 bit or 64 bit rendering. • Submit Softimage Scene File: The Softimage scene file will be submitted with the job. If your Softimage scene is stored in a project folder on the network, it is recommended that you leave this box unchecked. • Threads: The number of render threads to use during rendering. • Use Softimage Batch Plugin: This plugin keeps Softimage and the scene loaded in memory between tasks.

690 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Enable Local Rendering: If enabled, the frames will be rendered locally, and then will be copied to the final network location. Note that this feature doesn’t support the “Skip Existing Frame” option. • Skip Batch Licensing Check: If enabled, Softimage won’t try to check out a Batch license during rendering. This allows you to use 3rd party renderers like VRay or Arnold without using a Softimage batch license. Selecting passes to render: • Select which passes you would like to render. A separate job is submitted for each pass. If no passes are selected, then the current pass is submitted. Note that if you are using FxTree Rendering, the passes are ignored.

Setting up a tile rendering job:

9.50. Softimage 691 Deadline User Manual, Release 7.0.3.0

• Enable tile rendering to split up a frame into multiple tiles that are rendered individually. By default, a separate job is submitted for each tile (this allows for tile rendering of a sequence of frames). For easier management of single frame tile rendering, you can choose to submit all the tiles as a single job. • You can submit a dependent assembly job to assemble the image when the main tile job completes. If using Draft for the assembly, you’ll need a license from Thinkbox. Otherwise, the output formats that are supported are BMP, DDS, EXR, JPG, JPE, JPEG, PNG, RGB, RGBA, SGI, TGA, TIF, and TIFF. • Note that the Error On Missing Tiles option only applies to Draft assemblies. • Note that if you are using FxTree Rendering, the tile rendering settings are ignored.

Setting up an FxTree rendering job: • Enable FxTree rendering to render a specific FxTree output node, which can be selected from the FxTree Output dropdown box. You can also set the frame offset for the output files. Some things to note are: – The frame range to be rendered is pulled from the Frame List setting under the Submission Options tab. – If you are rendering to a movie file, be sure to set the Group Size to the number of frames in your animation.

692 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Notes: • Softimage gives the option to specify file paths as being relative to the current directory or absolute. Deadline requires that all file paths be absolute. • When specifying the image output, make sure to include the extension (.pic, .tga, etc) at the end so that you can view the individual rendered images from the task list in the Monitor.

Redshift Renderer Options

If submitting a Softimage scene that uses the Redshift renderer, there will be an additional option in the integrated submitter called GPUs Per Task. If set to 0 (the default), then Redshift will be reponsible to choosing the GPUs to use for rendering. If this is set to 1 or greater, then each task for the job will be assigned specific GPUs. This can be used in combination with concurrent tasks to get a distribution over the GPUs. For example: • if this is set to 1, then tasks rendered by the Slaves’s thread 0 would use GPU 0, thread 1 would use GPU 1, etc. • if this is set to 2, then tasks rendered by the Slaves’s thread 0 would use GPUs {0,1}, thread 1 would use GPUs {2,3}, etc.

9.50.2 Cross-Platform Rendering Considerations

In order to perform cross-platform rendering with Softimage, you must setup Mapped Paths so that Deadline can swap out file paths where appropriate. You can access the Mapped Paths Setup in the Monitor while in super user mode by selecting Tools -> Configure Repository. You’ll find the Mapped Paths Setup in the list on the left.

9.50.3 Plug-in Configuration

You can configure the Softimage and SoftimageBatch plug-in settings from the Monitor. While in super user mode, select Tools -> Plugins Configuration and select the Softimage or SoftimageBatch plug-in from the list on the left.

9.50. Softimage 693 Deadline User Manual, Release 7.0.3.0

Softimage

Render Executables • Softimage Render Excecutable: The path to the XSIBatch.bat file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Options • Enable Strict Error Checking: If enabled, Deadline will fail in almost all cases when the job whenever Softimage prints out ‘ERROR’ for whatever reason. • Return Codes To Ignore: Error codes (other than 0) that Deadline should ignore and instead assume the render has finished successfully. Use a ; to separate the error codes.

694 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

SoftimageBatch

Render Executables • Softimage Render Excecutable: The path to the XSIBatch.bat file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes. Options • Enable Strict Error Checking: If enabled, Deadline will fail in almost all cases when the job whenever Softimage prints out ‘ERROR’ for whatever reason. • Connection Timeout: The amount of seconds to give the Deadline plugin and Softimage to establish a connection before the job fails. • Timeout For Progress Updates: The amount of seconds to between Softimage progress updates before the job is failed. Set to 0 to disable this feature.

9.50.4 Integrated Submission Script Setup

The following procedures describe how to install the integrated Softimage submission script. This script allows for submitting Softimage render jobs to Deadline directly from within the Softimage editing GUI. Earlier versions of Softimage might not ship with Python out of the box. In this case, follow these steps: • Install the Python engine for Softimage. For more information, see the Softimage Python installation documen- tation. • Check that the Python engine has been installed correctly. This can be done by opening up Softimage and selecting File -> Preferences. Under the Scripting preferences, you should have the option to select Python as the Script Language. If you don’t see this option, then Python has not been installed correctly, and you should contact the Softimage support team.

9.50. Softimage 695 Deadline User Manual, Release 7.0.3.0

Once Python is an available scripting option in Softimage, you can follow these steps to install the submission script: You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/Softimage/Installers Manual Installation of the Submission Script • Copy the file [Repository]/submission/Softimage/Client/DeadlineSoftimageClient.py to the folder [Softimage Install Directory]/Application/Plugins • Launch Softimage. The submission script is automatically installed when Softimage starts up. To make sure the script was installed correctly, select the Render toolbar on the left and click the Render button. A Submit To Deadline menu item should be available.

696 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Custom Sanity Check

A CustomSanityChecks.py file can be created alongside the main Softimage submsion scripts (in [Reposi- tory]\submission\Softimage\Main), and will be evaluated if it exists. This script will let you set any of the initial properties in the submission script prior to displaying the submission window. You can also use it to run your own checks and display errors or warnings to the user. Here is a very simple example of what this script could look like: import win32com.client Application= win32com.client.Dispatch('XSI.Application') def RunSanityCheck( opSet ):

opSet.Parameters("DepartmentTextBox").Value="The Best Department!" opSet.Parameters("PriorityNumeric").Value= 33 opSet.Parameters("BatchBox").Value= True

Application.LogMessage("This is a custom sanity check!")

9.50. Softimage 697 Deadline User Manual, Release 7.0.3.0

return True

The opSet parameters can be found in the SoftimageToDeadline.py script in the Main folder mentioned above. Look for the following line in the script:

opSet= Application.ActiveSceneRoot.AddProperty("CustomProperty",False,"SubmitSoftimageToDeadline")

After this line, all the available parameters are added to the opSet. These can be used to set the initial values in the submission dialog. Finally, if the RunSanityCheck method returns False, the submission will be canceled.

9.50.5 FAQ

Which versions of Softimage are supported? Softimage versions 2010 and later are supported. What is the difference between the Softimage and SoftimageBatch plug-ins? The SoftimageBatch plug-in keeps the scene loaded in memory between subsequent tasks for the same job. This saves on the overhead of having to load Softimage and the scene file for each task. The Softimage plug-in uses standard command line rendering, and should only be used if you experience problems with the SoftimageBatch plug-in. Is FxTree rendering supported? Yes. Simply enable FxTree rendering in the submission dialog and specify the FxTree and Output Node you want to render. Is the Arnold renderer for Softimage supported? Yes. Deadline supports the Arnold plug-in for Softimage, as well as Arnold’s standalone renderer (kick.exe). For more information on rendering Arnold Standalone jobs, see the Arnold Standalone Plug-in Guide. Can Softimage script jobs be submitted to Deadline? Yes. Deadline provides very basic support for script jobs, though there is currently no interface to submit them. The option for submitting a script job can be specified in the plug-in info file. After installing the Softimage integrated submission script, Softimage failes to load (it goes to a white screen and hangs). We have heard of this problem before, but we have not been able to reproduce it. The workaround for this problem is to remove the script from the plugins folder, and manually path to the submission script plugin after starting Softimage. When Deadline renders the job, Softimage isn’t able to find anything in the scene’s project folder. If you’re Softimage scene file is saved in a project folder on the network, leave the Submit Softimage Scene File check box unchecked in the submission dialog. This allows Deadline to load the Softimage scene in the context of its project folder. I have Softimage configured to save output to a network share, but when Deadline renders the scene, the render slaves save their output to their local C drive rather than to the network share. There are two possible solutions:

698 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

1. If you’re Softimage scene file is saved in a project folder on the network, leave the Submit Softimage Scene File check box unchecked in the submission dialog. This is the recommend solution. 2. Specify the full resolved path for the scene output directory, instead of something like [Project Path]\Render_Pictures. Rendering with Deadline seems a lot slower than rendering through Softimage itself. If you’re submitting your jobs with the Use Softimage Batch option disabled, then Softimage needs to be restarted and the scene needs to be reloaded for every task in the job, which can add a lot of overhead to the render time, especially if cached data needs to be loaded. To speed up your renders, you can increase the task group size (aka: chunk size) from 1 to 5 or 10. This way, the scene is loaded once for every 5 or 10 frames. Increasing the chunksize like this is recommended if you know ahead of time your frames will only take seconds to render, or if a large amount of cached data needs to be loaded.

9.50.6 Error Messages and Meanings

This is a collection of known Softimage error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Exception during render: Renderer returned non-zero error code, -1073741819 The error code -1073741819 is equivalent to 0xC0000005, which represents a Memory Access Violation error. So Softimage is either running out of memory, or memory is becoming corrupt. If you find that your frames are still being rendered, you can configure the Softimage plugin configuration to ignore this error. Exception during render: ‘ ERROR : 2000 - Library not found: ... This can occur if the Slave application’s environment variables haven’t been updated. Try rebooting the machine and see if that fixes the problem. ERROR : 2004 - Invalid pointer - [line 2] You can workaround this by renaming the ICEFlow plugin (Application\Plugins\ICEFlow.dll). This plu- gin manages the transfer of data between Softimage and Maya (the one-click ICE workflow).

9.51 Terragen

9.51.1 Job Submission

You can submit Terragen jobs from the Monitor.

9.51. Terragen 699 Deadline User Manual, Release 7.0.3.0

700 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Terragen specific options are: • Project File: The Terragen project file to render. • Render Node: Select the render node to render. Leave blank to use the default in the project. • Output: Override the output path in the project file. If rendering a sequence of frames, remember to include the %04d format in the output file name so that padding is added to each frame. • Extra Output: Override the extra output path in the project file. If rendering a sequence of frames, remember to include the IMAGETYPE.%04d format in the output file name so that padding is added to each frame. • Enable Local Rendering: If enabled, the frames will be rendered locally, and will then be copied to the final network location. Note that this requires that an Output file be specified above. • Version: The version of Terragen.

9.51.2 Plug-in Configuration

You can configure the Terragen plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Terragen plug-in from the list on the left.

Render Executables • Terragen CLI Executable: The path to the Terragen executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.51. Terragen 701 Deadline User Manual, Release 7.0.3.0

9.51.3 FAQ

Which versions of Terragen are supported? The commercial version of Terragen 2 and later are supported.

9.51.4 Error Messages and Meanings

This is a collection of known Terragen error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.52 Tile Assembler

9.52.1 Job Submission

You can submit Tile assembler jobs from the Monitor. Normally, these jobs are submitted as dependent jobs for your original tile jobs, but you can submit them manually if you wish. Please note that the Tile Assembler plugin is EOL (End-Of-Life/deprecated) and we recommend using the newer Draft Tile Assembler plugin for all tile/region assembly duties.

702 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Tile Assembler specific options are: • Input Tile Files: Select just one of your image tile files from a group to perform the tile assem- bly for. The files should have the format [PREFIX]\_tile\_[I]x[J]\_[X]x[Y].[EXTENSION]. For example, r:\projects\deadline\Tests\example_tile_1x1_2x1_0000.exr. Ensure the filenames match this naming conven- tion. • Tiles Are Uncropped: Enable this option if a tile consists of the full resolution of the image, with only a part of it rendered.

9.52. Tile Assembler 703 Deadline User Manual, Release 7.0.3.0

• Ignore Overlap: If assembling uncropped tiles, enable this option to ignore any overlap that exists for the given tiles. For example, if two tiles share a few pixels between them. • Clean Up Tile Files After Assembly: Enable to automatically delete the tile files after successfully assembling the final image. • Opaque Opacity: Use this option if non-exr tiles use opaque opacity in ‘empty’ pixels.

9.52.2 Plug-in Configuration

The Tile Assembler plug-in does not require any configuration.

9.52.3 FAQ

Is the Tile Assembler plugin still officially supported in Deadline? No. Please note that the Tile Assembler plugin is EOL (End-Of-Life) and we recommend using the newer Draft Tile Assembler plugin for all tile/region assembly duties. The old Tile Assembler system is still available in Monitor and via some of the in-app tile rendering submission scripts and will still work. However, it is now deprecated, so please do not build any in-house tools around Tile Assembler. The newer Draft Tile Assembler contains all the features of the old Tile Assembler and more! Tile Assembler will be removed at an undetermined date in the future. You have been warned!

9.52.4 Error Messages And Meanings

This is a collection of known Tile Assembler error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Please note that the Tile Assembler plugin is EOL (End-Of-Life) and we recommend using the newer Draft Tile Assembler plugin for all tile/region assembly duties. ERROR: ImageMagick: Invalid bit depth for RGB image ‘[path to tile/region render output image]’ This error is due to the TileAssembler executable not supporting certain bit depth images such as VRay’s RE’s “Reflection”, “Refraction” and “Alpha” when saved from the VRay Frame Buffer (VFB). Please use the newer “Draft Tile Assembler” plugin to ensure all image types/bit depths are correctly assembled. Draft Tile Asssembler jobs can also be submitted independently if you already have the *.config file(s) and is explained further in the Draft Tile Assembler documentation.

9.53 VRay Distributed Rendering

9.53.1 Interactive Distributed Rendering

You can submit interactive VRay DBR jobs from 3ds Max, Maya, or Softimage. The instructions for installing the integrated submission script can be found further down this page. The interactive submitter will submit a VRay Spawner job (VRay standalone for Maya, Softimage, 3dsMaxRT, Rhino, Sketchup) to reserve render nodes, and the submitter will automatically update the VRay server list. Do NOT execute or install the Chaos Group VRaySpawner (VRaySpawner/VRaySpawner RT/VRay standalone) executable as a background service (NT service/daemon). Deadline is more flexible here and will spawn the VRaySpawner/standalone executable as a child process of the Deadline Slave. This makes our system more flexible and resilient to crashes as when we terminate the VRay DBR job in the Deadline queue, the Deadline Slave appli- cation will ‘cleanly’ tidy up VRaySpawner/standalone and more importantly, any DCC application (3dsMax/Maya)

704 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0 or standalone instances which it in turn has spawned as a child process. This can be helpful if VRay DBR becomes unstable and a user wishes to reset the system remotely. You can simply re-queue or delete/complete the current DBR job or re-submit.

9.53. VRay Distributed Rendering 705 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The VRay DBR specific options are: • Maximum Servers: The maximum number of VRay Servers to reserve for distributed rendering. • Port Number (Softimage/Maya/3dsMax/3dsMaxRT only): The port number that VRay will use for distributed rendering. In the case of Softimage, this is necessary because Softimage uses VRay standalone for distributed rendering and the default port number for VRay in Softimage is different from the default port number in VRay standalone. The port number needs to be identical on all machines including the workstation machine for a particular DCC application to communicate correctly. It is recommended to disable any client firewall whilst initial testing/configuration is carried out. Typically, opening TCP/UDP ports in the range: 20200-20300 will cover all VRay implementations for DBR. • Use Server IP Address instead of Host Name: If checked, the Active Servers list will show the server IP ad- dresses instead of host names. • Automatically Update Server List (3dsMax only): This option when un-checked stops the automatic refresh of the active servers list based on the current Deadline queue. • Complete Job After Render (3dsMax only): When checked, as soon as the DR session has completed (max quick render finished), then the Deadline job will be marked as complete in the queue.

706 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Rendering

After you’ve configured your submission options, press the Reserve Servers button to submit the VRay Spawner job. The job’s ID and Status will be tracked in the submitter, and as nodes pick up the job, they will show up in the Active Servers list. Once you are happy with the server list, press Start Render (3ds Max and Maya) or Render Current Pass/Render All Passes (Softimage) to start distributed rendering. Note that the VRay Spawner/VRay standalone process can sometimes take a little while to initialize. This means that a server in the Active Server list could have started the VRay Spawner, but it’s not fully initialized yet. If this is the case, it’s probably best to wait a minute or so after the last server has shown up before pressing Start Render. After the render is finished, you can press Release Servers or close the submitter UI (Setup VRay DBR With Deadline) to mark the VRay Spawner/VRay standalone job as complete so that the render nodes can move on to another job.

9.53.2 VRay Spawner/VRay standalone Submission

You can also submit VRay Spawner/VRay standalone jobs from the Monitor, which can be used to reserve render nodes for distributed rendering. Note, if you submit the job via the Monitor submission script, that you will need to manually configure/update your local workstation settings to point to the correct, corresponding Deadline slaves over an identical port number.

9.53. VRay Distributed Rendering 707 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The VRay DBR specific options are: • Maximum Servers: The maximum number of VRay Servers to reserve for distributed rendering. • Application: The application you will be initiating the distributed render from. • Version: The version of the application, if applicable. • Port Number (Softimage/Maya/3dsMaxRT only): The port number that VRay will use for distributed rendering. In the case of Softimage, this is necessary because Softimage uses VRay standalone for distributed render- ing and the default port number for VRay in Softimage is different from the default port number in VRay standalone. The port number needs to be identical on all machines including the workstation machine for a particular DCC application to communicate correctly. It is recommended to disable any client firewall whilst

708 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

initial testing/configuration is carried out. Typically, opening TCP/UDP ports in the range: 20200-20300 will cover all VRay implementations for DBR.

Rendering

After you’ve configured your submission options, press the Submit button to submit the VRay Spawner/VRay stan- dalone job. Note that this doesn’t start any rendering, it just allows the VRay Spawner/VRay standalone application to start up on nodes in the farm. Once you’re happy with the nodes that have picked up the job, you can initiate the distributed render manually from within the application (ie: Rhino or Sketchup). This will likely require manually configuring your VRay Server list. After the distributed render has finished, remember to mark the job as complete or delete it so that the nodes can move on to other jobs.

9.53.3 Plug-in Configuration

You can configure the VRay Spawner/VRay standalone plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the VRaySpawner plug-in from the list on the left.

VRay Executables Here you can specify the executable used for rendering for the different versions of VRay. DR Process Handling • Handle Existing DR/DBR Process: Only one instance of the same DR process running over the same port is possible. This option allows for Deadline to fail the task if this is the case or attempt to kill the currently running process, to allow the Deadline managed DR process to run successfully. Note, if the process is set to kill and does indeed kill a currently present process, but seems to auto-restart even after killing; then this indicates the

9.53. VRay Distributed Rendering 709 Deadline User Manual, Release 7.0.3.0

process is already running as a service and the service will need to be stopped by your IT staff. Do NOT install as a service as Deadline can NOT support this configuration. DR Session Timeout (unsupported in 3dsMax) • DR Session Auto Timeout Enable: If enabled, when a DR session has successfully completed on a slave, the task on the slave will be marked as complete after the DR session auto timeout period in seconds has been reached (Default: False). • DR Session Auto Timeout (Seconds): This is the timeout period (Default: 30 seconds) when a DR session will timeout and be marked as complete by a slave.

9.53.4 Integrated Submission Script Setup

There are integrated VRay DBR submission scripts for 3ds Max, Maya, and Softimage. The installation process for these scripts can be found below. You can also submit VRay Spawner jobs for Rhino and Sketchup from the Monitor. In this case, the render nodes will simply be reserved for DBR, and the distributed rendering process itself will have to be initiated manually from within Rhino or Sketchup.

3ds Max

The following procedures describe how to install the integrated VRay DBR submission script for 3ds Max. The integrated submission script and the following installation procedure has been tested with Max versions 2012 and later (including Design editions). Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts will not work. However, this bug has been addressed in 3ds Max 2012 Hotfix 1. You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/3dsmaxVRayDBR/Installers Manual Installation of the Submission Script • Copy [Repository]/submission/3dsmaxVRayDBR/Client/Deadline3dsmaxVRayDBRClient.mcr to [3ds Install Directory]/MacroScripts. If you don’t have a MacroScripts folder in your 3ds Max install directory, check to see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsCmdClient.mcr file there if you do. • Copy [Repository]/submission/3dsmax/Client/SMTDSetup.ms to [3ds Max Install Direc- tory]/scripts/Startup/SMTDSetup.ms

710 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Maya

The following procedure describes how to install the integrated VRay DBR submission script for Maya. The integrated submission script and the following installation procedure has been tested with Maya versions 2012 and later. You can either run the Submitter installer or manually install the submission script Submitter Installer • Run the Submitter Installer located at /submission/MayaVRayDBR/Installers Manual Installation of the Submission Script • On Windows, copy the file [Repository]\submission\MayaVRayDBR\Client\DeadlineMayaVRayDBRClient.mel to [Maya Install Directory]\scripts\startup. If you do not have a userSetup.mel in [My Documents]\maya\scripts, copy the file [Repository]\submission\MayaVRayDBR\Client\userSetup.mel to [My Documents]\maya\scripts. If you have a userSetup.mel file, add the following line to the end of this file:

9.53. VRay Distributed Rendering 711 Deadline User Manual, Release 7.0.3.0

source "DeadlineMayaVRayDBRClient.mel";

• On Mac OS X, copy the file [Repository]/submission/MayaVRayDBR/Client/DeadlineMayaVRayDBRClient.mel to [Maya Install Directory]/Maya.app/Contents/scripts/startup. If you do not have a userSetup.mel in /Users/[USERNAME]/Library/Preferences/Autodesk/maya/scripts, copy the file [Repository]/submission/MayaVRayDBR/Client/userSetup.mel to /Users/[USERNAME]/Library/Preferences/Autodesk/maya/scripts. If you have a userSetup.mel file, add the following line to the end of this file:

source "DeadlineMayaVRayDBRClient.mel";

• On Linux, copy the file [Repository]/submission/MayaVRayDBR/Client/DeadlineMayaVRayDBRClient.mel to [Maya Install Directory]/Maya.app/Contents/scripts/startup. If you do not have a userSetup.mel in /home/[USERNAME]/maya/scripts, copy the file [Reposi- tory]/submission/MayaVRayDBR/Client/userSetup.mel to /home/[USERNAME]/maya/scripts. If you have a userSetup.mel file, add the following line to the end of this file:

source "DeadlineMayaVRayDBRClient.mel";

• The next time Maya is started, a Deadline shelf should appear with an orange button that can be clicked on to launch the submitter.

• If you don’t see the Deadline shelf, it’s likely that Maya is loading another userSetup.mel file from somewhere. Maya can only load one userSetup.mel file, so you either have to configure Maya to point to the file mentioned above, or you have to modify the file that Maya is currently using as explained above. To figure out which userSetup.mel file Maya is using, open up Maya and then open up the Script Editor. Run this command:

whatIs userSetup.mel

Softimage

The following procedure describes how to install the integrated VRay DBR submission script for Softimage. The integrated submission script and the following installation procedure has been tested with Softimage versions 2012 and later. Submitter Installer • Run the Submitter Installer located at /submission/SoftimageVRayDBR/Installers Manual Installation of the Submission Script • Copy the file [Repository]/submission/SoftimageVRayDBR/Client/DeadlineSoftimageVRayDBRClient.py to the folder [Softimage Install Directory]/Application/Plugins

712 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• Launch Softimage. The submission script is automatically installed when Softimage starts up. To make sure the script was installed correctly, select the Render toolbar on the left and click the Render button. A Setup VRay DBR With Deadline menu item should be available.

9.53.5 FAQ

Is VRay Distributed Rendering (DBR) supported? Yes. A special ‘reserve’ job is submitted that will run the VRay Spawner/VRay standalone process on the render nodes. Once the VRay Spawner/VRay standalone process is running, these nodes will be able to participate in distributed rendering. Which versions of VRay DBR are supported? VRay DBR interactive rendering is supported for 3ds Max, Maya, and Softimage 2012-2015. You can also submit VRay Spawner jobs for Rhino and Sketchup from the Monitor. In the latter case, the render nodes will simply be reserved for DBR, and the distributed rendering process itself will have to be initiated manually from within Rhino or Sketchup. VRay Slave or VRay Spawner application fails to start manually? During initial configuration of VRay DBR & any future debugging, it is recommended to disable any firewall & anti-virus software at both the DBR master host machine as well as all render slave machines which are intended to participate in the DBR render. We suggest you manually get VRay DBR up and running in your studio pipeline to verify all is well before then introducing Deadline as a framework to handle the Spawner/Slave process. Is Backburner required for 3dsMax based VRay DBR rendering via Deadline? Yes. Normal 3dsMax rendering via Deadline requires the Backburner dll’s to be present on a system and this is the same prerequisite for VRay DBR rendering to work correcty. Ensure you have the lat- est/corresponding version of Backburner to ensure it supports the version of 3dsMax you are using. You

9.53. VRay Distributed Rendering 713 Deadline User Manual, Release 7.0.3.0

can submit a normal 3dsmax render job to verify that Backburner & 3dsMax rendering via Deadline are all operating correctly before attempting to configure VRay DBR rendering. Use the Deadline job report to verify the correctly matched version of Backburner, 3dsMax are in order. 3dsmax.exe starts (via vrayspawnerYYYY.exe) in the taskbar (minimized) but then instantly disappears? VRay DBR rendering requires Deadline to have rendered at least one normal 3dsMax render job on the slave machine prior to attempting DBR rendering via vrayspawnerYYYY.exe. Essentially, to test/debug if this is an issue, try to manually start the vrayspawnerYYYY.exe program from the Start menu (Start menu > Programs > Chaos Group > V-Ray for 3dsmax > Distributed rendering > Launch V-Ray DR spawner). It will automatically try to find the 3dsmax.exe file and start it in server mode. You should end up with 3dsmax minimized in the task bar with the title “vraydummyYYYY.max”. If 3ds Max stays there alive without closing then VRay DBR is working correctly. If you see the 3ds Max window flashing on the taskbar and then instantly disappearing, right-click on the V-Ray DR spawner icon in the taskbar tray, select “Exit” to close the DR spawner application, and try submitting a regular Deadline 3dsMax render job with this machine running Deadline slave. After that, try to start the V-Ray DR spawner again. Do I need to run the vrayspawner (or RT/vrayslave/vray standalone) application or install vrayspawner (or RT/vrayslave/vray standalone) executable as a service/daemon on each machine? No. Do NOT execute or install the Chaos Group VRaySpawner (VRaySpawner/VRaySpawner RT/VRay standalone) executable as a background service (NT service/daemon). Deadline is more flexible here and will spawn the VRaySpawner/standalone executable as a child process of the Deadline Slave. This makes our system more flexible and resilient to crashes as when we terminate the VRay DBR job in the Deadline queue, the Deadline Slave application will ‘cleanly’ tidy up VRaySpawner/standalone and more importantly, any DCC application (3dsMax/Maya) or standalone instances which it in turn has spawned as a child process. This can be helpful if VRay DBR becomes unstable and a user wishes to reset the system remotely. You can simply re-queue or delete/complete the current DBR job or re-submit. Can I force VRaySpawner to run over a certain port? Yes. Set the system environment variable “VRAY_DR_CONTROLPORT” to the required port number or where possible, in the case of some supported applications we expose the Port Number option in our Monitor/in-app submitters. Note: the port numbers were changed in VRay for 3dsMax between version 2 & version 3. Consult VRay user manual for more information. VRay DBR rendering seems a little unstable sometimes or my machine slows down dramatically! Depending on the number of slave machines being used (Win7 OS < 20), scene file sizes being moved around together with asset files, and your network/file storage configuration, it may help to disable your local machine from participating in the DR render process. Depending on your 3D application used and the VRay version, there might be a “Use local host” or “Don’t use local machine” checkbox option, which can help to reduce the load on your local machine. Can I fully off-load 3dsMax VRay or Mental Ray DBR rendering from my machine? Yes, this is supported in the 3dsmax plugin. See the VRay/Mental Ray DBR section for more information.

9.53.6 Error Messages and Meanings

This is a collection of known VRay DBR error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. error: Failed to start network server: Failed to open listening port (98) VRay.exe/vrayslave has been configured as a service/daemon on the machine generating this error mes- sage, possibly during the VRay/Maya install process and this is conflicting with Deadline trying to also spawn the same process on the same TCP port (default: 20207). On Linux, ensure you check

714 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

the contents of the file: “/usr/autodesk/maya20##-x64/vray/bin/vrayslave” for a line entry as follows: “/usr/autodesk/maya2014-x64/vray/bin/vray.bin $* -server -portNumber=20207” where ## is the Maya version. This line entry should not be present. Note, we are unable to attach to an already running process as part of the VRay Spawner Plugin, hence the vray executable must NOT already be running. Do NOT execute or install VRay as a service. Deadline is more flexible here and will spawn the executable as a child process of the Deadline Slave.

9.54 VRay Standalone

9.54.1 Job Submission

You can submit VRay Standalone jobs from the Monitor.

9.54. VRay Standalone 715 Deadline User Manual, Release 7.0.3.0

716 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Setup your vrscene Files

Before you can submit a VRay Standalone job, you must export your scene into .vrscene files. You can export into either one .vrscene file with all your frames in it, or one .vrscene file per frame.

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The VRay specific options are: • VRay File: The VRay file (*.vrscene) to be rendered. If you are submitting a sequence of vrscene files (one file per frame), you only need to select one vrscene file from the sequence. • Output File: Optionally override the output file name. • Separate Input vrscene Files Per Frame: Select this option of you are submitting a sequence of vrscene files (one file per frame). • Threads: The number of threads to use for rendering. Specify 0 to use the optimal number of threads. • Command Line Args: Specify additional command line arguments you would like to pass to the mental ray renderer. • Vrimg2Exr Options: If you are saving out vrimg files, you can submit a dependent Vrimg2Exr job that will convert the vrimg files to exr files.

9.54.2 Plug-in Configuration

You can configure the VRay plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the VRay plug-in from the list on the left.

9.54. VRay Standalone 717 Deadline User Manual, Release 7.0.3.0

Render Executables • VRay Executable: The path to the VRay executable file used for rendering. Enter alternative paths on separate lines. Path Mapping For vrscene Files (For Mixed Farms) • Enable Path Mapping For vrscene Files: If enabled, a temporary vrscene file will be created locally on the slave for rendering and Deadline will do path mapping directly in the vrscene file.

9.54.3 FAQ

Is VRay Standalone supported? Yes.

9.54.4 Error Messages and Meanings

This is a collection of known VRay error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.55 VRay Ply2Vrmesh

9.55.1 Job Submission

You can submit Ply2Vrmesh jobs from the Submit menu in the Monitor.

718 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Ply2Vrmesh specific options are: • Input File: The file to be converted. • Output File: Optionally override the output file name. If left blank, the output name will be the same as the input name (with the vrmesh extension). • Append: appends the information as a new frame to the .vrmesh file • Merge Voxels: merge objects before voxelization to reduce overlapping voxels

9.55. VRay Ply2Vrmesh 719 Deadline User Manual, Release 7.0.3.0

• Smooth Angle: a floating point number that specifies the angle (in degrees) used to distinguish if the normals should be smoothed or not. If present it automatically enables the -smoothNormals flag. • Smooth Normals: generates smooth vertex normals. Only valid for .obj and .geo files; always enabled for .bin files • Map Channel: stores the UVW coordinates to the specified mapping channel (default is 1). Only valid for .obj and .geo files. When exporting a mesh that will be used in Maya, currently this must be set to 0 or the textures on the mesh will not render properly • FPS: a floating-point number that specifies the frames per second at which a .geo or .bin file is exported, so that vertex velocities can be scaled accordingly. The default is 24.0 • Preview Faces: specifies the maximum number of faces in the .vrmesh preview information. Default is 9973 faces. • Faces Per Voxel: specifies the maximum number of faces per voxel in the resulting .vrmesh file. Default is 10000 faces. • Preview Hairs: specifies the maximum number of hairs in the .vrmesh preview information. Default is 500 hairs. • Segments Per Voxel: specifies maximum segments per voxel in the resulting .vrmesh file. Default is 64000 hairs. • Hair Width Multiplier: specifies the multiplier to scale hair widths in the resulting .vrmesh file. Default is 1.0. • Preview Particles: specifies the maximum number of particles in the .vrmesh preview information. Default is 20000 particles. • Particles Per Voxel: specifies maximum particles per voxel in the resulting .vrmesh file. Default is 64000 particles. • Particle Width Multiplier: specifies the multiplier to scale particles in the resulting .vrmesh file. Default is 1.0. • Velocity Attr Name: specifies the name of the point attribute which should be used to generate the velocity channel. By default the ‘v’ attribute is used. • Disable Color Set Packing: only valid for .geo and .bgeo files; disables the packing of float1 and float2 attributes in vertex color sets. • Material IDs: only valid for .geo files; assigns material IDs based on the primitive groups in the file. • Flip Normals: reverses the face/vertex normals. Only valid for .obj, .geo and .bin files • Flip Vertex Normals: reverses the vertex normals. Only valid for .obj, .geo and .bin files • Flip Face Normals: reverses the face normals. Only valid for .obj, .geo and .bin files • Flip YZ: swap y/z axes. Needed for some programs i.e. , ZBrush. Valid for .ply, .obj, .geo and .bin files. • Flip Y Positive Z: same as -flipYZ but does not reverse the sign of the z coordinate. • Flip X Positive Z: same as -flipYPosZ but swaps x/z axes.

9.55.2 Plug-in Configuration

You can configure the Ply2Vrmesh plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Ply2Vrmesh plug-in from the list on the left.

720 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Render Executables • Ply2Vrmesh Executable: The path to the ply2vrmesh.exe executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.55.3 FAQ

Which versions of Ply2Vrmesh are supported? Ply2Vrmesh for VRay 2 and 3 are currently supported.

9.55.4 Error Messages and Meanings

This is a collection of known Ply2Vrmesh error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.56 VRay Vrimg2Exr

9.56.1 Job Submission

You can submit Vrimg2Exr jobs from the Monitor. You can use the Submit menu, or you can use a job’s right-click Scripts menu to automatically populate some fields in the Vrimg2Exr submitter based on the job’s output.

9.56. VRay Vrimg2Exr 721 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The Vrimg2Exr specific options are:

722 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

• VRay Image File: The VRay Image file(s) to be converted. If you are submitting a sequence of files, you only need to select one vrimg file from the sequence. • Output File: Optionally override the output file name (do not specify padding). If left blank, the output name will be the same as the input name (with the exr extension). • Frame List: The list of frames convert. • Specify Channel: Enable this option to read the specified channel from the vrimg file and write it as the RGB channel in the output file. • Crop EXR Data Window: Enable this option to auto-crop the EXR data window. • Set Gamma: Enable this option to apply the specified gamma correction to the RGB colors before writing to the exr file. • Store EXR Data as 16-bit (Half): Enable this option to store the data in the .exr file as 16-bit floating point numbers instead of 32-bit floating point numbers. • Set Buffer Size: Enable this option to set the maximum allocated buffer size per channel in megabytes. If the image does not fit into the max buffer size, it is converted in several passes. • Convert RGB Data to the sRGB Color Space: Enable this option to converts the RGB data from the vrimg file to the sRGB color space (instead of linear RGB space) before writing to the exr file. • Set Compression: Enable this option to set the compression type. The Zip method is used by default. • Delete Input vrimg Files After Conversion: Enable this option to delete the input vrimg file after the conversion has finished.

9.56.2 Plug-in Configuration

You can configure the Vrimg2Exr plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Vrimg2Exr plug-in from the list on the left.

9.56. VRay Vrimg2Exr 723 Deadline User Manual, Release 7.0.3.0

Render Executables • Vrimg2Exr Executable: The path to the vrimg2exr.exe executable file used for rendering. Enter alternative paths on separate lines.

9.56.3 FAQ

Is Vrimg2Exr supported? Yes.

9.56.4 Error Messages and Meanings

This is a collection of known Vrimg2Exr error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.57 Vue

9.57.1 Job Submission

You can submit jobs from within Vue, or you can submit them from the Monitor.

724 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Submitting from Vue

If you are submitting a single frame from within Vue, select Render -> Render Options, then do the following: • Find the Renderer section, select RenderBull/RenderNode Network, then press the Edit button. • In the Options dialog that pops up, enter the submission command described below.

9.57. Vue 725 Deadline User Manual, Release 7.0.3.0

• You can also enter the folder you want the temporary Vue scene file saved in during submission. By default, you should be able to leave this blank. Press OK when finished. • Press Render to bring up the submission dialog.

If you are submitting an animation from within Vue, select Animation -> Animation Render Options, then do the following: • Find the Renderer section, select Network Rendering/RenderNode Network, then press the Edit button. • In the Options dialog that pops up, enter the submission command described below. • You can also enter the folder you want the temporary Vue scene file saved in during submission. By default, you should be able to leave this blank. Press OK when finished. • Press Render Animation to bring up the submission dialog.

726 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

This is the submission command to submit a job from within Vue. Make sure this is entered as one line, and make sure to set the deadlinecommand.exe and repository paths correctly. Note that the last two arguments ‘10’ and ‘64bit’ are optional, are are used to automatically populate the Version and Build settings respectively. Check the Vue submission dialog in the Monitor for the available options for Version and Build.

"[Client Bin Folder]\deadlinecommand.exe" -executescript [Repository]\scripts\submission\VueSubmission\VueSubmission.py "[FILE_PATH]" "[SCENE_NAME]" "[NUM_FRAMES]" 10 64bit

9.57. Vue 727 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options are explained in the Draft and Integration documentation. The Vue specific options are: • Vue File: The Vue scene file to be rendered. • Render animation sequence: Whether or not to render the full animation. • Version: The version of Vue to render with. • Build To Force: Force 32 bit or 64 bit rendering.

9.57.2 Plug-in Configuration

You can configure the Vue plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the Vue plug-in from the list on the left.

Render Executables • Vue Executable: The path to the Vue executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your render nodes.

9.57.3 FAQ

Which versions of Vue are supported? Vue 6 and later are supported (Infinite and xStream editions). I have Vue render node licenses, but when I render with Deadline, I get the error “No serial number found”.

728 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

If you have render node licenses for Vue, you need to use the *RenderNode.exe executable (ie: Vue 9 xStream RenderNode.exe) instead of the StandaloneRenderer.eon executable for rendering.

9.57.4 Error Messages and Meanings

This is a collection of known Vue error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Unable to initialize application - Check render log for more information. Check the render log for the job to see if this additional information in printed out:

STDOUT: Initializing...Error STDOUT: No serial number found STDOUT: Unable to initialize application. Exiting.

If this is the case, it means that Vue can’t get a license. If you have render node licenses for Vue, you need to use the *RenderNode.exe executable (ie: Vue 9 xStream RenderNode.exe) instead of the StandaloneRenderer.eon executable for rendering.

9.58 xNormal

9.58.1 Job Submission

You can submit xNormal jobs from the Monitor.

9.58. xNormal 729 Deadline User Manual, Release 7.0.3.0

Submission Options

The general Deadline options are explained in the Job Submission documentation. The xNormal specific options are: • XML File: The xNormal XML file to render. • Build To Force: Force 32 bit or 64 bit rendering.

9.58.2 Plug-in Configuration

You can configure the xNormal plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the xNormal plug-in from the list on the left.

730 Chapter 9. Application Plugins Deadline User Manual, Release 7.0.3.0

Render Executables • xNormal Executable: The path to the xNormal executable file used for rendering. Enter alternative paths on separate lines.

9.58.3 FAQ

Is xNormal supported? Yes

9.58.4 Error Messages and Meanings

This is a collection of known xNormal error messages and their meanings, as well as possible solutions. We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline Support and let us know. Currently, no error messages have been reported for this plug-in.

9.58. xNormal 731 Deadline User Manual, Release 7.0.3.0

732 Chapter 9. Application Plugins CHAPTER TEN

EVENT PLUGINS

10.1 Draft

10.1.1 Overview

Draft is a tool that provides simple compositing functionality. It is implemented as a Python library, which exposes functionality for use in python scripts. Draft is designed to be tightly integrated with Deadline, but it can also be used as a standalone tool. Using Deadline’s Draft plugin, artists can automatically perform simple compositing operations on rendered frames after a render job finishes. They can also convert them to a different image format, or generate Quicktimes for dailies.

10.1.2 Submitting Dependent Draft Jobs

When submitting jobs to Deadline through any of our integrated submitters, you now have the option to have Deadline create a dependent Draft Job once the submitted job is done rendering; this is where the Draft Event Plugin comes into play.

The options available here are similar to those discussed in the Draft Plugin section. Although it might appear as though there are less options here than in the Monitor submitter, all the same information will get passed to the Draft template. This approach just allows us to automatically pull a lot of the needed info directly from the scene file and from information filled in elsewhere in the submitter.

733 Deadline User Manual, Release 7.0.3.0

10.1.3 Setup

Since Draft is being shipped alongside Deadline, there is not a whole lot of configuration that is needed for this event plugin to work (beyond simply enabling it). There are, however, options that allow you to select the priority, group and pool to which the Draft event plugin will submit Draft jobs. To access these settings, simply enter Super User mode and select Tools -> Configure Events form the Monitor’s menu. From there, select the Draft entry from the list on the left.

The Draft event plugin settings are: • Enabled: If this event plugin is enabled. • Draft Pool: The Group to which the Draft jobs will be submitted. If blank, the original job’s Group will be re-used. • Draft Group: The Pool to which the Draft jobs will be submitted. If blank, the original job’s Pool will be re-used. • Draft Limit: The Limit to which the Draft jobs will be submitted. If blank, no Limit will be used. • Priority Offset: This offset will be added to the original job’s priority, in order to determine the Draft job’s priority. • Draft Output Folder: The folder in which to put the Draft output, relative to the Draft input folder.

10.2 ftrack

10.2.1 Overview ftrack is a cloud-based Project Management tool that provides Production Tracking, Asset Management, and Team Collaboration tools to digital studios; see the ftrack website for more information.

734 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

Using Deadline’s ftrack event plugin, artists can automatically create new Asset Versions in ftrack when they submit a render Job to the farm. When a Job completes, Deadline will automatically update associated Asset Versions with a proper Status, Thumbnail, and Components (if the output location is known).

10.2.2 Creating Versions

Versions can either be created automatically on submission (using the ftrack Event Plugin), or done manually after- wards.

Automatic Version Creation

When you submit a new job to Deadline, you can have Deadline automatically create a new Asset Version in ftrack. This is done by connection to ftrack during the submission process, and selecting the Asset to which the Job should be tied. The majority of the submission scripts that ship with Deadline include the ftrack connection option. For this example, we will use Nuke, but the process is basically the same for each submission script. First, find the tab or panel with the Integration settings. For Nuke, this is under the Integration tab.

10.2. ftrack 735 Deadline User Manual, Release 7.0.3.0

Choose ftrack from the Project Management drop down, and then press the Connect button to bring up Deadline’s ftrack browser. Enter your ftrack Login Name and press Connect. If the connection is successful, Deadline will collect the list of Projects and Tasks you are assigned to. If there are problems connecting, Deadline will try to display the appropriate error message to help you diagnose the problem.

736 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

10.2. ftrack 737 Deadline User Manual, Release 7.0.3.0

After you have selected a Task and Asset, you must specify a Version Description.

738 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

After you have configured the Version information, press OK to return to the Nuke submitter. The ftrack settings will now contain the Version information you just specified. To include this information with the job, leave the Create New Version option enabled. If you want to change the Version name or description before submitting, you can do so without reconnecting to ftrack.

10.2. ftrack 739 Deadline User Manual, Release 7.0.3.0

You can now press OK to submit the job. If the ftrack event plugin is configured to create the new version during Submission, the log report from the ftrack event plugin will show the Version’s ID. Otherwise, the Version won’t be created in ftrack until the job completes. You can view the log report for the job by right-clicking on the job in the Monitor and selecting View Job Reports.

740 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

Manual Version Creation

You can also create an Asset Version and tie it to a Job after submission, from the Deadline Monitor. To do this, simply right-click on the job and select ‘Scripts’ -> ‘Create FTrack Version’. This will bring up an ftrack browser so that you can connect, pick the appropriate asset, and set a description. After specifying the required information, just press OK and the new Version should be created.

10.2. ftrack 741 Deadline User Manual, Release 7.0.3.0

Selecting An Existing Version

Some of our submission scripts can edit an existing ftrack version. For example, our Quicktime submitter allows you to select an existing ftrack Version to upload the movie to when the job completes.

742 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

Choose ftrack from the Project Management drop down, and then press the Connect button to bring up Deadline’s ftrack browser. Enter your ftrack Login Name and press Connect. If the connection is successful, Deadline will collect the list of Tasks you are assigned to. If there are problems connecting, Deadline will try to display the appropriate error message to help you diagnose the problem.

10.2. ftrack 743 Deadline User Manual, Release 7.0.3.0

744 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

After you have selected a Task and Asset press OK to return to the Quicktime submitter. The ftrack settings will now contain the Version information you just specified. To upload the movie file to the selected Version, leave the Create New Version option enabled.

10.2. ftrack 745 Deadline User Manual, Release 7.0.3.0

You can now press OK to submit the job. When the job finishes, the rendered movie will automatically be uploaded to the selected Version.

10.2.3 Setup

In order to be able to create versions within Deadline, you must first follow the steps below to setup Deadline’s connection to ftrack.

746 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

Create API Key

The first thing you need to do is create an API Key in ftrack. This will be used by Deadline to authenticate when connecting to the ftrack API. To create a new API Key, you need to navigate to the ‘API Keys’ page, located under the ‘Security’ header of ftrack’s Settings section. Once this page is displayed, press the ‘Create’ button to create a new key; while you could re-use an existing key, it is recommended that you create a separate one for Deadline.

The name of the key doesn’t matter much (as long as it’s descriptive), but make sure Enabled is set to ‘On’ and that you select the ‘API’ role. Once you’ve filled in all the values, click the ‘Create’ button to finalize the key’s creation.

Once you’ve created the new entry, take note of its ‘Key’ value – you will need this when configuring Deadline in the next step.

Configure Deadline

Once you’ve created an API Key as detailed above, you can now set up the Event Plugin to connect to ftrack. To perform this setup, you need to enter Super User Mode (from the Tools menu), and then select ‘Tools’ -> ‘Configure Events’. Once in the Event Plugin Configuration window, select ‘FTrack’ from the list on the left.

10.2. ftrack 747 Deadline User Manual, Release 7.0.3.0

This is where you will configure all the ftrack-relevant settings in Deadline. There are several different categories of settings you can configure; they are described in more detail below. Options This section contains general high-level options that control the behaviour of the Deadline’s ftrack integration. • Enabled: This will turn Deadline’s ftrack integration on/off. In order for this feature to function properly, this must be set to ‘True’. • Create Version On Submission: This setting controls when an Asset Version is created in ftrack. If this is ‘True’, they will be created when a Job is submitted. On the other hand, if this is ‘False’, the Asset Version will only be created when the Job is Completed. Connection Settings This section contains information that Deadline uses to connect to the ftrack API; these settings must be configured properly in order for this feature to work at all. • FTrack URL: This is the URL which you use to connect to your ftrack installation. • FTrack Proxy: The proxy you use to connect to ftrack. This is only relevant if you use a Proxy; if in doubt, leave this field blank. • FTrack API Key: This is where you must enter the API Key created in the ‘Create API Key’ step. Version Status Mappings

748 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

This section contains mappings from Deadline Job Statuses to ftrack Asset Version Statuses. These are not neces- sary, but if specified, Deadline will update the status of Asset Versions as Deadline Jobs change status (based on the mappings provided).

Rename ExtraInfo Columns

The ftrack integration uses ExtraInfo columns 0-5 to display relevant information about the Asset Versions that are tied to Deadline Jobs. Given that “ExtraInfo0” isn’t exactly a descriptive name for what that column is being used for in this context, many people find it useful to rename these columns to be more descriptive. To do so, you must be in Super User mode and select ‘Tools’ -> ‘Repository Options’. You must then go to the ‘Job Settings’ section, and select the ‘Extra Properties’ tab; from here you’ll be able to change these column names to something more appropriate.

10.3 Puppet

10.3.1 Overview

Puppet is management system that can be used to keep applications and plugins synched across your render nodes. See the Puppet Labs Website for more information. The Puppet event plugin that ships with Deadline can be used to run a Puppet update on a slave when it starts and when it becomes idle, thus allowing you to keep your render nodes in sync without interupting jobs that are currently rendering. Note that Puppet must already be configured to work outside of Deadline. Once your Puppet system is set up, you can then enable the Puppet event plugin for Deadline to automatically trigger Puppet updates.

10.3.2 Setup

Some configuration is needed to use the Puppet event plugin. To access these settings, simply enter Super User mode and select Tools -> Configure Events form the Monitor’s menu. From there, select the Puppet entry from the list on the left.

10.3. Puppet 749 Deadline User Manual, Release 7.0.3.0

The Puppet event plugin settings are: • Enabled: If this event plugin is enabled. • Puppet Path: The path to the Puppet executable file. Enter alternative paths on separate lines. • Verbose: If enabled, the puppet update will have verbose logging enabled.

10.4 Salt

10.4.1 Overview

Salt (or SaltStack) is management system that can be used to keep applications and plugins synched across your render nodes. See the SaltStack Website for more information. The Salt event plugin that ships with Deadline can be used to run a Salt update on a slave when it starts and when it becomes idle, thus allowing you to keep your render nodes in sync without interupting jobs that are currently rendering. Note that Salt must already be configured to work outside of Deadline. Once your Salt system is set up, you can then enable the Salt event plugin for Deadline to automatically trigger Salt updates.

10.4.2 Setup

Some configuration is needed to use the Salt event plugin. To access these settings, simply enter Super User mode and select Tools -> Configure Events form the Monitor’s menu. From there, select the Salt entry from the list on the left.

750 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

The Salt event plugin settings are: • Enabled: If this event plugin is enabled. • Salt Exe: The path to the Salt Executable. Enter alternative paths on separate lines. • Logging: The level of verbose logging Salt will provide.

10.5 Shotgun

10.5.1 Overview

Shotgun is a customizable web-based Production Tracking system for digital studios, and is developed by Shotgun Software. Using Deadline’s Shotgun event plug-in, artists can automatically create new Versions for Shots or Tasks in Shotgun when they submit a render job to the farm. When the job finishes, Deadline can automatically update the Version by uploading a thumbnail and marking it as complete or pending for review.

10.5. Shotgun 751 Deadline User Manual, Release 7.0.3.0

10.5.2 Creating Versions

Versions can either be created automatically on submission (using the Shotgun Event Plugin), or done manually afterwards.

Automatic Version Creation

When you submit a new job to Deadline, you can have Deadline automatically create a new Version in Shotgun. This is done by connecting to Shotgun prior to submitting the job and choosing the Task that the job is for. The majority of the submission scripts that ship with Deadline include the Shotgun connection option. For this example, we will use Nuke, but the process is basically the same for each submission script. First, find the tab or panel with the Integration settings. For Nuke, this is under the Integration tab.

752 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

Choose Shotgun from the Project Management drop down, and then press the Connect button to bring up Deadline’s Shotgun browser. Enter your Shotgun Login Name and press Connect. If the connection is successful, Deadline will collect the list of Tasks you are assigned to. If there are problems connecting, Deadline will try to display the appropriate error message to help you diagnose the problem.

10.5. Shotgun 753 Deadline User Manual, Release 7.0.3.0

754 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

After you have selected a Task, you must specify a Version name and a description. If you have configured Version name templates in the Shotgun event plugin configuration, you can select one from the drop down. You can also manually type in the version name instead.

10.5. Shotgun 755 Deadline User Manual, Release 7.0.3.0

After you have configured the Version information, press OK to return to the Nuke submitter. The Shotgun settings will now contain the Version information you just specified. To include this information with the job, leave the Create New Version option enabled. If you want to change the Version name or description before submitting, you can do so without reconnecting to Shotgun.

756 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

You can now press OK to submit the job. If the Shotgun event plugin is configured to create the new version during Submission, the log report from the Shotgun event plugin will show the Version’s ID. Otherwise, the Version won’t be created in Shotgun until the job completes. You can view the log report for the job by right-clicking on the job in the Monitor and selecting View Job Reports.

10.5. Shotgun 757 Deadline User Manual, Release 7.0.3.0

Manual Version Creation

To manually create a Version from a completed job, right-click on the job in the Deadline Monitor and select Scripts -> Create Shotgun Version. This will bring up the Shotgun browser so that you can connect, pick the appropriate Task, and specify a Version name and description. After specifying the appropriate information, press OK to create the new version.

758 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

Selecting An Existing Version

Some of our submission scripts can edit an existing Shotgun version. For example, our Quicktime submitter allows you to select an existing Shotgun Version to upload the movie to when the job completes.

10.5. Shotgun 759 Deadline User Manual, Release 7.0.3.0

Choose Shotgun from the Project Management drop down, and then press the Connect button to bring up Deadline’s Shotgun browser. Enter your Shotgun Login Name and press Connect. If the connection is successful, Deadline will collect the list of Tasks you are assigned to. If there are problems connecting, Deadline will try to display the appropriate error message to help you diagnose the problem.

760 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

10.5. Shotgun 761 Deadline User Manual, Release 7.0.3.0

After you have selected a Task, you can select a Version for that Task. Then press OK to return to the Quicktime submitter. The Shotgun settings will now contain the Version information you just specified. To upload the movie file to the selected Version, leave the Create New Version option enabled.

You can now press OK to submit the job. When the job finishes, the rendered movie will automatically be uploaded to the selected Version.

762 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

10.5.3 Advanced Workflow Mode

When setting up the Shotgun event plugin, you can enable an Advanced Workflow Mode. This mode allows you to create Versions by selecting a Task, or by selecting a Project and Entity. Studios that don’t use the Task-centric approach will probably find the Advanced Workflow Mode more suitable to their needs.

10.5.4 Setup

Follow these steps to setup Deadline’s connection to Shotgun.

Create the API Script in Shotgun

In Shotgun, you must first create a new API script so that Deadline can communicate with Shotgun. This can by done from the Admin menu.

10.5. Shotgun 763 Deadline User Manual, Release 7.0.3.0

After the Scripts page is displayed, press the [+] button to create a new script, and enter the following information in the window that appears. If you can’t see one or more of the following fields, use the More Fields drop down to show them. • Script Name: deadline_integration • Description: Script for Deadline integration • Version: 1.0 • Permission Group: API Admin

After you have created the new script, click on the deadline_integration link in the Scripts list and note the value in Application Key field (it’s a long key consisting of alphanumeric characters). You’ll need this key when configuring Deadline’s Shotgun connection in the next step.

764 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

Configure the Shotgun Connection

After you have created the Deadline API Script in Shotgun, you can now configure the Shotgun event plug-in from the Deadline Monitor. Enter Super User Mode from the Tools menu, and then select Tools -> Configure Events.

10.5. Shotgun 765 Deadline User Manual, Release 7.0.3.0

The event plugin settings are split up into a few sections. The most important sections are the Options and Connection Settings, as these control how Deadline connects to Shotgun. In most cases, the Field and Value Mapping sections can be left alone because they map to fields that exist in the default Shotgun installation. Only studios that have deeply customized their Shotgun installations might have to worry about changing the Field and Value Mapping settings. Options • Enabled: The Shotgun event plugin must be enabled before Deadline can connect to Shotgun. • Create Version On Submission: If enabled, Deadline will create the Shotgun Version at time of submission and update its status as the job progresses. Otherwise, the Version will only be created once the job completes. • Enable Advanced Workflow: If enabled, the user can select a Project and Entity instead of just a Task. • Thumbnail Frame: The frame to upload to Shotgun as a thumbnail. • Convert Thumbnails with Draft: Whether or not to attempt to use Draft to convert the Thumbnail frame prior to upload. • Thumbnail Conversion Format: The format to convert the Thumbnail to prior to upload (see above). • Version Templates: Presets for Version names that users can select from (one per line). Available tokens include ${project}, ${shot}, ${task}, ${user}, and ${jobid}. For example:

– ${project} - ${shot} - ${task} – ${project}_${shot}_${task} (${jobid}) • Enable Verbose Errors: Whether or not detailed (technical) error information should be displayed when errors occur while connecting to Shotgun. Connection Settings • Shotgun URL: Your Shotgun URL. • Shotgun Proxy: Your proxy (if you use one).

766 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

• API Script Name: The name of the API script you created in Shotgun earlier (deadline_integration). • API Application Key: The key from the script you created in Shotgun earlier (it’s a long key consisting of alphanumeric characters). Shotgun Field Mappings These are the Version fields that Deadline is expecting to exist in Shotgun. The default values match those from a default Shotgun installation, so you will only have to edit these settings if you have customized the Version Field names in your Shotgun installation. Note that some of the Fields you can specify aren’t created by default in Shotgun. You will have to manually create those fields in Shotgun and specify their names here, if you wish to use them. An example of such fields would be Deadline Job ID, and Average/Total Render Time. Status Value Mappings These are the Version status values that Deadline is expecting to exist in Shotgun. The default values match those from a default Shotgun installation, so you will only have to edit these settings if you have customized the Version Status values in your Shotgun installation. Draft Field Mappings • Draft Template Field: The field code for a Task field that contains a Draft Template relevant to the Task. If this is specified, Deadline can automatically pull in the specified template at submission time.

Test the Shotgun Connection

After you have configured the Shotgun connection, you can test it from the Deadline Monitor by selecting Scripts -> TestIntegrationConnection. This will bring up the Test Integration Connection dialog.

10.5. Shotgun 767 Deadline User Manual, Release 7.0.3.0

Choose Shotgun from the Project Managemnt drop down, and then press Connect. If the connection is successful, Deadline will collect the list of Tasks you are assigned to. If there are problems connecting, Deadline will try to display the appropriate error message to help you diagnose the problem.

768 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

Set up Shotgun Columns in the Deadline Monitor

Deadline uses the job Extra Info properties 0 to 5 for Shotgun specific settings, and you can configure the columns in the Job List in the Monitor to properly represent these settings. In the Monitor, enter Super User mode from the Tools menu, and then select Tools -> Configure Repository Options. Find the Job Settings section and click on the Extra Properties tab. It will show the following:

10.5. Shotgun 769 Deadline User Manual, Release 7.0.3.0

Rename the Extra Info properties as shown in the following image. After committing these changes, you will now be able to see these Shotgun specific columns in the Job List in the Monitor.

770 Chapter 10. Event Plugins Deadline User Manual, Release 7.0.3.0

10.5.5 FAQ

Which editions of Shotgun does Deadline support? Deadline supports the Studio and Partner editions of Shotgun, because those editions include the necessary API access. Which versions of Shotgun does Deadline support? Deadline supports Shotgun 2.3 and later. Which version of the Shotgun API does Deadline use? Deadline 7.0 ships with version 3.0.17 of the Python Shotgun API.

10.5. Shotgun 771 Deadline User Manual, Release 7.0.3.0

772 Chapter 10. Event Plugins CHAPTER ELEVEN

CLOUD PLUGINS

11.1 Amazon EC2

11.1.1 Overview

The Amazon EC2 plugin for Deadline allows for communication between Deadline and the EC2 service. It works with both the Cloud Panel in the Monitor and the Deadline Balancer application.

11.1.2 Configuration

Before you can configure the Amazon EC2 plugin for Deadline, you must add Amazon as a provider in the Cloud Providers dialog in the Monitor. The Amazon EC2 plugin requires only a few credentials before it can be used in Deadline. These can be collected from the Amazon EC2 web site (see the image below).

773 Deadline User Manual, Release 7.0.3.0

Configuration Settings

Options • Enabled: Enables the region. • Access Key ID: Your EC2 Access key. • Secret Access Key: Your EC2 secret key. • Region: The Ec2 region you want to use. • Account Number: Your EC2 account number. Used to filter the image list. • Instance Types: List of the Hardware Types used on EC2. Make sure you use types that are supported by Amazon. You can find a list of them Here Customization • Instance Name: The name of the instances that are started by the Balancer. We add some random hex values to the end for uniqueness. VM Configuration • Key Pair Name: The Key Pair to be used for the instance. • Subnet ID: ID of the Subnet to start instances in.

11.2 Google Cloud

11.2.1 Overview

The Google plugin for Deadline allows for communication between Deadline and the Google Cloud service. It works with both the Cloud Panel in the Monitor and the Deadline Balancer application.

11.2.2 Configuration

Before you can configure the Google Cloud plugin for Deadline, you must add Google Cloud as a provider in the Cloud Providers dialog in the Monitor. The Google plugin requires only a few credentials before it can be used in Deadline (see the image below). You can download the Client Secrets file from the API->Credentials section of the Google Compute console.

774 Chapter 11. Cloud Plugins Deadline User Manual, Release 7.0.3.0

Configuration Settings

Credentials • OAuth 2.0 Dat: Path to you oauth2.dat file. This won’t exist until you Verify Access (or try to use the plguin) for the first time. The API will download it after you grant it access. • Client Secrets: Client Secrets file you downloaded from the Google Cloud Console. • Project: Name of your Google Cloud Project. Options • Enabled: Enables the region for use in Deadline. • Zone: Google zone to connect to. • Port: Port number used for connection. Customization • Instance Name: Name used when starting new instances. We add some random hex values to the end for uniqueness.

11.3 Microsoft Azure

11.3.1 Overview

The Azure plugin for Deadline allows for communication between Deadline and the Azure service. It works with both the Cloud Panel in the Monitor and the Deadline Balancer application.

11.3.2 Configuration

Before you can configure the Azure plugin for Deadline, you must add Azure as a provider in the Cloud Providers dialog in the Monitor. The Azure plugin requires only a few credentials before it can be used in Deadline (see the image below). You’ll also have to create and upload a Management Certificate.

11.3. Microsoft Azure 775 Deadline User Manual, Release 7.0.3.0

Configuration Settings

General • Enabled: Enables the Cloud Region for use. Credentials • Subscription ID: Your access ID for your Azure account. • Certificate Path: Path to your Azure Certificate. • VHD Blob Storage: The url of your Blob Storage. • Blob Storage Password: Password for Blob Storage if you have one. Customization • Instance Name: Name used when starting new instances. We add some random hex values to the end for uniqueness. VM Config • Affinity Group: The Affinity Group to start instances in. Can be used instead of Location. • Location: The Location to start instances in. Can be used instead of Affinity Group. • Virtual Network: The virtual network that the instance will be a part of. • Subnet Name: Name of the subnet that the instance will be in. • VM Login User: User name to login to the instance. • VM Login Password: Password to login to.

776 Chapter 11. Cloud Plugins Deadline User Manual, Release 7.0.3.0

11.4 OpenStack

11.4.1 Overview

The Openstack plugin for Deadline allows for communication between Deadline and an Openstack server. It works with both the Cloud Panel in the Monitor and the Deadline Balancer application.

11.4.2 Configuration

Before you can configure the OpenStack plugin for Deadline, you must add OpenStack as a provider in the Cloud Providers dialog in the Monitor. The Openstack plugin requires only a few credentials before it can be used in Deadline (see image below).

Configuration Settings

Options • Enabled: Enabling the plugin makes it visible in the Cloud Panel. • User Name: Your Openstack user name. • Password: The password for your Openstack account. • Keystone Endpoint: The endpoint of the Openstack server. This is listed as Identity in the Access & Security section of the Openstack project. • Tenant Name: The Project Name. Customization • Instance Name: The name of newly created instances. We add some random characters on the end for unique- ness.

11.4. OpenStack 777 Deadline User Manual, Release 7.0.3.0

11.5 vCenter

The vCenter plugin for Deadline allows for communication between Deadline and a vCenter server. It only works with the Cloud Panel in the Monitor. It does not work with the Deadline Balancer application.

11.5.1 Configuration

Before you can configure the vCenter plugin for Deadline, you must add vCenter as a provider in the Cloud Providers dialog in the Monitor. The vCenter plugin requires only a few credentials before it can be used in Deadline.

Configuration Settings

Options • Enabled: Enables the Cloud Region for use in Deadline. • vCenter Server: The name of the vCenter Server you want to connect to. • User Name: vCenter user name. • Password: Password for vCenter. Customization • Instance Name: Name used when starting new instances. We add some random hex values to the end for uniqueness.

778 Chapter 11. Cloud Plugins CHAPTER TWELVE

RELEASE NOTES

12.1 Deadline 7.0.0.54 Release Notes

12.1.1 Overview

Deadline 7 is the latest version of Thinkbox Software’s scalable high-volume compute management solution. It fea- tures built-in VMX (Virtual Machine Extension) capabilities, which allow artists, architects and engineers to harness resources in both public and private clouds. In addition to enhanced cloud support, Deadline 7 expands support for the Jigsaw multi-region rendering feature, which can now be accessed in 3ds Max, Maya, modo, and Rhino. Deadline 7 also introduces Draft 1.2, an update to Thinkbox’s lightweight compositing and video processing plug-in designed to automate typical post-render tasks such as image format conversion as well as the creation of animated videos and QuickTimes, contact sheets, and watermark elements on exported images. Finally, Deadline 7 introduces a wealth of new features, enhancements, and bug fixes, which are detailed below. Note that a new 7.0 license is required to run this version. If you have a license for Deadline 6.2 or earlier, you will need an updated license. In addition, the version of Draft that ships with Deadline 7 needs a new 1.2 license. If you have a license for Draft 1.1 or earlier, you will need an updated license.

12.1.2 Highlighted Features

VMX (Virtual Machine eXtension)

With VMX (Virtual Machine eXtension) built in and pluggable cloud support, Deadline 7 can interact with private and public cloud solutions out-of-the-box, including Amazon EC2, Microsoft Azure and OpenStack, among others. The new Deadline Balancer application can start and shut down virtual instances on demand based on the jobs in the queue, the current budget settings, or other custom algorithms. Multiple cloud solutions can be used simultaneously, along with classic non-cloud rendernode and workstation rendering.

Updated to MongoDB 2.6.3

Deadline now ships with MongoDB 2.6.3, with version 2.6.1 being the new minimum requirement for Deadline 7. Deadline utilizes MongoDB’s new timestamp feature to significantly reduce the number of write queries performed during normal operation. Not only does this improve performance under heavier loads, but it also allows Deadline to support MongoDB’s Sharding feature. Sharding can be used to create a cluster of MongoDB instances, allowing the database server to scale horizontally by adding more nodes to the cluster. Deadline’s Replica Set support has been improved as well. Previously, you had to specify each node in your Replica Set when specifying the MongoDB server name. Now, you can also include the Replica Set name.

779 Deadline User Manual, Release 7.0.3.0

Updated User Interface

Deadline’s User Interface libraries have been updated to Qt 5, and the Deadline applications now use Qt’s new Fusion theme for a more modern look and feel. The Fusion theme provides better scaling at larger resolutions, and it also provides more color contrast. The Monitor also uses new progress bars to show the progress for jobs. The progress bars show the state of every task for the job, not just the complete versus incomplete tasks. This allows you to see the overall state of all the tasks at a glance. Finally, updating to Qt 5 also addresses issues that Qt 4 had with Wacom tablets.

Python Upgraded to 2.7.8

Deadline now ships with Python 2.7.8. Note that this shouldn’t affect any existing scripts that you use with Deadline. In addition, the Deadline applications no longer set the PYTHONHOME and PYTHONPATH environment variables for their current session. This means that any applications launched from a Deadline application will no longer inherit these modified variables, which should avoid compatibility issues if those other applications use a different version of Python.

Draft Upgraded to 1.2.3.57201

Deadline now ships with Draft 1.2.3.57201. Note that this shouldn’t affect any existing Draft template scripts that you use with Deadline. Also note that if you are using Draft 1.1 or earlier, you will need an updated Draft license. Below is a list of what’s new in Draft 1.2.3.57201: Python Version • The Python version that Draft requires is now Python 2.7. FFmpeg Version • FFmpeg libraries have been updated to version 2.3. OpenColorIO Improvements • Use config.ocio and ColorSpaces / Roles to create OCIO color processors for color correcting images. • Create OCIO color processors directly from your favourite LUT files... see http://opencolorio.org/FAQ.html for the full list of LUT formats supported. ASC CDL Improvements • A fully standard-compliant implementation of ASC CDL LUTs. (The clamping steps in OCIO’s ASC CDL implementation is not currently standard-compliant.) • Added ASC CDL and OCIO lut example templates. WebM Improvements • Added support for WebM files (vp8 video codec, vorbis audio). EXR Improvements • Improved error message when trying to open an exr file that doesn’t exist. Unicode Improvements • Draft now supports unicode filenames and text annotations!

780 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• Note: We need to modify the DraftParamParser.py library so that unicode strings aren’t mangled in the Dead- line/Draft boundary, but once they’re in, Draft handles them properly. Licensing Improvements • Draft Licences are now more flexible! Most Draft features require only that a license be present. Actual checkout of licenses now happens only while videos are being encoded or decoded. • “Lost connection to license server” no longer pops up dialog boxes on Windows.

Mono Upgraded to 3.8

Deadline now runs against Mono 3.8 on Linux and Mac OSX, which helps improve stability. In addition, the Mac OSX version of Mono is now 64-bit. This new version is bundled with the Linux and Mac OSX Client and Repository installers.

Mono Included in Linux Installers

Mono is now installed automatically as part of the installation procedure on Linux. It is installed to the Deadline installation folder, and won’t impact any existing Mono installations. Now Mono no longer needs to be installed manually on Linux prior to installing Deadline.

Updated Slave Licensing Model

When running multiple slaves on a single machine, they will now share a single license instead of needing one license per slave instance. In addition, the slaves will only hold onto their license while they are rendering. When they become idle, they will return their license.

Customizable Styles for Deadline Applications

The new Styles configuration panel in the Monitor options allows you to customize the color of the Deadline applica- tions. Simply specify a palette color and the User Interface will automatically use lighter and darker variants of that color where necessary. In addition, the font style and size can be configured as well. Finally, you can export styles and share them with other users.

New Batch Property for Grouping Jobs

A new Batch property has been added to jobs that allows jobs to be grouped together in the Job List. All jobs with the same Batch name will be grouped under that Batch name, and the Batch name can be expanded or collapsed to show and hide all the jobs, respectively. Jobs in the same Batch will also be grouped together in the Job Dependency View. Finally, the properties for the jobs in the same Batch can be modified by simply right-clicking on the Batch item in the Job List or the Job Dependency View.

New Graphs in the Monitor

New graphs have been added to the Monitor. The Jobs panel can show pie charts based on the job pool, secondary pool, group, user, and plugin. The Tasks panel can show graphs representing the task render times, image sizes, cpu usage, and memory usage. The Slaves panel can now show bar charts that show how many slaves are in certain pools and groups. The Job Reports panel can now show a pie chart that shows the percentage of errors generated by each slave.

12.1. Deadline 7.0.0.54 Release Notes 781 Deadline User Manual, Release 7.0.3.0

Customizable Default Layout for Panels in Monitor

A default layout for panels in the Monitor can now be saved, and when a new panel is opened, it will use the saved default layout. So now you can set up your favourite default layouts for the Job list, Task list, etc and not have to worry about setting them up again when you open new panels. In addition, you can now save the layout from a panel to disk and load it in again. This allows you to share a layout from your Monitor with someone else.

Job Dependency Improvements

Job dependencies are now more flexible than ever. Individual dependencies can have notes attached to them, and they can also have their own overrides for the Frame Offset and Resume On... settings. The Job Dependency view in the Monitor has also been updated to show these per-dependency settings. In addition, there is now a new feature in the Dependency View that allows you to test the dependencies and see which ones pass and which ones do not. Finally, the look of the nodes in the Dependency View have been updated.

Limit Improvements

Limits are now much more flexible than they were before. Previously, one Limit Stub per Slave was used up when a Slave rendered a job that required that Limit. This is still supported, but now, a Limit can be configured so that one Limit Stub per Task is used up, or one Limit Stub per Machine is used up. The per Task option is useful if you are rendering with an application that requires one license per instance, and you are rendering more than one concurrent task at a time. The per Machine option is useful if you are rendering with an application that only requires a single license per machine, regardless of how many instances are running on that machine.

Improvements to Pool and Group Management

The Slave list in the Pool and Group Management dialogs can now be filtered, and all columns in the list are now available. In addition, you can now right-click on specific slaves in the Slave list in the Monitor to modify Pools and Groups for the selected slaves only.

Suspend Tasks

Deadline now supports the ability to suspend and resume individual tasks. This can be useful if you want to postpone or skip the rendering of specific tasks.

Slave Scheduling Improvements and Idle Detection

Deadline’s Slave Scheduling feature has undergone a major overhaul. Previously it was part of Power Management and controlled by Pulse, but now it is a standalone feature that is controlled by the Launcher application that runs on every Client machine. This means that Pulse is no longer required to use the Slave Scheduling feature. There are also new features that have been added to Slave Scheduling. If a slave is scheduled to start on a machine, a notification message will now pop up for 30 seconds indicating that the slave is scheduled to start. If someone is still using the machine, they can choose to delay the start of the slave for a certain amount of time. Another addition is the new option to enforce the slave schedule. If enabled, the Launcher will keep restarting the slave if it is shut down during a period of time that it is supposed to be running.

782 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

Finally, Slave Scheduling can now be configured to launch the slave if the machine has been idle for a certain amount of time (“idle” means no keyboard or mouse input). There is also additional criteria that can be checked before launching the slave, including the machine’s current memory and CPU usage, the current logged in user, and the processes currently running on the machine. Finally, this system can stop the slave automatically when the machine is no longer idle. Note that Idle Detection can be set in the Slave Scheduling settings, or on a per-slave basis in the Slave Settings dialog in the Monitor. It can also be set in the new Local Slave Control dialog so that users can configure if their local slave should launch when the machine becomes idle.

Job Dequeueing Mode

Slaves now have a new Job Dequeuing mode that controls which jobs a slave dequeues based on how the job was submitted. By default, a slave will dequeue any job, but it can be configured to only dequeue jobs submitted from the same machine that the slave is running on, or submitted by specific users. The Job Dequeuing Mode can be configured in the Slave Settings dialog in the Monitor. It can also be set in the new Local Slave Control dialog so that users can configure if their local slave should only render their own jobs, or if they want to help another user render their jobs.

Local Slave Controls

The Monitor and Launcher applications now have a new dialog that can be used to control the slave running on the local machine. It can be used to start and stop the slave, or connect to the slave’s log. This is useful if the slave is running as a service on the machine. In addition, you can set up the slave to launch if the machine has been idle for a certain amount of time (“idle” means no keyboard or mouse input). It can also stop the slave automatically when the machine is no longer idle. Finally, the slave’s Job Dequeuing Mode can be configured here. By default, a slave will dequeue any job, but it can be configured to only dequeue jobs submitted from the same machine, or submitted by specific users. This is useful if a user wants their slave to only render their jobs, or they want to help another user render their jobs. Note that the Idle Detection and Job Dequeuing Mode settings can also be changed by administrators for all slaves. In addition, the Local Slave Controls feature can be disabled by administrators if they don’t want users to be able to control their local slaves.

Render As User

A new option has been added to Deadline to render jobs with the account that is associated with the job’s user. The account information can be configured in the Deadline user settings. On Windows, the user’s login name, domain, and password are required. On Linux and Mac OSX, just the user’s login name is required, but the Slave must run as root so that the Slave has permission to launch the rendering process as another user.

Improved Slave Statistics

Additional statistical information is now gathered for individual slaves, including the slave’s running time, rendering time, and idle time. It also includes information about the number of tasks the slave has completed, the number of errors it has reported, and its average Memory and CPU usage. Like job statistics, Pulse does not need to be running to gather this information.

12.1. Deadline 7.0.0.54 Release Notes 783 Deadline User Manual, Release 7.0.3.0

Pulse Redundancy

You can run now multiple instances of Pulse on separate machines as backups in case your Primary Pulse instance goes down. If the Primary Pulse goes offline or becomes stalled, Deadline’s Repository Repair operation can elect another running instance of Pulse as the Primary, and the Slaves will automatically connect to the new Primary instance. Note that when multiple Pulse instances are running, only the Primary Pulse is used by the Slaves for Throttling. In addition, only Primary Pulse is used to perform Housecleaning, Power Management, and Statistics Gathering. However, you can connect to any Pulse instance to use the Web Service.

New Events and Asynchronous Job Events

New events have been added to the Event Plugin system. The first is the OnHouseCleaning event, which triggers whenever Deadline performs Housecleaning. This allows you to set up event plugins to do custom cron-job style operations within Deadline. In addition, there are four new events that trigger when a slave changes state: OnSlaveStarted, OnSlaveStopped, OnSlaveRendering, and OnSlaveStartingJob. As an example, an event plugin could be written to have slaves automat- ically add themselves to Groups when they start up based on some custom criteria, or an event plugin could be written to have slaves perform maintenance checks when they become idle. Finally, there is now an option to process many types of job events asynchronously. The benefit is that job events will no longer slow down batch operations in the Monitor (for example, deleting 1000 jobs will be much faster if you are using event plugins because those events will be processed later). These job events are queued up in the Database and Deadline’s Pending Job Scan will process them at regular intervals. Because they are placed in a queue, they will still be processed in the same order that they were triggered. Note that if this option is enabled, some events are still processed synchronously, like the OnJobSubmitted and OnJobStarted events.

Auto Configuration Overhaul

The Auto Configuration feature has undergone a couple of significant changes. The first is that all Deadline applica- tions can now pull the Auto Configuration settings, instead of just the Slave. This means that Auto Configuration can now be used to automatically configure workstations, not just render nodes. The second change is with how Auto Configuration works. Previously, all Auto Configuration settings were pulled from Pulse. Now, only the Repository Path is pulled from Pulse, and the other settings are pulled when the Deadline application connects to the Repository. The benefit to this is that most of the Auto Configuration settings will work without Pulse running. Finally, Auto Configuration rule sets can now be enabled or disabled, so you no longer have to delete a rule set if you want to remove it temporarily.

Region Awareness

Regions can now be configured in Deadline, and users and slaves can be assigned to a specific region. Currently, this is useful for Path Mapping, and allows you to map paths differently based on the region that the users or slaves are in. Note that when VMX launches a slave, it will automatically be added to the region associated with the cloud provider settings.

Grid-Based Script Dialogs

New grid-based functions have been added to the DeadlineScriptDialog class which makes it easier to create custom dialogs. Instead of setting the width and height when adding new controls to a row, you can instead add them to a grid and indicate which row and column the control should go in. Optionally, you can also indicate how many rows and

784 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0 columns the control should occupy. By being part of a grid, the controls will now grow and shrink dynamically based on the size of the dialog and the size of the font.

FTrack Integration

The Deadline/FTrack integration enables a seamless render and review data flow. When Deadline starts a render, an Asset Version is automatically created within FTrack using key metadata. When the render is complete, Deadline automatically updates the created Version appropriately – a thumbnail image is uploaded, components are created from the Job’s output paths (taking advantage of FTrack’s location plugins), and the Version is flagged for Review. In doing so, Deadline provides a seamless transition from Job Submission to Review process, without artists needing to monitor their renders.

Jigsaw for Maya, modo, and Rhino

Jigsaw, which was previously only available for 3ds Max, is now available for Maya, modo, and Rhino. It gives you more control over the tiles and/or regions that you are submitting to Deadline. This feature uses Thinkbox Software’s Draft library to assemble the final image instead of the old TileAssembler.exe application. Note that Draft requires a license, so contact Thinkbox Sales if you don’t already have a Draft license.

Submission Script Installers

Submission script installers can now be found in each application folder in the Submission folder in the Repository. These allow for most of the submission scripts to be installed automatically, instead of having to manually copy over files.

Support for Salt and Puppet

Application and Event plugins have been added to support the Salt and Puppet automation applications. Jobs can be submitted to the application plugin to update software and machine configurations on specific machines, while the event plugins can be used to update all of your machines when the slave running on them becomes idle.

Updated Application Support

Support has been added for After Effects CC 2014, Arnold for Houdini, Cinema 4D 16, Corona, Fusion 7, Nuke 9, Realflow 2014, and SketchUp 2015.

12.1.3 Complete Release Notes

General Improvements

• Added the new VMX (Virtual Machine eXtension) system to Deadline. • Upgraded Python to version 2.7.8. • Added FTrack support, and updated many job submission scripts to connect to FTrack. • Added new event to Event Plugins that triggers every time Housecleaning is performed. This is useful for performing custom cron-job style operations within Deadline. • Added new events to Event Plugins that trigger when a slave starts, stops, starts rendering, and becomes idle.

12.1. Deadline 7.0.0.54 Release Notes 785 Deadline User Manual, Release 7.0.3.0

• Added option to process many of the job events asynchronously to improve performance (particularly in the Monitor). • Added application and event plugins for Puppet and Salt automation applications. • Users, slaves, and pulse can now be added to regions, which affects how Path Mapping is performed for them (regions can be configured in the Repository Options). • Path mapping can now be associated with regions so that different path mappings can be set for different regions. • There is now an option in slave scheduling to keep the slave running during scheduled hours. • Housecleaning and the Pending Job scan are now performed on a more regular basis by the Slaves when Pulse isn’t running. • During the Pending Job Scan, the task dependency check now handles a missing __main__ function in the dependency script properly. • Fixed a typo where the Pending Job Scan would refer to itself as Housecleaning. • Fixed an encoding issue when saving and loading job and slave reports. • Added new slave statistics gathering that logs more information about individual slaves. • Added new vCenter Cloud plugin. • Limits can now be configured with different usage levels. They can be per task, per slave, or per machine. Previously, they could only be per slave. • Bumped up the maximum thread/cpu setting limit in the submission scripts. • The Deadline temp folder on the Client machines now gets cleaned up on a regular basis. • Split out the critical Housecleaning operations into a new Repository Repair operation (orphaned task and limit stub checking, stalled slave checking, and available DB connection checking). • The randomness of the housecleaning checks has been removed to make the system more reliable and pre- dictable. • Fixed some cases where timestamps were still using 12 hour clocks. • Added IP address/hostnames to the power management logging. • Fixed a bug that prevented Deadline from shutting down an OSX machine. • Most integrated submitter client scripts now print out where they’re getting the main script file from prior to running the main script. • Housecleaning can now detect if a task is waiting to start, but the slave hasn’t updated its state to show that it’s rendering that task. • Fixed a bug that prevented the timeout from triggering when running the housecleaning operations as separate processes. • Added an option for splitting the output from the different housecleaning operation to separate logs. • Fixed how the timestamps look when connecting to a remote slave/pulse/balancer log. • Job event triggers now fire properly when changing states of individual tasks. • Improved performance when checking pending jobs with frame dependencies. • The Deadline applications no longer set the PYTHONHOME and PYTHONPATH environment variables for their current session. • The error message that is displayed when auto-archiving a job fails now shows the job ID instead of the job name.

786 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• Housecleaning only loads event plugins once when deleting or archiving completed jobs. • When purging jobs in housecleaning, the event plugins are only loaded once per batch. • Added a Machine Startup option in Power Management to not send the command to the machine to launch the slave. • Added user group permission option to disable job submission (enabled by default). • Added stalled Pulse and Balancer detection to housecleaning. • Removed a misleading message that was printed when getting the user from deadline.ini and one wasn’t defined yet. • Housecleaning, pending job scan, and repository repair are no longer run as a separate process by default.

Installer Improvements

• Mono is now shipped with the Linux installers, so it is no longer required for Mono to be installed prior to installing Deadline. • Added the major version number to the shortcuts created on windows, and to the uninstaller shortcuts created on all operating systems. • Added command line option to Client installer to set the NoGuiMode setting. • When setting up the database, the Repository installer now checks to make sure the database version is the minimum supported version. • The Repository installer now checks to make sure it’s not installing over an existing repository that’s a different version. • The Repository installer now sets the default database name to include the major Deadline version number. • The Repository installer now creates a repository.ini file in the repository install directory which contains the Version information. • The Windows Repository installer now ships with both the standard and legacy versions of MongoDB. The standard version will be installed on Windows Server 2008 R2 and later, and the legacy version will be installed on older versions of Windows. • The Repository uninstaller now removes all subfolders except for the “custom” one. • Fixed a bug in the Client Installer that was causing the license server entry to be reset if the repository directory was invalid. • The MongoDB service name and port can now be customized in the Repository installer, and its default is based on the current Deadline version. • The Windows client installer now creates a DeadlineLauncher# registry key to start the Launcher on login (where # is the major version number). This allows different versions of the Launcher to start on login. • Fixed a bug in the Repository installer that was causing “Password:” to be set for the user name in dbCon- nect.xml on OSX. • Fixed some errors when running the Repository installer in unattended mode. • Installers on OSX are now signed with codesign v2 so that Gatekeeper doesn’t flag them on OSX 10.9.5. • The replica set name and mongo password fields in the Repository installer are now wider. • The Mono.Posix and Mono.Security dlls are no longer installed with the Linux version of Deadline. • The api, balancer, cloud, and draft folders in the repository are now backed up during an upgrade.

12.1. Deadline 7.0.0.54 Release Notes 787 Deadline User Manual, Release 7.0.3.0

• Windows installers are now code-signed. • The settings folder is now backed up by the Repository Installer. • The “slavedatadir” command line option for the client installer is now visible in the usage instructions.

Repository Improvements

• Archived jobs are now stored in subfolders based on the year and month they are submitted. • Job reports are now stored in a subfolder with the job’s ID, which improves performance when deleting reports for a job. • The License Server is no longer installed in the Repository. It can be downloaded from the Thinkbox website. • There are now submission script installers in each application folder in the Submission folder in the Repository. • Lock files are no longer used in the Repository to ensure that operations like Housecleaning and Repository Repair are only done by one application at a time. • There are now separate 32 and 64 bit versions of the windows bin.zip file in the repository. This is so that we can ship platform-specific libraries as part of the auto-upgrade in the future if necessary.

Database Improvements

• Upgraded minimum MongoDB requirement to 2.6.1 (although the Repository installer ships with 2.6.3). • Deadline now uses MongoDB’s new timestamp feature to reduce the number of write operations it performs. • Split many collections into separate databases to improve performance. • Using the new timestamp feature allows Deadline to support Sharding. • A Replica Set Name can now be specified when configuring the database connection settings. • Improved how passwords are saved in the database. • When saving a new job, if there is a job with that ID in the Deleted Job Collection, it is now removed from the Deleted Job Collection. • A config file is now installed to the Database folder, and this can be modified to configure how MongoDB runs. • Unexpected mongodb exceptions now include the stacktrace and exception type. • Reduced the number of database writes that occur when deleting jobs, slaves, pulses, balancers, and limits. • Reduced bandwidth when checking if a job or slave exists in the database. • Fixed a bug where too many asynchronous calls to the database could result in connection errors. • When adding history entries, the saving of the new entries and the purging of the old ones is now done in one query instead of two. • Added a “locking” collection to the database that is used instead of lock files to ensure that operations like Housecleaning and Repository Repair are only done by one application at a time.

Job Improvements

• Individual tasks for jobs can now be suspended or resumed. • Added ability to render jobs using the account for the user that submitted the job. • Job dependencies are more flexible, and can have per-dependency overrides and notes.

788 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• Added new OnTaskTimeout option to mark a task as complete. • Added optional timeout option for the Starting phase of a job. • Added a job timeout option to calculate the task timeout based on the number of frames for the current task. • Jobs with custom plugin locations specified no longer need the custom plugin to be in the repository to be submitted and resubmitted. • Sequential jobs are no longer dropped for higher priority jobs. Once a slave picks up a sequential job, it will keep rendering it until the job is complete or the render is canceled. • Added a job task buffer value that can be applied to balanced or weighted algorithms to help prevent slaves from jumping between jobs to keep things balanced. • Fixed a bug where pre job scripts were not necessarily finishing before regular tasks were started. • Failed jobs with a post job script no longer remain stuck in the queued state. • Added option to submit a job with a start time delay by specifying a JobDelay=dd:hh:mm:ss value in the job info file. The delay value is represented by the number of days, hours, minutes, and seconds, all separated by colons. • Fixed a bug when undeleting a job that could cause the job’s task counts to be incorrect. • Jobs now have a limit on the number of error reports that can be generated. A job with the maximum number of errors will fail and cannot be resumed until some reports are deleted. This number is configurable in the Job Settings in the Repository Options. • Fixed a bug that could cause interruptible jobs to be interrupted for another job of equal priority. • Improved the performance of how job tasks are updated in some cases. • Added support for jobs to have their own custom event plugin directory to load event plugins from. • Added a job option to override the number of days before the job is automatically cleaned up. • Added an option to completely override auto-cleanup settings, which means you can choose to disable auto- cleanup for a job if it’s enabled in the Repository Options. • A history event is now logged when a job is failed because it reached the error limit. • If a job with a post job task is frame dependent, the post job task now only gets released if all the other tasks are complete. This fixes the problem of the job showing up as Queued in the Monitor because the post tasks is queued, but the rest of the tasks are a combination of pending/completed/failed. • During job submission, if the job’s user doesn’t exist, default user settings are now created for them.

Client Application Improvements

• Upgraded user interface libraries to Qt 5, which fixes some known Wacom Tablet issues. • The Deadline applications now use the new Qt Fusion theme for a more modern and scalable look. • All Deadline applications can now update the Auto Configuration configuration settings, instead of just the slave. • All Auto Configuration settings, except for the Repository path, are now pulled directly from the Repository. Only the Repository Path is still pulled from Pulse. • The color and font used in all Deadline applications can now be customized from the Monitor. • All Deadline application command line arguments now support any number of leading dashes (for example, “deadlinemonitor -console” or “deadlineslave –help”).

12.1. Deadline 7.0.0.54 Release Notes 789 Deadline User Manual, Release 7.0.3.0

• Added a NoGuiMode setting to the deadline.ini file. It’s set to False by default, but if True, then the launcher, slave, and pulse will always run in nogui mode, regardless if the -nogui flag is passed or not. • All logs for the Deadline applications and for jobs now have timestamps. • The LaunchPulseAtStartup and LaunchBalancerAtStartup settings are now stored in the system deadline.ini file, not the per user one. • The Monitor, Pulse, and Balancer listening ports and process IDs are now stored in separate ini files, not the system deadline.ini file. This means that a symlinked deadline.ini file can now be shared between multiple machines. • Added the major version number to the app packages on OSX. • Fixed some bad logic when the applications try to determine if they should run in GUI mode or not. • Fixed a typo in the dbConnect.xml error that would be shown if the Client application couldn’t find or read the dbConnect.xml file. • The look of the disabled text in labels now matches Qt’s default look. • On OSX, any popups that appear when the splash screen is visible now appear in front of the splash screen. • On Windows, a task bar item is now visible when the splash screen is visible. • Improved the Connection Error message when a Deadline application cannot connect to the Repository or Database. • Menus that are too long for the screen are now scrollable.

Launcher Improvements

• The Launcher now controls the the scheduled starting and stopping of slaves. • The Launcher displays a popup message when a slave is scheduled to start, allowing a user to delay launching the slave if they are still using the machine. • The Launcher can detect if the system is idle and launch the slave. It can also stop the slave when the system is no longer idle. • Added new Local Slave Settings dialog to the Launcher menu to control the local slave and configure its Idle Detection and Job Dequeuing Mode settings. • The Launcher system tray icon now shows the Deadline version number in the tooltip. • The launcher now waits 5 minutes after starting before it starts checking if it should restart a stalled slave. This ensures that if the launcher is set to launch the slave at startup, and that slave previously stalled, the slave will have a chance to cleanup after itself. Otherwise, the launcher might try to launch the slave multiple times. • Added new “-shutdownall” command line option to launcher, which shuts down the slaves, pulse, and balancer before shutting down the launcher. • On Linux, Deadline’s init.d script now shuts down the slaves and the launcher during a reboot/shutdown, which ensures the slaves check their licenses back in. Pulse and the balancer are shut down if they are running as well. • On Linux, fixed some other issues in Deadline’s init.d script. • The Restart Slave If Stalled option is now disabled by default. • Fixed some bugs in the Launcher init script on Linux. • Cleaned up the output of a successful remote command. • The Launcher can now process multiple remote commands simultaneously.

790 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• Added the -balancer command line option to launch the Balancer through the Launcher. • A LaunchBalancerAtStartup=true entry can be added to the system deadline.ini file to have the Launcher start the Balancer when the Launcher starts. • When running as a service on Windows, the Launcher now properly shuts down the slave when the machine is shut down, which ensures the slaves check their licenses back in. Pulse and the balancer are shut down if they are running as well. • Added new optional entries to deadline.ini file to have the launcher keep pulse and balancer running (Keep- PulseRunning=true and KeepBalancerRunning=true). • Added a “-slavenames” command line option to the launcher to be used with “-slave” to launch slaves with specific names by specifying a comma-separated list of slave names. • Added “-upgrade” command line option to launcher to simply trigger an upgrade if it’s required. • Up to 5 attempts are now made during an auto-upgrade to copy over the binaries, with an increasing interval between attempts. • When the launcher checks for upgrades, it now performs an upgrade if the local Version file is missing (but the network one exists). • When doing an automatic upgrade, the launcher now copies the bootstrap files to the system’s temp directory, instead of using the Deadline temp directory.

Monitor Improvements

General • The UI Lock can now be toggled on and off using the Shortcut “ALT+‘”. • Font sizes are now consistent for all column headers in the lists in the Monitor. • Added new graphs to the Monitor. • There are no longer artifacts in the images when saving graphs to disk. • Default list layouts can now be saved for each panel in the Monitor. These defaults are used when new panels are opened. • List layouts for each panel in the Monitor can be saved to disk and opened again later. • The lists no longer auto-scroll horizontally when clicking on a column that is only partially visible. • Added ability to add Separators when customizing Script Menus. • The Monitor now gives the user the option to save the Location and Size when pinning a layout or saving a layout to disk. • When right-clicking on the column headers for a list to show hidden columns, the column will now appear where the mouse cursor is instead of at the end. • Added search history to the search boxes in the Monitor. The search history can be cleared from the down-arrow menu for each list. • The default size for the Manage Pools and Manage Groups dialogs are now bigger. • The Slave list in the Pool and Group Management dialogs can now be filtered, and all columns in the list are now available. • Fixed a bug when deleting groups and pools from the Manage Pool and Group Dialog that was preventing deletion of a single pool or group, or deleting them all if one was selected for deletion.

12.1. Deadline 7.0.0.54 Release Notes 791 Deadline User Manual, Release 7.0.3.0

• The Slave Scheduling feature has been broken out of the Power Management dialog and now has its own configuration dialog. • Added new Repository Options panel to create regions. • In Repository Options, moved the database threshold to the Notifications panel, and grouped it with the database email address setting. • Added an option to the Email Notification panel in the Repository Options to enable/disable auto-generating email addresses for new users. If enabled, the email address will be based on the SMTP server unless a postfix override is specified. • The statistics panel in the repository options now has all of its settings in a group box. • Added a toggle to the FarmOverviewReport to switch between percentages and counts for the graphs. • Repository options dialog now notes that it can take up to 10 minutes for the settings to propagate. • Improved the tooltips in the Repository Options dialog. • Fixed a typo in the House Cleaning panel in the Repository Options. • Updated Repository Options, Job Properties, Slave Properties, and Monitor Options dialogs so that each panel takes up a bit more space. • New rows created in the Path Mappings, Drive Mappings, and Monitor Layout panels in the Repository Options now have the correct height. • Moved the database threshold in the repository options dialog to the Notifications panel, and grouped it with the database email address setting. • Added a button in the Repository Options dialog to reset all settings back to factory defaults. • In the Repository Options, all performance-related settings are now on a new Performance panel. Use the new Auto Adjust spinner control to automatically pick good default settings based on the number of Slaves in your farm. • Fixed a bug in the Auto Configuration page in the Repository Options that occurred when the last entry in the Auto Configuration list was deleted. • Auto Configuration rule sets can now be enabled or disabled. • Manage User, Manage Groups and Manage Pools dialogs no longer close the Name dialog if an invalid name is entered. • When a new user group, pool or group is created, it is automatically selected. • Fixed an error that could occur when deleting multiple users at the same time. • The Farm Statistics dialog now has a drop down to choose an interval, rather than 4 separate buttons. • The Configure Cloud Providers dialog now initializes the cloud plugins before displaying to improve perfor- mance when viewing the settings for different cloud plugins. • Added Import Settings option to the Tools menu, which allows you to import settings from other Repositories running a minimum of Deadline 6. • Added new Local Slave Settings dialog to the Tools menu and the main toolbar to control the local slave and configure its Idle Detection and Job Dequeuing Mode settings. • Improved layouts of controls in Plugin and Event Plugin configuration dialogs. • Features that require Pulse now mention it in their respective property dialogs. • Updated all Monitor scripts to use the new grid-based system for the script dialogs.

792 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• All Monitor submission scripts now save their sticky settings if the dialog is closed using the “X” button, or if Alt+F4 is pressed. • Added additional command line arguments for the Monitor to set specific Monitor Options at startup. • Fixed the filter types for some columns in the slave list, job report list, and slave report list. • If a Remote Control command succeeds, the result will now be “Connection Accepted” instead of just being empty. • Added Monitor option to show when the last house cleaning and pending job scan operations where performed in the Monitor status bar. If they haven’t been performed for more than 10 minutes, they will be highlighted in red. • Added Monitor option to enable slave pinging (it’s now disabled by default). • Fixed some Remote Control commands that were not checking if they should be using the slave’s IP address, or a machine name or IP address override. • Fixed a bug where trying to send a Remote Command to an unknown host would hang indefinitely on Linux and OSX. • When executing a remote command, if the process returns a non-zero exit code, then the result is returned as a failure instead of a success. • Fixed a “ManageListForm” error. • The limit dialog and the power management dialogs now disable the name field instead of just making it read- only when in edit mode. • Added settings in the Repository Options to control how long the local Launcher and Balancer logs should be kept for. • Fixed a layout issue in the multi-line file browser control in the Plugin Configuration dialogs. • Resetting the Repository Options in the dialog is now visually smoother. • Added a panel menu item to reset the default list layout back to the original default. • Fixed an error when removing users from the User Group permissions dialog. • When cloning an existing user group, the clone is selected automatically. • Increased the default height for the Manage Users dialog. • Added Monitor Option settings to configure the double click task behavior for rendering, completed and failed tasks. • The scripts menus are now hidden when right-clicking on a panel with nothing selected. • The job scheduling weight settings in the Repository Options now have 4 decimal places instead of 2. • Updated the icon/script sync icon to be the “refresh” icon. • Added View menu option to show/hide the main toolbar. • Cleaned up the layout of the View menu a bit. • Graph names are now shown in the panel titles when they are showing a graph. • The splitter for job reports, slave reports, and remote command panels no longer moves when resizing the panel. • Fixed a leak caused by the context menus in the panels. • Fixed a bug in the Auto Job Timeout settings in the Repository Options that caused the Timeout Multiplier to be disabled when it shouldn’t be.

12.1. Deadline 7.0.0.54 Release Notes 793 Deadline User Manual, Release 7.0.3.0

• When restarting the Monitor, the location of the splitters for all panels is now restored properly from the previous session. • When switching between saved layouts, the Monitor is now hidden and shown to ensure that the location of the splitters is restored properly. • Fixed a bug in the Manage Users dialog where the password confirmation fields were not being verified on accept. • Tweaked some labels in the idle shutdown and machine startup tabs in the power management dialog. • Cleaned up the error message when a job import fails due to the job already existing. • Deleting a ruleset in the auto configuration panel of the repository options now resets all controls to their defaults. • When creating new Path Mappings in the Repository Options, they are no longer case-sensitive by default. • Plugin and Event configuration settings are now sanitized when they are saved. • Added a new general TestIntegrationConnection script to the General script menu that can be used to test con- necting to Shotgun or ftrack, and it shows the results. • Added stacktraces to the error messages if the Monitor can’t update its data cache. • Added Repository Configuration settings for maximum repository, slave, job, pulse, and balancer history entries. • Double clicking the title bar of a floating panel in the Monitor now maximizes it on Windows. • Repository history entries are now logged when changing Repository Options. • Fixed a bug when collapsing and expanding group boxes in the Configure Plugins/Events dialogs. • Improved the performance of bulk delete operations in the Monitor. • Improved the default widths of some of the columns for lists in the Repository Options. • When switching between the global pinned monitor layouts, the local pinned layout settings (column layouts and filters) are ignored so that they do not get clobbered. • Fixed a typo in the Application Logging panel in the Repository Options. • Fixed a layout bug in the Plugin Configuration if CategoryOrder was specified in the .options file of a plugin. • Fixed some errors when editing idle shutdown overrides, and when editing existing thermal shutdown sensors and overrides. • When connecting to a remote log from the Monitor, it now connects to the correct machine if the Monitor is connected to a different repository than the one stored in the deadline.ini file. • CMD+R shortcuts now work properly on OSX (ie: resume job, resume task). Jobs and Tasks • Added new progress bars to the job list to show the state of all tasks for the job at a glance. • Jobs with only a single task now show better job progress in the Job list. • Fixed some issues that caused the job counts in the job list to be incorrect. • Fixed some issues where requeue reports weren’t getting created properly for jobs. • Improved layout of controls in the plugin-specific properties in the Job Properties dialog. • Selecting multiple jobs and modifying their properties only overwrites shared properties for dependencies, extra info variables and environment variables. • All dependency related job properties are now in the Dependencies panel in the job properties dialog, instead of being spread across three separate panels.

794 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• The job timeout panel in the job properties now lets you specify a timeout in terms of hours, minutes, and seconds. • Fixed a color control bug in the plugin-specific job properties that would cause the property to appear as modified when pressing Cancel on the color picker dialog. • Split up the job history logging to be more granular when modifying certain Job Properties. • Jobs can now be grouped together in the Job list if they share the same Batch name. • Improved the performance of the Quick Filters for the job list. • User name quick filters now have “Me (userName)” as the entry for the current user, and will be the first user in the list. • Changed the right-click menu item text in the quick filters to avoid confusion. • Added an option when suspending a job to only suspend the non-rendering tasks for the job. • Updated the Transfer Job script to include some missing job properties that weren’t getting transferred. • Fixed an error that could show up in the Console when closing the Job Details panel. • The Explore Output menu in the job and task list no longer shows any duplicate paths. • The task list now shows the current CPU and RAM information for a rendering task. • Added right-click menu item to task list to suspend/resume individual tasks. • Swapped the default location of the Startup Time and Render Time columns in the task list. • The Job Dependency nodes in the Monitor have also been updated to show per-dependency settings. • Added a new feature to toe Job Dependency View to test the dependencies and see which ones pass and which ones do not. • The backgrounds for the graphs and the Job Dependency View now match the look of the rest of the Monitor. • Jobs can now be grouped together in the Job Dependency View if they share the same Batch name. • The layout can now be pinned for the Job Dependency View panel. • You can now select multiple jobs in the job list and have them show up in the job dependency view. • The Job report list in the Monitor now have columns that show Memory and CPU usage information. • Moved the Explore Path menu for non-job nodes to the main context menu in the Job Dependency View, and fixed a bug that caused it to be disabled when it shouldn’t be. • Cleaned up the error message when changing the frame range for a job, and the new task count exceeds the maximum allowed. • Added ability to pin and save quick filters. • The archive job path is now remembered within a session (it will revert back to the default repository folder the next time the Monitor is restarted). • Added new Cleanup panel to job properties window (for auto-cleanup override settings). • Added option to auto-filter Job Reports based on the selected Task. • Added option to switch Job Reports panel to a horizontal orientation. • Added a Render Status column to task list, which shows the same information that the Task Render Status column in the slave list shows. • Fixed some layout and font-size issues in the job dependency drag and drop dialog. • Fixed a bug that could cause output paths in the job/task context menus to show double path separators.

12.1. Deadline 7.0.0.54 Release Notes 795 Deadline User Manual, Release 7.0.3.0

• Event plugins are only loaded once when archiving a batch of jobs. • Fixed a bug when parsing the frame padding of an output path that contained multiple sections of padding characters. • The Task ID column in the Task and Job Report lists are now string filters instead of integers. • Capped the job and task sub-menu length for viewing output and auxiliary files to 50 menu items. • Deleting jobs from the monitor now logs to the repository history. • If a job report can’t be loaded, the error message is now shown in the job report viewer. • Task progress bars are now only visible for completed and rendering tasks. • Task progress bars no longer change color based on the task’s state, although they will still match the completed job color when the task is complete. • Disabled ability to resubmit tasks for Tile and Maintenance jobs. • For tile jobs, the tile numbers under the Frame column in the Task List now start at 1 instead of 0. • Fixed a bug in Job Properties where editing a job’s existing Script Dependencies wasn’t being committed prop- erly when pressing OK. • Fixed some errors when removing multiple asset or script dependencies from their respective lists in the job properties. • Fixed spelling of “interruptiple” in the job properties dialog. • Fixed a bug in the Job Dependency View that could lock up the Monitor when clicking on different jobs. • When resubmitting a job that was scheduled to start at a certain time, the flag that indicates if the job has been resumed already is now reset. Slaves and Pulse • You can now right-click on specific slaves in the Slave list in the Monitor to modify Pools and Groups for the selected slaves only. • Added job icon to the Job Name column in the slave list. • The Slave list now shows which Limits the slaves are whitelisted, blacklisted, and excluded for. • The Slave report list in the Monitor now have columns that show Memory and CPU usage information. • The utilization value in the slave list now takes into account rendering and idle slaves (necessary if there are multiple slaves running on the same machine, but not all are rendering). • If the slave list is filtered, the utilization will show the total utilization, as well as the utilization for just the visible slaves. • Fixed a bug where the utilization would only update if you click on a slave in the list. • Cleaned up the utilization text a bit so that it’s easier to read. • Added option for viewing history to the Pulse list. • Moved the Modify Pools/Groups menu items in the slave list menu below the Modify Slave Properties menu item. • The slave list now shows the time a slave has been in its current state for all states (previously it would only show this for rendering slaves). • A warning now appears when trying to shut down the local machine from the slave list, instead of failing silently. • Added option to switch Slave Reports panel to a horizontal orientation.

796 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• If a slave report can’t be loaded, the error message is now shown in the slave report viewer. • When deleting a pulse, the history entry is now logged in the repository history. • The Mark Slave As Offline menu item is now shown if the slave is in the StartingJob state. • Fixed a bug where history entries for saving slave settings weren’t logged if only one slave was selected. • The Job Candidate Filter in the Slave list now handles jobs with empty whitelists properly. • The Slave Reports panel now shows render logs in addition to render errors. • Added new graphs to Slave Reports panel. • Added Connect Host, Primary, and Region columns to pulse list. • Pulse settings can now be modified from the pulse list. • The pulse list is now used to connect to the pulse log, instead of the Tools menu. Limits and Cloud • The Limit list shows who the current stub holders are if that Limit is in use. • The Limit list now has a new column that shows the Usage Level for the Limit. • The Limit property dialog now has an option to use the Usage Level for the Limit. • Many context menu items in the Cloud panel (ie: starting and stopping instances) are now performed asyn- chronously. • User group permissions can now be set for the Cloud panel. • The cloud panel will show dialog boxes if an error occurs when interacting with the cloud instances. • Cloud plugin data is now only loaded and updated if the Cloud panel is being displayed. • Added some messages to the cloud commands so you get some feedback when a command is successful. Console and Remote Commands • Fixed a timestamp bug in the Console panel. • The Remote Commands panel is now enabled by default in the User Group Permissions (so that the Monitor’s Local Slave Controls can display it). • Fixed a spacing inconsistency between the timestamp and the text in the Monitor’s Console panel.

Slave Improvements

• Multiple slaves on a single machine now share one license, instead of requiring one license each. • Slaves now return their license when they become idle. • New Idle Detection settings can be set per slave. They can be used to launch the slave when the machine is idle and/or stop the slave when the machine is in use again. • New Job Dequeueing Mode settings can be set per slave. They can be used to force slaves running on worksta- tions to only pick up jobs submitted from the same machine, or by specific users. • Slaves can now be added to regions, which mainly affect how the slave applies Path Mappings. • The slave system tray icon now shows the Deadline version number in the tooltip. • Added timestamps when streaming the slave log. • Fixed a startup bug on Linux and Mac OSX that could result in multiple slaves with the same name starting up on the same machine.

12.1. Deadline 7.0.0.54 Release Notes 797 Deadline User Manual, Release 7.0.3.0

• Improved how the slave picks its IP address on Windows and Linux so that it picks a network interface with a gateway (the Mac OSX version already did this). • If a slave is initially running in Free Mode and it later gets a license, the License information in the slave UI and the slave list in the Monitor will be updated appropriately. • When a slave can’t connect to a license server, it only tries to do auto-discovery every 5 minutes so that it doesn’t saturate the network. • The slave now queries the machine’s CPU speed at regular intervals while it’s running, instead of just caching the value it gets at startup. This is useful for machines with CPU speeds that dynamically change while the system is running. • Fixed a bug that was not checking the Job failure detection settings when a plugin failed to sync its files. • When searching for a job, we no longer prune jobs that have a QueuedChunk count less than or equal to 0. This helps ensure that if a job’s state gets messed up, queued tasks will still be dequeued for that job. • When searching for a job, the slave will now cache any Limits that it failed to acquire, and ignore other jobs that require the same limit during that search. • The idle interval between job searches is now calculated based on the percentage of the idle slaves in the farm. The interval increases as more slaves become idle. • Improved the message printed by the slave when it is doing a self-cleanup because it didn’t close properly the previous sessions. • Limit stub returning a little more robust. • Improved verbose log messages when the slave is looking for a higher priority job. • Fixed a bug that allowed the slave to move on to another task before finishing saving the log for the current task. • Significantly improved how the slave handles large amounts of stdout from the rendering process (both to per- formance and memory usage). • Improved speed and reduced database load when a slave is processing limit groups while searching for a job to render. • Fixed a null reference exception when the slave would check if it needed to return limit stubs based on progress, and the limit no longer exists. • The check that the slave makes to see it needs to return limit stubs based on progress is now done every few minute instead of every second. • If the dlinit file is not found after a plugin sync, the slave will try three more times and then throws an exception. • When dequeuing a job, the slave now returns job limit stubs immediately if it can’t find any tasks for that job. • When dequeuing a job, the slave will check if the job has any queued tasks available before trying to get a task for it. • When updating the job state information during rendering, the slave no longer reads the full job object back from the database. • Slaves only do partial updating of their state when possible to reduce bandwidth. • Fixed a bug that could cause the slave to crash during shutdown. • Fixed a bug that would result in only partial logs for a task that rendered across different days. • The local task logs have been renamed to ensure they are unique to the slave and render thread that is rendering them. • Any orphaned local task logs are now cleaned up the next time that render thread renders a task.

798 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• Fixed a bug that could cause a render to fail if the job’s name changed between tasks. • Added some additional logging just before the slave exits. • Slaves now save their own copy of the task report, which can be viewed from the Slave Reports panel in the Monitor. • Fixed some text fields in the Slave UI that weren’t readonly. • Fixed some typos in some error messages. • During each job scan, the slaves cache if a plugin supports concurrent tasks or not to avoid repeatedly reloading that information from the repository.

Pulse Improvements

• A primary pulse can now be configured, which is the ones that the slaves will connect to. Only the primary instance of pulse will do things like housecleaning and the pending job scan. • If the primary pulse is offline or stalled, the repository repair operation can elect another running pulse as the primary. This can be enabled in the repository repair settings in the repository options. • Fixed some text fields in the Pulse UI that weren’t readonly. • Pulse no longer controls the Slave Scheduling feature. It is now handled by the Launcher. • Pulse now only sends the Repository Path for Auto Configuration requests. The other settings are pulled from the Repository after the Deadline applications have connected to it. • The Pulse system tray icon now shows the Deadline version number in the tooltip. • In Power Management, Idle Shutdown now takes into account if there are multiple slaves running on the same machine. • Added field to the Pulse UI that shows the state of the web service. • The slave can now be shutdown with “deadlineslave -s” when it hasn’t connected to a repository yet. • Added more information to the pulse throttling messages such as the slave name, job id, number of requests and throttle limit. • Made some tweaks to the web service new and delete user groups functions to not return error codes for certain outcomes. • Fixed bugs in some REST API functions that could cause Pulse to crash. • Added a catch to prevent REST API functions from causing Pulse to crash. • Changed some of the error messages that were inconsistent with the rest of the REST API. • Pulse no longer prints out an error when favicon.ico is requested from the web service. • Cleaned up the web service messages when the command is an invalid API command, and when no command is specified. • Added Access Control Allow Origin header to Web Service responses. • The options request type is now supported by the Web Service. • When deleting from the restful API, we now log to the repository history, not the job’s history. • Added support to the restful API for only grabbing certain job properties in a request for jobs to reduce the amount of data getting passed around. • Fixed a bug in the Machine Startup feature of Power Management that would result in no slaves being woken up for a job with an empty whitelist.

12.1. Deadline 7.0.0.54 Release Notes 799 Deadline User Manual, Release 7.0.3.0

Command Improvements

• All command line options now support any number of leading dashes (for example, “deadlinecommand.exe -pools” or “deadlinecommand.exe –groups”). • Added new commands to suspend/resume individual tasks. • Added a new command to suspend all non-rendering tasks for a job. • Fixed some bugs with the RenderJob command line option. • Fixed some bugs with the JobStatistics command line option. • Added some User Group command line options. • Fixed the RemoteControl command to properly print out results. • Updated the help text for the ChangeRepository command line option to mention the optional Repository Path argument. • The RemoteControl command options are no longer case sensitive. • Added GetJobDetails command to print the job details that are shown in the Job Details panel in the Monitor. • Added GetVersion and GetMajorVersion commands. • Added commands that can be used to configure the Cloud plugins, group mappings, regions, etc. • Added DeadlineCommand commands for adding job, slave and repository history entries. • Added command line option to DoHouseCleaning and DoRepositoryRepair to choose which mode to run. • Added command line commands for performing path mapping. • Removed JobCleanup command line option, since the DoHouseCleaning command can do this. • The DoPendingJobScan command line option can now take an optional region parameter that is used for path mapping when checking asset and script dependencies. • Added SlaveExists command to check if a slave exists. • Deadline Command no longer checks if the collection indices in the database need to be created (the other Deadline applications still handle this). • The ChangeRepository command no longer tries to load the Qt libraries if it is being passed the repository path as a command line option. • The ChangeLicenseServer command no longer tries to load the Qt libraries if it is being passed the license server as a command line option. • The ChangeUser command no longer tries to load the Qt libraries if it is being passed the user name as a command line option. • Fixed a bug with commands that accept a repository path as an argument. The bug would cause deadline command to crash if the repository path was quoted and ended with a character (ie: “\serverrepository”).

Scripting Improvements

• Added new events to the Event Plugin API: OnHouseCleaning, OnSlaveStarted, OnSlaveStopped, OnSlaveRen- dering, and OnSlaveStartingJob. • Added new grid-based control options to the DeadlineScriptDialog class, which make it easier to create custom interfaces in the Deadline scripts. • Updated the cloud plugins to not swallow their errors when creating instances.

800 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• Exposed some errors that happened when Cloud/Balancer plugin files were missing or spelled incorrectly. • Added function to get the database connection string. • Added function to change a job’s frame list. • Default for ConcurrentTasks in a plugin’s dlinit file is now True. • Added API commands to launch processes with a specific user account. • Made some improvements to the way python exceptions are printed out. • Fixed some issues with how python stdout and stderr redirection to the Deadline logs was working. • Added new API commands to suspend all non-rendering tasks for a job. • When a plugin, event, cloud, or Monitor script is executed, the log will now show where the script is being loaded from. • Added “EnabledStickySaving” function to the DeadlineScriptDialog class that can be used to automatically save sticky settings when the dialog is closed. • Improved some function documentation for the API. • The Slave Stdout Limit is now applied to ManagedProcess objects created in plugin scripts. Before, it was only applied to the main DeadlinePlugin object. • Fix a bug that prevented module import errors from showing the actual Python error. • Added some additional User Group functions. • The RGB spinners in the Color script control now resize when the control size changes. • Added ClientUtils.CreateScriptTempFolder() function to create a temporary folder for the script that is automat- ically cleaned up. • Fixed how the value is set for the RadioControl script control. • Fixed a bug with getting the disabled slave count in the GetFarmStatisticsEx.py web service script. • Added a OnJobPurged event trigger that gets called right before a job gets purged from the database. • Added OnSlaveStalled callback for event plugins. • Added functions to the REST API and the standalone Python module to get the job details that are shown in the Job Details panel in the Monitor. • Added support to the REST API and the standalone Python module to undelete deleted jobs, purge deleted jobs, get deleted jobs and get deleted job ids. • Added RepositoryUtils.GetJobDetails() function. • Added RepositoryUtils functions to get deleted job IDs and to undelete jobs. • Fixed a bug that prevented JobUtils.CalculateJobStatistics() from working in non-Monitor scripts. • PYTHONHOME and PYTHONPATH are now properly set to the system’s values in RunProcess for the event plugins. • GetConfigEntry and GetConfigEntryWithDefault functions for plugins now trim whitespace off the values. • Added support to the Standalone Python API for doing basic authentication with the Web Service. • Added missing documentation for SlaveUtils.GetMachineIPAddresses() API function. • Added RepositoryUtils.SlaveExists() function to check if a slave exists. • Fixed a bug where the OnJobFinished callback for Event plugins wasn’t always getting the updated job object.

12.1. Deadline 7.0.0.54 Release Notes 801 Deadline User Manual, Release 7.0.3.0

• Added some missing properties to the doxygen docs for BalancerInfo, PulseInfo, SlaveInfo, and SlaveSettings. • SlaveHostMachineIPAddressOverride in SlaveSettings now represents the correct value.

Application Plugin Improvements

3ds Max Improvements • Updated SMTD version numbers to 7.0. • Fixed a SMTD initialization error. • When copying external files, SMTD no longer tries to copy over missing files. • 3dsMax2015_sp2 & Extension_1 dictionary entry added to 3dsmax plugin. • Default/sticky settings can now be set in SMTD for the ExtraInfo fields. • Removed (x86) references in 3dsmaxcmd plugin for Max2014 & Max2015. • Made some improvements for the RTT (Render To Textrure) feature in SMTD, including the option to bake one object per task. • Fixed bug in FumeFX string handling in 3dsmax plugin. • Updated SMTD to handle blowup mode properly. • Updated Region manipulator in SMTD to keep aspect ratio while in blowup mode. • When offloading Mental Ray DBR jobs, the job will now use a temporary max.rayhosts file, rather than modify the original. • Added workaround to prevent 3ds Max 2015 from crashing when it’s rendering as a service. • Fixed some layout issues in SMTD. • Fixed some layout issues in the VRay DBR submitter. • Added better error messages to SMTD if the main script from the repository can’t be loaded. • Added some new SMTD sanity checks (CheckForOutputPathLength, CheckForREPathLength, CheckForDu- plicateREPaths, CheckForObjectNames, CheckForCorruptGroup). • Fixed a bug in the 3ds Max 2015 workspace workaround that caused it to fail if the workspace directory doesn’t exist. • Fixed a bug that affected the tile assembly of frames rendered using the VRay frame buffer. • Fixed a tile assembly issue with VRay MultiMatte render elements. • Updated 3dsmax plugin dict in 3dsmax.py to clearly inform users which versions of 3dsMax are broken with Deadline. • Changed maxTileAssembler command to use “HiddenDOSCommand” to hide console window on slave. • SMTD - Add [PREVIEW] job ability to enable/disable its parent dependency to the [REMAINING] frames job. • SMTD - When rendering single frame tile or single frame jigsaw, OutputFilename# should be frame specific instead of ####. • SMTD - If VRay “Separate Render Channels” is enabled, RE paths were not output to the Monitor OutputFile- name#. • SMTD - Re-worked logic for when VRay RE’s are output as “Separate Render Channels” via VFB to the Monitor OutputFilename#.

802 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• SMTD - When the script file “SubmitMaxToDeadline_RemapLocalToNetworkPath.ms” is not found in the Repository, the option for network remap will be hidden from the UI. • SMTD - Improved “Generate Quicktime .MOV File” drop down UI positions. • SMTD - Last edited dates are now shown in the SMTD ABOUT dialog. • SMTD - 3dsmax.options - “Size” category re-named to “Render Size” so it appears next to the other “Render...” categories in alphabetical order in job properties. • SMTD - Updated “RebuildRenderElements” function. • SMTD - Added ability to override Tile/Jigsaw Assembler Pool, Secondary Pool, Group & Priority to be different from main 3dsMax job. • SMTD - Added sticky/default ini entries for Assembler Override settings. • SMTD - Empty State Sets are now purged from the scene during submission via SMTD. • SMTD - Submit a subset of Tiles for rendering with Draft Assembler, and they will all assemble over black background. • SMTD - Re-queue some of the “Completed” tasks of the above job to render more tiles that were not requested as Custom Tiles during submission - the Draft Tile Assembler will successfully integrate them into the final image(s). • SMTD - Submit with “Clean Up Tiles” checked multiple times - each time, it will successfully assemble the new tiles over the previous output image used as background. • SMTD - Jigsaw standalone feature “Fill Regions” backported to SMTD Jigsaw MX version, with extended functionality. • SMTD - Added an option to permanently rename render elements during submission. • SMTD - Fixed a bunch bugs that could affect how the output paths are passed to Deadline. • SMTD - The State Sets dialog is now automatically closed during submission and re-opened afterwards. • SMTD - The “Sequencer Mode” State Set dialog docking now supported during State Sets dialog auto closing during submission. • SMTD - Properly handles versions earlier than Max 2015 not having “IsMainFrameVisible” property available in State Sets object. After Effects Improvements • Added support for After Effects CC 2014. • Improved the error message if the wrong submission script is installed on the client machine. • Relaxed the output path sanity check in the integrated submitter so that it doesn’t prevent you from submitting a job that is outputting to a folder that doesn’t exist yet. Arnold Standalone Improvements • Added the -dp flag to the render arguments to speed up the rendering. Cinema 4D Improvements • All multi-pass paths are now included when submitting from the integrated submitter, allowing you to open these output files from the Monitor. • Fixed a bug in how the integrated submitter gets the output file name in cases where the output name scheme doesn’t start with a period. • A Team Render submitter is now available that lets you launch Team Render on slaves and connect to them to perform an interactive render.

12.1. Deadline 7.0.0.54 Release Notes 803 Deadline User Manual, Release 7.0.3.0

CommandLine Improvements • Path Mapping is now performed on the arguments for CommandLine jobs. CommandScript Improvements • Path Mapping is now performed on the arguments for CommandScript jobs. Corona Improvements • Added support for Corona standalone. DJV Improvements • Re-worked DJV plugin & submission script to handle new DJV v1.0.1, which has changed the majority of it’s command line flags in this new release! • Fixed a couple bugs when using the job right-click script to submit a DJV job. Draft Improvements • Added Path Mapping support to the Draft tile assembler. • Updated Draft to version 1.2.3.57201. Also note that if you are using Draft 1.1 or earlier, you will need an updated Draft license. • Updated Draft Tile assembler monitor submission script to be able to add all of the plugin submission options. • Updated Draft Tile submitter to fix a visual bug. • Improved the error message when the Draft Tile Assembler can’t load input tiles. FFmpeg Improvements • Path mapping is now applied to the preset files. • The FFmpeg plugin now enforces the correct path separators based on the OS. • Fixed some typos in the FFmpeg submitter in the Monitor. Fusion Improvements • Added support for Fusion 7. • Updated the Fusion plugin icon. Hiero Improvements • Fixed how we get the start and end frame for a clip in the Hiero submitter. Houdini Improvements • Fixed some bad logic when checking the output file in the houdini submitter. • Fixed an error when loading the sticky SubmitSuspended property in the integrated houdini submitter. • The integrated submitter now includes the current ROP name with the job name. • Improved Arnold for Houdini support. Lightwave Improvements • Updated the Path Mapping tooltip in the Lightwave plugin to mention that it can be disabled if there are no Path Mapping entries defined in the Repository Options. • Jobs submitted from Lightwave 11.8 now render properly. Mantra Standalone Improvements • The “mantra: Bad Alembic Archive” error message is now caught during rendering.

804 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• Updated the Path Mapping tooltip in the Mantra plugin to mention that it can be disabled if there are no Path Mapping entries defined in the Repository Options. Maya Improvements • Added Jigsaw support to Maya. • Removed unnecessary 32 bit paths from the MayaBatch and MayaCmd plugin configurations. • Added a new stdout handler to catch a Maya licensing error. • Fixed some text cutoff issues in the integrated submitter on Mac OSX Mavericks. • Added overrides for the height and width of the render output to the Monitor submitter. • Fixed FumeFX Wavelet Sim issue for MayaBatch & MayaCmd. • Fixed an Arnold for Maya verbosity flag bug. • Fixed some issues when using tile rendering with VRay. • VRay render elements are now supported when using the Draft Tile Assembler. • Arnold AOVs are now supported by tile rendering. • Added multichannel EXR support for Jigsaw and Draft Tile rendering. • Fixed the default Maya executable paths on OSX. • Added an explanation to the tooltip for the frame list control in the integrated submitter for why it would be disabled. • Fixed some Vray related bugs in the integrated Maya submitter due to differences between Vray 2 and Vray 3. Mental Ray Standalone Improvements • Added plugin configuration option to treat exit code 1 as error or success. modo Improvements • Added Jigsaw support to modo. • Added option to modo Monitor submitter to specify the output pattern. • Added warning message to modo Monitor submitter that overriding output and using Tile Rendering has limi- tations, and that they should use the integrated submitter in certain cases. • Fixed a bug in the integrated modo submitter that prevented it from working in modo 801. Nuke Improvements • Added support for Nuke 9. • Updated Nuke plugin to properly handle frame counts in batch node when given write node names. • Fixed a bug that could crop up when setting the environment in the nuke submitter prior to launching dead- linecommand. • Added Render Using Proxy Mode option to the Nuke submitter. • Removed Build option from Nuke submitter, since the versions of Nuke that Deadline supports are 64-bit only. • Fixed an error that could occur if PrepForOFX is not defined in the Nuke.dlinit file. • The integrated Nuke submitter now includes output paths for all Views so that they can be viewed from the Monitor. • The integrated Nuke submitter now displays a warning if you are trying to submit a job that has no Views.

12.1. Deadline 7.0.0.54 Release Notes 805 Deadline User Manual, Release 7.0.3.0

• Updated the names given to the Knobs created by the integrated submitter, which seems to address some insta- bility issues that could come up. • The secondary pool setting is now sticky in the integrated submitter. • Fixed a bug with Nuke path mapping that would mess up embedded TCL in the output path. • Updated the Path Mapping tooltip in the Nuke plugin to mention that it can be disabled if there are no Path Mapping entries defined in the Repository Options. • The integrated Nuke submitter handles TCL embedded in the output path properly when passing the paths to Deadline to view the output from the Monitor. • Fixed an error in the submitter when the Nuke comp has proxy mode enabled. • In the Nuke submitter, Deadline’s settings are now created in a “Deadline” tab, instead of just using the default User tab. The settings have more readable names too. • Added Performance Profiling option to submitter (Nuke 9 and later). • Changed layout of submitter controls a bit. • Fixed an issue with loading Shotgun and FTrack KVPs from the Nuke script file. Puppet Improvements • Added support for Puppet jobs. Python Improvements • Path separators for the script path are now set per OS after Path Mapping has taken place. Quicktime Improvements • Fixed an error in the job right-click script to submit a Quicktime job. Realflow Improvements • Added support for Realflow 2014. • Improved Hybrido simulation progress reporting. Rhino Improvements • Added Jigsaw support to Rhino. • Added Tile Rendering support to Rhino. • Updated the default Rhino 5 executable path. • When Rhino starts up, the “Enter” button is now pressed to workaround a case where Rhino wouldn’t start rendering. Salt Improvements • Added support for Salt jobs. SketchUp Improvements • Added support for 2015. • Increased width of export directory and prefix fields in the submitter. Vray DBR Improvements • Added a task timeout option to all the DBR submission scripts. When the timeout is reached, the task will be marked as complete so that the slave can move on to something else. • In the Monitor submitter, the application version number is now sticky between sessions.

806 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• The 3ds Max and Maya DBR submitters now disable vray distributed rendering when closing if the submitter had automatically enabled it. • The 3ds Max DBR submitter can now automatically mark the spawner job as complete when the rendering finishes. • Fixed how the Maya Vray DBR submitter creates a new shelf if there isn’t already a Deadline shelf. • In the Monitor submitter, the port label visibility is now toggled on/off based on the currently selected applica- tion, which properly refreshes the UI. • The default Vray spawner paths for 3ds Max Design are now included. • Added a timeout setting for all supported applications except 3ds Max (3ds Max RT is supported though). • Added an option for how to handle the case where a vray DR process is already running on the machine. • The Port number can now be specified for 3ds Max. • 3ds Max RT is now properly supported. • Updated height of VRay dialog in Softimage. • In the Ply2Vrmesh submitter, the attribute field is now wider.

Event Plugin Improvements ftrack Event Improvements • Added ftrack support to most of the submission scripts. Shotgun Event Improvements • Updated Shotgun API to version 3.0.17. • Added functionality to upload a filmstrip and a H264 quicktime movie to Shotgun when a job finishes rendering.

12.2 Deadline 7.0.1.3 Release Notes

12.2.1 Overview

This is a patch release for Deadline 7.0. It fixes a few important bugs that were discovered shortly after Deadline 7.0 was released. A bug with how the Slaves updated their state in the database had a significant impact on database performance. In order to fix this bug, we had to change how the Slaves update their state, and as a result the Slave list in the Monitor will show that your Slaves are in an “Unknown” state until all your machines (Slaves, Monitors, Pulse, etc) are running Deadline 7.0.1. Once all machines are running the same version, the Slaves will appear properly in the Monitor again. See the Deadline 7.0.0.54 Release Notes for the full release notes. Note that a 7.0 license is still required to run this version. If you have a license for Deadline 6.2 or earlier, you will need an updated license. In addition, the version of Draft that ships with Deadline 7 needs a new 1.2 license. If you have a license for Draft 1.1 or earlier, you will need an updated license.

12.2. Deadline 7.0.1.3 Release Notes 807 Deadline User Manual, Release 7.0.3.0

12.2.2 Complete Release Notes

Monitor Improvements

• Fixed some bugs in the Dependency panel in the Job Properties dialog. • When resubmitting a job from the monitor, you can no longer set the frames per task to 0, which results in an error during submission.

Slave Improvements

• Fixed a bug with how slaves update their state in the database, which had a negative impact on performance. • The slave no longer prints out logging before and after each successful license checkout (errors are still printed out). • The slave no longer updates its state in the database a bunch of times when shutting down. • Reduced the frequency at which the slaves check if housecleaning needs to be done. Now, they only check at the same interval that Pulse would be performing the housecleaning operations, instead of before each task search.

Application Plugin Improvements

CommandScript Improvements • Fixed a syntax error in the CommandScript plugin. Maya Improvements • Fixed a couple bugs that affected how the integrated submitter handled some VRay render elements. Mental Ray Standalone Improvements • Fixed a syntax error in the MentalRay plugin.

12.3 Deadline 7.0.2.3 Release Notes

12.3.1 Overview

This is the second patch release for Deadline 7.0, which fixes a few bugs, and adds support for Lightwave 2015. See the following pages for the full release notes: • Deadline 7.0.0.54 Release Notes • Deadline 7.0.1.3 Release Notes Note that a 7.0 license is still required to run this version. If you have a license for Deadline 6.2 or earlier, you will need an updated license. In addition, the version of Draft that ships with Deadline 7 needs a new 1.2 license. If you have a license for Draft 1.1 or earlier, you will need an updated license.

12.3.2 Complete Release Notes

Installer Improvements

• The Repository installer now sets the version number correctly in the repository.ini file.

808 Chapter 12. Release Notes Deadline User Manual, Release 7.0.3.0

• The submission script installers no longer create a rollback folder in the Repository folder.

Launcher Improvements

• Fixed an error on Linux when checking how long the system has been idle in a headless environment.

Slave Improvements

• Fixed a bug that caused the slave to report that it had a permanent license in some cases when it couldn’t check out a valid license.

Pulse Improvements

• Fixed a bug that prevented a Primary Pulse from performing the Pending Job Scan on Linux and OSX.

Application Plugin Improvements

3ds Max Improvements • Fixed a bug for 3ds Max 2015 when checking the visibility of the SceneExplorer prior to rendering. Cinema 4D Team Render Improvements • The C4D Team Render plugin now works properly with C4D 15 and 16. • Removed the security token file location options from the plugin configuration, since they aren’t needed. • The security token file is now created in the correct location on OSX. • Improved the error message that occurs if the security token file can’t be created (often due to permissions). • Moved the “Copy to Clipboard” button next to the security token field in the integrated submitter. • Increased the button widths at the bottom of the integrated submitter to fix some text cutoff issues. • If the security token is blank when submitting the job, it is now populated with the token that is automatically generated. • The Team Render submission script installer now supports C4D 16. • The security token can no longer be modified from the Monitor after the job has been submitted. Combustion Improvements • Path mapping is now performed on the scene file path (if the scene isn’t being submitted with the job). Lightwave Improvements • Added support for Lightwave 2015. • Fixed a bug that prevented the integrated submitter from working with Lightwave 2015. modo Improvements • Permissions are now set properly by modo submitter installer, which allows modo to recognize the Deadline submitter when loading.

12.3. Deadline 7.0.2.3 Release Notes 809 Deadline User Manual, Release 7.0.3.0

12.4 Deadline 7.0.3.0 Release Notes

12.4.1 Overview

This is the third patch release for Deadline 7.0. It fixes a critical side-effect in the feature that allows you to pick an alternate folder for job auxiliary files in the Job Settings in the Repository Options. Without this update, Deadline can delete any existing subfolders in the chosen folder if their name doesn’t match an ID of a job that is still in the queue. This isn’t a problem if you choose an empty folder (which is recommended), but if you choose a folder with existing subfolders, those subfolders will get deleted. This update ensures that only subfolders with names that represent a valid job ID can be deleted by Deadline. See the following pages for the full release notes: • Deadline 7.0.0.54 Release Notes • Deadline 7.0.1.3 Release Notes • Deadline 7.0.2.3 Release Notes Note that a 7.0 license is still required to run this version. If you have a license for Deadline 6.2 or earlier, you will need an updated license. In addition, the version of Draft that ships with Deadline 7 needs a new 1.2 license. If you have a license for Draft 1.1 or earlier, you will need an updated license.

810 Chapter 12. Release Notes