While working at Deakin Motion Lab (now Fika Entertainment) I developed a virtual production pipeline with a colleague.
Over the course of ~3 years we developed Unity editor tools, plugins, and linking software that enabled the creation of many varied types of animated content. The pipeline remains internal to the lab, but this page details some of the work that I produced for it.
I do not have any control over the source, nor the capacity to share the tools with anyone.
Traditional screen production is structured into sequences that contain a number of shots. A shot might contain multiple environments, characters, props, animations, lighting, cameras, and effects. Managing all these assets is a technical challenge that we worked to solve. Currently all this data needs to be opened and assigned manually. We introduced concepts such as:
• Hierarchical asset management.
• Automated asset and multi-scene loading.
• Asset shortcuts for further asset reuse.
• Per-Take Asset-Component addressing.
• Improved folder metadata.
The Manager window lets you create and manage customisable folder structures, their Loaders and assets.
A common operation performed on-set to prepare for the next take is duplicating the current scene setup and clearing previously recorded data from the scene.
Performing these repetitive operations, especially in a time-sensitive environment carries a large risk of human error. By simplifying shot duplication, we eliminated this risk and reduce the time taken by this process.
We facilitated recording into Timeline during Play mode, providing a centralised location for all recordable objects in a shot.
Animation could be easily replaced after cleanup, as recording was synchronised with external programs with a Timecode provider.
This system was tightly integrated beside Timeline, and was the touchpoint when working on set.
Performance capture may contain motion capture, facial capture, and external reference audio and video. Each plugin in a project has a different interface, and their components are located across multiple scenes. Server connections are often configured on a per-component basis, so changes need to be synchronised in multiple locations.
We centralised and consolidated Connections of all types into a single interface that managed the data and lifetimes seamlessly.
This was the most separable part of the platform and was broadly utilised outside of the scope of virtual production.
We enabled the discovery and loading of any asset found in a project structure according to a set of customisable rules. By default this hierarchically loaded from a Loader's directory, processing any relevant Asset.
This was very suitable for internal usage, but the next evolution would ditch the hierarchical nature of the loading and the direct use of the AssetDatabase. By utilising Assets and Addressables instead, it should make complex runtime loading and even programmable actions easy to assemble.
Recording and managing large amounts of performance captured data is a difficult challenge.
We provided components for refining what gets recorded to the Timeline, filtering out unimportant data, and activating objects that are present on stage.
This system proved pretty fruitful and ended up being the backbone for connecting in-scene objects into the system, additionally doing things like controlling GameObject enabled state, supplying per-object recording configuration, and generating Timeline Tracks and their Groups.
This system went through the most refactoring out of all of the pipeline, trying to reduce specificity of the project setup.
Every aspect of the pipeline was extensible.
Customise how loading works, how assets were handled, new IO, Synchronisation with external hardware and software, custom hierarchies and indexing, utilities, and even queries for the Wizard.
This was the most powerful part of the pipeline, what enabled its flexibility between projects, and what allows any studio to take up the tools as their own.
Remote simply integrates Cinemachine's power into a project in a way that any director and editor could understand and operate.
Interact with a virtual camera to re-record and edit cameras on or off the stage.
This separation allows a director to focus on actor performance on shoot day, optionally deferring camera-work to periods of greater downtime.
Sync discovered Unity-specific dependencies (above the scope of Asset Packages,) and enabled their intelligent transfer across projects.
Sync also enabled two projects to be in partial synchronisation. If your director wanted a VR project alongside, your main project was no longer affected.
This proved helpful in transferring assets cross-project without any tedium.
We utilised the Package Manager to streamline deployment. The modular structure allowed users to pick and choose what elements of the pipeline they needed for their project.
The unification of these systems delivered a streamlined environment for producing fast iterative creative content.
The consistent UI and UX means that developers who are already familiar with Unity become quickly accustomed to working with the pipeline's systems.
The pipeline has been used by teams creating a broad spectrum of creative content:
ABC's Minibeast Heroes - Six TV episodes and one 360 video, 2m30s each.
Rendered with V-Ray, and one shot rendered with Unity and Octane in Unity.
Shooting took 4 days, live-edit roughcuts were produced on-stage.
"Snapcracker" - Internal short developed with external IP, under NDA.
The 3m50s final animation was produced in 2 weeks, and rendered with Unity HDRP.
The Adventures of Aunty Ada - TV pitch by Deakin Motion Lab
Public demos at ACMI and Screen Forever
Creating a linking colour scheme and icons proved extremely valuable, giving each area of the pipeline its own swatch, appling it to windows, inspectors, icons, and documentation greatly sped up how people acquired information and worked across systems.
Unification of windows and systems was rarely a positive. Instead provide quick methods to switch contexts.
Very early on we defined a flexible set of rules to create folder structures to be iterated with shots and takes. The loading system we implemented piggybacked off this hierarchy to do its work.
This system relied so heavily on the AssetDatabase and folder structures, and build-time usage of the pipeline wasn't ever explored. The system served our needs perfectly, and went above and beyond in its flexibility and extensibility. We had shortcut assets that could point to other content, and the inspector UI would display how and why this content would be loaded.
With Addressables entering the fray it's likely that asynchronous loading and an asset-focused approach with would be appropriate for the next evolution of such a system. An assetised approach for loading a set of varied content in a user-defined manner could also be valuable outside of virtual production.
The system used for loosely binding in-scene Objects to their Timeline Track counterparts was the most iterated design I worked with. The main reasons behind this were attaining visible functionality and operational clarity.
With a complex system that has many embedded layers, it's important to both simplify operation, and also enable understanding of back-end implications.
In the end the UI gave a broad overview of the composition of the elements, whilst still allowing selective expansion of a relevant element. It was a task to keep this UI from becoming monolithic or muddled, but iterating on a set of visual principles that became unified across the whole pipeline meant the inspector was one I'm proud to be displaying.
You can't make these sort of tools without running into Unity's issues often.
I made many systems to disable or augment built-in functionality:
I could go on for pages about major and minor decisions that had impacts on workflow and how these systems were planned and developed, but I think I'll leave it there for now.