Real-Time Compositing Workflow for LED Volume Set Extensions

Research Team

Kevin Santos
Spencer Idenouye
James Rowan
Emerson Chan

Partners

StradaXR
NRC IRAP
CTO

Impact

  • Unreal Engine 5 Virtual Production set extension
  • Live compositing
  • Camera calibration
  • Color accuracy evaluations

Developing Advanced Workflows for Live Object Compositing and Virtual Set Extensions in LED Volumes

StradaXR wanted to explore live compositing techniques inside Unreal Engine 5 to allow for LED wall extensions during virtual productions. The goal was to establish a robust pipeline using Unreal Engine 5’s nDisplay and Composure plugins, integrated with OptiTrack for accurate camera and object tracking, to achieve realistic set extensions. The methodology involved developing the integrated workflow, rigorous testing to ensure real-time performance and visual fidelity, and comprehensive documentation and training. Key challenges addressed included optimizing camera tracking accuracy, minimizing system latency, and managing color consistency across the pipeline.

The Need for Advanced Real-Time Compositing in Virtual Production

The film and virtual production industries are constantly seeking ways to enhance visual effects workflows, particularly within immersive LED volume environments. Live object compositing, often referred to as Real-time Compositing (RTC), is a key area for development, enabling the seamless blending of physical objects and actors with virtual environments and set extensions. This capability is crucial for creating realistic augmented reality (AR) experiences directly in camera during live shoots.

This project aimed to push the boundaries of real-time virtual production techniques by establishing a robust workflow and pipeline for live object compositing. The primary goal was to integrate essential tools like Unreal Engine’s nDisplay and Composure plugins with the OptiTrack motion capture system to achieve realistic set extension in live shot conditions. A secondary, but vital, goal was to ensure real-time performance and high-quality visual fidelity, ensuring that live objects blend seamlessly with virtual environments displayed on the LED volume.

Implementing an Integrated Unreal Engine Pipeline

The project utilized an in-camera visual effects (ICVFX) pipeline within Unreal Engine 5. This pipeline integrated several key technologies:

  • Unreal Engine 5: The core platform for virtual environment rendering and compositing.
  • nDisplay: Utilized for rendering virtual environments across the LED volume display surfaces.
  • Composure: Unreal Engine’s real-time compositing plugin used to blend live video feeds and virtual elements.
  • OptiTrack: A motion capture system integrated to provide accurate camera tracking data.


The development process followed a structured milestone approach:

  1. Workflow Design and Pipeline Development:
    This involved designing the integration of nDisplay, Composure, and OptiTrack, defining processes for integrating physical objects into virtual environments, identifying necessary software and hardware components, considering synchronization configurations like LTC/Genlock, establishing real-time compositing requirements, implementing OptiTrack integration with Unreal Engine 5, and evaluating lens calibration for accurate tracking. An environmental scan of other compositing tools like Nuke was also conducted in comparison to Composure.
  2. Integration and Testing:
    The developed pipeline was integrated into the existing Unreal Engine 5 environment, ensuring compatibility. Rigorous testing was conducted to confirm seamless operation, focusing on real-time object tracking, set extension, and AR compositing on the LED volume. Stress tests evaluated performance under various conditions, assessing areas like latency, frame rate, rendering artifacts, and compositing fidelity. This phase also involved ensuring timecode and genlock synchronization to reduce latency between the camera and Unreal Engine.
  3. Documentation and Training:
    The entire workflow and pipeline were comprehensively documented, including setup procedures, troubleshooting guides, and best practices, with a specific focus on using OptiTrack for set extension. Training sessions were provided to key personnel to ensure proficiency in operating the integrated nDisplay, Composure, and OptiTrack setup.


Throughout the project, challenges related to achieving consistent lens distortion calibration, managing positional skating, and minimizing latency were addressed through dedicated camera testing and synchronization steps. Color management for the LED wall was also a critical component, exploring baseline configurations, custom calibration using tools like OpenVPCal, and utilizing Unreal Engine’s built-in color correction tools to ensure consistency and facilitate the blend between virtual and physical elements. Best practices for OptiTrack tracking, including focus, exposure, and marker configurations (passive vs. active), were also evaluated.

Project Outcomes, Deliverables, and Our Findings

This project successfully established a workflow and pipeline integrating nDisplay, Composure, and OptiTrack for real-time object compositing, contributing valuable insights to virtual production techniques and best practices regarding passive versus active markers.

The SIRT team implemented and showcased the project results at the StradaXR studio, delivering in-depth documentation to enable StradaXR to continue utilizing the developed workflow beyond the project duration.