news 2021 3 min read
disguise democratises virtual production with new solutions for integrating Unreal Engine
The latest upgrade to the disguise software addresses the continued demand for high quality, immersive content by enhancing its integration of Unreal Engine.
The latest release of our software delivers never-before-seen detail and frame rate across extended reality (xR) and virtual production.
2020 saw us become a major player in virtual production for film and television and a market leader in xR, powering more than 200 xR productions in 35 countries across broadcast, corporate and education spaces.
The latest xR software features include:
- ACES colour management
- DirectX12 (DX12)
- Deep Learning Super Sampling (DLSS) support from Nvidia.
In combination, these will allow users to deliver images of the highest quality via 10-bit HDR processing of Unreal content through disguise RenderStream. RenderStream is a proprietary IP protocol that allows Unreal content to be hosted on dedicated disguise hardware.
Download our latest software release
“This approach of scale-out rendering in conjunction with features such as DX12 support, DLSS and the ACES colour management workflow allows virtual production customers to pursue truly photorealistic scenes and content without compromising on creative delivery,” says Joe Bleasdale, Product Manager at disguise.
ACES (the Academy Color Encoding System) is one of the leading colour management systems used in the film and VFX space. Its integration means users can now access its powerful workflow end-to-end through the disguise system, enabling unparalleled colour quality for motion picture images regardless of source.
“In my last shoots, I’ve been working with the ACES workflow, and the results have been really good. This feature is a great tool for all of us in the production process - the VFX team has less work to transform the plates, and I have more dynamic range to adjust in the screens. I've been doing some tests with a colourist to compare, and the conclusion is that it is working perfectly.”
David Monguet, CTO, MO+MO Film Services
disguise and Nvidia’s Deep Learning Super Sampling
We’ve also implemented a pipeline that supports Unreal Engine running DLSS - an AI rendering technology from Nvidia that enables content to be rendered at a lower resolution then up-sampled using a GPU-accelerated deep learning model to reconstruct the image at the higher resolution. The use of DLSS was pioneered by disguise partner Orca Studios, a virtual production and VFX studio in Barcelona.
“We adopted an early implementation of DLSS into Unreal Engine, which allows us to get a critical boost in performance in the more demanding scenes with ray tracing global illumination that would otherwise not hit the noise and performance limits necessary for high-end productions. The disguise integration of DLSS and Unreal Engine made this a seamless process.”
Adrian Pueyo, VFX Supervisor, Orca Studios
Our product team at disguise have also added DirectX12 support for Unreal Engine, unlocking advanced rendering features such as ray tracing. Users can now capture high-quality reflections, refractions and accurate shadows to deliver best-in-class photorealistic content. These new features enhance current virtual production workflows and pave the way for more exciting advances to come.
“Our users are always looking for ways to deliver images of the highest quality, highest detail and highest frame rate, but this is often constrained by the finite GPU power of their rendering system,” says Peter Kirkup, Global Technical Solutions Manager at disguise. “Our next release will introduce users to cluster rendering, an integration solution for Unreal’s nDisplay. With this, we aim to do two things – simplify the configuration of render clusters and separate out the render clusters from the final pixel delivery machines. By separating those two, we are able to scale them independently so you can add more render nodes for more render power or add more output nodes if you just have a bigger canvas.”