Share

UE5.1 adds M&E tools to an impressive UE5 release

[ad_1]

The majority of the Unreal Engine improvements surrounding UE5.0 focused on the core gaming business, but with version UE5.1 Epic Games offers a suite of M&E tools, especially in the area of Virtual Production (VP).

Virtual Production

With over 300 LED VP stages, of which 50 are high-end VFX stages and another 50 that are high-end Broadcast stages, the UE5.1 release offers some serious improvements. We spoke to the team to get the latest and clearer visibility into the future from the vector Epic’s UE5.1 is demonstrating.

For example, in this release, there is now an MVP version of Lumen,  providing indirect lighting and shadowing based on just the lights, or position of bounce cards. Previously, stage crews required a baking step that could pause production, interrupting the creative flow. Currently, the number of lights supported is modestly estimated to be about five to seven lights total, depending on the graphics cards, (this estimate is made based on the Epic team’s own production setup of dual A6000 cards). While this is helpful, the direction Epic has set in allowing real-time Lumen lighting on stages is impressive, and a more extensive Lumen solution will inevitably make VP more productive.

There is also an improved DMX ecosystem in UE5.1, with enhanced support for the MVR format to include fixtures, plots, and patches; when both Unreal Engine and a lighting console need to share DMX data. This allows for the physical and digital lighting to be effectively controlled as one, with the DMX GDTF support introduced in UE5.

Color Correction On Set

There is a new dedicated In-Camera VFX Editor that supports a range of VP workflows. A corresponding iOS application that mirrors the editor’s functionality, but with a UI adjusted for touch interaction, is planned for release within the next few weeks. You can also construct powerful custom browser-based remote controls more quickly and easily. It is easy to imagine people developing Set Dressing tools or specialist controls for controlling interactive elements.

The new editor hosts an interface to an improved Light Card system that displays as a preview of the nDisplay wall. This allows a range of visualizations of the stage and the ability to save templates. How soon until a DOP and Gaffer turn up on set with the ‘personal custom tools’ or ‘lighting kit’ on a USB stick?

The stage can be viewed in many projections – including Lat-Long or as above.

Color corrections (CC) are critical to VP stage workflows, as they allow for perfect blending between the LED panels and the physical set. The In-Camera VFX Editor has new Color Correction Windows (CCWs) that enable adjustments to be applied to anything on the stage. They are similar to DaVinci PowerWindows (without the secondaries). They provide the ability to apply color corrections volumetrically, or via region coupled with depth regions. Epic is also apparently working with third-party CC panel manufacturers to allow a brain bar on the stage to control the CC with a Tangent CC panel.

VCam

UE5.1’s virtual camera system has been overhauled with a new underlying framework that utilizes Epic’s Pixel Streaming technology for improved responsiveness and reliability, and an updated UI with a modern camera-focused design that will be more familiar to camera operators. Pixel streaming means people can connect using any modern Web browser on their platform of choice, such as a tablet or mobile, and stream the rendered frames and audio directly from UE5. There’s no need for users to install or download anything. It’s just like streaming a video from YouTube or Netflix — except that users can also interact with the application using touch input, etc, and even custom HTML5 UI pages. The UE5.1 tools allow for much better ‘VR Scouting’ for example and incorporate new and improved UMG widgets. One can now imagine a DOP with a virtual camera, while an assistant stands beside them, focus pulling via a separate iPhone.

Animation

Now in Beta, the Machine Learning (ML) Deformer generates high-fidelity approximations of nonlinear deformers, complex proprietary rigs, or any arbitrary deformation by using a custom Maya plugin to train a machine learning model, which in turn runs in real-time in UE5.1. This enables you to simulate film-quality deformations, such as flexing muscles, bulging veins, and sliding skin.

The ML Deformer in UE5.1 approximates any complex deformation model that can be captured externally, greatly improving the quality of real-time characters. The ML Deformer internally leverages Unreal’s Neural Network Inference (NNI) system as well as the Deformer Graph system, to keep everything on the GPU.

By replacing the previous Vertex Delta Model with Neural Morph Model, trained deformations take up significantly less memory while maintaining high-quality results – often just a few megabytes of additional data for a character. The model generates blendshapes at training and drives the blendshape coefficients at runtime instead of the runtime neural net computing every vertex. This improvement brings performance gains as well.

The reality is that most VP stages will not have the bandwidth to introduce a lot of character animation into the mix, but clearly, this is what both Epic and their customers want. Pixomondo did achieve this in their very specific application of a crowd in the Caledon spot we featured recently, but hopefully, as GPUs and technology advance, more such projects will be possible.

See our fxguide story on Pixomondo’s VP crowd solution

Sequencer

Sequencer, Unreal Engine’s multi-track nonlinear animation editor has added much-requested support for constraints, including Position, Rotation, and Look-at. With these, an animator can quickly and easily create and animate relationships between any Control Rig or Actors, for example, making a camera always follow a character; keeping a character’s hands on a steering wheel; animating a clown juggling balls; or constraining a cowboy’s hips so that he sits naturally in the saddle as the horse moves, while his hands hold the reins.

Sequencer also sees additional functionality exposed through Blueprint and Python scripting, and new UI/UX for increased stability and extensibility, and to improve animation authoring and editing workflows.

 Future Glimpses

New broadcast, production, and distribution data flexibility

One of the future approaches beyond UE5.1 to achieve greater performance is to allow the inner view frustum to be generated by a dedicated card, vs the current approach of dividing the panels across the GPU cards. Using new data/digital media over IP protocols such as SMPTE 2110, and as yet unreleased features, Epic will be able to put direct and dedicated processing power where the camera sees it.

LED panels

Naturally, stage quality is driven by the quality of the LED panels themselves. Here too there are exciting improvements on the horizon, beyond simple three RGB LED screens. Current panels are not able to produce the consistency of spectral response that a team might want, especially in the reds – which is naturally where many skin tones land technically. But away from Epic’s announcement, Kino Flo has devised the Mimik VR,  which is effectively a lighting panel fed by a video signal, that can light the foreground correctly in terms of Spectral response. Shown at IBC in September, the 10mm pitch LED panels can output 10,000 nits which is 10 times the normal LED panel. Naturally, with a course 10mm pitch, they are currently more use as lighting sources than an in-camera VFX panel, but again they offer a tempting glimpse of what might be coming. The Mimik VR has a very high refresh rate (31kHz), for shooting at high frame rates ~240fps and it provides 16-bit HDR support, by synchronized Megapixel VR’s Helios LED processors.

Lumen at 60fps

One of the main indicators of future VP is the performance UE5.1 provides in non-VP gaming consoles. Below is a clip rendered with Lumen showing dynamic global illumination and reflections. It demonstrates the Nanite virtualized micropolygon geometry system, and Virtual Shadow Maps (VSM) designed currently to support games running at 60 fps on next-gen consoles and capable PCs: insanely fast-paced graphics and detailed simulations running without latency. One can only dream of this level of performance on a VP stage… but dream we all do.

(Note: the playback on our site is at 30fps)

 

More on VP at Pixomondo

Pixomondo recently discussed scanning for Star Trek: Strange New Worlds at the Unreal Fest ’22. This is not using UE5.1 but it shows their pipeline and use of UE in Virtual Production. It provides unique insights into how their team uses LIDAR and photogrammetry techniques to blend VFX and practical elements into seamless images.

 

 



[ad_2]

x