- Key considerations
- Implementation guidelines
Rendering images and video with the Unreal Engine for non-interactive use is becoming increasingly popular in industries such as automotive, architectural visualisation, and film and television. Unreal Engine containers make it possible to perform these rendering tasks at scale in the cloud.
You will need to start the Unreal Editor with the
-RenderOffscreencommand-line flag in order to perform offscreen rendering inside a container.
The Unreal Engine does not currently support raytracing using Vulkan on Linux. If your use case requires real-time raytracing then you will need to use GPU accelerated Windows containers.
Rendering video frames with Sequencer using Vulkan on Linux renders correctly but currently prints a series of non-fatal errors to the Engine’s log output. These errors do not appear to interfere with the quality of the rendered output.
If you are rendering individual images on demand and transmitting them to users in response to incoming requests then you might be interested in exposing this functionality directly via an Unreal-powered microservice.
Rendering individual images
The framebuffer can be captured programmatically from within the Unreal Engine itself, either via the HighResShot console command or the classes from the MovieSceneCapture module. How you choose to trigger the image capture will depend on the specific details of your use case.
Rendering video frames
Cinematic sequences created using the Unreal Engine’s Sequencer system can be rendered to a set of video frames from the command-line using the flags described in the Command Line Arguments for Rendering Movies page of the official Unreal Engine documentation. These same flags also work inside Unreal Engine containers, albeit with the necessary addition of the
As an example, the following commands can be run from your Unreal project’s root directory (the one containing the
.uproject file) to render a video sequence inside a GPU accelerated Linux container: