Chaos
image14-virtual-production-explained.jpg
image14-virtual-production-explained.jpg

The stage is your world: virtual production technology explained


Virtual production is transforming movie making — but what is it, and how does it work? Join us for a deep dive into the ultimate combo of CG and live-action.

 

  • Virtual production combines real-time CGI, in-camera visual effects, and live-action filmmaking.
  • Technologies like LED walls, motion capture, and virtual cinematography create immersive environments.
  • This approach is used in major productions such as The Mandalorian and Avatar: The Way of Water, and Chaos' own short "Ray Tracing FTW."


Virtual production is a hot topic in filmmaking — but it’s also an evolution of movie magic that’s as old as the (Hollywood) hills. If you’ve seen a recent movie or television series, big or small, there’s a high chance that it’s used virtual production technology for some of its shots.

As its name suggests, virtual production is the coming together of two components: virtual computer-generated three-dimensional places, props, and even characters, and the production process of making a movie or television series, which is the point at which actors and scenes are filmed. This virtual production combo allows filmmakers to explore and capture computer-generated (CG) worlds with the same techniques they’re used to in real-world shoots.

image11-virtual-production-explained.jpg
The virtual production stage for "Ray Tracing FTW"

The evolution of virtual production

As we’ll find, the virtual production process has its roots in some of the earliest techniques in filmmaking. The term virtual cinematography was first coined by John Gaeta, the VFX supervisor on The Matrix, to describe that film’s use of computer-generated imagery (CGI) to create a digital world with photorealistic sets and characters.

From here, computing power increased exponentially, and techniques developed that enabled better-quality visual effects. Real-time software and game engines, such as Chaos Vantage and Epic Games’ Unreal Engine, took advantage of leaps in hardware to bring photorealism to interactive rendering. At the same time, movies such as Gravity and Oblivion used computer-controlled light banks and projected backdrops to blur the lines between visual effects and practical sets.

Inevitably, real-time engines found their way into the Hollywood film production process. Completely CG animated movies and movies with many effects shots, such as The Lion King and Avatar: The Way of Water used virtual reality and virtual cameras to set up and interact with 3D environments and characters as if they were on a real set using real cameras, as well as using performance capture to help actors become animals or aliens in real-time.

The final piece of the puzzle is in-camera visual effects (ICVFX). As the name suggests, these are CG visual effects shots that are captured in-camera. We’ll get into this more later, but by coupling powerful software and hardware with state-of-the-art LED walls, it’s possible to generate realistic, interactive backdrops with which directors, cinematographers, and actors can interact. Examples of TV series and films that have used a virtual production pipeline include The Mandalorian, The Batman, Dune: Part 2, and The House of the Dragon.

Why is virtual production important?

Virtual production solutions, particularly ICVFX, can dramatically alter VFX workflows. A traditional production process involves shooting against a colored screen, typically green or blue, which contrasts against actors’ skin tones. The screen is removed from the shot via a process called chroma keying, and replaces them with CGI in post-production.

Green- or blue-screen production processes produce seamless results, and they’ve become the dominant way to create VFX shots, giving filmmakers and VFX teams incredible flexibility and control over the minutest details. However, they can also result in long production times and drawn-out conversations between VFX supervisors, directors, and cinematographers as they render and re-render shots.

ICVFX turns this process on its head. By shooting on a stage with an LED wall (which can be flat or slightly curved) or an LED volume (which is more immersive), VFX supervisors and artists can create virtual backdrops to film against. What’s more is that they can make changes in the digital world that would be impossible on a practical shoot, such as the position of the sun or the location of a mountain.

image3-virtual-production-explained.jpg
Backstage alterations during the shooting of "Ray Tracing FTW"

Directors and cinematographers are given unprecedented flexibility, while actors have a better idea of their location, and aren't just stuck in front of a green screen. But, just like real-world shoots, virtual production requires detailed planning.

By carefully managing the color and density values of the volume with the live foreground action, I was able to bake in the exact look I wanted. And since there was no guesswork to burden the special effects supervisor, the DI (Digital Intermediate) process was equally invigorating.

Richard Crudo, Director of Photography, “Ray Tracing FTW”

Pre-production planning in virtual production


Pre-production is the phase in the filmmaking process when all the most important decisions are made. In addition to arranging everything from the cast to the catering, each shot is meticulously storyboarded and visualized based on the intentions of the screenplay, with input from the director, set designer, cinematographer, and VFX artists.

image1-virtual-production-explained.jpg
"Ray Tracing FTW's" pre-production included building an entire Western town to serve as the virtual set

In virtual production, this previsualization step is roughly the same, but the construction of the “set” takes place in a virtual environment instead of a physical space. This is handled by a specific team, called the virtual art department, which collaborates with everyone from VFX supervisors to carpenters to ensure a seamless blend of digital and real.

When the set is created in digital space, virtual scouting can take place. Via this process, filmmakers can explore the space with virtual reality headsets and work out the best angles and shots, just like they would with real-world locations. Movies such as The Jungle Book and Avengers: Endgame used real-time engines for on-set previews of how green screen shots would look in the finished movies.

 

Main virtual production techniques


Use of LED walls

Without good-quality LED screens, ICVFX would fall apart. These fundamentally rely on the same technology that displays imagery on your TV, monitor, or smartphone, with a few key differences:

  • No bezels: LED stages are created by building literal walls or volumes of multiple panels that seamlessly link together. Bezels are obviously a no-go in this environment, so LED screens feature edge-to-edge displays. 
  • Sturdy and versatile: The panels also have to be tough enough to withstand temperamental directors’ boots and versatile enough to mount in various scenarios, from large flat walls to cavernous volumes, where they can form part of the wall or ceiling.
  • Increased brightness: Film sets are surprisingly bright places. A specialized LED screen is much brighter so it's visible on camera, even with studio lights in front of it, and so it can cast realistic lighting and reflections on physical actors and props. It also has a matte finish to avoid unwanted reflections on the screen itself.
  • Excellent viewing angles: Have you ever noticed how cheaper monitors can look washed out when viewed from a certain angle? This effect would ruin a virtual production, so most LED panels are designed to be viewed from any angle.
  • Low pixel pitch: Likewise, the distance between pixels on the screen (known as pixel pitch) must be as low as possible. If the pixel pitch is too high, there’s a risk of a moire effect when the camera is close to the screen, which can cause unwanted wavy or shimmering lines. Today, 1.8mm is considered a good pitch — but panels used in the production of Tomorrowland in 2015 featured a pitch as high as 11mm!
  • Color accuracy and calibration: Spot-on color reproduction is essential for seamlessly blending the LED wall or volume with real-world props and actors. For instance, a virtual desert shot might use real sand on set, so it’s crucial that the colors and lighting match accurately.


Rear projection and virtual sets

Rear projection is nothing new. Those unconvincing shots of actors driving cars against shaky backgrounds in old films made use of this technology by placing a car in front of a projector and replaying footage of a road. Fortunately, it’s come a long way since then — but the fundamental principle is the same.

In an ICVFX production, a car can be placed in front of a moving CG landscape to create a convincing illusion that the vehicle is in motion. However, unlike old rear projection systems, the angle, lighting, and content of the backdrop can be changed if necessary. We used this technique to create the moving steam train footage for “Ray Tracing FTW.”

image13-virtual-production-explained.jpg
The "Ray Tracing FTW" train carriage set

Camera tracking and motion capture

Virtual production enables a new dimension of rear projection — literally. Because virtual environments are created three-dimensionally, they can be synchronized with the camera, enabling parallax movement. If the camera operator moves the camera up, down, or side-to-side, the background moves as it would in reality, with assets closer to the camera moving faster and distant elements moving more slowly. 

image7-virtual-production-explained.jpg
Shooting "Ray Tracing FTW"

Like the best magic tricks, this illusion is possible thanks to some really clever technology behind the scenes — and deft timing. To synchronize the on-screen display with the camera’s position, the camera itself has to be tracked. There are myriad options for tracking the camera:

  • Vicon and OptiTrack are motion capture systems that use reflectors or infrared LEDs, in conjunction with infrared cameras, to track the camera. This is known as outside-in tracking.
  • NCAM, Mo-Sys and stYpe work the other way around, using an infrared sensor attached to the camera itself in conjunction with reflectors attached to the ceiling and walls of the set. These are referred to as inside-out tracking systems.
  • HTC VIVE Mars Camtrack and SteamVR Tracking repurpose virtual reality headset sensors for use in virtual production.
  • Lens metadata, such as focal length, focus, and zoom, can be captured via specialized systems such as Cooke /i Technology or ARRI LDS, and it’s built into stYpe, EZTrack, Mo-Sys, and more.


Each suits different budgets and setups, but they all essentially do the same thing: pass the rendering engine the position of the camera within a CG space so that the screen itself matches the viewpoint of the camera.

The rendering engine output is sent to a media server, such as Brompton Technology or Disguise, which translates it onto the displays. Because the displays have to operate at high frame rates (60hz or above) to ensure smooth playback, it’s critical that everything is synchronized to the split-second, which is where a genlock system needs to be used.


Real-time camera visual effects

Once they’ve lined up their perfect angle, directors, cinematographers, lighting designers, and CG supervisors and artists can tweak just about anything in the virtual world. Want to switch from spooky dusk to harsh midday sun? Or add a layer of snow to the environment? Or literally rearrange a city? With virtual production, it’s easy, and it reduces the need for extensive post-production.

image9-virtual-production-explained.jpg
Making live changes to the "Ray Tracing FTW" set in Chaos Arena

Virtual production tools

Unreal Engine and its role in virtual production

Unreal Engine has become a go-to tool for virtual production. Initially developed for videogames (it’s the same engine that runs Fortnite), Unreal Engine’s photorealistic output, limitless customization and interactivity, realistic physics and particle effects, plus integration with popular tools, have made it a solid choice.

Virtual production is only possible thanks to powerful hardware and software. Unreal Engine’s ability to output the vast, high-resolution imagery required for ICVFX is down to NVIDIA's nDisplay, which uses multiple synchronized graphics processing units (GPUs) to distribute the rendering job.


Other game engines used in virtual production

Unreal Engine rules the roost in virtual production, but Unity, a competing game engine, is also used in more niche applications such as online streaming. Unity doesn’t pack the same feature set as Unreal Engine, but it’s generally considered easier to use. In addition, Industrial Light & Magic, the innovative VFX studio behind Star Wars and Jurassic Park, has its own StageCraft system, which runs on its bespoke Helios rendering engine.

Another option is Chaos Arena. Built on the photorealistic rendering technology Chaos has spent 20 years developing for the VFX industry, Chaos Arena’s advantages are that it works seamlessly with the digital assets VFX artists have already created, and its real-time ray tracing produces physically accurate results.

image8-virtual-production-explained.jpg
The street set in Chaos Arena

We’ve been developing Chaos Arena to change all that, so everyone from the artist to the DP can stop thinking about the technology, and just get back to the natural rhythms of filmmaking.

Christopher Nichols, Director of Special Projects, Chaos Innovation Lab

Digital asset creation and management

Optimizing digital assets when using game-engine-based virtual production systems such as Unreal Engine is critical. Objects and environments can be created in conventional 3D rendering software such as 3ds Max, Maya, Houdini or Blender, but they must be fine-tuned to work efficiently in a rasterized engine without diminishing frame rates. The poly count must be kept low, UVs should not overlap, and the level of detail (LOD) should be set according to the item’s visibility.

With Chaos Arena, on the other hand, most assets can be used directly from digital content creation software. This saves time and means that the same asset can be used for both virtual production and post-production — whether it’s a single fire hydrant or an entire city.

image5-virtual-production-explained.jpg
The background building was taken from an online store and dropped into Chaos Arena in a matter of minutes

Just like any other production involving VFX, assets should be categorized and documented accurately so that it’s easy for multiple teams to find them, but fast virtual production workflows often require quick changes and more iterations.


Post-production techniques in virtual production

One of the biggest benefits of virtual production, particularly ICVFX, is the reduction in requirements for post-production. Instead of shooting against a green or blue screen and then waiting weeks or even months for VFX artists and compositors to create and polish shots, they’re almost ready when they come out of the camera. These shots are good enough for editors to work with and require less post-production work.

image6-virtual-production-explained.jpg
A shot captured on-set with Chaos Arena

However, that doesn’t mean post-production is not necessary at all. Computationally expensive effects such as fog, rain, or explosions are often enhanced or added in post, and sometimes the live-action footage capture in production is married to a new CG backdrop. This process very similar to conventional VFX workflows, making use of rendering engines such as V-Ray to deliver high-def imagery.

When motion capture is used, animators are also required to clean up or alter data. And while these productions are “filmed” using real-time game engines and virtual cameras, the final shots are rendered by taking the captured data into 3D software such as Autodesk Maya and then rendering it in an offline rendering engine like V-Ray.


The benefits of virtual production techniques

Cost efficiency

While constructing or hiring an ICVFX stage can be costly, it can save money overall. Instead of flying cast and crew to far-flung locations, they can be built within a virtual production environment. Props, set decorations, background vehicles, aircraft, and even spaceships can also be added with a click of a mouse, avoiding the costs associated with procuring, shipping, and customizing these items.


Time efficiency

Imagine being able to shoot on a city street without waiting for permits. Or capturing the exact icy hue of a South Pole research station exterior while you’re actually in an air-conditioned volume in midsummer LA. Or switching a set from a dusty interrogation room to the shoulder of Orion instantaneously, without waiting for the crew to dismantle and rebuild the set.

With virtual production, artists can also design multiple versions of the same asset, so it can switched out instantaneously when filming begins. 

With virtual production, the time savings can be as big as the cost savings.

image14-virtual-production-explained.jpg
The "Ray Tracing FTW" set

Environmental impact

It’s no secret that the traditional film production and industry has an enormous carbon footprint — but, thanks to associations such as the Sustainable Entertainment Alliance, productions are taking steps to reduce their environmental impact. Virtual production can be part of this process; according to virtual production company Quite Brilliant, shooting on a virtual stage produces just 0.74 tonnes of carbon versus a whopping 95 tonnes for a practical shoot.

Another benefit is that real-world locations can be accurately rebuilt in 3D, a process made even easier using 3D Gaussian Splatting. In turn, this reduces environmental damage caused by film crews and allows productions to be shot in ecologically delicate protected areas — albeit virtually.


Improved health and safety

Related are the safety advantages of virtual sets. Want to shoot next to a volcano? Or in outer space? Virtual production makes it easy to place your A-listers in precarious positions without physically endangering them. Virtual production can also create dangerous effects, such as explosions or car crashes, without requiring stunt and special effects crews.

image4-virtual-production-explained.jpg
No VFX industry legends were harmed in the making of this shot

The other advantage of virtual production stems from its initial success during the COVID-19 pandemic. Hopefully, we won’t face a viral outbreak on that scale again, but if we do virtual sets with small crews can minimize the risk of transmission, and fewer trips equal less exposure to public spaces such as airports and hotels.

 

The future of virtual production

Virtual production is here to stay. The number of specialized LED stages around the world increased from just three in 2019 to over 300 in 2022, and the barriers to entry cutting edge technology are continually lowering.

“[Any] independent filmmaker has enough literature right now to build their own wall and test things out.”

Kathryn Brillhart, Virtual Production Supervisor & Cinematographer

Technology is also improving rapidly, and we can expect to see AI creep into virtual production workflows. Chaos’ Arena already uses NVIDIA’s AI upscaling technology to create photorealistic virtual environments at the high framerates required in virtual production.

image2-virtual-production-explained.jpg
A virtual production shot from "Ray Tracing FTW"

Potential applications in other industries

Virtual production has scope far beyond the world of cinema, and similar techniques are currently used for news and weather broadcasts. Imagine being able to step into a mock-up of your new Manhattan apartment and see how the view and lighting will change in any given season. Or attending a concert where the visuals dynamically change on the artists’ whim. Theme parks already use large screens to transport visitors to fantasy and science-fiction landscapes, but the technology used in virtual production could make these more interactive and less like on-rail experiences.


The Takeaway

Now, you should have a good idea of how virtual production works — and how it’s changing Hollywood. 

The virtual production revolution has only just begun, and filmmakers are only just starting to tap into its limitless potential. For a new generation of Hollywood filmmakers who have grown up playing in virtual spaces, these production methods will feel like a natural extension of the virtual worlds that they’ve loved exploring, and the stories that are told with virtual production are likely to be as inventive and playful as the tech itself.


FAQ

What is the difference between virtual production and VFX?
Virtual production is a groundbreaking method that merges live-action footage with CGI in real-time, utilizing technologies such as ICVFX, motion capture, and virtual cinematography to create immersive environments directly on set. With ICVFX, realistic backdrops are displayed on LED walls or volumes, allowing actors and directors to interact with digital worlds as though they were physical. Motion capture translates actors' movements into digital characters, while virtual cinematography provides tools to frame and "film" virtual environments naturally. In contrast, VFX remains rooted in post-production, where CGI is painstakingly added to pre-filmed footage, requiring more time and less on-set collaboration.

What are the stages of virtual production?
Virtual production consists of three main stages: pre-production, production, and post-production. In pre-production, digital environments and assets are built by the virtual art department, with motion capture rehearsals and virtual cinematography helping to plan and visualize each shot. During production, live-action elements are filmed against LED walls using ICVFX, seamlessly blending physical and virtual worlds. Motion capture systems enhance performances by turning actors into digital characters in real-time. Post-production fine-tunes these elements, combining motion-captured data with final renders from virtual cinematography to deliver visually stunning results.

What is real-time virtual production?
Real-time virtual production leverages ICVFX, motion capture, and virtual cinematography to create dynamic environments that interact with live-action performances during filming. LED walls display virtual settings that adapt in real-time to camera movements, thanks to advanced tracking and rendering technology. Motion capture brings digital characters to life by capturing and projecting actors' movements instantly, while virtual cinematography lets directors explore and adjust virtual scenes with the same creative freedom as physical sets.

Virtual production just got real.

Explore Project Arena.
Henry-Winchester-profile-pic.jpg
About the author

Henry Winchester

Before becoming Chaos' content marketing manager, Henry contributed to magazines and websites including PC Gamer, Stuff, T3, ImagineFX, Creative Bloq, TechRadar, and many more. Henry loves movies, cycling, and outrageously expensive coffee.

Originally published: December 16, 2024.
© Škoda Design

Subscribe to our blog.

Get the latest news, artist spotlight stories, tips and tricks delivered to your inbox.

By submitting your information you are agreeing to receive marketing messages from Chaos. You can opt-out at any time. Privacy Policy.

Chaos
© 2024 Chaos Software EOOD. All Rights reserved. Chaos®, V-Ray® and Phoenix FD® are registered trademarks of Chaos Software EOOD in Bulgaria and/or other countries.