This document has been complied to help students participating on the Erasmus+ co-funded programme EMEX (Emerging Media Exploration). Grant number: 2018-DE01-KA203-004282.
“Virtual production removes the barriers between live production and visual effects, so that they can occur simultaneously instead of inline.”
Kevin Baillie Visual Effects, Executive Producer
Virtual production is a broad term referring to a spectrum of real-time computer-aided production and visualisation film-making methods.
It's certainly not new! Directors have been calling upon computers to aid and visualise aspects of productions since the late 1990’s.
Some notable directors (and their productions) that pioneered the use of real-time computer-aided production and visualisation:
- George Lucas (Star Wars: Episode I)
- Steven Spielberg (A.I.)
- Peter Jackson (Lord of the Rings)
- Robert Zemeckis (The Polar Express)
- James Cameron (Avatar)
Large budget films today are exceedingly complex productions featuring many moving parts on highly compressed schedules.
The process is typically a linear affair, one that resembles an assembly line, encompassing development, pre-production, production, and post.
Iteration is challenging and costly while development efforts are frequently siloed off from one another, sometimes with incompatible resources.
For filmmakers, this problem manifests itself as uncertainty.
When a cinematographer has to take a best guess at the colour of their lighting to match an unseen green screen element, or a director doesn’t exactly know what the creature at the climax of their movie actually looks like, that’s uncertainty.
It’s an overall process that feels disconnected and doesn’t resolve itself until all shots become finalised.
For editors, when they are attempting to cut on action and timing for a sequence featuring in-progress visual effects, that’s uncertainty.
The arrival to a final version of a scene is often a long road involving replacement of temporary or missing shots, adding score, colour correction, and so on.
Achieving shot-to-shot continuity and fluidity is challenging with so many components in flux.
Scale & Scope of Production
Scale and scope of production can also be affected by traditional visual effects workflows.
Although streaming and broadcast network series production values have become increasingly ambitious in recent years, they can’t always match the complexity of major theatrical releases.
This is due in part to cost but also to the realities of network release schedules, which only allow for a certain degree of spectacle to be created within the allotted time frame.
Put simply: time, cost, continuity, shared vision and a linear production flow are all problems…
ENTER VIRTUAL PRODUCTION
Virtual Production encourages a more iterative, nonlinear, and collaborative process.
It empowers the filmmakers to collaboratively iterate on visual details in the moment, not deferring all of these decisions to post.
Iteration begins much earlier in the production schedule. With a real-time graphics engine, high-quality imagery can be produced from the outset.
Instead of different teams creating incompatible assets siloed off from one another, assets are cross-compatible and usable from pre-visualisation through final outputs.
For filmmakers, the uncertainty of traditional pre-production and visual effects production are replaced with working imagery far closer to final pixel.
And because this high-quality imagery is produced via a real-time graphics engine, iteration and experimentation are simplified, cost-efficient, and agile.
The process feels much more connected and collaborative…
When it comes to editorial, virtual production also alleviates uncertainty by providing provisional imagery much closer to its final appearance and helping to eliminate missing or in-progress shots.
When shots that might have been previously photographed in front of a green screen are replaced with in-camera, LED wall visual effects, the editor has much more to work with…
… It becomes possible to edit shots and sequences featuring major visual effects the same way as traditional non-effects scenes.
VIRTUAL PRODUCTION TYPES
Virtual Production can be characterised as 4 main approaches:
- Performance Capture
- Hybrid Green Screen Live
- Full Live (LED) Volume
Visualisation is probably the virtual production use case most filmmakers are already familiar with and have employed.
Visualisation can be thought of as prototype imagery created to convey the creative intent of a shot or sequence.
Visualisation can take the form of: pitchvis, previs, virtual scouting, techvis, stuntvis, and postvis.
The Lion King example looked at earlier is a key example of the multiple forms visualisation can take in the same production…
Performance capture is the process of recording the movements of objects or actors and using that data to animate digital models in real-time. When it includes the actor’s face and more subtle expressions, it’s often referred to as performance capture.
Body capture (or motion capture) is accomplished by the actor wearing a suit covered in markers that are tracked by special cameras or a suit with built-in sensors.
Facial capture involves using either depth-sensor cameras for “markerless” facial capture, or tracking markers drawn directly onto the performer’s face.
Hybrid Green Screen Live
Hybrid virtual production is the use of camera tracking to composite green screen cinematography with CG elements.
This is created either as a live preview for the DoP and the camera operator, and completed in postproduction, or is intended as final pixel in camera.
This type of virtual production has been used for a while in live broadcast, especially in sports, but has also proliferated increasingly into feature and episodic production.
The two primary modes of hybrid virtual production are real-time (virtual studio) and post-produced (virtual set or set extension).
Full Live (LED) Volume
The use of image output from real-time engines to a live LED volume (sometimes also called, wall) in combination with camera tracking to produce final-pixel imagery, completely in camera, represents the state of the art for virtual production.
The benefits of live imagery projected behind the actors are numerous. In some respects, it’s also the culmination of all of the previous development work done in the sphere of virtual production.
What’s more, working with final pixel in camera could remove the need for post-production visual effects in certain shots entirely…
This type of production, whilst relatively new and very expensive, did come to the forefront of media attention with the arrival of the Disney+ Star Wars spin-off series: The Mandalorian.
The following video explains how LED screen volumes are shaping the future of visual effects-heavy productions by allowing the composition of a set and live action be composited in real time. It also explains why this approach is such a huge advantage over the traditional method of shooting live action in front of a green screen:
On top of the benefits already covered, there are some less obvious, but still pertinent reasons a virtual production workflow is beneficial:
- Asset creation
- Real-time engine
- More malleable tools
- Remote/multiuser collaboration
- Real-time physics
- Real-world camera movement
- Asset tracking and data management
- Avoiding budget creep
THE COST OF VIRTUAL PRODUCTION
A recent study by On Set Facilities (a company dedicated to systems design for VP), provided ballpark costs for a studio wishing to get implement a VP solution – based on the assumption you have a physical studio and most filmmaking equipment to begin with.
They categorised the costs based on the 3 main VP approaches (explored earlier minus Visualisation)…
- Real-time in-camera VFX - LED wall stages, real-time reflections and dynamic real-time lighting = $3,000,000
- Real-time Augmented Production - Real-time compositing live actors with augmented reality sets and graphics. $300,000
- Real-time Virtual Production - Real-time animated digital humans, Ai characters, virtual sets and VFX. $30,000
So why has virtual production started to proliferate a role within so many productions and media industries recently?
Is the quality of the visual output REALLY good enough to challenge what’s possible with traditional visual effects?
There is still one last hurdle for real-time graphics to overcome to be a viable technology to compete with traditional CG rendered VFX…
Ray-traced global illumination, shadows, reflections and ambient occlusion are still what limits the quality of real-time graphics…
… until recently…
The video below is a technical breakdown of how they managed to implement the above running in real-time at an acceptable frame-rate:
I’M INTERESTED, HOW DO I GET STARTED?
As a filmmaker, you would probably want to team up with other students who already have a grounding in real-time graphics and game engines…
If you wanted to start exploring on your own, Unreal Engine is FREE, Quixel Megascans (and tools) is FREE, both of which are incredibly well-supported in terms of tutorials, training, communities and resources.
Regarding the equipment and studio accessories, a very budget setup can be achieved for around £5,000 (€5,473). For this, you could achieve basic live composite green screen for set extension.
The video below demonstrates what a reasonably modest setup can yield. The author's channel is an excellent resource for getting started with the Hybrid Green-screen Live type of VP.
His latest video demonstrates (with a lot of depth of field blur) what can be achieved with a simple setup and free assets from the Unreal Engine Marketplace:
Additional Resource (NEW)
To help support students on the EMEX project, a YouTube channel called Virtual Production Academy has been created and populated with content to help beginners and enthusiasts to help setup a Virtual Production studio with no budget at all. The series then looks at what kit to add and why until there's a reasonably competent Green Screen Live Hybrid setup whilst taking into consideration available budget, resources and space:
The first video from the playlist for this series: