

Over the past few decades, film production has become increasingly reliant on computer generated imagery. We have been able to create cinematic marvels that in the past would have been unimaginable, but it has also resulted in sets that are barely recognisable to the final cut. That's why the recent development in virtual production is changing the game. With new technology, filmmakers can now see in real-time what a shot will look like in its final form.
This innovative approach holds huge potential across film, TV, advertising, and live events. In this guide we explore why virtual production is growing in popularity and how it's switching up the whole production process.
Virtual production is a modern filmmaking technique that utilises an LED wall to display a digital background, seamlessly blending physical and digital elements in real time during the production process. Essentially, it allows real-world filmmaking and digital effects to occur simultaneously.
It has its history in old techniques like rear-projection, where an image was projected onto a scene from behind. This method used to allow driving scenes to be done without complicated shooting on-location, but shooting angles were severely limited.
The green screen is a great tool to enable editors to add visual effects and digital backgrounds in post-production.
However, the major downside of using the green screen technique is it forces the on-set cast and crew to imagine what the final shot will look like in their mind's eye. This makes many elements like framing and lighting shots tricky.
So how does virtual production solve this problem? The green screen is replaced by a huge LED screen which shows a photorealistic 3D world, allowing everyone on set to see what is happening in the background in real-time.
Virtual production changes the traditional filmmaking workflow by bringing post-production thinking much earlier into the process. Instead of waiting until filming has wrapped to see how visual effects will look, filmmakers begin designing digital environments during pre-production. These environments are then rendered in real time and displayed on LED walls during the shoot, allowing directors, cinematographers and actors to see the final composition as scenes are filmed.
Because the virtual background responds live to camera movement and lighting changes, creative decisions can be made on set rather than months later in post-production. This shift encourages closer collaboration between departments such as VFX, art direction and camera. This means fewer surprises later in the pipeline and a much more efficient production process overall.
Virtual production relies on a combination of cutting-edge technologies to seamlessly blend physical and digital elements during the filmmaking process. Some of the key technologies involved in virtual production include:
Real-time rendering engines: The virtual background that is displayed on the LED wall is generated in a real-time rendering engine. The most popular 3D real time creation tool is Unreal Engine, developed by Epic Games (and originally used for building video games).
The software uses Nanite, an engine that allows for photographic sources to be imported into its 3D environment. This results in near photo-realistic backgrounds that don't need to be built from scratch.
Virtual cameras: Virtual production requires on-set video cameras to be synched with virtual cameras within Unreal Engine. The camera's movement is then tracked while filming so the background shifts realistically in relation to the subjects.
3D scanning and photogrammetry: These technologies are used to capture real-world objects and locations in 3D, which can integrate seamlessly into the virtual environment.
Motion capture: Motion capture technology captures actors' movements and facial expressions using specialised suits, markers, and cameras. You can use this data to animate digital characters or avatars.
Interactive lighting: Virtual production studios can simulate realistic lighting conditions within the virtual environment. This ensures that virtual elements and live-action footage are lit consistently and convincingly.
VR, AI and pre-visualisation tools: Virtual reality can be utilised for pre-visualisation, allowing filmmakers to explore and plan scenes in a virtual environment before actual production begins.
AI can be used for tasks like character animation, crowd simulation, and procedural generation of assets, enhancing the efficiency of virtual production.
One groundbreaking element of the LED screen is its use as a light source. The lighting is far truer to a real outdoor environment because the screen emits light which reflects naturally onto the subjects and props etc on the set.
The challenge with a green screen on the other hand, is the cinematographer needs to light the actors as closely as possible to match the background that will be added in post-production.
The LED screen excels at soft, diffused lighting but unfortunately it does have a hard time recreating natural sunlight. You'll notice scenes from virtual productions like 'The Mandalorian' are mostly overcast or shot at sunrise and sunset when the lighting is soft.
The other lighting advantage from virtual production is the level of control filmmakers have over lighting conditions. For example, sunrise and sunset happen very quickly in real life and therefore scenes often have to be shot over multiple days. However on a virtual production set a sunset scene can last for as many hours as needed.
A crew can switch multiple sets within one shooting day on a virtual production. They can pack in so much more filming time than if they had to travel to multiple film locations.
Virtual production studios often have revolving stages too which can mean less setup time. Instead of physically moving the cameras to get a different angle, they can simply move the set with the push of a button.
Not having to worry about keying out a green screen also means the set designers can utilise atmospherics on set like rain or smoke machines. The advantage of capturing atmospherics live is they interact with the lighting and set realistically.
SuperScout Showcase lets you share your amazing locations across the internet, with a personally branded website that lets you build and show off your collections!

Virtual production can reduce costs by eliminating the need for extensive physical sets, location scouting, and travel. It streamlines production workflows, making film budgeting more predictable.
Virtual production often results in more predictable budgets since many costs are incurred upfront, reducing the likelihood of unexpected expenses during post-production.
Filmmaking is a long process that usually resembles a factory production line, starting at pre-production then moving on to production, and then post-production.
Unfortunately, this system is inefficient. Any mistake made in production increases the time spent in post-production, for example, if the director cannot imagine the VFX creature meant to be in the shot, he may shoot the scene incorrectly.
Real-time rendering and live compositing in virtual production can significantly reduce post-production time, allowing filmmakers to finalise shots more quickly.
Filmmakers can see the final composition of scenes as they are being filmed, enabling on-set adjustments and informed decision-making, which often leads to improved shot quality.
Virtual production studios offer unprecedented creative freedom. Filmmakers can experiment with different visual elements, camera angles, and lighting setups in real time, fostering innovation and unique storytelling approaches.
Unpredictable weather conditions, location availability, and other external factors can disrupt traditional shoots. Working in virtual production studios minimises these risks, providing a controlled environment for shooting.
Virtual sets can be more environmentally friendly by reducing the need for extensive travel, materials for building physical sets, and other resource-intensive processes.
One of the biggest challenge of this series was to create realistic alien planet backdrops for the world of Star Wars. The Mandalorian creator Jon Favreau states “With Star Wars you are building on a rich legacy of innovation”.
Their solution came in the form of a virtual production stage known as "The Volume" where the sets felt like a three dimensional environment — because they were 3D. The Mandalorian was one of the first projects to use this type of virtual production.
Traditionally, the majority of the sets would have been added in post-production, but in this situation the team were considering the set design at the same time as filming which was a completely new way of working. The team had to pull together a background and foreground that lived together harmoniously on The Volume.
The timeframes were much faster than traditional green screen techniques. They discovered that a scene created in pre-production was able to be put on the screen within 24 hours of being finalised.
The idea to use virtual production for this European Netflix series came out of the struggle to shoot with an international cast and many locations during the pandemic. The first lockdown occurred just as they were in the writing process and it was obvious it was going to be impossible to travel and shoot on-location.
The team didn't take the idea of using virtual production technology seriously at first because they were traditional filmmakers and hated working with green screen. But it turned out using virtual production allowed the writers so much more freedom to come up with storylines and scenes that they would not have been able to shoot the same way if they were using traditional filmmaking methods.
Changing the order of the process and pulling post production forward was new to everyone working on the show. The VFX team had to start working on the digital backgrounds before there was even a finished script. It required constant communication between the set design teams and the VFX teams and they did many virtual scoutings to decide on set placement and shooting angles.
Their stage was a 350 square metre shooting area with 270 degrees of shooting angles, and an industry-first revolving set. They were also able to use a rain rig to create real weather conditions, something that is impossible to do with green screen because atmospheric effects only distort the screen and ruins the video for post-production. The ship scenes looked so real that lots of the actors felt seasick watching the waves in the background.
The show used a total of 30 different virtual sets which made up 30% of the season. They were able to do that all out of Germany and ensure the shooting schedule was reliable.
Matt Reeves, the writer/ director of The Batman, believes that to make a production extra special you need as much in the shooting frame as possible to be "real". I.e the outcome will be much better for the viewer when the film is shot in-camera. But how do you invent Gotham for real and avoid anything that looks like bad VFX?
The city is practically its own character in the movie so the crew knew they needed to get the look and feel just right. In this case, virtual production could do what a green screen couldn't.
Creating cityscapes on a virtual stage was a new challenge but it paid off. One of the big advantages they discovered on The Batman virtual production stage was the ability to essentially freeze time. They could create a golden hour sunset and have it last as long as they needed to film the perfect shots.
During car chase scenes the actors also appreciated being able to act in cars on the stage while the real background played out around them on screen. It allowed them to react with perfect timing and at the right eye level to what was coming at them.
Despite its many advantages, virtual production does come with challenges. The upfront planning required can be intense, as digital environments often need to be developed well before shooting begins. This requires strong communication between creative and technical teams, as well as a willingness to lock certain decisions earlier than in traditional workflows.
There are also technical limitations to consider. LED walls excel at soft, ambient lighting but still struggle to convincingly recreate harsh midday sunlight. Additionally, virtual production stages require specialised facilities and experienced crews, which may not be available in all regions. Understanding these limitations is key to deciding when virtual production is the right tool for the job.
As technology continues to advance, virtual production is set to become a standard part of the filmmaking toolkit. Improvements in real-time rendering, LED resolution and camera tracking are already expanding what’s creatively possible, while costs are gradually becoming more accessible to a wider range of productions.
Beyond film and television, virtual production is also influencing advertising, live events and immersive experiences. While it won’t replace traditional location shooting entirely, virtual production is reshaping how stories are planned, filmed and brought to life.
Virtual production is reshaping how film and television are made, but it isn’t removing the need for real-world locations — it’s changing how they’re used. As productions increasingly blend physical sets with digital environments, the role of location scouting becomes less about finding a single perfect place and more about understanding how locations function within a wider production ecosystem.
As virtual production continues to evolve, collaboration between location teams, production designers and VFX departments will become increasingly important. Rather than replacing traditional scouting, virtual production expands the creative toolkit. It offers new opportunities to plan smarter, shoot more efficiently and tell richer visual stories.
SuperScout is your own private location library – upload locations in minutes, tag them with ai in seconds, then search and share with your team

