by James Delhauer
Virtual production is an exciting new area of our industry. In the last five years, studios have invested heavily in stages and infrastructure that revolve around integrating on-set performances with simulated environments in order to capture cinematic experiences in-camera. With this technology, an actor can be transported anywhere in the world or even to worlds beyond. It has been a game changer that has enabled extraordinary productions like 2012’s Oblivion, 2019’s The Mandalorian, and 2022’s Avatar: The Way of Water to deliver stunning images that enable truly immersive performances for viewers.
I had the opportunity to sit down with Rene Amador, CEO of ARwall, an award-winning XR technology and service company specializing in virtual production.
Tell me a little bit about ARwall. What sort of work do you specialize in within our industry and what sorts of products/services do you offer?
We’re a combination of filmmakers, game developers, and visual effects artists. So, you can approach us and we’re able to address XR and virtual production development needs, creative needs, and production needs. We’re trying to be at the intersection of those points, so we have three different divisions.
The one that we’re most well-known for is just ARwall Production Services. We specialize in XR LED virtual production, which is an in-camera effect. In-camera effects basically are the term that we now use for rear projection or process shooting. When you add XR virtual production to in-camera, that refers to the use of camera tracking and real-time backdrops that shift as the camera moves and can also be manipulated in real time. For example, the position of the objects, the color of the objects, the lighting, the time of day, the camera angle, that type of thing. These are all crucial to creating the mise-en-scène of a shot.
Another division is where we sell software products online. You can use the exact same plugins and software that we use on productions. I’m pretty sure we’re the only team that does that in the industry. We actually use the tools that we make.
Our newest division is Glowcraft Films, which is an escalation of the production services side of our business. Many of our clients need help getting projects off the ground because they’re new to the game of virtual production, so we will come in to co-produce and take on the virtual production, production design, and visual effects parts of the project. Glowcraft Films has an LED stage in Burbank where it can also service and house productions.
What are some productions or titles that your team have worked on?
We did the first season of Nightflyers for Netflix and NBCUniversal, which included a lot of content of characters looking outside of windows and lots of stuff happening outside the windows. The team on that project had a specific look in mind. I think it’s what most people would call a kind of alien type of look with lots of high reflectivity and very high key and high contrast. And fundamentally, a green screen visual effects workflow was not going to allow them to achieve that.
The next project that we worked on, I think which we’re most proud of is Muppets Haunted Mansion. That project actually won an Emmy for art set and scenic design, but I don’t think most people realize that 70% of the sets were fully virtual.
And then I think the third one that I’m most proud of is we did a commercial for a Sony PlayStation VR2 video game called Crossfire Sierra Squad.
How does ARwall differ from other companies specializing in virtual production?
We think of ourselves as a full-service virtual production company, so however you want to engage with virtual production, whether you want to be an owner with your own setups or you’re a producer, we’re there to help guide you through the process. We’re also trying to slice down the creative hurdles involved, so we’ve created software for that purpose. Then on the technical side, we’re trying to figure out what we do and don’t need to have the most flexibility within cost and budget. We’re always attacking every problem from two or three angles.
The result of that was that in 2020, we were able to boil down the solution to the prosumer kit that we released for $11,000, which there was a large demand for during the pandemic. They were mostly being purchased by small commercial companies or educational institutions or even small creators working at home. These were for productions with small LED video walls or projectors or even 4K TV’s. We learned the pain points clients are dealing with and steer product development toward addressing those problems. So, once we get them set up, they’re good to go. We’ve gotten people set up in under an hour and then we never hear from them again.
Is there any difference in the workflow if you’re using an LED wall or projector or a TV?
No. Not really, it’s all basically the same. The code in our system has specific perspective points, like the lens of the camera, and that point might shift around based on what you’re displaying an image on, but the code for the display is the same. Fundamentally, we’re pushing pixels through a cable. That’s all.
How do you see virtual production continuing to evolve in the next 5-10 years?
We’re seeing major tech companies sniffing around and going, “This is an interesting sector.” We’re going to see AI productions try to eat up low-end content. There’s very little that will make these guys go away. One thing we can do is attempt to compete on the technical side as well. We can hire technical engineers to create a bulwark against the tech companies so that we can protect virtual production as one of the last bastions of physical production with sets and actors.
In the next five to ten years, I think you’ll see it become more accessible. With maybe four people on set, we’ll be able to do virtual production work in almost any location, which can include any element like explosions or big effects. And as this technology becomes more accessible and simpler to use, that will foster more focus on performance and creativity for getting the shot. In fact, I think a lot of sound stages may start to get very geared toward that purpose.
What sort of opportunities do you see virtual production creating for video engineers and workers in the coming years?
It’s important to understand that there are some professional broadcast practices like genlock and color calibration that will continue to be foundational. That’s going to maintain. What’s sending and receiving that signal may shift, but the foundations of a GPU sending a signal to a pixel array will be the same. So, people need to understand that part of the workflow. What is going to change is the content playback flexibility. It will become simpler, more accessible. You may not get to the point where it’s as simple as a VCR’s play, fast-forward, and stop buttons, but we may get it down to nine to twelve dials. So, my recommendation is to become familiar with those tool sets because those are what will be appealing to producers. Things like Unreal Editor, ARFX Pro Plugin, ARFX Scenepacks. I even really recommend checking out some of our competitors like Asymmetry, which is a virtual production tool that doesn’t require Unreal Editor. And learn about Disguise Server Integration. Reach out to us and lay out your situation. “I’m 695. I’m excited about virtual production and want to get into playback.”
Are ARFX Pro Plugin and ARFX Scenepacks from your company?
Yeah, ARFX Pro is a plugin for Unreal Engine 5 that unifies the workflow throughout the production process and is designed to work with the engine specifically for filmmaking. It has an in-engine options menu of the most used controls, XR SYNC, a patented one-click XR filmmaking calibration system, low-latency tracking algorithms, saving/bookmarking, clip plane adjustments, color enhancements, cheat modes, trick shot modes, and integrations with LiveLink tracking systems. Basically, it lets you launch environments in a matter of seconds instead of hours, without needing to export to or run another piece of software that will eat up processing power.
There’s also ARFX, which we dropped the Pro designation from, and that supports smaller filmmakers and productions. The plugin is fully integrated. You don’t need Unreal Editor or technical setup. This is what organically came out of the pandemic since we couldn’t be on set. We had to give clients everything they needed to do it themselves while we supported them remotely. And then at NAB (National Association of Broadcasters), we showed off the integration of the ARFX app with our StudioBox. This allows you to start doing virtual production using your iOS smartphone—even allowing camera tracking connected to the Unreal Engine. So even people shooting on their phone can set up with something like a 4K TV and create tracked virtual environments for their productions.
ARFX Scenepacks is a set of Unreal optimized virtual maps with an easy-to-use launcher that you can use on existing 4K screens or projectors, so it’s an affordable entry point into virtual production.
I would like to thank Rene for taking the time to sit down with me and discuss the many changes occurring in the world of virtual production. Local 695 looks forward to partnering with ARwall to bring training and educational opportunities for our members. In the meantime, anyone interested in learning more about this growing area of our industry is encouraged to check out ARwall’s website at www.arwall.co.