It’s not surprising that Hollywood has been hit hard by the Covid-19 pandemic. After all, movie sets typically require hundreds of people coming together from all around the world to work in close proximity to one another. Early in the spring of 2020, production effectively shut down until further notice. But slowly, quietly, new movies—movies that were filmed during the pandemic—have begun surfacing. How? Filmmakers found ways to adapt, and now they’re getting even more tools to help them film safely.
Not every movie needs high-tech solutions, of course. Smaller films, like Netflix’s Malcolm & Marie or the recent Sundance flick How It Ends, are able to get by with smaller, quarantined crews. But for bigger, more complicated projects—the kind that need visual effects and lots of extras—tech is filling in the gaps on socially distant shoots. Here’s how.
One of the most innovative adaptations to date comes from Frame.io. The company is best known for providing web-based tools for teams to go over dailies and pass notes back and forth during the editing process. Today, though, Frame.io unveiled a new service: Camera to Cloud, which allows multiple people to start working on a shot the second the director films it, greatly reducing the number of people on set and increasing the number who can contribute from a safe, socially (very) distant place.
Here’s how it works: Let’s say you’ve got an 8K RED camera on set. Using the Camera to Cloud system, that rig would be connected to a transcoding box, such as a Teradek Cube 655, which takes that 8K video and makes it into a smaller, easier-to-view/share 1080p file. That box is also connected to the internet, as is a Sound Devices deck, which is collecting audio from all the microphones on set. As soon as someone yells “Cut!” the files are being uploaded to the cloud, where anyone who has been granted access can review them.
From there, people like executive producers and VFX supervisors can weigh in with notes in near real time. Even better, the system allows the movie’s editor to work on the film in tandem, even if they’re on the other side of the planet. As soon as the take is done, the video files (with separate but synced audio files) will automatically appear in DaVinci Resolve, Final Cut, Adobe Premiere, or whatever editing software they’re using. As soon as it’s there, they can drop the last take into the timeline, apply effects and filters (such as keying out a green screen), and quickly export it back to Frame.io for everyone to check out and approve. Back on set, the director can review the new cut and leave notes that will show up directly on the editor’s timeline with single-frame accuracy.
The files that are uploaded from the camera can be anywhere from 0.5 Mbps (think Zoom quality) to 15 Mbps (Netflix-ish), your choice. The higher end of the scale is generally more than enough for something like network news and could go to broadcast immediately. For films with a tight turnaround, the proxy files that are uploaded are edit-quality (and the audio, which is much smaller, are the originals) and can be cut together immediately. When the hard drive with all the full-resolution files lands in the editing bay, they can be swapped into the edit with the click of a button.
Camera to Cloud has been used on one Hollywood production: Songbird. Last summer, the disaster flick, which landed on video on demand in December, became the first full production to beta test the service. In fact, Songbird was the first film to start production after the most stringent Covid-19 restrictions in California were lifted, so it had to do everything possible to minimize the crew, including shooting with RED’s smallest camera (the 8K Komodo) so the director of photography could also function as a camera operator. Meanwhile, six or more executives watched remotely as the shoot unfolded.
"We were all able to give notes while the filmmakers were in the moment,” says coproducer Max Votolato. “That’s as opposed to the usual 24-hour delay, where the crew would have to go back and set it all up again so they could make the adjustments."
Incredibly, 90 percent of the uploading to the cloud happened over LTE and 5G using off-the-shelf hot spots and such. Only once, when shooting in the mountains without a cell signal, did they have to use a house’s Wi-Fi. The Teradek Cube has more than enough storage for a full day’s shoot, though, so even when filmmakers are off grid the device is ready to upload to the cloud as soon as it's back online.
Remote editing is just the beginning. Frame.io is putting the finishing touches on a livestream feature, so someone off set will be able to look through the main camera’s lens and weigh in on lighting, props, shot-framing, and more. That feature is expected to roll out in the near future as an addition to the subscription service, but Camera to Cloud is included in the cost for any Frame.io subscriber and is scheduled to go live next month.
Of course, larger studios have some rather wonderful toys of their own, and nowhere is that more obvious than at Industrial Light & Magic, Lucasfilm’s legendary visual effects department. You may have heard about the innovative Stagecraft LED setup that has been using extensively for backdrops on series like The Mandalorian (we’ll get to that in a moment), but that's just on set. For filmmakers needing a socially distant way to plan their production, ILM uses another tool: the Virtual Art Department, or VAD.
The VAD team—essentially a special-ops corps of artists and technicians highly skilled in location scouting—steps in as soon as the the previsualization work (concept art, storyboarding) is done. When location scouting in normal times, the production designer, director, visual effects team, and other filmmakers might all hop on planes and pack into vans to inspect locations directly. Obviously, that isn’t safe or practical in a raging pandemic. So the VAD team has a fix: a tech setup that allows that group of filmmakers to collaborate in virtual reality. They’re able to scout locations as if they’re actually there and, once they’ve found their spot, frame up shots, make lighting decisions, and even change set pieces. Because both practical and digital elements are visible in the VR environment, they can make decisions about those, too.
Because the VAD setup uses Epic Games’ Unreal Engine, it lets filmmakers collaborate using any device that can run Unreal, from phones to laptops to Oculus VR headsets. Everybody involved can talk to each other in real time, point at things with a laser pointer, and pick things up and move them around. They even have a virtual version of their entire lens package loaded into the program, so the DP can set up exact shots. This gives the whole team a picture of what will need to be built, what can be used that already exists, and what has to be created digitally.
When they need to scout a real-world location, ILM sends out a very small team with specialized equipment to capture it. Say they want to shoot a room in a spooky, old mansion. They will scan the entire space with Lidar, then take photography of it from practically every angle. They then stitch everything into a 360-degree digital model of that environment. Or, if it isn’t a real place—a cave on an alien planet that doesn’t exist, for instance—the VAD might build proxy models of the environment that the whole team can explore virtually. Then, once all the shots are planned out and the director and other filmmakers are happy with what they’ve got, those models will be turned over to ILM’s visual effects team to make a photorealistic environment using them as a foundation.
This works nicely in conjunction with the aforementioned Stagecraft LED, which is basically a big, high-definition TV that serves as an ever-changing digital backdrop. It’s 21 feet tall, 75 feet in diameter, and can surround 270 degrees of a set (including the ceiling). It can re-create a distant planet or an enemy base and calibrates itself so that it looks photorealistic for whichever camera is pointed at it. The images it produces are so sharp that more than 50 percent of the first season of The Mandalorian was shot utilizing Stagecraft, and it completely eliminated the need for location shoots. (The images are also so pristine they often confuse humans. On a recent shoot, a Covid safety officer ran over to chastise two unmasked extras for standing too close together only to find out they were avatars on the LED wall.)
Actors love working on a Stagecraft set because, instead of imagining a scene while looking at a greenscreen, they have something to respond to. It also takes care of a lot of the lighting. For example, when shooting the recent Netflix movie The Midnight Sky, filmmakers used the LED screen instead of traditional lights. In this way, the actors can be lit by the environment just like they would be in real life—it doesn’t look like “movie lighting.” In the cases where they do want a little extra, though, Stagecraft can handle that, too.
To understand how that comes together practically, consider George Clooney. At the onset of The Midnight Sky, ILM’s VAD crew traveled to remote, snowy locations in Iceland and captured detailed scans of the environment. Those were then turned into Stagecraft images that allowed Clooney to sit on a set, look out the window of “Barbeau research station” and see the same snowy landscape the audience saw in the final film. Everything was captured in-camera. The details might be altered and augmented, but it’s a massive leap forward for soundstages.
It's also just the beginning. Stagecraft was created before Covid hit, and both The Mandalorian and The Midnight Sky were shot prior to lockdowns, but the techniques ILM pioneered are now becoming even more valuable during pandemic productions, reducing crew sizes and allowing filmmakers to work together remotely. The Stagecraft set in London where Clooney's film was shot is now being used for The Batman; another one in Sydney became a home for Thor: Love and Thunder. Similarly, Pinewood Studios in Atlanta recently announced it, too, will be offering virtual production capabilities similar to those offered by Stagecraft.
These, of course, are not the only Covid-conscious cinematic innovations we’ve seen in the last year. There are also technologies like Solo Cinebot, a robotic camera that can film actors remotely, and Crew in a Box, which is essentially a briefcase with a Blackmagic Pocket Cinema Camera 6K, a three-panel LED light, a teleprompter, a mic, and an ATEM Mini Pro, which allows the director to control everything (and direct the talent) remotely. That has enabled networks like NBC, ABC, and MTV, to get super-high-quality footage of people (especially for interviews and such) while exercising maximum caution.
A lot of this tech was in development before this pandemic hit, but the rather dire circumstances brought on by Covid-19 have either fast-tracked them or inspired new features and spurred adoption. "We always talk about how 2020 was such a difficult year and all of the problems with it, but we made so much progress in 2020 with pushing all of these technologies,” Votolato says. “We were forced to make these discoveries, so in some ways we're probably seven years ahead of where we would be otherwise."
Seven years ahead and light-years away.