
Audiences streaming the current season of “The Mandalorian” have seen the main characters fly through space and explore the murky mines of Mandalore thanks to the groundbreaking virtual production technology invented for the show.
ALSO READ: Read all of Ryan Bordow’s movie reviews here
ALSO READ: Here’s how Arizona film incentives can attract filmmakers
Virtual production not only generates richly detailed sets displayed on giant LED screens instead of blank green screens, it also saves money by making production more efficient and accurate. Since “The Mandalorian” debuted in 2019, the technology has also been used in the films “Dune” and “The Batman.”
Learning technology from the ‘Mandalorian’
Now students at Arizona State University have access to this cutting-edge technology to tell their own stories. The Sidney Poitier New American Film School offers virtual production technology with extremely high-resolution LED wall and floor displays created by Planar Studios at the ASU California Center in downtown Los Angeles.
The Media and Immersive eXperience Center in downtown Mesa will offer the technology starting in the fall semester, according to Jake Pinholster, founding director of the MIX Center and executive dean of the Herberger Institute for Design and the Arts.
“In just a few years, this has become one of the most explosive and transformative trends in the film production industry because it has enormous positive consequences,” he said.
“It cuts post-production time. It makes it easier to pre-visualize and know what a photo will look like before you turn on the camera, he said.
“Actors can see the environment and react to it. You can shoot a dawn scene all day.”
Industrial Light and Magic, the special effects production company founded by George Lucas, released a video explaining how it created the technology for “The Mandalorian” so that the world building can be adjusted in real time and saved. The method streamlines the work that was previously done in the pre-production, production and post-production timelines.
The environments are created digitally and loaded onto the giant screens, where the actors can interact with what the audience sees. Previously, actors would work in front of a blank green screen and the digital effects were added during post-production.
Because ASU is the only film school to offer the technology, Pinholster and Nonny de la Peña, founder of ASU’s Narrative and Emerging Media program, are helping set standards for teaching the method. They are part of a working group of the Society of Motion Picture and Television Engineers.
“To some extent, there’s no industry standard for how this should be done because it’s still an experimental process,” Pinholster said.
“We are one of the first universities to train people in what will become the most important manufacturing technique.”
Courses in virtual production will be included in both the Narrative and Emerging Media program in Los Angeles and the undergraduate program at the MIX Center in Mesa.
De la Peña’s students in Los Angeles have been working with the Planar screens to tell both fiction and non-fiction.
“We use new technology in every possible way,” she said.
“We have students walking around carrying iPads scanning the building, and how do you tell the story with what you’ve scanned?”
The scans run on game engine technology, and once uploaded to the giant LED screens, the effect is immersive. Her students work on stories involving Shakespeare, drug abuse, water problems and baseball.
De la Peña sees virtual production as the future, not only for films, but also for narrative journalism.
“You can have a reporter on the scene without being on the scene,” she said.
“If we want to make sure we have students prepared for the storytelling of the future, we have to teach them now.”
A crucial part of embracing new technology is deciding how to use it ethically. She and Mary Matheson, director and professor of practice at the film school, teach a class called “Diversity and Ethics in New Media.”
“Students are now learning about how (artificial intelligence) is trained, as if nothing else is sexist and racist,” de la Peña said.
One of de la Peña’s students, Cameron Kostopoulos, debuted “Body of Mine VR,” an immersive virtual reality experience, at the South By Southwest festival 12-14. March, which won a jury prize. The experience places the viewer into another body for an exploration of gender dysphoria and trans identity.
Kostopoulos used a combination of several technologies, including the Planar screens plus VIVE, to create “Body of Mine VR,” which combines body, face and eye tracking with audio interviews.
Kostopoulos, a gay man, grew up in Texas.
“Having been in the closet for pretty much my entire K–12 experience, looking back, I know how having certain spaces could have helped me,” he said.
“So because of that, I’m passionate about creating these spaces and those experiences for other queer youth who can benefit from them. And for cisgender people to learn about the trans experience and gain empathy.”
“Body of Mine VR” uses full-body motion capture and eye tracking, so at one point the viewer looks in a mirror and sees themselves blinking.
“I put everything together for a more intimate VR experience than what you would normally get with controllers,” said Kostopoulos, who is a writer, director and developer based in Los Angeles.
Combining all the new technology at the ASU California Center was a challenge.
“It’s basically supergluing a lot of cutting-edge stuff into our own makeshift tracking system,” he said.
“Because all the pieces of technology exist in isolated pockets, there aren’t many experiences that combine everything to make a fully immersive embodiment of a body in VR,” he said.
“There aren’t many tutorials I could follow and not many people who have worked on it.
“But getting it to finally work was totally worth it, and it ended up being super cool.”