Beckley Studio’s motion capture technology brings real-life movements to game scene

Ellen Glover • Aug. 21, 2017
Two actors fighting
Musical theatre senior Scott Van Wye and theatre and drama junior Deborah Alix act out fight choregoraphy for the cut scene in the Beckley Studio. (Courtesy photo)

Four actors dressed in head-to-toe black body suits with white reflective dot nodes spaced out across their bodies perform choreographed movements in Franklin Hall’s Beckley Studio.

A camera records the movements and transmits the sequence to Blade, a software that cleans the data and runs it through a filter to smooth and naturalize the captured movements.

The images then stream through Pegasus, a retargeter software that implements the actors’ choreography into Unreal, a video game engine.

The actors are now animated characters. Their choreographed moves appear in a video game.

The Beckley Studio’s motion capture system allows student filmmakers and game designers to create digital animations based on live human actions, allowing for more lifelike representations. The Media School installed the technology in January.

Illustration of a wizard character
Students and faculty from The Media School, the IUPUI School of Informatics and Computing, the Jacobs School of Music, the Department of Theatre, Drama, and Contemporary Dance and the kinesiology department in the School of Public Health collaborated on a video game cut scene using motion capture technology in the Beckley Studio. (Courtesy image)

“There is a huge need for motion capture techs right now,” said Rush Swope, a game art and modeling lecturer at The Media School. Before coming to IU, Swope used the technology at Blizzard, a California-based video game publisher and developer that is best known for its games World of Warcraft and Overwatch. “This isn’t something you can just download. You need to be in the studio working with the technology. It’s changed a lot since my days at Blizzard. The equipment we have here is top of the line.”

During spring semester, The Media School collaborated with four other units to create a five-minute animated video game cut scene using the system. The team consisted of about 50 people, many of whom are Media School students.

The scene was based on the storyline of Final Fantasy 9, a role-playing fantasy video game developed for PlayStation in 2000. Of the four actors in the scene, two are characters from the game, one is from the game’s epilogue and another is an original student creation. Although the story is rooted in an existing game, all of the choreography, dialogue, animation, sounds, music and art were created by the five departments working on the project.

IUPUI School of Informatics and Computing students contributed the visual effects. Jacobs School of Music students composed the music. The Department of Theatre, Drama, and Contemporary Dance handled the voice casting and acting. The kinesiology department in the School of Public Health choreographed the fight scenes. The Media School designed sound and characters.

“I never thought I was going to be able to touch anything like this,” said Morgan Anderson, BA’17, who was a senior studying telecommunications with a focus on design and production. She developed concept art and worked on 2-D animation for the original character and villain. She has also helped design concepts for weapons to be used in the fight scenes.

The technology and equipment required to create a cut scene with motion capture is not only new to The Media School, but to the game industry, too. 

Illustration of two video game characters
The cut scene was based on the storyline from Final Fantasy 9, a role-playing fantasy video game developed for PlayStation in 2000, but all of the choreography, dialogue, animation, sounds, music and art were created by the five departments working on the project. (Courtesy image)

Both Anderson and Julian Povinelli, a senior majoring in game design, enjoyed the inventiveness of the project, despite the fact that the scene was based on a game that is almost 20 years old.

“We’re all pretty much learning as we go,” said Povinelli, who worked on 3-D models and rigging, the process of developing the skeleton of the character so they move like a normal physical structure. “This is all pretty new and, for a lot of us, this is the first test run to see what we can really do.”

Once animation and the footage were captured, it was up to the sound design team, in this case led by Norbert Herber, a sound art and music senior lecturer at The Media School, to develop the scene’s soundscape. Herber recruited a team of sound designers from one of his classes, and they were in charge of creating sounds for the environment and character movements.

“The students have accumulated a sound effects library where they can go through raw matter and customize it for the given movement,” Herber said. “For example, students are using garden shovels to create the sound for the big metal claws on the bad guy.”

Illustration of a flying video game character
Motion capture technology in the Beckley Studio allowed artists to convert live human actions into game animations. (Courtesy image)

Herber explained that, although the brain and the eyes do a lot of the work when viewers watch animated pieces, the sound designers still have to manipulate the sounds they capture in order match them to the size of the character or the action itself. They need to make sure that, in the end, the claws sound like scary claws, not gardening shovels. This, Herber explained, requires an enormous attention to detail.

The finished product took months — including many long weekends — to complete.

“Students were definitely excited to see the nitty gritty technical side of animation, but a lot of them were thrown into the gauntlet quickly and got to see how tough animation really is,” Swope said.

Swope said The Media School will use the motion capture system for similar projects in the future.

More: