Warner Bros. Pictures / MGM
Simulation software that helped filmmakers create the character Gollum's skin for "The Hobbit: An Unexpected Journey" is one of the technologies recognized by the Academy of Motion Picture Arts and Sciences.
The goal of every movie is for the audience to suspend its collective disbelief and become immersed in the world created on screen. With special effects breakthroughs continuing to raise the bar for movie audiences, the technical folks behind the scenes are convening on Saturday to celebrate the science and engineering advances in moviemaking.
Audiences know that Daniel Day-Lewis is not really Abraham Lincoln and that Anne Hathaway is not Fantine, but when they watched "Lincoln" or "Les Miserables," they believed. Those kinds of accomplishments are traditionally honored by the Academy of Motion Picture Arts and Sciences at a ceremony held a couple of weeks before the televised Oscar extravaganza.
Saturday's Scientific and Technical Achievement awards ceremony will be at the Beverly Hills Hotel, co-hosted by Zoe Saldana and Chris Pine, who both starred in 2009's "Star Trek" reboot. Nine science and technological awards will honor a total of 25 innovators whose hardware and software have changed the process of moviemaking.
Numerous award winners spoke to Inside Science to explain the science, engineering, and mathematical tools behind the latest special-effects wonders:
Visual effects: Feathers and smoke
Even though Natalie Portman has tremendous acting ability, it was screen science that helped her sprout feathers during her final transformation from a woman to a swan in the 2010 film "Black Swan."
"The team at Look FX had been working for weeks trying to make it work," said Ross Shain, chief marketing officer at Imagineer Systems Ltd. "The end result had to show the effect starting on her back, neck and shoulders with the camera panning close up."
Imagineer Systems Ltd.
Mocha Planar Tracking Software helped Natalie Portman sprout feathers in the 2010 film "Black Swan."
With lots of camera movement and very few points to digitally attach feathers to Portman's arm, Look FX had tried all the tools it had, but nothing worked. So the team tried the Mocha planar and tracking software, which was created to solve common technical problems and save time for visual-effects artists, editors, animators and colorists.
Developed by a team including Shain and fellow award winners Philip McLauchlan, Allan Jaenicke, and John-Paul Smith, the software essentially tracks the movement of each on-screen digital picture element, or pixel, during a scene. This allows an artist to have more control over the final look and movement of a visual effect. Almost instantly, Mocha allowed artists to take the image of swan skin and feathers that they had created, attach it to Portman's arm, integrated the image into her skin.
"This allowed the reveal to happen," said Shain. "The result blew people away."
Audiences are often blown away by large fiery explosions or billowing clouds of smoke.
Theodore Kim / UCSB
A still image shows how Wavelet Turbulence software can create wisps of virtual smoke that can take on any desired shape.
In the 2011 film "Hugo," as Hugo Cabret runs through the clock tower trying to escape the train station inspector, it was Theodore Kim — a computer scientist at the University of California at Santa Barbara, and fellow award winners Nils Thuerey, Markus Gross and Doug James — who created the wisps of smoke that provided an extra cloak of invisibility. The Wavelet Turbulence software makes it easier for artists to control the final look of smoke clouds and fiery flames on screen.
"While this work is highly technical, its ultimate goal is an aesthetic one," said Kim. "When many people think of math and science, the perception is often that it leaves no room for creativity or intuition. However, both played a tremendous role in the design and implementation of this software and in turn it aids others in their own creative work."
CG skin and movement
Bringing to life a computer-generated character like Gollum from the 2012 film "The Hobbit: An Unexpected Journey" was a unique challenge, because part of what made him appear so lifelike had to do with his skin and his movements. To make this work, a team of artists and scientists from Weta Digital, including award winners Simon Clutterbuck, Richard Dorling and James Jacobs, developed an approach they call "Tissue: A Physically-Based Character Simulation Framework."
"The framework is used to construct and simulate the anatomical components of our digital creatures and characters," said Jacobs, a supervisor for creature special-effects.
With a similar goal in mind, a group at Centropolis FX, including awardees J.P. Lewis, Nickson Fong and Matt Cordner, created the pose space deformation, or "PSD," technique.
"PSD is an artist-friendly way to fix basic skinning problems with animation," said Cordner, an FX artist at Blizzard Entertainment. "It is an integral component of Weta's tissue framework."
PSD helps an artist pose a computer-generated arm into a specific position, such as an arm flexed making a muscle. The artist can fix the skin's surface and save the settings for the skin's surface for that specific poise. After the skin is fixed on all the poses in a scene, PSD will incorporate all of that information so that as the arm moves from flexed to relaxed, to help make the skin look more realistic.
Lighting: From a scene to city
In the 2001 animated film, "Shrek," creating a rose-colored sunset was part art and part science for the team who worked at PDI/Dreamworks, including Daniel Wexler, Lawrence Kesteloot and Drew Olbrich.
A screenshot shows the Light system at work.
"We made a tool for artists to help them achieve new levels of creativity," said Wexler, now a chief executive officer at The11ers. "Lighters tell a story with light, and since a lighter's time is more valuable than a computer's time, we developed the Light system."
The Light system combines lighting and rendering into one tool. Lighting is when the artist adds light to the scene, such as an illuminated desk lamp. Rendering generates the entire scene by forming an image that combines the lamp's light, the wood grain on the desk and the color of the wall. This allows the lighter to see what the light looks like in the scene.
"Instead of having to wait hours between making a change to a scene and being able to view it, the artist is able to see changes in lighting in real time," said Wexler.
Focusing on the lights in one room is one thing, but trying to light up five blocks of the New York City skyline is another. For Steve LaVietes, Brian Hall and Jeremy Selan at Sony Pictures Imageworks, creating Katana, a computer graphics scene management and lighting software, was a way to overcome the common problem of using up all of the computer's memory to generate large, complicated scenes.
"Katana is specialized for large-scale film production where there is lots of data or lots of team members involved," said LaVietes, a pipeline architect. He develops the software process that moves data between departments for final movie frame delivery. "The way Katana works, if I make a change to a scene, I only save that change, and you would see a flowchart of all the changes to this scene."
For example, an artist can produce a set of instructions for how the light will look streaming from an apartment window at night. Then, if the artist decides to make that one window and entire apartment building the length of an entire city block, Katana can apply the same light instructions over a much larger environment.
Getting the lighting just right, making the characters appear lifelike and creating visual effects that take an audience's breath away is the goal of these Oscar award-wining screen scientists and ingenious engineers. When they do their job well, the audience doesn't even notice their work.
More about movie tech:
- Rock Center: 'Hobbit' director reveals secrets
- 'Hunger Games' fire show is Hollywood magic
- Get your fill of fun facts about the Oscars
Emilie Lorditch is an editor and writer for Inside Science TV.
This report was originally published by Inside Science News Service as "Oscar Sci-Tech Awards Honor Ingenious Screen Science and Engineering." Copyright 2013 American Institute of Physics. Republished with permission.