!

Disney rendered its new animated film ‘The Bi 6’ on a 55,000-core supercomputer. They made the movie on a beta renderer (says Hank Driskill), the software Disney created from the ground up to handle the film’s impressive lighting. It’s just one of about three dozen tools the studio used to bring the robotics-friendly world of San Fransokyo to life. Some, like the program Tonic originally created for Rapunzel’s hair in Tangled, are merely improved versions of software built for previous efforts, or “shows” as Disney calls them. Hyperion, however, represents the studio’s greatest and riskiest commitment to R&D in animation technology thus far. And its feasibility wasn’t always a sure thing, something Disney’s Chief Technology Officer Andy Hendrickson underscores when he says, “It’s the analog to building a car while you’re driving it.”

It took a team of about 10 people over two years to build Hyperion. This software allowed animators to eschew the incredibly time-consuming manual effort to animate single-bounce, indirect lighting in favor of 10 to 20 bounces simulated by the software. It’s responsible for environmental effects — stuff most audiences might take for granted, like when they see Baymax, the soft, vinyl robot featured in the film, illuminated from behind. That seemingly mundane lighting trick is no small feat; it required the use of a 55,000-core supercomputer spread across four geographic locations.

This movie’s so complex that humans couldn’t actually handle the complexity. We have to come up with automated systems,” says Hendrickson. To manage that cluster and the 400,000-plus computations it processes per day (roughly about 1.1 million computational hours), his team created software called Coda, which treats the four render farms like a single supercomputer. If one or more of those thousands of jobs fails, Coda alerts the appropriate staffers via an iPhone app.

Hendrickson says that Hyperion “could render Tangled from scratch every 10 days.

Advertisements

Question

How important is technological convergence for institutions and audiences within a media area which you have studied?

SLIDE 1

CAPS. Pixar invented, sold to Disney. Improved software/hardware. Disney used in ‘Beauty and the beast’ -1991. (Jeff Bridges)

SLIDE 2

Pixar enhances Disney’s animations,production line became more efficient . =strengthens another, large audiences, people trust (before = hand drawn animations). 

SLIDE 3

For ‘Bugs Life’ the technology wasn’t good enough so John Lassester created a ‘croud team’ to make it work.

SLIDE 4

Monsters University = 4 years  (6 years for Brave). Pixar = doubled size of “render farm” = created a realistic system of lighting known as “global illumination.” –

SLIDE 5

Pixar improved technology for ‘Monsters University’ compared to ‘Monsters Inc’. Computers=2X more powerful.

RenderMan

RenderMan, which is developed by Pixar, has faced increased competition from rival animation rendering programmes such as VRay and Arnold.

Although Pixar, which is owned by Disney, produces its own films, it licenses RenderMan to rival studios.

The 25 year old software, RenderMan – Version 21 is now available for free on al commercial devices.

When RenderMan was first discussed there was no notion of a ‘product’. The team had a charter to computerize Lucasfilm. “I had taken it on myself to produce a rendering system that would produce film quality images,” explains Carpenter. “By that I mean something the computer would generate that we could intercut with live action so that you couldn’t tell who did what, and this is standard now just go look at any standard movie.” That meant that the renderer had to produce images that were as richly detailed as a live action camera could capture and that was way beyond the best of the state of the art, the most you could hope for then was a few polygons and they looked like plastic at best. We had a lot of conversations, Rob Cook, Ed Catmull and I  – and worked on the problem for most of a year or more – and I had some ideas I brought with me from Boeing. Rob Cook had some ideas from Cornell and Ed had some ideas from Utah (University), and we worked these ideas around and we came up with a synthesis that became the Reyes Rendering Algorithm, and it was the foundation of what became the RenderMan renderer – before that it was the Reyes Renderer,” recalls Carpenter.

Tlast 44 of the 47 movies which had won the Oscars for visual effects had used the RenderMansoftware.

Inside out technology problems

Pixar is a technology company. With Inside Out, its newest feature due later this year, Pixar had its own unique set of technical challenges to overcome. A bigger vision led to scaling problems, the duality of the film’s narrative meant creating not one, but two worlds and visual languages — not to mention a main character made entirely of light.

One of the major technical hurdles to overcome was how to shoot the movie in a way that communicates the tumultuous, expressive world of emotions — yet can also transmit the subtleties and nuance of our ‘outside’ human world.

Typically, when you want to direct a camera in a virtual world for an animated film, you do it point-by-point. If the desired effect is a mechanical, tracking, dolly or even handheld shot, each of those is programmed in by a camera operator to mimic the real-world equivalent.

In addition, as the story progresses, the camera techniques move from a swooping, 30’s-style mechanical camera into a much more modern hand operated camera style.

Lin had previously used the technique while making The Blue Umbrella, a short that ran before Pixar’s Monsters University. The version they used then was significantly improved for use on Inside Out.

“On Blue Umbrella it was really in its infancy. The gearbox that we have, that is actually built by one of our lead layout artists Adam [Habib]. He also built a focus ring, too, that can actually do live focusing, so that we can get that perfect focus more naturally. Everything we do has to be deliberate, and nothing is accidental.”

One of the main characters in the movie is Joy, as voiced by Amy Poehler and featured prominently in the trailers and other marketing. As you can see here, Joy glows. Not only does she glow, but she’s actually a full on light source.

Having a light bulb walking around in your scenes presented some difficulties to Pixar’s lighting staff.

Character lighting lead Angelique Reisch says that Joy’s glowing nature was one of the tougher technical challenges to overcome.

Finding dory technology

Each new Pixar film employs newer and better technology, but Finding Dory introduces an unprecedented amount of new software to their production pipeline. The company’s chief technology officer Steve May, who worked on Finding Nemo as the supervisor of the shark sequence, says that the process of how they make films has changed a lot since then, but “mainly computers are way faster and algorithms are way better.” Finding Dory introduces three completely new technologies and major improvements in one of their older pieces of software.

While their animation software RenderMan has been around since the late 1980s (used in all of their films as well as many films that have won Oscars in the past 15 years), an entirely new version of the software was employed on Dory. There are basically two kinds of light: direct light and indirect light (which bounces off something else). In Finding Nemo, Pixar could only afford direct light and the lighting artists would create invisible light sources around the scene to mimic what indirect light might look like. That process is intensive as it’s all created and designed by hand.

Finding Dory is the first Pixar film to use the RenderMan RIS architecture software, which is able to create both the direct and indirect light. This allows the lighting team to spend less time trying to mimic reality and more time making creative decisions. Finding Dory has a lot of water, and the software is also able to deal with reflected and refracted light in water, which impacts the color of the water. With something more complicated like a splash, the software is able to reflect and refract light on every single drop of water. Each shot in Finding Dory has billions of individual light rays per frame, with probably ten reflections and refractions in each ray.

On Finding Nemo, creating the look of water meant a lot of work by hand. Now all of that can be created automatically in the software. The software is also able to create foam, aeration inside the water, which adds another 100 reflections and refractions — all of which was not possible before this film. Fish tanks are very complicated, and in Finding Nemo Pixar had three or four people work for six months to add in the reflections on the tank and water surface. This was done by hand for every shot of the tank that you see in that film. In Finding Dory, it’s all accomplished automatically.

One of the biggest challenges with lighting a computer-created scene is that it takes a lot of time to get feedback to see what a render will actually look like. Sometimes this takes minutes, but often times it takes hours for a single frame render. For Finding Dory, Pixar employed a Katana, a DCC app which creates more reactive renders and which works with RenderMan through a plugin. As they move a light source in a scene, the artist see a noisy-looking live render of the results which gradually improves in resolution.

Pixar’s proprietary animation system Presto is not new: they’ve been using since Brave. But they’ve added new abilities to the software for this film. Sully in Monsters Inc. originally had tentacles, but Pixar was forced to get rid of them because they were too hard to animate. They have added a new ability to allow artists to simply draw tentacles rather than program tons of different points along each tentacle. The result is almost like bringing computer animation back to the hand-drawn days.

On Finding Nemo, they built blueprints to help them animate how each fish swims. On Finding Dory, they have created an animation recipe to help make the animation of the swimming more automatic, with controls to allow animators to change and alter how Dory swims. While it doesn’t create the swim animation automatically, it helps take a lot of the tedious work out of it so animators can concentrate on the creative side of it.

Pixar

Computer hardware is any physical device used in or with your machine, whereas software is a collection of code installed onto your computer’s hard drive.

From its earliest days as a hardware developer, the company’s technology has been inextricably intertwined with the development of the computer graphics industry. Its early short films, like Luxo Jr, and its debut feature film, Toy Story, introduced the world to RenderMan, now the industry-standard software for rendering (the process of generating finished two-dimensional images from the geometry, surfacing and lighting data used to create a three-dimensional animation).

Films

Pixar’s subsequent films act like a timeline of technological developments in computer graphics. Building on the work of other researchers, 2001’s Monsters, Inc. introduced the on-screen representation of fur. Two years later, Finding Nemo pioneered new techniques in digital lighting, which were used to create realistic-looking water. The Incredibles and Ratatouille brought with them believable human characters, and advances in the simulation of crowds and fluids.