Mark Spatny, Supervising Producer for Stargate Films, acknowledges that it wasn't the most complex visual effects sequence of the show (that would be Season 3's cleaving of Tokyo—more on that, below), but there are scenes in Heroes' first season that contain a confounding mix of practical and virtual, including an episode in which Claire (played by Hayden Panettierre) battles a radioactive man (no, not The Simpson's comic book character) who burns her to a crisp. (If you're interested, check out Heroes Monday nights at 9:00 p.m. on NBC.)

"We had originally planned to do it practically, with quick cuts, and several Claire look-a-likes in special effects make-up," Mark recounts. "First, we had to get Hayden's make-up right. We work with a great special effects make-up company and they know the difference between, for instance, radiation and electrical burns. They designed a look for Hayden with a shading technique for burned skin that would flake off, and then applied that look to her. She sat in make-up for eight hours!" Once the "Crispy Claire" look was established, the make-up team applied various stages of the look to the Claire look-alikes.

Example Example
The VFX team did, indeed, fry the cheerleader through a mix of practical make-up and digital double but, thankfully, she was able to save herself through her regenerative powers. Image courtesy of NBC Universal and Stargate Digital.

"But, when we got on location and shot the scene," says Mark, "no matter how much the Claire look-a-likes looked like Claire, they didn't move like Claire—not exactly, as every human has a distinctive gait, almost like a fingerprint." So, after all that work, the practical effect was scrapped in favor of CG. "We came up with a solution that used most of Hayden's performance, and created a digital double for those few, fleeting on-screen moments when she's supposed to be engulfed in flames or charred to a crisp."

A 360-degree laser scan of Hayden was taken "neutral," without make-up, and then several still photos were taken of her as Claire, in hair and make-up, for reference. The team then took the live-action plate of Hayden's performance and, frame by frame, inserted the computer-animated 3D model in exact alignment, "almost like tracing paper," says Mark. Wherever the digital double appeared in the scene, the team applied texture, such as hair and skin tone, to the 3D model. "We use match-move software, like boujou, to make sure the actual and virtual Claires stayed in proper alignment and perspective as the camera and character move." Once the CG character matched exactly, the team could dissolve (an optical effect in which one image seems to dissolve into another) seamlessly between the two.

The Iceman Cometh

The team went from super-hot to stone cold when Tracy Strauss (Niki Sanders' identical triplet, also played by Ali Larter) gives a journalist the cold shoulder—literally, turning him into a human Popsicle® that shatters into a thousand pieces. That scene presented more than its share of artistic and technical challenges.

For starters, what does a human block of ice look like? And how does it shatter in a manner that looks realistic, considering no one has ever seen a frozen human shatter into a thousand pieces? (Remember, Robert Patrick's liquid-nitrogenic T-1000 in Terminator 2 was a machine.) "It all starts with a script break-down," says Mark who, together with Heroes VFX Supervisor for Stargate Films, Eric Grenaudier, reads the script, noting every scene and sequence requiring visual effects shots. Typically, there are anywhere from 80-100 visual effects shots for each episode of the series and Mark and Eric work together, along with their visual effects team, to get them done.

"We have a concept meeting with the director, producer, and other department heads, including make-up, wardrobe, and camera, to figure out the best way to approach the shot," says Mark, who is responsible for budget and scheduling. There are some effects that can be done practically, in camera, on a live set using make-up and prosthetics, for example. "We could have done the ice-man scene practically, with a sculpture," agrees Eric, who is ultimately responsible for the creative look of the effect. "In fact, another show did the same gag [visual effects parlance for a VFX shot or sequence] a few weeks later, and that's the approach they took, but we decided to do it CG."

Example Example Example
What does the inside of a human, frozen solid, look like as it shatters into a thousand pieces--and does it really "shatter," or it more "crumble" or "disintegrate?" Image courtesy of NBC Universal and Stargate Digital.

Explains Eric: "First, we shot the live-action plate on location in the parking lot where the scene takes place, right up to the point where Tracy touches the journalist with her new-found power and he begins to freeze." (The journalist was played by William Katt, best known as the super-cute 80's superhero in the sitcom The Greatest American Hero.) Literally, Katt froze at that moment, holding his body very still "Then we took him out of the plate [the live-action footage], and Ali reacted to this empty space as if he were freezing, toppling, and shattering right in front of her." And that's when the VFX fun began.

"We did a 360-degree laser scan of Katt in his character's wardrobe to create an exact digital double that we imported into the 3D animation environment Maya," says Eric. This is the juncture at which artistry and technology really intersect. "It was open to everyone's interpretation, what a human block of ice looks like, and how it shatters," Mark notes. "Should he look as if he's been dipped in liquid nitrogen, or frozen at the molecular level? And, I know it sounds gross, but, when he shatters, is it into thousands of 'ice chips' of human blood, guts, and entrails, or does it look less life-like? And does he really shatter, or is it more 'crumble' or 'disintegrate'?"

According to Eric, the first order of business in the scene was to make the transition from live-action actor to 3D model. "We did this in the first 'frozen shot' of the scene," he explains, "by matte painting a layer of frost, condensation, and ice onto the live-action plate, giving a lot of thought to Katt's skin complexion and coloring, how it would change as his body froze." By the second shot of the scene, "we were into our digital double, who we lit virtually with exposure information captured from the location."

Highslide JS
Close Move
Eric and Mark's VFX team uses pre-visualization software ("pre-viz" in industry lingo) so the director and producers can see how the effect will unfold. Clip courtesy of NBC Universal and Stargate Digital. (Click to play. Movie will begin playing once loaded.)

Right-click to download this video clip.

To ensure the lighting of Katt's character—live-action and digital double—is consistent throughout the scene, even as he freezes over, the VFX team used a technique known as high dynamic range imagery (HDRI), which helps accurately render illumination, reflection, and shadow on 3D objects as they move through their environment. "Basically, you use a camera—I use a Cannon SLR 5D," says Eric, "to capture a still scene in as many different stops, or exposures, as possible [typically, five under and five over the key exposure] in order to collect a dynamic range of data that enables virtual light to change as the virtual camera and objects move." A similar technique is used in panoramic photography.

Then various computer simulations of how ice would form on the body were run, starting from when Tracy grabs Katt's character's arm and frost travels across his shoulder, engulfing his face and, eventually, encasing his body. At this point in the process, Mark was thinking a glass-like shatter of the character into big shards would be best. But when Eric saw the simulations, it became clear that he was envisioning something more "like glaciers calving," says Mark, "bigger chunks, with smaller particles. We really had to re-think our approach." When they got the consistency of the shatter correct, there was also the matter of the, um, matter. "Nobody had really thought about it till then, but he's human, so the chunks would be frozen meat and gristle and bone," Mark notes.

"To enhance the CG shots of this human ice block shattering into thousands of pieces, we shot pass after pass of practical liquid nitrogen in our studio, along with chunks of meat, and blended everything together in the computer," says Eric. According to Mark, the first completed sequence was "a little too gruesome, we had to tone it down," but they are both very proud of the result they were able to achieve, especially on the comparatively limited budget afforded television series (in comparison to motion pictures). Thankfully, they had a little more time to plot the complete destruction of Tokyo.

Tokyo Rising

"Toward the end of season one, we depicted the nuclear annihilation of New York City, so we thought we had some precedent for this scene of Tokyo ripped in two," says Mark. And though the Heroes VFX team is often able to concoct "cocktails," or recipes, for executing certain digital effects over and over, this was not one of those times. "For starters, this was a ground-level view of the destruction, the city literally being ripped in two, like a giant earthquake coming down the middle of the street, toward camera, destroying everything in its path—cars flying, people fleeing—a very different perspective from the long shots of the New York skyline incinerating." Everyone was in agreement that the scene had to be big, dramatic, and as realistic as possible—on a tight budget and schedule.

These screens show the nuclear annihilation of virtual New York in the background...
...and the composite with live-action actor in the foreground. Images courtesy of NBC Universal and Stargate Digital.

"We thought about starting with plates of a practical location, like Little Tokyo in Los Angeles," says Eric, "but quickly realized that, in order to render the scene effectively, our Tokyo would have to be CG, so we set about building a virtual landscape in Maya." Since Tokyo doesn't have a singular distinguishing landmark like the Eiffel Tower in Paris or the Golden Gate bridge in San Francisco, the team settled on the city's notorious neon signs (nearly a lead character in Lost in Translation), as signifiers. "We picked an actual street in Tokyo to recreate, and used photographic references from books to build a basic 3D architecture," adds Mark, "then found similar buildings in Los Angeles and photographed them as models for textures" (paint color, cracks, etc). Stargate's 3D department populated the place with cars and other detritus, while the show's art department created non-copyrighted, original signage.

The live action plate, with characters "Hiro," "Ando" and about fifty extras, was shot against a green screen. Green screen refers to a process where foreground subjects (in this case, Hiro, Ando, and the extras) are photographed in front of an evenly lit, bright, pure blue (or green) background, later completely replaced with a digital image known as the background plate, which is composited (or digitally blended) in post-production using a software program like After Effects. As Eric notes: "The reason green and blue are used is that these are the only two hues not found in human skin tones." Sometimes, if a character's costume is green, the background will change to blue, and vice versa.

Example Example Example Example
The VFX team considered shooting a practical plate in Los Angeles' Little Tokyo, but decided to create a CG city, first by pre-visualizing, then shooting Hiro and Ando against a green screen, then compositing the actual actors and virtual destruction. Image courtesy of NBC Universal and Stargate Digital.

Mark credits the costume department with accurately outfitting all the extras, which the VFX team then duplicated into fleeing CG extras who manage to interact realistically with the environment as it's being destroyed. This scene was made possible by Massive, a software package that creates thousands—even millions—of "agents" (digital extras) who act as individuals through the use of fuzzy logic (programming that uses approximate rather than precise reasoning). "As the street erupts and all the buildings begin to crumble," says Eric, "we also had to think about how the electrical grid would respond in real life, and that, in turn, affected our virtual lighting of the scene as blackouts roll through the city."

As the split rips up the pavement, cars fall into the chasm. One car gets hurled right at Hiro. And, of course, buildings collapse. "We studied footage of buildings collapsing during earthquakes and we all saw the Twin Towers tragically fall on 9/11," says Mark. Interestingly, one element that was difficult to get right was the dust kicked up in the wake of a building's collapse. "We know what real dust looks like, but we needed lots of trial and error, many computer simulations of various particle systems, plus varying degrees of wind movement and gravity, in order to render it realistically."

Highslide JS
This sequence started with the live-action plate of actor William Katt, then transitioned to the plate with matte painting of ice and frost, and finally to an entirely CG digital double. Clip courtesy of NBC Universal and Stargate Digital.(Click to play. Movie will begin playing once loaded.)

Right-click to download this video clip.

(Particle systems are a method used in 3D computer graphics to simulate certain environmental effects such as rain, explosions, smoke, or dust. A particle system has its own rules that it applies to every particle, which often involve interpolating values over the lifetime of a particle. A particle system is made up of particles, or small objects, with properties such as position, velocity, and color.)

Despite all the simulations, though, the element wasn't quite right. According to Mark: "Usually, we're augmenting the practical with the virtual, but in this case, we had to augment our digital dust with stock footage of the real thing." The resulting dust composite was, well, magical. All the elements came together for a stunning sequence on which the team spent 100 man-days, and which took weeks to render on a dedicated render farm, even with optimization (by not rendering the non-camera facing polygons).

Asset Management: The SQL

Both Mark and Eric credit Stargate Film's founder, Sam Nicholson, for having the vision to develop not only the studio and sound stage but also a proprietary digital pipeline and content management system that lets the VFX team deliver motion picture quality, Emmy®-nominated effects on episodic television budgets and schedules. "It's such a cool system," Mark raves. "When an artist finishes a shot, a copy of it pops up on my desktop. I can see, right away, whether the green screen is right, or the color balance or tracking are off." The same shot pops up on Eric's screen, and he can read Mark's notes in real-time and offer his own feedback.

Stargate's Adam Ealovega says the CMS evolved organically, as assets became digitized, along with a need to organize them online. "We started with a Filemaker database, which required lots of human input," Adam remembers. Of course, wherever there are humans, error is sure to follow. "When we were at a quiet stage in the project pipeline, files got updated and put in the right folders," Adam says. But, as critical deadlines approached and workflow sped up, updating the database became less of a priority. "When artists went to retrieve assets, file information was missing or out-of-date," Adam recalls, "eroding trust in the system." In creating Stargate's ever-evolving proprietary asset management system (using Microsoft® SQL), Adam and the Stargate technology team's objective was to make everything as automated as possible

For instance, file naming conventions have been automated. The first iteration is "name1," the second iteration "name2," and so forth. "It used to be, an artist might think he was working on the final iteration, and gave it a '.final' extension," says Adam, "which was okay until another artist had a new version of the same file and named it ','" he chuckles. "I just thought, what's next? '.noireallymeanitthistime'?" In addition to consistent naming conventions, the asset management system hooks into primary applications like Maya and After Effects so that when an artist accesses a file, all the updating of the database is done behind the scenes. Most impressively, once a visual effects shot is returned from the render farm, a copy pops up on the desktop of every person who has touched it for review. According to Adam: "We estimate our system has saved countless man hours that, for us, are better spent focusing on the work."

The company has also innovated a new green screen technology called VBLive (for Virtual Backlot Live), a tool that lets filmmakers replace green screens with a digital projection background plate of any environment—stock footage or 3D—that can be instantly composited and recorded (through a digital camera) in real time. "We're having great success with this system in daytime drama," Mark reports, "helping shows create high-quality effects like car crashes and tornadoes in record time. We just completed 380 effects shots for All My Children." And that is downright, well, heroic.