Digital effects help TV ads score with the Super Bowl audience.
Super Bowl 2010 was a year to remember. With the Indianapolis Colts heavily favored, the New Orleans Saints still came marching in, thrilling die-hard fans in the Big Easy, as well as those across the nation rooting for the underdog to claim its first-ever world championship.
But Super Bowl commercials this year weren’t nearly as thrilling. Although the cost of a Super Bowl spot has skyrocketed, the number of eye-catching VFX commercials on display this year was paltry. Has the money gone to buying the time instead of creating the spots? This year’s Super Bowl commercials took in an average of $3 million per 30-second spot, with total ad revenue of $213 million. According to Mediaweek journalist Anthony Crupi, that is a 329 percent increase from CBS’s price of $700,400 for the 1990 game. Turning back the clock to 1967, a commercial running during the Super Bowl went for just $42,500.
Fast-forward to 2010, when the most popular commercial was the extraordinarily simple Google ad, which charted a budding romance, marriage, and family simply through search terms typed into a Google box on a computer screen.
It’s hard not to see this pullback from extravagantly visual spots as a response to the economic downturn. With thousands out of work, perhaps the brands reasoned, would it be unseemly to spend millions on a commercial? Yet, many did—with dismal results.
Nonetheless, a handful of great-looking CG animation and visual effects did rise to the top, including one from Anheuser Busch, a company that always seems to score high in terms of commercial appeal, as well as Super Bowl newcomer Vizio, which used CG technology combined with star power and pop culture to peddle its televisions. Animated animals are always a sure bet, as Honda and Monster.com realized with a squirrel and beaver, respectively. And, with Valentine’s Day right around the corner, Teleflora got into the act and came up smelling rosy with animated flowers.
Bridge • Budweiser
Director: Paul Middleditch
Agency: DDB Chicago
Production company: aWhitelabelProduct
CG company: The Mill NY
In “Bridge,” the people of Any Town, USA, are distraught to learn that the Budweiser delivery truck can’t make it because the bridge is out. Determined to get their Budweiser, they form a human bridge to allow the truck to roll into town.
Westley Sarokin, lead Flame artist at The Mill NY and co-shoot supervisor with Yann Mabille, started pre-production on the spot in November, creating previs for the project using Autodesk’s Softimage. “In making a bridge out of people, we wanted to know what the bridge looks like and how it’s structured,” says Sarokin. “We went through many designs to come up with something that fell together but had a coherent shape.”
For this Budweiser Super Bowl spot, The Mill NY constructed a human bridge using live-action components and 200 digital people. The CG bridge was built in Softimage.
|
The goal was to shoot as many practical elements as possible to lend to reality, says Sarokin. “We were trying to get a rough idea for shot-blocking for the storyboards, but because we didn’t know the terrain yet, we kept previs as generalized as possible to get a basic idea of the size of the bridge and how they wanted it to move,” he explains. “We went over all the animation possibilities.”
Santa Paula, California, a picturesque little town near Los Angeles, became Any Town for the shoot, while the bridge sequence was filmed on a farm a few miles away. According to Sarokin, he and the crew arrived at the shoot site a few days early and sat down with the agency and director to map out what they wanted to capture on film. The bridge would consist of the rails (people standing on the side), the road (people laying down horizontally), and the five main pillars supporting the bridge. “Shot by shot, we looked at which elements we could use for multiple shots, and which elements we needed to get practically,” he says.
Post-shoot, Sarokin and Mabille found they didn’t quite have everything they needed. “Every time you go out on set, you never really know what you’re going to get until you sit with an edit and see the footage come together,” says Sarokin. “On site, you do the best you can to match camera angles, lighting, and so on, but it’s not an exact science. There were a few circumstances where we needed an element.”
To fill in the missing pieces, the team set up a greenscreen shoot, using a Canon 5D Mark II camera, on the rooftop at The Mill. “We put people in the perspective we needed to use them as live-action components within the composite,” says Sarokin. “The most important thing was to expose things properly to match the time of day and lighting direction.”
In post, Sarokin was the lead visual effects artist and Mabille was in charge of 3D. “The main point was to have as many realistic-looking people as possible and that the animation of their interrelations made sense,” says Mabille, who used Softimage as his main 3D tool. All told, the group created nearly 200 digital extras.
“There are a couple of live-action plates of the base of the pillar that we used, and then we extended them with CG people,” Mabille describes. “And we built the smaller pillars entirely out of CG people. All the shots looking down at the bridge are re-created in CG as well, because we didn’t shoot any live-action people from this camera angle.”
Softimage’s Ice was used to refine the simulations and handle the massive amount of data. “Ice allowed us to organize and manage the information,” says Mabille. “We had to assimilate the whole bridge animation with the truck that presses down on the people, and had to have them all react accordingly and without stretching.”
The artists used The Foundry’s Nuke to do rough comps before sending them to Autodesk’s Flame for final compositing. “It lets us know if our 3D comps are going to work,” says Mabille.
Despite the pre-planning, The Mill NY had a mere three weeks to get the plates to delivery. Still, “our approach and the techniques we set up early on served us well,” says Sarokin. “We were able to make it look as amazing as possible and add as much detail as we could.”
Forge • Vizio
Director: Wally Pfister, ASC
Agency: Venables Bell & Partners
Production company: Independent Media, Inc. CG
Company: MassMarket
In this hyperkinetic spot, Vizio shows how the worlds of Internet video and television come together, as robotic arms snatch up singer Beyonce and place her inside a cubicle. As the camera pans, we see numerous cubicles, inhabited by the Numa Numa guy, the Twitter bird, musician Tay Zonday, a Flickr sign, a Facebook page, characters from movies, and more.
The Vizio commercial combines star power, pop culture, live action, and CGI—the latter involving robotic arms and more created by MassMarket.
|
Getting a jump on the project, MassMarket producer Paul Roy and his previs team started work between Christmas and New Year’s in anticipation of the four-day shoot that took place the first week in January at the Santa Monica Airport’s Barkar Hangar. Cinematographer Wally Pfister (The Dark Knight, The Prestige, Batman Begins) directed the commercial, his talents a perfect match for the spot’s moody look. “It was great working with Wally,” says Roy. “He is very familiar with 3D, greenscreen, and comps. He has done a lot of set extensions in the movies he’s done and worked with a lot of visual effects.”
The spot featured three sets of real cubicles, reveals Roy, but MassMarket then extended them with two computer-generated ones. “We added an extra row of them and then another set of five towards the back,” he says.
Then there were the robot arms, key elements in the spot. Early on, the decision was made—for reasons of budget and time—to create them with CGI. “It’s actually cheaper to do it in CG than practically,” Roy points out. The production built a one-ninth-scale arm that was filmed for lighting references.
“As soon as we had the design for the robot arm completed and approved, we built a CG version of that practical arm in the art department. We rigged it up so that it was ready before we finished shooting, which let us get started on the animation right away,” says Andrew Romatz, MassMarket visual effects supervisor for 3D.
The digital content creation tool set that was used for the spot included Autodesk’s Maya, Adobe’s Photoshop, and Pixologic’s ZBrush, along with 2d3’s Boujou for camera tracking.
Romatz and John Shirley, who was VFX supervisor for 2D, were both in attendance at the shoot. “Getting lighting to match was important,” says Romatz. “A lot of times, it’s hard to get those reference shots during a busy shoot, but Wally [Pfister] was very accommodating. In fact, he shot a lot of reference shots in 35mm.”
On set, the group collected all the necessary data, captured HDR panoramic shots for lighting reference, and worked closely with the video playback person to check the grayscale and make sure it was the right angle. “It’s something you can’t foresee when you have a CG character pick up a 3D character,” Romatz explains. “You have to imagine timing and the impact of the CG, and anticipate how that will affect the live-action element so the actor can behave in a way that’ll look good.”
The team conducted numerous animation tests, working closely with Pfister to find the sweet spot. “Some of the arms have more of an attitude, some are smooth and slower in how they grasp and pick things up,” says Romatz.
Getting the lighting to work properly was the biggest hurdle. “The surface [of the robotic arms] was so flat and squared off at the edges, so it was challenging to keep the reflections from getting too broad or big,” Romatz says. “A lot of effort went to getting the reflections to look right,” he adds, and to this end, the team did several bright and dark passes. Meanwhile, rough pre-composites were done in The Foundry’s Nuke.
Lead Flame artist Thibault Debaveye notes that it was tricky to find the proper look. “We had to be careful not to go too dark,” he says. “We wanted it moody, but we also had to consider the final color look. In terms of integrating CG and live action, we went back and forth with making sure the colors and lighting directions matched perfectly, and that we were catching shadows from the live-action elements.”
In the end, however, the most demanding aspect of the spot was the one most often cited by VFX/animation houses working on Super Bowl commercials: the schedule. “Logistic issues arise from that,” says Roy. “We worked long hours every day and, on the last day, put in 22 hours. Getting everyone on the same page in three weeks is a big job.”
Squirrel • Honda
Director: Andy Hall
Agency: RPA
Production company: Elastic
CG company: A52
To promote the new Honda Cross Tour, agency RPA worked closely with Elastic, which does conceptual and directing work, and visual effects firm A52—two divisions of the same company. “We have a long-standing relationship with RPA,” explains Andy Hall, who was both director and head of CG on “Squirrel.” “They approached us for the launch of the new Honda Cross Tour, with the idea of going back to the fundamentals of design and aesthetics.”
A four-spot package—the final one of which was “Squirrel” for the Super Bowl—ensued. The design of the spot, which integrates animation and live action, is hard to miss: a low-polygon animation style heavily influenced from the design aesthetic of the 1960s. “Saul Bass was someone they referenced,” says Hall, referring to the legendary graphic designer/filmmaker. “We took the visual cue from that, and then emphasized it to be more contemporary.”
“Squirrel” reveals the capacious storage of the Honda Cross Tour via a squirrel that hides a pineapple, trophy, bowling ball, barbell, and armchair. “It’s about finding the ultimate place to put stuff?…?and that’s the car,” says Hall.
Some of the footage was already in the can when it came time to producing the Super Sunday spot. “We did two shoot-arounds of the car,” says Hall, “one early in the process, and then they decided to use another color of car for the ending, so we shot that in December.” All the live-action footage (which focused on the vehicle) was filmed in 35mm at South Bay Studios near Los Angeles.
The CG in “Squirrel” reflects Saul Bass’s animation style from the 1960s, with its low-polygon look, only in this case, it was crafted using state-of-the-art 3D tools, including Maya.
|
Hall’s background is in animation, so the process, for him, felt natural. “I storyboarded the whole thing in Adobe’s Photoshop and cut a rough animatic in Adobe’s After Effects based on the live action,” he says. “That became the footprint for timing and a cue for the camera angles I wanted to capture.”
Once the animatic was in place, Hall points out, he started blocking out the shots and establishing the interactions in the six shots. “We did several frames up-front for the agency so they could share them with their client, for their comfort level,” he says. “And there’s that comedic element they wanted to introduce as the objects become more and more ridiculous.”
Another striking aspect of the animation is its color palette. “That was established from the first spot of the package,” says Hall. “They wanted a distinct, vibrant look, and that’s followed through to the last spot, with consistency among all four. The last one, “Squirrel,” is focused on hues of oranges and reds, but aesthetically, they all complement one another.”
Although it had a simplistic 2D look, the animation was 3D, created entirely in Autodesk Maya. Compositing, however, was done in The Foundry’s Nuke. “Obviously, we use [Autodesk’s] Flame,” says Hall, “but it made more sense to use Nuke so we could work in floating point. That enabled us to push the minutiae between color ranges more easily. Also, we could tweak and adjust the light in Nuke, rather than going back and relighting in Maya.”
The animation itself went very smoothly, notes Hall. “We’re always trying to push the performance of the character, and there’s an arc in the piece,” he says. “The performance is based on the objects the squirrel is interacting with, and it becomes more extreme; so, it starts quiet and builds towards the end. It was a matter of finding that arc.”
Though the client’s aesthetic touchstone was Saul Bass, Hall found his own inspiration during the creation of the spot. “It was kind of referencing Chuck Jones in terms of the simplicity and the performance,” he says. “You’re not dealing with any dialog. It’s a low-res character captured by strong poses.”
Mr. Warmth • Teleflora
Director: Tim Hamilton
Agency: Fire Station
Production company: Go Films
CG company: Asylum
Just in time for Super Bowl—and Valentine’s Day—Teleflora brought back its talking flowers in a story of comeuppance and comedy. A vain office worker, who snubs a timid colleague, is supremely self-satisfied to receive a box of flowers. But when she opens the box, she finds a mouthy, wilted tulip—voiced by none other than comedian Don Rickles—that berates the sender as well as the receiver. As the horrified woman slams shuts the box, the mousy co-worker receives a beautiful bouquet of ?Teleflora flowers in a vase, presented personally by a Teleflora deliveryman.
Asylum modeled and then inserted four computer-generated tulips into a bouquet of real (albeit wilted) flowers. The artists then tweaked the color and location of the real and fake flowers.
|
Asylum created its first talking flower for Teleflora a year and a half ago to start off the campaign. For Super Bowl Sunday, there was pressure to come up with something that would up the ante. “And they did that with voice talent Don Rickles,” says Asylum executive producer Mike Pardee. “We also needed to up the ante with the look, by updating the textures, colors, and feel of the petals, not just to match the bouquet better, but also to make it feel more real.”
The spot was shot first with a stand-in voice talent, with the VFX team in attendance, to ensure that enough space was left inside the flower box to insert the CG flora during post. “We worked with the director to make sure the framing was right, so when we have the digital flowers in the bouquet, they’d have enough room to breathe and act,” says lead animator/CG supervisor Mike Warner.
The digital tulip that was the hero flower rises upward farther than the other flowers. “We pushed the drama of filling the frame further this time, so the flower could be more [theatrical],” says Warner. “In working with the actors, we find an eye line that will work for the director and also makes sense for us. And that’s not always an easy thing.”
Not all the flowers are digital, however. A florist and the art department put together a practical bouquet using real dead flowers. “The bulk of the bouquet is practical,” Warner says.
Two sticks in the bouquet served as tracking markers, and the digital team used Andersson Technologies’ SynthEyes to track the practical flowers, so the compositors could drop the digital flowers right on top of the bundle.
The four digital tulips were modeled and rigged in Autodesk’s Maya, with textures painted in Adobe’s Photoshop and Right Hemisphere’s Deep Paint. The color palette matched some of the existing flowers in the practical bouquet, although the crew had to fight the temptation to make the CG flowers more beautiful.
“In this case, we want them to look dead and ugly,” says 3D lead Jeff Werner. “There was an ugly red flower in the practical bouquet that we changed more to a brown, so our main hero tulip, which is red, could stand out.”
The animation was a process of iteration. “The danger of going too cartoony is that you want to avoid a lot of Warner Bros.-style squash and stretch,” says Werner. “Our part was to match that dialog just right so it felt like the flower was saying the dialog, and give it that little emotion to turn the flower into an actor. Between head motion and exaggerating the dialog with mouth motions, you get a lot more personality out of it.”
Asylum used The Foundry’s Nuke for pre-compositing, with the final composite work done in Autodesk’s Flame. Rendering, meanwhile, was done within Pixar’s RenderMan. Lighting artist Eric Pender also did pre-comps to set up the lighting, generating passes and mattes so the client could adjust the colors and tones in the final composite. “That’s key, to let the client have all that flexibility in the end,” says Pardee. “Also, having a voice like Don Rickles was fun.”
Fiddling Beaver • Monster.com
Director: Tom Kuntz
Agency: BBDO
Production company: MJZ
CG company: Framestore
In November, BBDO creative director/writer Steve McElligott and creative director/art director Jerome Marucci approached producer Anthony Curti with the concept of a fiddling beaver for a Monster.com commercial. They made two decisions: The beaver would be animatronic, and they would return to Framestore—where they had gone for two previous Super Bowl commercials—for all the CG creation and compositing.
“We wanted it to be realistic, to show a busy-beaver lifestyle and how this one beaver (an animatronic from AnimatedFX in Los Angeles) was an outcast,” says Curti. The spot called for an entire beaver world that had to be constructed quickly, so the group lost no time in contacting the VFX studio. Despite the quick contact, Framestore would have less than two and a half weeks to complete the work.
At the time the project started, there wasn’t a storyboard yet, says Framestore animation supervisor Kevin Rooney. “All we knew was there was a hero beaver playing the fiddle and there would be background beavers,” he says. With time ticking away, the Framestore team immediately began creating a generic CG beaver model to get the process moving. When the modelers received some photos of the animatronic from the shoot, they began to shape their creatures—which would appear in the background shots—to be more in tune with the look of the mechanical star. “What we didn’t know was how far the beavers would be away from camera, and if they’d be swimming or on dry land,” says CG supervisor Jenny Bichsel.
Next up was creating different fur grooms and various animation cycles for the CG animals. “We rigged our beaver models to do anything that might be asked of them,” says Rooney. “We didn’t spend much time setting up the faces because we knew there wouldn’t be much in terms of lip synching. [Rather], we focused on swimming, building dams, preening, and other beaver stuff. That way, when we got the board, we’d have an animation library built up and could hit the ground running.”
Using Maya, the Framestore team modeled and animated 23 CG beavers, matching their look to the animatronic star of the commercial.
|
Once the storyboard came in, Rooney split up the shots. “We’d already done the more naturalistic beaver animation,” he says. “Now we had to go back and tell a story through blocked performances and, in particular, a couple of close-up shots of digi-beavers. We had to be faithful to the animatronic fiddler. We had to make sure ours looked like beavers in terms of their performances, and yet get across character with body language and facial expressions.”
The Framestore crew crafted its CG creatures using Autodesk’s Maya for the modeling chores, rigging, and animation, with some pre-comping that was done in Apple’s Shake. Maya and Mental Images’ Mental Ray were used for the rendering.
In the end, Framestore created 23 digital beavers, with two different hairstyles (dry and wet), using its own grooming system and a newly developed fur shader. “That was more important for the close-up shots, where the beavers were quite large in frame,” says Bichsel. “We also had to make sure they had distinctive color differences. We were able to control the root and tip color of the fur and how coarse and shiny it was, so we manipulated all these channels to get them to look different. With our system, you can paint while looking in the 3D viewport and move the direction of hair, and it updates very quickly.” As a result, the team spent less time on troubleshooting, contends Bichsel, and more time on grooming, painting, and general creativity.
Senior Flame artist Raul Ortego, meanwhile, spent most of his time doing rig removal. “It was quite simple but labor-intensive work,” he says. When the rig was behind the animatronic, the work was much easier. But sometimes the rig was in front, requiring a track (within Autodesk’s MatchMover) and the addition of fur. Some shots had camera moves that had to be replicated in Maya. For rotoscoping, the tool of choice was Silhouette, named after the manufacturer.
“This was a really fast turnaround for creating digital creatures,” concludes Jenn Dewey, VFX producer. “The final animation was [done] a few days before we were supposed to finish lighting the whole spot, and that was two days before it shipped to Flame.” Nevertheless, the spot worked well.