Technique: lVirtual Restoration
Issue: Volume: 27 Issue: 9 (September 2004)

Technique: lVirtual Restoration

One of the most stunning animations premiering in the Electronic Theater at SIGGRAPH 2004 was a computer graphics tour de force, titled "The Parthenon," which, in two and a half minutes, visually restores to its original glory the ancient Greek monument that has stood at the head of the Athenian Acropolis for two and a half millennia.

The film—created by a team led by Paul Debevec, executive producer of graphics research at the Institute of Creative Technologies (ITC) in Los Angeles and research assistant professor at USC—applied new techniques to reassemble and renew the Parthenon, considered to be one of the world's most architecturally refined structures.

Constructed in 432 B.C. as the temple to the goddess Athena, the Parthenon has been damaged by wars and the elements over the centuries. Adding insult to injury, most of the classical statues and sculptures decorating the monument were removed in the early 1800s and placed in the British Museum, where they remain on display today.




To create this photoreal scene of the Parthenon, Paul Debevec's team applied several novel modeling and rendering techniques.

To create CG models of the sculptures, Debevec's group developed a 3D scanning system consisting of a standard 1024x768 video projector and a 1kx1k video camera. With the device, researchers captured 2200 scans, accurate to approximately 1mm, which they assembled into precise models. Next, they employed a 3D laser scanner, taking some 120 panoramic scans of the Parthenon and collecting more than six billion point measurements, which were assembled into a 90-million polygon model.

Once the CG models were completed, the team devised a novel method of generating accurate color texture maps that overcomes the shortcomings of traditional techniques. Rather than simply using digital photographs—which record surface colors only as they appear under lighting conditions that can vary widely according to cloud cover, time of day, shadows, and so forth—the researchers recorded precise lighting measurements as they took each photo. The readings were taken with a new "light probe" device from Debevec's group that measures the incident illumination—that is, all the light arriving at a given point, including direct light from the sun as well as indirect light from the sky, clouds, and ground.

Using this data, the group applied an inverse global-illumination technique to "un-light" the photographs and reveal the true colors of the temple's surfaces.


 


When taking digital photos of the Parthenon, the researchers simultaneously captured precise incident-illumination data using a new "light probe" device.

Another advantage of the approach is that the results can also be used to re-light and render the virtual scene under any desired illumination. So, rather than waiting in Greece to capture lighting with a desired combination of clouds and sun, the team returned to Los Angeles and recorded illumination from the roof of the ITC using High Dynamic Range process imaging. According to Debevec, the new HDR technique is the first to directly capture the full dynamic range of natural illumination—from the dark predawn sky to the brilliant light of the midday sun—a range of more than 100,000 to one. The researchers then used the results to create the time-lapse lighting in their final renderings of the Parthenon.

In "The Parthenon" animation, the lighting appears photoreal, even as the intensity of the illumination changes—moving from cloudy to sunlit during the rapid time-lapse effects—and as the virtual camera moves through the digital models of the Parthenon and the British Museum.

During the short film's climactic sequence, after the statues return to their original condition and location and the long-lost pieces of the monument are refitted, the camera pulls back to reveal in dazzling color and detail how the Parthenon may have appeared when it had first been built. —Phil LoPiccolo

Computer Graphics World September, 2004
Author(s) :   Phil LoPiccolo