Maurizio Morone, formerly a design team coordinator at Pininfarina Extra, remembered the day he presented three alternative cell-phone prototypes to his client through a live Webcam feed. He showed three different versions, all sitting on his desktop. The prototypes didn’t exist in real life—at least not at the time of the meeting. They existed merely as digital mock-ups, surfaced in Robert McNeel & Associates’ Rhino software and detailed in PTC’s Pro/Engineer software. He was able to project these models onto his desk (which does exist in reality) through the use of LinceoVR software from Seac02.
(Top) At Ford’s Immersive Vehicle Environment lab, engineers merge digital vehicles and physical structures to test new models. (Bottom)
The fancy term for what Morone did is “augmented reality,” sometimes also called “mixed reality” or “virtual reality.” Either way, it is where digital and physical realities merge, morphing into a hybrid environment. It’s different from superimposing a 3D digital model onto a 2D photograph. Placing a 3D object in the foreground with a 2D image in the back (which you can do in many 3D modeling and rendering programs nowadays) gives you a static image, not an interactive scene. In Morone’s setup, he was able to freely move the prototype on his desk by moving a small placard that represented the digital data. On the other end of the Webcam, his clients saw a cell phone on a desk, responding in real time to Morone’s movements and the Webcam’s focus.
Forward-thinking automakers, like Ford Motor Company, hope to deploy augmented reality too, as a way to test out its new models’ comfort and ergonomics, as well as simulate manufacturing processes. Imagine being able to see, feel, and reach for the steering wheel, speedometer, and dashboard as though you were a driver of a certain height and stature before a physical prototype of the car
exists. What would be the ideal configuration for a 6-foot-2-inch male driver? What would it be for a 5-foot-6-inch female driver? And what’s the happy medium between those two? If a mechanic needs to inspect and service the drivetrain, can that person do so without coming in contact with electrical wires or without straining his or her back? The best way to answer these types of questions is to evaluate the car in mixed reality, using the digital model of the design in progress.
Design software maker Autodesk, which has frequently used the tagline “experience it before it’s real,” has prototyped several interactive mixed-reality systems. A number of them are currently
installed in its customer briefing center in San Francisco, overlooking the Ferry Building and the San Francisco Bay. Here, you’ll find that you can walk into a luxury condo that exists only as an Autodesk Revit 3D model, or drop different 3D buildings into digital landscapes and inspect them as though you were standing before the full-size model.
In December 2009, when company CTO Jeff Kowalski addressed the Autodesk faithfuls at the annual Autodesk University conference, he prophesized, “In the future, your real world will be much more virtual, your virtual world will be much more real, and the future is coming faster than you think.”
The future arrived faster than Kowalski’s prediction. While large-scale, mixed-reality setups may take time, nothing prevents average consumers from diving into mixed reality right now. The software that Morone used to present his cell phones, for instance, can be had for as little as €25 ($30 US) per year—less than the price of a standard Webcam.
Pocketbook-friendly VR
If you’re an Alfa Romeo enthusiast with just €25 in your wallet, you might buy a glossy, full-color calendar commemorating the latest models of your favorite car. You’d be hard-pressed to find any Alfa Romeo memorabilia that costs less at the online Alpha Romeo Shop (http://shop.alfisti.net). Yet, you’d be glad to know that for the same price, you can buy a copy of LinceoVR from Seac02, the company that boasts the Italian automaker as one of its clients.
In what might be one of the biggest understatements by a software developer, Seac02’s founder and CEO Andrea Carignano says, “Our market is a little bit wide.” The base version of his augmented-reality software is called LinceoVR for All—that should give you a rough idea how wide his market is.
“With [this edition], we’re addressing the lower-end market: students, technology enthusiasts, educators, and small companies,” explains Carignano. “They may not have the money to invest in a technology that’s nice to have but not a must-have solution.”
Seac02’s LinceoVR, an affordable rendering software for creating augmented-reality scenes, displays a digital vehicle model on a city street (left). It can also project a digital prototype (right) into live footage of real environments.
Seac02’s larger customers, like Samsung and Prada, use Professional and Enterprise Editions of LinceoVR, which come with a number of options better suited for commercial projects. For example, the base version, LinceoVR for All, lets users work in only 640 x 480 resolution in an augmented-reality viewport, whereas Professional and Enterprise Editions let them work in much higher resolutions. Similarly, the base version lets customers render images and export them only using lower-resolution settings. Nevertheless, it is a fully functional augmented-reality setup.
LinceoVR can import common computer-aided design or 3D files, such as FBX, OBJ, 3DS, IGES, or STEP formats. Once the file is imported, the software lets a person associate the digital model with a marker, a small placard containing a computer-recognizable pattern (the user can print it from a standard printer). Essentially, this enables the software to swap the marker with the digital model in the live video feed, so when a person holds the marker in front of the Webcam, the client on the other end sees the user holding the digital object.
For hobbyists, the gimmick itself might be rewarding enough. But for small design shops and consultants, the augmented-reality solution offers a way to let the client see the concept in the physical environment where the end product might be deployed. E-mailing the client a rendered image is now the standard practice, hardly worth mentioning. But being able to put a lipstick-red cell phone on a mahogany desk or place a chrome-plated laptop under a lamp light when the cell phone and the laptop in question exist only as digital models—that elevates the art of presentation to a whole new level.
Components of an augmented-reality setup (left to right): a small gadget representing the camera, placards printed with textures of scanned materials, a camera (here in the shape of a kid) positioned inside a building’s blueprint, and a small home associated with the digital model of a home.
Beyond aesthetics, there also may be benefits to the technology when it comes to decision-making. LinceoVR user Morone recalls, “Once, we made a desktop lamp for a client. They weren’t sure about the right dimensions. So after we made one mock-up, they wanted us to make another mock-up, but larger. That would have cost them more money, so I told them, ‘I can show you the lamp in virtual reality.’ ”
In LinceoVR’s augmented reality, Morone created a larger digital mock-up simply by scaling the existing digital model. He showed the client two digital mock-ups side by side—an illusion achieved by positioning two copies of the same digital model at different scales. With this effortless pixel push, the indecisive client swiftly came to a decision on the dimensions for the lamp.
Augmented-reality footage created in LinceoVR can be exported in standard movie files. But they may also be exported in a format viewable in LinceoVR Viewer, a free download from Seac02. Unlike a movie file, the scene preserved in LinceoVR Viewer remains interactive, so the recipient may move, tumble, and rotate the digital model embedded in it.
“You can’t edit a scene in the free viewer,” explains Carignano, “so you can’t apply different materials or select a different HDR background, but you can do everything else. It’s just like an extended LinceoVR license.”
While LinceoVR can be used to render photorealistic scenes, the output may not be as mathematically accurate as higher-end rendering programs. For example, LinceoVR doesn’t employ global illumination, and its raytracing engine is, by the founder’s own admission, not on par with what’s in, say, Luxion’s KeyShot or Autodesk 3ds Max.
In April, Seac02 struck an agreement with Think3 to add features from the basic-edition LinceoVR into Think3’s ThinkDesign 3D software. This makes ThinkDesign one of the first CAD software packages to include an augmented-reality function out of the box. The company is also preparing to release a PowerPoint plug-in for LinceoVR to let users embed mixed-reality scenes in standard PowerPoint presentations.
Pixel-Perfect Craftsmanship
In Henry Ford’s hometown of Dearborn, Michigan, Elizabeth Baron prepares to shepherd the red-blooded American motor company into virtual reality. Baron oversees Ford’s Immersive Vehicle Environment (iVE) lab, a state-of-the-art facility she helped spawn. Here, designers and engineers shuttle between physical space and the digital realm using sophisticated technologies. This hybrid workflow allows them to understand, address, and refine a future Ford owner’s comfort before he or she ever sets foot in the car.
“With virtual-reality technology,” says Baron, “we can marry a bare-bones vehicle structure with a beautiful digital model that has all the details, including climate-control devices, then we can inspect it for craftsmanship: Are there exposed fasteners? Are the wire routings in the way? What does the headlamp housing look like? It’s hard to gauge what the customer would see [without a physical prototype]. That’s what’s so powerful about virtual reality and augmented reality. It can represent the customer’s point of view.”
In Autodesk’s mixed-reality setup for exploring building exteriors, the tilt, rotation, and angle of the gadget representing the camera determine the POV of the building exterior in the monitor. The small-scale building represents the detailed digital model to be visualized.
One of the virtual-reality systems Baron is evaluating is Realtime Technology’s (RTT’s) DeltaGen and RealView software.
Photorealistic digital prototypes built in RTT DeltaGen can be used with the RTT Immersive module, which allows users to examine a 3D model by employing a virtual-reality head-mounted display. With the RTT RealView module, a person can superimpose digital mock-ups into live Web feeds and camera views. Those who wish to deploy the immersive system in panoramic CAVE environments may distribute the visualization tasks over computer clusters using the RTT Scale module.
One of the challenges of using a virtual-reality setup, acknowledges Baron, is “convincing people that the output we see is reliable, that it really does represent the real world.” Now that she and her colleagues have overcome this hurdle, they confront another challenge in validating mixed-reality systems.
Ford evaluated the use of RTT RealView to replace key vehicle features by overlaying several virtual design alternatives on a physical model. Baron says she can see the potential in this approach. “It is compelling when the virtual and physical align as expected,” she says. “The virtual data is validated by its correct placement on the physical data.”
What often frustrates Baron is the disparity between hardware and software technologies. For instance, even though the latest generation of CPUs and GPUs are designed to perform multithreaded computing, many professional engineering software developers have not refined the code to take advantage of multithreaded computing. “I had a 64-bit hardware system for at least two years before I began to see 64-bit apps,” she observes with a chuckle.
Constructing a physical vehicle mock-up in clay or foam on structural frames usually takes about six weeks, with a price tag of $250,000 to $1 million, estimates Baron. (The greater the level of detail, the higher the cost.) Ford used to build four prototypes for each new vehicle model in development. Today, the number has been reduced to one. “I’m a huge advocate of doing immersive design reviews,” says Baron. “I think that’s an area untapped by some companies that could benefit from it. It gives us the power to represent you, the customer.”
Seeing What’s Not There
Often, design software maker Autodesk would give the public sneak peeks of its technologies under development, by making them available through Autodesk Labs at http://labs.autodesk.com (see “Lab Report,” April 2010). Project Newport, described as “a real-time 3D story-building technology designed specifically for architectural visualization and presentation,” is one such technology. It represents an amalgam of the company’s game visualization and CAD modeling software. When hooked up to a physical interface, it becomes a mixed-reality system that lets people see what’s not really there.
A critical component of Autodesk’s mixed-reality systems (currently in prototype stage and not yet commercially available) is the Mixed Reality Interface hardware developed by technology partners Komme-Z (www.kommerz.at) that functions as navigation space. A series of hidden cameras mounted inside track the movement of the top surface of a mouse, allowing the system to translate the movement it detects into the hypothetical 3D space projected onto the display monitor. This setup lets a person navigate, for instance, inside a digital building model constructed in Autodesk Revit software simply by moving a mouse along a 2D floor plan.
The same setup allows the mouse to be used as a virtual camera, aimed at another physical object associated with 3D data. Therefore, by moving or tilting the mouse, users can inspect the digital data as though they were standing before the full-scale structure. The mouse, in fact, could be any portable object (for example, a toy figurine), as long as its bottom surface is equipped with a pattern recognizable by the cameras and the computer.
In augmented reality, a marker (often a small placard with a unique printed pattern) could represent any digital data, allowing the software to project the digital data (a 3D building model or an automobile) as part of its vision. This allows a person to swap materials in a digital building, or superimpose scanned textures on a real building, simply by swapping markers.
Brian Pene, an Autodesk technical evangelist, observes, “With the pervasiveness of tablet PCs and portable computing devices, the future is in mobile computing. Imagine if the computer vision can track my hand, then my hand itself can become an interface to manipulate objects in augmented reality.”
In another form of augmented reality, Autodesk’s MatchMover camera-tracking software may be used to superimpose 3D digital buildings onto video footage, by associating key points in the video footage with correlating vertices in the digital building. “The technology has been available for some time, but it used to require batch processing,” explains Brian Mathews, vice president of Autodesk Labs. “With the increased processing power available, we can now do this kind of visualization in real time.” He did his most recent demonstration on a standard laptop. It went off without a hitch.
At the moment, software-based, pixel-tracking algorithms seem to produce more accurate results than sensors embedded in cell phones and portable devices, but when sensors become more sophisticated, augmented-reality deployment may become as easy as pointing an iPhone at a marker.
Last Christmas, Home Depot launched a new line of Interactive Gift Cards. The difference between these and other gift cards is, when recipients hold up the card before a Webcam-equipped computer, they can see themselves holding a product—say, a power drill or a bucket of paint—corresponding to the value of that particular card. (For a demonstration of Home Depot’s gift card, visit http://3d.
homedepot.com.)
At home or at work, before long, augmented reality may become an everyday reality. n
Kenneth Wong is a freelance writer who focuses on the computer game and CAD industries, innovative usage of technology, and its implications. He can be reached at Kennethwongsf@earthlink.net.