Nvidia Inventions Promise to Make AR More Comfortable
David Luebke for Nvidia
July 27, 2017

Nvidia Inventions Promise to Make AR More Comfortable

Few moments are more magical than slipping on a headset and being instantly transported to an immersive virtual world.

To help bring such experiences to more people, we’re showing some of Nvidia Research’s latest work to heighten the magic of VR and AR at the next week’s SIGGRAPH computer graphics conference, in Los Angeles.

We’ll present work in two areas: what researchers call “varifocal displays,” which give users the ability to focus more naturally while enjoying VR and AR experiences; and haptics, which enhances VR and AR with touch and feel. This represents the latest in a growing body of research we’ve shared over the past decade at industry events such as SIGGRAPH, as well as academic venues.

Enhancing Focus in AR and VR

We’re demonstrating a pair of techniques that address vergence-accommodation conflict. That’s caused when our eyes, accustomed to focusing on objects in 3D space, are presented with stereo images with parallax depth cues, but which are presented on a flat screen at a constant optical distance. Both aim to solve this in different ways by varying the focus of virtual images in front of a user, depending on where they’re looking.

The first, Varifocal Virtuality, is a new optical layout for near-eye display. It uses a new transparent holographic back-projection screen to display virtual images that blend seamlessly with the real world. This use of holograms could lead to VR and AR displays that are radically thinner and lighter than today’s headsets.

This demonstration makes use of new research from UC Berkeley’s Banks lab, led by Martin Banks, which offers evidence to support the idea that the our brains use what a photographer would call a chromatic aberration — causing colored fringes to appear on the edges of an object — to help understand where an image is in space.

Our demonstration shows how to take advantage of this effect to better orient a user. Virtual objects at different distances, which should not be in focus, are rendered with a sophisticated simulated defocus blur that accounts for the internal optics of the eye.

So when a user is looking at a distant object it will be in focus. A nearby object they are not looking at will be more blurry just as it is in the real world. When the user looks at the nearby object, the situation is reversed.

The second demonstration, Membrane VR, a collaboration between University of North Carolina, Nnvidia, Saarland University, and the Max-Planck Institutes, uses a deformable membrane mirror for each eye that, in a commercial system, could be adjusted based on where a gaze tracker detects a user is looking.

The effort, led by David Dunn, a doctoral student at UNC, who is also an Nvidia intern, allows a user to focus on real-world objects that are nearby, or far away, while also being able to see virtual objects clearly.

For example, a label displaying a person’s name above a person’s head might appear to a user to actually be on top of their head, creating an experience that blends the virtual and real world’s more seamlessly. (To learn more, read the award-winning paper Dunn co-authored on this technique.)

New Ideas in Haptics

We’re also showing off two new techniques for using fluid elastomer actuators — small air chambers — to provide haptic feedback that enhances VR and AR, by connecting what you see on your display to what you feel in your hand. Both are created by Cornell
University in collaboration with Nvidia.

One is a prototype VR controller that lets VR users experience tactile feedback while they play, relaying a sense of texture and changing geometry. Its soft skin can safely provide force feedback, as well as simulate different textures and materials.

The second is a controller that changes its shape and feel as you use it. So, a foam sword — the kind you might wave around at a sporting event — feels soft and squishy, yet can transform, in a moment, into a katana that feels longer and firmer in your grip.

We’ve integrated these novel input devices with our VR Funhouse experience. You’ll feel a knock when you whack-a-mole with a mallet in the game, or a kick when you fire at plates in a shooting gallery with antique revolvers.

Learn More

There is a lot of work left to be done to take ideas like these to market and make VR and AR more comfortable for users. B, but from optics to haptics, Nvidia is committed to solving the industry’s hardest technology problems in order to drive mass adoption of VR and AR

Come see our latest ideas on display at SIGGRAPH’s Emerging Technologies exhibit. And don’t forget to stop by our booth for demonstrations of how you can put technologies such as AI and VR to work.