Share

AR Research from NVIDIA – fxguide amazing

[ad_1]

For many years fxguide has enjoyed the insane and wonderful demos and presentations at SIGGRAPH’s Emerging Technologies. This year, the booth that most ‘caught our eye’ was NVIDIA’s AR Research.

At SIGGRAPH’s Emerging Technologies area in Los Angeles last week NVIDIA had a pair of experimental wearable augmented reality displays technologies or glasses. It is thought that AR could grow far more popular than VR due to the nature of not being shut off from the world. However, for wearable AR to work, it needs to address the field of view and deal with the 20% of the world’s population who are myopic (and this percentage is increasing). The Hololens 1 from Microsoft, for example, was an incredible breakthrough, but the image was contained to a very small or narrow field of view. By comparison, the Magic Leap One fits on a users head in a way that does not allow for the user to wear their traditional glasses. The Hololens, while having a limited field of view, does allow people to wear their glasses.

Note: Some of you may notice fxguide’s Mike Seymour CG head used below, – the team used Mike’s head, but he was not involved in the AR research, but he does wear glasses!

Foveated AR

The issues to be addressed are complex and are not limited to only display technology limitations. Even if a wide field of view system was available, it is still impractical to transmit video data with today’s display interface standards at the bandwidth required. One popular and promising idea that can both improve overall visual quality and reduce required bandwidth is to take advantage of the human foveal architecture, where only a small portion of the visual field is seen in high resolution. This is referred to as a foveated display.

“Foveated AR,” addresses the problem of a narrow field of view, by having a headset that adapts to your gaze in real-time using deep learning. It adjusts the resolution of the images it displays and their focal depth to match wherever you are looking and gives both sharper images and a wider field of view than any previous AR display.

To do this, it combines two different displays per eye: a high-resolution small field of view, displaying images to the portion of the human retina where visual acuity is highest; and a low-resolution display for peripheral vision. The result is high-quality visual experiences with reduced power and computation.

The display combines a traveling microdisplay relayed off a concave half-mirror magnifier for the high-resolution foveal region, with a wide field-of-view peripheral display using a projector-based Maxwellian-view display, whose nodal point is translated to follow the viewer’s pupil during eye movements using a traveling holographic optical element.

The same optics relay an image of the eye to an infrared camera used for gaze tracking, which in turn drives the foveal display location and peripheral nodal point. The display supports accommodation cues by varying the focal depth of the microdisplay in the foveal region, and by rendering simulated defocus on the “always in focus” scanning laser projector used for peripheral display.

The resulting family of displays significantly improves on the field-of-view, resolution, and form factor tradeoffs compared to previous augmented reality designs. At SIGGRAPH, the team were showing prototypes supporting 30, 40 and 60 cpd foveal resolution at a net 85° × 78° field of view per eye.

The physical adjustments focuses the resolution of the image to the right point on a user’s eye. This demo showed the retargeting dynamically.

The wearable prototype consists of a modular, 3D printed frame that housed and aligned all of the optical/mechanical components including a compact laser projector (MEGA1, MEGA1-F1), a beam shaping lens (Edmund Optics, 84-281), a right-angle prism (Thorlabs, PS908), a micro OLED (SONY, ECX335R), optical front-end from ILLUCO (combiners and half mirrors for the fovea and tracking optical path), and the motion unit used to translate the foveal and peripheral displays in relation to each other.

Mock up

Prescription AR

The other was a pair of “Prescription AR,” glasses. This is a prescription-embedded AR display. It is much thinner and lighter and has a wider field of view than current-generation AR devices. Virtual objects appear throughout the natural instead of clustered in the center, and it is designed to have someone’s prescription built right into it. If you wear corrective optics, you will know AR is not a natural fit with glasses. The Magic Leap 1, for example, is designed to take a special optical element, which is effectively a special pair of glasses that you buy that magnetically clip into the Magic Leap. However, the insert optical unit called an ‘InSpatialRx Prescription Insert’ is hundreds of dollars plus  only with prescriptions with power in the following nearsighted range: Sphere (SPH) of  -7.5 to 0.  If you have a + or farsightedness prescription, then you are out of luck.

With the NVIDIA prototypes, people with prescriptions are a step closer to being able to get comfortable, practical and perhaps even socially-acceptable AR wearable displays.


A prescription lens corrects the viewer’s vision while a half-mirror-coated free-form image combiner located delivers an augmented image located at the fixed focal depth (1 m).

The prototype delivers an augmented image through a prescription lens. A beam-shaping lens and an in-coupling prism are attached to the top surface of the prescription lens, so the light rays from a micro OLED (Sony, ECX339A) goes into the 5-mm thick prescription lens and propagates inside the lens. After two total internal reflections inside the lens, the light rays meet the half-mirror coated free-form surface and are reflected to the eye.

The team demonstrated both a static demo and a dynamic demo. The dynamic version included the real-time generation of binocular images and the generated images will be transferred through a cable. The system was set up with an iPhone so the results could be seen on a screen. The phone was not an active part of the demo.

The glasses are very lightweight, just a shade over 46 grams.  Above is the static prototype which was shown in a non-tethered demo and shows printed images on the light valve technology (LVT) films. The backlight module and the battery were included in the wearable prototype.



[ad_2]

x