GET IN TOUCH WITH PAKKO, CREATIVE DIRECTOR ALIGNED FOR THE FUTURE OF CREATIVITY.
PAKKO@PAKKO.ORG

LA | DUBAI | NY | CDMX

PLAY PC GAMES? ADD ME AS A FRIEND ON STEAM

 


Back to Top

Pakko De La Torre // Creative Director

Ride the Wave: Augmented Reality Devices Rely on Waveguides | Radiant Vision Systems

Ride the Wave: Augmented Reality Devices Rely on Waveguides | Radiant Vision Systems

Augmented and mixed reality (AR/MR) devices share a key characteristic that sets them apart from virtual reality (VR) devices: they are transparent. The hallmark of VR devices is how they completely encompass the wearer’s field of view inside a headset to create an immersive virtual environment. By contrast, AR/MR smartglasses and headsets project images onto a clear display surface that enables the wearer to see through to the real world.

Waveguides in AR/MR Devices

The principal technology that has enabled augmented visualization in AR/MR are waveguides: a thin piece of clear glass or plastic with specific light transmitting properties. Already a well-established concept, waveguides have been used in multiple technology applications such as fiber optics, LED backlights, holograms, and more. In all applications, waveguides are used to “guide” electromagnetic waves in specific directions, shapes, or patterns. 

In near-eye devices (NEDs), optical waveguides help bend and combine light to direct it into the eye and create the virtual images seen by the wearer overlaid onto the environment. They propagate a light field via the mechanism of total internal reflection (TIR), bouncing light between the inner and outer edges of the waveguide layer with very little light leakage. 

Schematic illustration of the mechanism of total internal reflection (TIR) for an AR device: a planar glass waveguide layer receives light via an input coupler (entrance pupil) and bounces it between the edges of the layer without losing (leaking) light until it reaches an output coupler (exit pupil). (Image source)

VR headsets can display images from a projector or imaging system placed directly in front of the wearer, but AR/MR devices need “see-through” functionality. “The imaging system cannot block the front view, which therefore requires one or several additional optical elements to form an ‘optical combiner.’ The optical combiner reflects virtual images while transmitting external light to human eye, overlaying the virtual content on top of the real scene, for them to complement and ‘augment’ each other.”1  

In a NED, waveguide technology uses “an image projector tucked away out of the line of vision [to] projects the image into a small peripheral area of the display lens, then propagates it along the lens to an extraction point in front of the eye.”2 Essentially, the waveguide acts “as a transparent periscope with a single entrance pupil and often many exit pupils.”3

Comparing the configuration of a VR device (left) with display and optical modules (lens, projector, opaque display surface) directly in the user’s field of view, with an AR device (right) where a transparent optical combiner (waveguide) receives light input from the display and the real-world simultaneously, combining them to present the user with an integrated scene.

Waveguide Adoption

To gain traction in the consumer marketplace, AR/MR devices need to provide a high-quality image with a large field of view (FOV), in a compact form factor with minimal weight or bulk. There are many types of optical combiners but, to date, the only type that has been able to achieve the desired visual quality for AR in a tiny package are waveguides.4 Waveguides have become a key component in many of the AR devices released to market thus far, including HoloLens, Magic Leap, and others.

Adoption of AR/VR devices has been on a growth trajectory that is expected to accelerate, from industry revenues of US$ 17.67 billion in 2020 to more than $26 billion by 2028, a CAGR of 4.3%.5  Head-mounted devices (HMDs) such as smartglasses represent roughly 65% of the market.6 AR/MR applications in healthcare, construction and architecture, education, and navigation are helping to drive adoption, and spurring continuing development of optical waveguides. 

Waveguide Structures

Waveguides used in AR/MR devices are typically a glass substrate that can range from sub-nanometers to a few nanometers thick. Variations in the waveguide and coupler architecture and application of surface gratings or coatings enable developers to create virtually limitless waveguide structures to serve different purposes. “The core of a waveguide combiner consists of the input and output couplers. These can be either simple prisms, micro-prism arrays, embedded mirror arrays, surface relief gratings, thin or thick analog holographic gratings, metasurfaces, or resonant waveguide gratings,”7 along with beam splitters, curved combiners, or free-form optics, all of which provide specific advantages and limitations. Stacking multiple waveguide combiners can propagate enhanced color images with a wider field of view.

Schematic illustration of a multi-layer waveguide: each waveguide combiner layer transmits one portion (red, green, blue) of the light wavelength spectrum. Air gaps between each layer produce the desired TIR condition and allow for potential additional spectral or polarization filtering. (Image source: AR VR Journey)

Four main types of waveguides are commonly used in modern HMD systems for AR/MR applications:

Reflective – uses a molded plastic substrate to guide the light waves and semi-reflective mirror in front of the eye. Images are generated on a micro-display, magnified by a collimating lens, and “the collimated light waves are transmitted to the semi-reflective mirror through the waveguide. Finally, the human eye sees the images reflected”8 onto the mirror and the real-world at the same time. Google Glass and Epson’s Moverio devices use reflective-type waveguide structures.

Polarized – also known as transflective, polarized waveguides require multiple layers of coating and polarized reflectors, which are aligned in parallel and polished to guide the light waves effectively. Lumus used this type of waveguide in their see-through AR products.

Diffractive – the most widely-used waveguide structure for AR displays. Incident light waves enter the waveguide at an angle due to slanted gratings, called the in-coupler. The light passes through the waveguide and is extracted at the exit pupil through a second slanted grating, the out-coupler. The in- and out-couplers are typically produced with a diffractive optical element (DOE) fabricated with slanted gratings. Vuzix Blade smartglasses, Microsoft HoloLens, and Magic Leap One are a few examples of devices that use a diffractive waveguide structure.

Holographic – these are structured like diffractive waveguides, with holographic optical elements (HOEs) used as the in- and out-couplers in place of DOEs. The HOEs can reflect monochromatic or polychromatic (RGB) light waves. The HOE is fabricated in the hologram recording process with the incident angle using laser illumination. 

Schematic illustration of some common waveguide structures for AR/MR devices with corresponding images of grating structures: (a) polarized (transflective), (b) diffractive with surface gratings, and (c) holographic (diffractive with volumetric holographic gratings). (Image source: AR VR Journey)

The core of any diffractive waveguide is the grating—defined as a “periodic optical structure, whose periodicity can either be represented by the embossed peaks and valleys on the surface of the material, or by the ‘bright/dark’ fringes formed by laser interference in the holographic technology.9 

Measuring Waveguide Performance for AR/MR Devices

When light is guided through an optical component such as a lens or a waveguide it is always affected in some way by the physical properties of the component. By the time light from an AR waveguide reaches our eye, it has already bounced many times through the optical structure, as determined by the angle of incidence and reflection, diffraction effects of the grating, and many other factors. This process has an impact on optical efficiency, meaning, light energy is reduced as it travels from the source display to our eye. 

This effect can result in poor brightness, contrast, and clarity of the virtual image as perceived by a human viewer. It is especially important to control image quality in see-through NED designs, where superimposed images must be visible and clear to a user in a range of ambient lighting conditions. Diffractive waveguides are prone to issues with color tones in grayscale images, causing uneven brightness (luminance) and color (chromaticity) in the images.

Projection from a diffractive waveguide exhibiting non-uniformity of both brightness and color tones on mid-level gray pixels. (Image source).

A recent study10 demonstrated how, depending on the incident angle, even slight roughness of the optical waveguide surface can degrade the image quality of a display, as measured by an MTF (modulation transfer function) analysis.

Images from a study of effects of incident angles on the waveguide surface. (a) and (b) are 65° and 75°, respectively. Grayscale images (c) and (d) correspond to images (a) and (b). MTF calculations are made in areas (e) and (f). (Image source)

One method to test waveguide performance during the development is to project light through it from a light source or picture generating unit (PGU). The resulting image can then be evaluated for various parameters such as brightness (luminance), color (chromaticity), uniformity, and sharpness. Colorimetric imaging systems are an effective tool for capturing absolute values that help guide optical design. Radiant’s ProMetric® I-Series Imaging Colorimeters can measure values including luminance and chromaticity and evaluate these values across the entire image to determine uniformity. Because measurement data is quantifiable, numerical values can be used to compare results throughout design iterations, based on different structural changes to the waveguide.

A picture generating unit projects an image through the waveguide; output is measured by ProMetric Imaging Colorimeter and analyzed using TT-ARVR™ Software (upper right, example analysis image shown in false-color scale).

For more information about quality testing the latest devices and components of AR, VR, and MR systems (collectively, XR), watch the webinar: “Novel Solutions for XR Optical Testing: Displays, Waveguides, Near-IR, and Beyond.” Presented by Radiant in cooperation with Photonics Media, the webinar provides guidance on optical performance testing for AR/VR/MR devices and components such as waveguides. In it, we demonstrate novel technologies that emulate the human eye and are optimized for accuracy, speed, and ease of use at each stage of component development and production.

CITATIONS

This content was originally published here.