CN112558307B - Improved manufacturing of virtual and augmented reality systems and components - Google Patents

Improved manufacturing of virtual and augmented reality systems and components Download PDF

Info

Publication number
CN112558307B
CN112558307B CN202011551607.0A CN202011551607A CN112558307B CN 112558307 B CN112558307 B CN 112558307B CN 202011551607 A CN202011551607 A CN 202011551607A CN 112558307 B CN112558307 B CN 112558307B
Authority
CN
China
Prior art keywords
diffractive optical
optical element
substrate
layer
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011551607.0A
Other languages
Chinese (zh)
Other versions
CN112558307A (en
Inventor
R·D·泰克尔斯特
M·A·克鲁格
P·M·格列柯
B·T·朔文格特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magic Leap Inc
Original Assignee
Magic Leap Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/007,117 external-priority patent/US9915826B2/en
Application filed by Magic Leap Inc filed Critical Magic Leap Inc
Priority claimed from PCT/US2016/021093 external-priority patent/WO2016141372A1/en
Publication of CN112558307A publication Critical patent/CN112558307A/en
Application granted granted Critical
Publication of CN112558307B publication Critical patent/CN112558307B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1847Manufacturing methods
    • G02B5/1852Manufacturing methods using mechanical means, e.g. ruling with diamond tool, moulding
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29DPRODUCING PARTICULAR ARTICLES FROM PLASTICS OR FROM SUBSTANCES IN A PLASTIC STATE
    • B29D11/00Producing optical elements, e.g. lenses or prisms
    • B29D11/00009Production of simple or compound lenses
    • B29D11/00355Production of simple or compound lenses with a refractive index gradient
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29DPRODUCING PARTICULAR ARTICLES FROM PLASTICS OR FROM SUBSTANCES IN A PLASTIC STATE
    • B29D11/00Producing optical elements, e.g. lenses or prisms
    • B29D11/00009Production of simple or compound lenses
    • B29D11/00432Auxiliary operations, e.g. machines for filling the moulds
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29DPRODUCING PARTICULAR ARTICLES FROM PLASTICS OR FROM SUBSTANCES IN A PLASTIC STATE
    • B29D11/00Producing optical elements, e.g. lenses or prisms
    • B29D11/00663Production of light guides
    • B29D11/00682Production of light guides with a refractive index gradient
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4205Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1814Diffraction gratings structurally combined with one or more further optical elements, e.g. lenses, mirrors, prisms or other diffraction gratings
    • G02B5/1819Plural gratings positioned on the same surface, e.g. array of gratings
    • G02B5/1823Plural gratings positioned on the same surface, e.g. array of gratings in an overlapping or superposed manner
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1842Gratings for image generation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1847Manufacturing methods
    • G02B5/1857Manufacturing methods using exposure or etching means, e.g. holography, photolithography, exposure to electron or ion beams
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted

Abstract

The present invention relates to improved manufacturing of virtual and augmented reality systems and components. An improved diffractive structure for a 3D display system is disclosed. The improved diffractive structure includes an intermediate layer located between the waveguide substrate and the top grating surface. The top grating surface comprises a first material corresponding to a first refractive index value, the underlayer comprises a second material corresponding to a second refractive index value, and the substrate comprises a third material corresponding to a third refractive index value. According to additional embodiments, improved methods are provided to achieve deposition of imprint material on a substrate, which allows for very precise distribution and deposition of different imprint patterns on any number of substrate surfaces.

Description

Improved manufacturing of virtual and augmented reality systems and components
The application is a divisional application of PCT application having an application date of 2016, 3/month and 5, an international application number of PCT/US2016/021093, a Chinese national phase application number of 201680013598.5, and an inventive name of "improved manufacture of virtual and augmented reality systems and components".
Technical Field
The present disclosure relates to virtual reality and augmented reality imaging and visualization systems.
Background
Modern computing and display technologies have facilitated the development of systems for so-called "virtual reality" or "augmented reality" experiences, in which a digitally reproduced image, or a portion thereof, is presented to a user in a manner that appears to be realistic or can be perceived to be realistic. Virtual reality or "VR" scenarios generally involve presenting digital or virtual image information, but not seeing actual real-world visual input; augmented reality or "AR" scenarios generally involve presenting digital or virtual image information as a visual augmentation of the user's surrounding real world. For example, referring to fig. 1, an augmented reality scene (4) is depicted where a user of AR technology sees a park-like real world setting (6) featuring people, trees, buildings in the background, and a concrete platform (1120). In addition to these items, the user of AR technology also perceives that he "sees" the robot image (1110) standing on the real world platform (1120), and the cartoon-like virtual image (2) is dancing, appearing like an avatar of a bumblebee, although these elements (2, 1110) are not present in the real world. Indeed, the human visual perception system is extremely complex, and it is very challenging to generate VR or AR techniques that facilitate comfortable, natural, rich presentation of virtual image elements among other virtual or real-world image elements.
Presenting 3D virtual content to users of AR systems presents a number of challenges. A central premise in presenting 3D content to a user is to create multi-depth perception. In other words, it is desirable to have some virtual content appear closer to the user, while other virtual content appears to come from a farther location. Thus, to achieve 3D perception, the AR system should be configured to deliver virtual content at different focal planes relative to the user.
In order for a 3D display to produce a realistic sense of depth, and more specifically, a simulated surface sense of depth, it is necessary for each point in the display field to produce an adaptive response corresponding to its virtual depth. If the adaptive response to a display point does not correspond to the virtual depth of the point, as determined by the binocular depth cues for convergence and stereo vision, the human visual system may experience an adaptive conflict, resulting in unstable imaging, harmful eye fatigue, headaches, and in the absence of adaptive information, surface depth is almost completely absent.
Accordingly, there is a need for improved techniques to enable 3D displays that address these and other problems of conventional approaches. The systems and techniques described herein are configured to cooperate with the visual configuration of a normal person to address these challenges.
Disclosure of Invention
Embodiments of the present invention relate to devices, systems, and methods for facilitating virtual reality and/or augmented reality interactions for one or more users.
An Augmented Reality (AR) display system for delivering augmented reality content to a user according to some embodiments includes: an image generation source for providing one or more frames of image data; a light modulator for transmitting light associated with the one or more frames of image data; a Diffractive Optical Element (DOE) for receiving light associated with the one or more frames of image data and directing the light to an eye of a user, the DOE comprising a diffractive structure having a waveguide substrate corresponding to a waveguide refractive index, a surface grating, and an intermediate layer (also referred to herein as an "underlayer") disposed between the waveguide substrate and the surface grating, wherein the underlayer corresponds to an underlayer diffractive index that is different from the waveguide refractive index.
According to some embodiments of the invention, a diffractive structure is used for a DOE comprising an underlayer between a waveguide substrate and a top grating surface. The top grating surface comprises a first material corresponding to a first refractive index value, the underlayer comprises a second material corresponding to a second refractive index value, and the substrate comprises a third material corresponding to a third refractive index value.
Each of these features may be implemented with any combination of the same or different materials, for example, where all three materials are different (and all three materials correspond to different refractive index values), or where two layers share the same material (e.g., two of the three materials are the same, thus sharing a common refractive index value that is different from the refractive index value of the third material). Any suitable set of materials may be used to achieve any layer of the improved diffractive structure.
Thus, various combinations may be used in which an underlayer having one index of refraction is combined with a top grating having another index of refraction and a substrate having a third index of refraction, and in which adjusting these relative values provides a large amount of variation in diffraction efficiency independent of angle of incidence. A layered waveguide having layers of different refractive indices is provided. Various combinations and permutations and associated performance data are provided to illustrate the functionality. Advantages include an increased angle, which provides an increased grating output angle, and thus an increased eyepiece field of view. Furthermore, the ability to counteract the normal reduction in angular diffraction efficiency is functionally beneficial.
According to additional embodiments, improved methods are provided for depositing an imprinting material on a substrate and imprinting the imprinting material to achieve a diffraction pattern. These methods allow very precise distribution, deposition and/or formation of different imprinting materials/patterns on any number of substrate surfaces. According to some embodiments, a patterned distribution of imprinting material (e.g., a patterned inkjet distribution) is performed to enable deposition of the imprinting material on the substrate. This method of using patterned inkjet distribution allows very precise volume control of the material to be deposited. Furthermore, the method can be used to provide a smaller, more uniform base layer below the grating surface.
In some embodiments, a template is provided having a first set of deeper depth structures and a second set of shallower depth structures. When the imprinting material is deposited on the imprinting receiver, a relatively large volume of the imprinting material is deposited in combination with the deeper structures of the template. In addition, a relatively small volume of imprinting material is deposited in combination with the shallower depth profile of the template. This method allows for the simultaneous deposition of different thicknesses of material for different features formed on the imprint receptor. This approach can be employed to create a distribution that is purposefully non-uniform for structures having different depths and/or feature parameters (e.g., where features are located on the same substrate and have different thicknesses). This can be used, for example, to create a volume of imprinting material that enables the simultaneous imprinting of a spatial distribution of structures of the same underlying layer thickness but varying depths.
Some embodiments relate to methods of enabling simultaneous deposition of multiple types of imprinting material on a substrate. This allows material having optical properties to be deposited simultaneously across multiple portions of the substrate. This approach also provides the ability to adjust local regions associated with specific functions, for example, for use as an in-coupling grating, an Orthogonal Pupil Expander (OPE) grating, or an Exit Pupil Expander (EPE) grating. The different types of materials may include the same material with different optical properties (e.g., two variations of the same material with different refractive indices) or two completely different materials. When using this technique, any optical property of the material may be considered and selected, such as refractive index, opacity and/or absorption.
According to another embodiment, multi-faceted imprinting may be employed to imprint multiple sides of the optical structure. This allows imprinting on different sides of the optical element to achieve multiplexing of functions through the base layer volume. In this way, different eyepiece functions can be realized without adversely affecting the grating structure function. A first template may be used to create an imprint on side "a" of the substrate/imprint recipient to form a first pattern of a first material on side a of the structure. Another template may be used to create a second imprint on side "B" of the same substrate, which forms a second pattern of a second material on side B of the substrate. The sides a and B may have the same or different patterns and/or may have the same or different types of materials.
Additional embodiments relate to multi-layer overlay imprinting and/or multi-layer separation/offset substrate integration. In either/both of these methods, the previously imprinted pattern may be ejected and printed again. The adhesive may be sprayed onto a first layer to which a second substrate is bonded (possibly with air gaps), and a subsequent spraying process may be deposited on the second substrate and imprinted. The series of imprinted patterns may be sequentially joined to one another by a roll-to-roll process. It is noted that the method of implementing multi-layer overlay imprinting may be used in conjunction with or in place of the multi-layer separation/offset substrate integration method. For multilayer overlay imprinting, a first imprint material may be deposited and imprinted onto a substrate, followed by deposition of a second imprint material, resulting in a composite multilayer structure having both the first imprint material and the second imprint material. For multi-layer separation/offset substrate integration, both the first substrate 1 and the second substrate 2 may be imprinted using an imprinting material. The substrate 1 and the substrate 2 may then be clamped together and bonded, possibly with an offset feature (also imprinted) that, in one embodiment, provides an air gap between the active structure of the substrate 2 and the backside of the substrate 1. An imprinted spacer may be used to create the air gap.
According to yet another embodiment, a method of achieving variable volume material deposition distributed across a substrate is disclosed that may rely on a priori knowledge of surface non-uniformities. Such correction for substrate surface non-uniformities may result in non-ideal parallelism, thereby degrading optical performance. Variable volume imprinting material deposition may be used to provide a horizontal distribution of imprinting material to be deposited independent of underlying topography or set of physical characteristics. For example, the substrate may be leveled by a vacuum chuck and in-situ metrology performed to assess surface height (e.g., by a low coherence or laser-based contact measurement probe). The amount of dispersion of the imprinting material may be varied based on the measurement data to produce a more uniform layer upon replication. Any type of non-uniformity, such as thickness variability (variability) and/or the presence of pits, spikes, or other anomalies or features associated with local locations on the substrate, can also be addressed by this embodiment of the present invention.
It is noted that any of the above embodiments may be combined together. Furthermore, additional and other objects, features and advantages of the present invention will be described in the detailed description, drawings and claims.
Drawings
FIG. 1 illustrates Augmented Reality (AR) as seen by a user through a wearable AR user device in one exemplary embodiment;
FIG. 2 illustrates a conventional stereoscopic 3-D analog display system;
FIG. 3 illustrates an improved method for implementing a stereoscopic 3-D analog display system according to some embodiments of the invention;
4A-4D illustrate various systems, subsystems, and components used to achieve the goal of providing a high quality and perceived comfort display system for human VR and/or AR;
FIG. 5 shows a plan view of an example configuration of a system utilizing a modified diffractive structure;
FIG. 6 shows a stacked waveguide assembly;
FIG. 7 shows a DOE;
FIGS. 8 and 9 illustrate example diffraction patterns;
FIGS. 10 and 11 show two waveguides into which light is injected;
FIG. 12 shows a waveguide stack;
FIG. 13A illustrates an example method of implementing a diffractive structure having a waveguide substrate and a top grating surface without an underlayer;
FIG. 13B shows a graph of example simulation results;
FIG. 13C shows the annotated version of FIG. 13A;
FIG. 14A illustrates an example method for implementing a diffractive structure having a waveguide substrate, an underlayer, and a top grating surface;
FIG. 14B illustrates an example method for implementing a diffractive structure having a waveguide substrate, an underlayer, a grating surface, and a top surface;
FIG. 14C illustrates an example method for implementing a stack of diffractive structures having a waveguide substrate, an underlayer, a grating surface, and a top surface;
FIG. 15A illustrates an example method for implementing a diffractive structure having a high index waveguide substrate, a low index underlayer, and a low index top grating surface;
FIG. 15B shows a graph of example simulation results;
FIG. 16A illustrates an example method for implementing a diffractive structure having a low index waveguide substrate, a high index underlayer, and a low index top grating surface;
FIG. 16B shows a graph of example simulation results;
FIG. 17A illustrates an example method for implementing a diffractive structure having a low index waveguide substrate, a medium index underlayer, and a high index top grating surface;
FIG. 17B shows a graph of example simulation results;
18A-D illustrate a modification of the underlayer characteristics;
FIG. 19 illustrates a method for achieving precise variable volume deposition of imprinting material on a single substrate;
FIG. 20 illustrates a method and imprinting step for achieving directional simultaneous deposition of multiple different imprinting materials in the same layer, according to some embodiments;
21A-B illustrate an example method for implementing double-sided imprinting in the context of a total internal reflection diffractive optical element;
FIG. 22 illustrates a structure formed using the method illustrated in FIGS. 21A-B;
FIG. 23 illustrates a method for implementing multi-layer overlay imprinting;
FIG. 24 illustrates a method for implementing multi-layer separation/offset substrate integration;
FIG. 25 illustrates a method for achieving variable volume deposition of material distributed across a substrate to account for surface non-uniformities.
Detailed Description
According to some embodiments of the present invention, a diffractive structure is employed that includes an underlayer/intermediate layer located between a waveguide substrate and a top grating surface. The top grating surface comprises a first material corresponding to a first refractive index value, the underlayer comprises a second material corresponding to a second refractive index value, and the substrate comprises a third material corresponding to a third refractive index value.
One advantage of this approach is that appropriate selection of the relative refractive indices for the three layers allows the structure to obtain a larger field of view for a larger range of incident light, since the lowest total internal reflection angle decreases with increasing refractive index. The diffraction efficiency can be increased, allowing "brighter" light to be output to the display of the image viewing device.
Various combinations may be used, where an underlayer with one index of refraction is combined with a top grating with another index of refraction and a substrate with a third index of refraction, and where adjusting these relative values provides a large amount of diffraction efficiency variation dependent on the angle of incidence. A layered waveguide having layers of different refractive indices is provided. Various combinations and permutations and associated performance data are provided to illustrate the functionality. Advantages include increased angles, which provide increased grating output angles, and thus increased eyepiece field of view. Furthermore, the ability to counteract the normal reduction in angular diffraction efficiency is functionally beneficial.
Display system according to some embodiments
This section of the present disclosure describes example display systems that may be used in conjunction with the improved diffractive structures of the present invention.
Fig. 2 shows a conventional stereoscopic 3-D analog display system, which typically has separate displays 74 and 76 for eyes 4 and 6, respectively, which are at a fixed radial focal distance 10 from the eyes. This conventional method fails to consider many valuable cues for the human eye and brain to detect and interpret three-dimensional depths including accommodation cue (accommodation cue).
In fact, a typical human eye is able to interpret a plurality of depth layers based on radial distance, for example about 12 depth layers can be interpreted. The near field limit of about 0.25 meters is about the closest depth of focus; a far field limit of about 3 meters means that any item that is beyond about 3 meters from the human eye will achieve infinite focus. As the focal layer gets closer to the eye, the focal layer becomes thinner; in other words, the eye is able to perceive very small differences in focal length relatively close to the eye, and this effect disappears as the object moves away from the eye. At infinite object position, the depth of focus/dioptric distance value is about 1/3 diopters.
Fig. 3 illustrates an improved method for implementing a stereoscopic 3-D analog display system according to some embodiments of the invention, in which two complex images are displayed, one for each eye 4 and 6, and multiple radial focal depths (12) for each image's respective aspect (14) may be used to provide perception of three-dimensional depth layering within the perceived image for each eye. Since there are multiple focal planes (e.g., 12 focal planes) between the user's eye and infinity, the data in these focal planes and the depicted relationship can be used to position the virtual element in the augmented reality scene for viewing by the user, as the human eye constantly scans around to perceive depth with the focal planes. Although the figure shows a particular number of focal planes at various depths, it is noted that implementations of the present invention may use any number of focal planes for the particular application desired, and thus the present invention is not limited to devices having only the particular number of focal planes shown in any of the figures of the present disclosure.
Referring to fig. 4A-4D, certain common component part options are shown, according to some embodiments of the present invention. In some of the detailed descriptions that follow the introduction of fig. 4A-4D, various systems, subsystems, and components are presented to achieve the goal of providing a high quality and perceived comfort display system for human VR and/or AR.
As shown in fig. 4A, an AR system user (60) is shown wearing a frame (64) structure that is connected to a display system (62) located in front of the user's eyes. In the illustrated configuration, a speaker (66) is connected to the frame (64) and is positioned near the ear canal of the user (in one embodiment, another speaker (not shown) is positioned near the other ear canal of the user to provide stereo/shapeable sound control). The display (62) is operatively connected (e.g., by wire or wireless connectivity) with the local processing and data module (70), and the local processing and data module (70) may be mounted in a variety of configurations, e.g., fixedly attached to the frame (64), fixedly attached to the helmet or cap (80), as shown in the embodiment of fig. 4B; as shown in the embodiment of fig. 4C, embedded within the earpiece, is movably attached to the torso (82) of the user (60) by a backpack-type configuration; or as shown in the embodiment of fig. 4D, removably attached to the user's (60) hips (84) via a belt-connected arrangement.
The local processing and data module (70) may include a power-efficient processor or controller, and digital memory, such as flash memory, which may be used to help process, cache, and store the following data: that is, a) data captured from sensors that are operatively connectable to a frame (64), such as an image capture device (e.g., a camera), a microphone, an inertial measurement unit, an accelerometer, a compass, a GPS unit, a radio, and/or a gyroscope; and/or b) data acquired and/or processed using a remote processing module (72) and/or a remote data store (74), which may be transmitted to the display (62) after such processing or retrieval. The local processing and data module (70) may be operatively connected (76, 78) (e.g., via a wired or wireless communication link) with the remote processing module (72) and the remote data store (74) such that the remote modules (72) are operatively connected to each other and available as resources to the local processing and data module (70).
In an embodiment, the remote processing module (72) may include one or more relatively powerful processors or controllers configured to analyze and process data and/or image information. In one embodiment, the remote data repository (74) may comprise a relatively large digital data storage facility that may be used through the internet or other networked configurations in a "cloud" resource configuration. In one embodiment, all data is stored and all computations are performed in the local processing and data module, allowing for fully autonomous use from any remote module.
The perception of Z-axis difference (i.e., the linear distance from the eye along the optical axis) can be facilitated by using a waveguide in conjunction with a variable focus optical element configuration. Image information from the display can be collimated and injected into the waveguide and this image information distributed in a large exit pupil using any suitable substrate guided optical method known to those skilled in the art, and then variable focus optical element power can be used to alter the wavefront focus of light exiting the waveguide and make the eye perceive that the light from the waveguide is from a particular focal length. In other words, since the inbound light has been collimated to avoid difficulties in the total internal reflection waveguide configuration, the light exits in a collimated manner, requiring the viewer's eye to adapt to the far point to bring the light into focus on the retina, and is naturally interpreted as coming from optical infinity-unless some other intervention causes the light to be refocused and perceived as coming from a different viewing distance; one suitable such intervention is a variable focus lens.
In some embodiments, the collimated image information is injected at an angle into a lens or other material to be totally internally reflected and transferred into an adjacent waveguide. The waveguide may be configured such that collimated light from the display is distributed so as to exit somewhat uniformly across the distribution of the reflector or diffractive features along the length of the waveguide. When exiting toward the eye, the emitted light passes through the variable focus lens element, where the light exiting the variable focus lens element and entering the eye will have various degrees of focus (collimated flat wavefront, representing optical infinity, more beam divergence/wavefront curvature representing closer viewing distance relative to the eye 58) depending on the controlled focus of the variable focus lens element.
In a "frame sequential" configuration, a stack of serialized two-dimensional images can be fed sequentially into a display to produce a three-dimensional perception over time; this approach is similar to the way computed tomography systems use stacked image slices to represent three-dimensional structures. A series of two-dimensional image segments may be presented to the eye, each segment located at a different focal distance from the eye, and the eye/brain assembles the stacks into a coherent three-dimensional volumetric perception. Depending on the display type, progressive, even pixel-by-pixel serialization can be performed to produce three-dimensional visual perception. For example, for a scanning light display (such as a scanning fiber display or a scanning mirror display), the display sequentially provides one row or one pixel at a time for the waveguide.
Referring to fig. 6, the stacked waveguide assembly (178) may be used to provide three-dimensional perception to the eye/brain by: a plurality of waveguides (182, 184, 186, 188, 190) and a plurality of weak lenses (198, 196, 194, 192) are configured together to transmit image information to the eye at a plurality of wavefront curvature levels for each waveguide level (indicative of a perceived focal length for that waveguide level). Multiple displays (200, 202, 204, 206, 208) or a single multiplexed display (in another embodiment) may be used to inject calibrated image information into waveguides (182, 184, 186, 188, 190), each of which may be configured according to the description above to distribute inbound light substantially equally across the length of each waveguide, thereby projecting the light downward toward the eye.
The waveguide (182) closest to the eye is configured to transmit collimated light injected into the waveguide (182) to the eye, which may represent an optical infinity focal plane. The next waveguide (184) is configured to emit collimated light that passes through a first weak lens (192; e.g., a weak diverging lens) before reaching the eye (58); the first weak lens (192) may be configured to produce a slightly convex wavefront curvature to cause the eye/brain to perceive light from the next waveguide (184) as coming from a first focal plane that approaches from optical infinity inward toward the person. Similarly, the following third waveguide (186) passes the output light through both the first lens (192) and the second lens (194) before reaching the eye (58); the combined optical power of the first lens (192) and the second lens (194) may be configured to produce another wavefront divergence increment to cause the eye/brain to perceive light from the following third waveguide (186) as coming from a second focal plane that is inward from optical infinity, closer to the person than light from the next following waveguide (184).
The other waveguide layers (188, 190) and weak lenses (196, 198) are similarly configured, with the highest waveguide (190) in the stack having its output passing through all weak lenses located between itself and the eye to achieve a collective power representing the focal plane closest to the person. To compensate the lens stack (198, 196, 194, 192) when viewing/sensing light from the world (144) on the other side of the stacked waveguide assembly (178), a compensating lens layer (180) is placed on top of the stack to compensate for the aggregate power of the underlying lens stack (198, 196, 194, 192). Such a configuration provides a number of perceived focal planes equal to the number of available waveguide/lens pairs, again with a larger exit pupil configuration as described above. Both the reflective viewing orientation of the waveguide and the focusing viewing orientation of the lens may be static (i.e., non-dynamic or non-electro-active). In an alternative embodiment, they may be dynamic view-oriented using the electro-active features described above, enabling multiplexing of a small number of waveguides in a time-sequential manner to produce a large number of effective focal planes.
Various diffractive configurations for focusing and/or redirecting collimated light beams may be employed. For example, passing a collimated beam through a linear diffraction pattern (such as a bragg grating) will deflect or "steer" the beam. Passing a collimated beam through a radially symmetric diffraction pattern or "fresnel zone plate" will alter the focus of the beam. A combined diffraction pattern having both linear and radial elements may be employed to produce both deflection and focusing of a collimated input beam. These skew and focusing effects can be produced in reflective as well as transmissive mode.
These principles can be applied with a waveguide configuration to achieve additional optical system control. As shown in fig. 7, a diffraction pattern (220) or "diffractive optical element" (or "DOE") has been embedded within the planar waveguide (216) such that the collimated beam is totally internally reflected along the planar waveguide (216), the planar waveguide (216) intersecting the diffraction pattern (220) at a plurality of locations. The structure may also include another waveguide (218), wherein the optical beam may be injected (e.g. by a light projector or display) into the other waveguide (218) and the DOE (221) embedded in the other waveguide (218).
Preferably, the DOE (220) has a relatively low diffraction efficiency such that, by each intersection with the DOE (220), only a portion of the light beam is deflected towards the eye (58) while the remainder continues to pass through the planar waveguide (216) via total internal reflection; the light carrying the image information is thus split into a plurality of related beams of light which emerge from the waveguide at a plurality of locations, resulting in a very uniform pattern of emitted beams of light towards the eye (58) for this particular collimated beam of light reflected around within the planar waveguide (216), as shown in fig. 8. The outgoing beam towards the eye (58) is shown in fig. 8 as being substantially parallel, since in this case the DOE (220) has only a linear diffraction pattern. However, the variation in the pitch of the linear diffraction pattern can be used to controllably deflect the emerging parallel beams to produce a scanning or tiling function.
Referring to fig. 9, as the radially symmetric diffraction pattern component of the embedded DOE (220) changes, the outgoing beam pattern diverges more, which requires the eye to accommodate closer distances to focus on the retina, and the brain perceives the light as coming from a viewing distance closer to the eye than optical infinity.
Referring to fig. 10, after adding another waveguide (218) into which a light beam may be injected (e.g., through a projector or display), depending on the particular DOE configuration in operation, a DOE (221), such as a linear diffraction pattern, embedded in the other waveguide (218) may be used to propagate light across the entire large planar waveguide (216), which may provide the eye (58) with a very large inbound area for inbound light emitted from the larger planar waveguide (216) (e.g., the large eyebox).
The DOEs (220, 221) are shown as bisecting the associated waveguide (216, 218), but need not necessarily do so; they may be placed near either of the waveguides (216, 218), or on either side of either to have the same function. Thus, as shown in fig. 11, with the injection of a single collimated beam, the entire replicated collimated beam field can be directed to the eye (58). Furthermore, by a combined linear/radially symmetric diffraction pattern scenario such as described above, a beam distribution waveguide optical element with Z-axis focusing functionality is provided (for functions such as exit pupil functional expansion; with a configuration such as that of FIG. 11, the exit pupil can be as large as the optical element itself, which is a very significant advantage for user comfort and ergonomics), where the divergence angle of the replicated beams and the wavefront curvature of each beam represent light from a point closer than optical infinity.
In an embodiment, one or more DOEs may be switched between an on-state (in which they actively diffract) and an off-state (in which they do not significantly diffract). For example, a switchable DOE may comprise a layer of polymer dispersed liquid crystal in which the droplets comprise a diffraction pattern in a host medium, and the refractive index of the droplets may be switched to substantially match that of the host material (in which case the pattern DOEs not significantly diffract incident light), or the droplets may be switched to a refractive index that DOEs not match that of the host medium (in which case the pattern actively diffracts incident light). Further, with the dynamic variation of the diffraction term, the beam scanning or tiling function can be realized. As noted above, it is desirable to have a relatively low diffraction grating efficiency in each DOE (220, 221), because this facilitates the distribution of light, and also because light that is ideally transmitted through the waveguide (e.g., light from the world 114 and emitted toward the eye 58 in an augmented reality configuration) is less affected when the diffraction efficiency of the DOE (220) that is passed through is low — thereby enabling better real world viewing with such a configuration.
Configurations such as those shown herein are preferably driven by image information injection in a time sequential approach, with frame sequential driving being the easiest to implement. For example, an image of sky at optical infinity may be injected at time 1, and a diffraction grating that preserves optical collimation may be utilized. An image of the nearer branch can then be injected at time t2, while the DOE controllably imparts a focus change, i.e. one dioptric unit or more than 1 meter, so that the eye/brain perceives the branch light information as coming from a nearer focus range. Such a paradigm may be repeated in a fast time-series manner so that the eye/brain perceives the input as all parts of the same image. This is just one bifocal plane example; preferably, the system will include more focal planes to provide a smoother transition between the object and its focal length. Such configurations generally assume that the DOE is switched at a relatively slow speed (i.e., synchronized with the display frame rate of the injected image — in the range of thousands of cycles per second).
The opposite extreme may be a configuration: wherein the DOE element is capable of changing focus at tens to hundreds of MHz or greater, which facilitates switching the focus state of the DOE element on a pixel-by-pixel basis when scanning pixels into an eye (58) using scanning light display technology. This is desirable because it means that the overall display frame rate can be kept fairly low; low enough to ensure that "flicker" is not a problem (in the range of about 60-120 frames/second).
Between these ranges, the focus on each scan line can be adjusted line by line if the DOE can be switched at the KHz rate, which gives the user a visual advantage in terms of time artifacts, for example, during movement of the eye relative to the display. For example, in this way, different focal planes in the scene are interleaved to minimize visual artifacts in response to head motion (as discussed in more detail below in this disclosure). The progressive focus modulator may be operatively coupled to a line scan display (such as a grating light valve display) in which a linear array of pixels is scanned to form an image; the focus modulator may also be operatively coupled with a scanning optical display, such as a fiber optic scanning display and a mirror scanning optical display.
A stacked configuration similar to that of fig. 6 may use dynamic DOEs to provide multi-plane focusing simultaneously. For example, with three simultaneous focal planes, the main focal plane (e.g., based on measured ocular accommodation) may be presented to the user, and the + and-margins (i.e., one focal plane closer and one farther) may be used to provide a large focus range within which the user may make visual accommodation before the plane needs to be updated. This increased focus range can provide a time advantage when the user switches to a closer or farther focus (i.e., as determined by visual adjustment measurements); the new focal plane may then be brought to an intermediate depth of focus, with the + and-margins again ready to quickly switch to either while the system continues to execute.
Referring to fig. 12, there is shown a stack (222) of planar waveguides (244, 246, 248, 250, 252), each waveguide having reflectors (254, 256, 258, 260, 262) at the ends and configured such that collimated image information injected by the display (224, 226, 228, 230, 232) into one end is reflected by total internal reflection down the reflectors at which point some or all of the light is reflected out toward the eye or other target. Each reflector may have a slightly different angle to reflect the outgoing light toward a common target such as a pupil. Lenses (234, 236, 238, 240, 242) may be interposed between the display and the waveguide to achieve beam steering and/or focusing.
As described above, an object located at optical infinity produces a substantially planar wavefront; while closer objects, such as 1m from the eye, produce an arc shaped wavefront (with a convex radius of curvature of about 1 m). The optical system of the eye needs to have sufficient optical power to bend the incoming light rays to eventually focus on the retina (the convex wavefront changes to a concave wavefront and then continues to a focal point on the retina). These are the basic functions of the eye.
In many of the embodiments described above, the light directed to the eye has been considered to be part of a continuous wave front, some subset of which strikes the pupil of a particular eye. In another approach, the light directed toward the eye may be effectively discretized or split into multiple beamlets or individual rays, each beamlet or ray having a diameter of less than about 0.5mm and having a unique propagation path as part of a larger aggregate wavefront that may be functionally created by the aggregate beamlets or rays. For example, an arcuate wavefront can be approached by converging a plurality of discrete adjacent collimated beams, where each collimated beam approaches the eye from an appropriate angle to represent an origin that matches the center of the radius of curvature of the desired converging wavefront.
When the beamlets have a diameter of about 0.5mm or less, the beamlets may be considered to be from a pinhole lens configuration, which means that each individual beamlet is relatively concentrated on the retina at all times, regardless of the state of accommodation of the eye, but the trajectory of each beamlet is affected by the state of accommodation. For example, if the beamlets approach the eye in parallel, representing a discretized collimated light aggregate wavefront, an eye correctly visually adjusted to infinity deflects the beamlets so that they are centered on the same shared point on the retina and will be clearly displayed. If the eye vision adjusts to say 1m, the beamlets will converge to a point in front of the retina, span multiple paths, and fall on multiple adjacent or partially overlapping points on the retina (i.e., appear blurred).
If the beamlets are close to the eye in a divergent configuration, and the shared origin is 1 meter from the viewer, a visual accommodation of 1m will steer the beam to a single point on the retina and show it clearly; if the viewer's vision adjusts to infinity, the beamlets converge to a point behind the retina and create multiple adjacent or partially overlapping points on the retina, thereby creating a blurred image. More generally, accommodation of the eye determines the degree of overlap of points on the retina, and a given pixel is "sharp" when all points are directed to the same point on the retina, and "blurred" when the points are offset from one another. This means that all beamlets of diameter 0.5mm or less are always clear, they can converge and be perceived by the eye/brain as they are substantially identical to a coherent wavefront and can be used to create a configuration for comfortable three-dimensional virtual or augmented reality perception.
In other words, a set of multiple beamlets can be used to simulate what would happen with a larger diameter variable focus beam, which would maintain a relatively static degree of focus if the beamlet diameter were kept at a maximum of about 0.5mm, and if desired produce a de-focus perception, and the beamlet angular trajectory can be selected to produce an effect that closely resembles the larger de-focused beam (this de-focus process may be different from the gaussian blur process for the larger beam, but would produce a multi-mode spot propagation function that can be interpreted in a manner similar to a gaussian blur).
In some embodiments, the beamlets are not mechanically deflected to create the aggregate focusing effect, but rather the eye receives a superset of a large number of beamlets, the superset comprising both the plurality of incident angles and the intersection locations of the plurality of beamlets with the pupil; to indicate that a given pixel is from a particular viewing distance, a subset of beamlets from the superset comprising appropriate angles of incidence and intersection with the pupil (as if they emanated from the same shared origin in space) become of matching color and intensity to represent an aggregate wavefront, while beamlets in the superset that are not coincident with the shared origin do not become of the above-described color and intensity (although some of them may become of some other color and intensity level to represent, for example, a different pixel).
Referring now to FIG. 5, one exemplary embodiment 800 of an AR system using an improved diffractive structure will now be described. The AR system generally includes an image generation processor 812, at least one FSD 808 (fiber scanning device), FSD circuitry 810, coupling optics 832, and at least one optical assembly (DOE assembly 802) comprising a stacked waveguide with an improved diffractive structure as described below. The system may also include an eye tracking subsystem 806. As shown in fig. 5, the FSD circuitry may include circuitry 810 in communication with an image generation processor 812, the image generation processor 812 having a Maxim chip CPU 818, a temperature sensor 820, a piezoelectric driver/transducer 822, a red laser 826, a blue laser 828, and a green laser 830, and a fiber combiner combining all three lasers 826, 828, and 830. It is noted that other types of imaging techniques may be used instead of the FSD device. For example, in some embodiments of the present invention, high resolution liquid crystal display ("LCD") systems, backlit ferroelectric panel displays, and/or high frequency DLP systems may all be used.
The image generation processor is responsible for generating virtual content that is ultimately displayed to the user. The image generation processor may convert images or video associated with the virtual content into a format that can be projected to a user in a 3D manner. For example, when generating 3D content, it may be desirable to format the virtual content such that a portion of a particular image is displayed on a particular depth plane and other portions are displayed on other depth planes. Alternatively, all images may be generated at a particular depth plane. Alternatively, the image generation processor may be programmed to feed slightly different images to the left and left eyes so that when viewed together, the virtual content is presented to the eyes of the user in a coherent, comfortable manner. In one or more embodiments, the image generation processor 812 delivers the virtual content to the optical component in a time sequence. A first portion of the virtual scene may be first transmitted such that the optical assembly projects the first portion at a first depth plane. The image generation processor 812 may then transmit another portion of the same virtual scene, cause the optical assembly to project the second portion onto a second depth plane, and so on. Here, an Alvarez (Alvarez) lens assembly may be laterally translated fast enough to produce multiple lateral translations (corresponding to multiple depth planes) based on a frame.
The image generation processor 812 may further include a memory 814, a CPU 818, a GPU 816, and other circuitry for image generation and processing. The image generation processor may be programmed to present the desired virtual content to a user of the AR system. It should be understood that in some embodiments, the image generation processor may be housed in the wearable AR system. In other embodiments, the image generation processor and other circuitry may be housed in a belt pack that is connected to the wearable optics.
The AR system also includes coupling optics 832 to direct light from the FSD to the optical assembly 802. Coupling optics 832 may refer to one or more conventional lenses for directing light into the DOE assembly. The AR system also includes an eye tracking subsystem 806 configured to track the user's eyes and determine the user's focus.
In one or more embodiments, software blurring may be used to cause blurring as part of a virtual scene. In one or more embodiments, the obfuscation module may be part of the processing circuit. The blur module may blur portions of one or more frames of image data fed into the DOE. In such embodiments, the blur module may blur out portions of the frame that are not intended to be rendered at a particular depth frame.
An example method that may be used to implement the image display system and its components described above is described in U.S. utility model patent application No. 14/555,585.
Improved diffractive structure
As described above, the diffraction pattern may be formed on the planar waveguide such that when the collimated beam is totally internally reflected along the planar waveguide, the beam intersects the diffraction pattern at multiple locations. According to some embodiments of the invention, the arrangement may be stacked to provide image objects at multiple focal planes within a stereoscopic 3-D analog display system.
Fig. 13A illustrates one possible approach that may be used to implement a structure 1300 of a waveguide 1302 (also referred to herein as a "light guide," "substrate," or "waveguide substrate"), in which an out-coupling grating 1304 is formed directly on the top surface of the waveguide 1302, e.g., as a combined monolithic structure and/or both formed of the same material (even if not composed of the same monolithic structure). In this approach, the refractive index of the grating material is the same as the refractive index of the waveguide 1302. The refractive index n (or "index") of a material describes how light propagates through the medium and is defined as n ═ c/v. Where c is the speed of light in vacuum and v is the phase velocity of light in a medium. The refractive index determines the degree of bending or the amount of refraction of light as it enters the material.
Fig. 13B shows a graph 1320 of example simulation results for a single polarization of the efficiency of light exiting the structure 1300 as a function of the angle at which the light propagates within the waveguide. The figure shows that the diffraction efficiency of the out-coupled light of structure 1300 decreases at higher angles of incidence. It can be seen that at an angle of about 43 degrees, the efficiency drops off relatively quickly on the graph shown due to the change in the total internal reflection ratio based on the angle of incidence in a medium with a uniform refractive index.
Thus, the usable range of the configuration 1300 may be somewhat limited and therefore undesirable, as the reflection interval may be reduced at higher angles of incidence, which further reduces the brightness seen by the viewer at these angles. The diffraction efficiency is lower at the shallowest incidence angles, which is not entirely desirable because the reflection separation between the interactions with the top surface (see fig. 13C) is far apart and there is little chance of light coupling out. Thus, a dim signal with less out-coupled sample is produced from this arrangement, which is exacerbated by gratings with lower diffraction efficiency at these high angles with the polarization orientation. Note that as used herein and in the drawings, "1T" refers to the first transmitted diffraction order.
In some embodiments of a waveguide-based optical system or a substrate-guided optical system, such as those described above, different pixels in the substrate-guided image are represented by light beams propagating at different angles within the waveguide, where the light propagates along the waveguide by Total Internal Reflection (TIR). The range of beam angles that are retained in the waveguide by TIR depends on the refractive index difference between the waveguide and the medium (e.g., air) outside the waveguide; the larger the refractive index difference, the larger the number of beam angles. In certain embodiments, the range of beam angles propagating along the waveguide is related to the field of view of the image coupled out of the waveguide plane by the diffractive element and is related to the image resolution supported by the optical system. In addition, the range of angles over which total internal reflection occurs is determined by the refractive index of the waveguide-in some embodiments a minimum of about 43 degrees, with a practical maximum of about 83 degrees, and thus in the range of 40 degrees.
FIG. 14A illustrates a method of addressing this problem according to some embodiments of the invention, in which structure 1400 includes an intermediate layer 1406 (referred to herein as "underlayer 1406") between substrate 1302 and top grating surface 1304. The top surface 1304 comprises a first material corresponding to a first refractive index value, the underlayer 1406 comprises a second material corresponding to a second refractive index value, and the substrate 1302 comprises a third material corresponding to a third refractive index value. It is noted that each of these portions of structure 1400 may be implemented with any combination of the same or different materials, e.g., where all three materials are different (and all three materials correspond to different refractive index values), or where two layers share the same material (e.g., two of the three materials are the same, thus sharing a common refractive index value that is different from the refractive index value of the third material). Any combination of refractive index values may be used. For example, one embodiment includes a low refractive index for the underlayer and a higher refractive index value for the surface grating and the substrate. Other example configurations are described below having other example combinations of refractive index values. Structure 1500 may be implemented using any suitable set of materials. For example, polymers, glass, and sapphire are examples of materials that can be selected to achieve any layer of structure 1400.
As shown in fig. 15A, in some embodiments, it is desirable to implement a structure 1500 using a relatively high index substrate as the waveguide substrate 1302, with a relatively low index underlayer 1406 and a relatively low index top grating surface 1304. This is because, with the relationship n1 × sin (θ 1) ═ n2 × sin (90), the minimum total internal reflection angle decreases as the refractive index increases, and therefore a larger field of view can be obtained. For a substrate with a refractive index of 1.5, the critical angle is 41.8 degrees; however, for a substrate index of 1.7, the critical angle is 36 degrees.
As long as the material layer comprising the grating is not too thick between the grating and the substrate, it is possible to use gratings formed on high refractive index substrates to couple light even if they themselves have a low refractive index. This is related to the fact that: for total internal reflection ("TIR") with this configuration, a wider range of angles is possible. In other words, with such a configuration, the TIR angle is reduced to a lower value. Additionally, it is noted that many current etching processes may not be well suited to extend to high index glasses. In some embodiments, it is desirable to replicate the outcoupling layer reliably and at low cost.
The configuration of underlayer 1406 can be adjusted, for example, by changing the thickness of underlayer 1406, to change the performance characteristics of structure 1500. The configuration of fig. 15A (which includes a grating structure 1304 comprising a relatively low index material on top, an associated low index underlayer 1406, and also an associated high index light guiding substrate 1302) can be modeled to produce data such as that shown in fig. 15B. Referring to this figure, the left side view 1502a is associated with a configuration having a zero thickness underlayer 1502. The middle graph 1502b shows data for a 0.05 micron thick underlayer 1502. The right graph 1502c shows data for a 0.1 micron thick underlayer 1502.
As the data in these figures show, as the underlayer thickness increases, the diffraction efficiency, which is dependent on the angle of incidence, becomes more non-linear and is suppressed at high angles, which may be undesirable. Thus, in this case, control of the underlayer is an important functional input. It should be noted, however, that for a zero thickness underlayer and a grating feature alone with a low index of refraction, the angular range supported by the structure is determined by the TIR conditions in the high index base material rather than the low index grating feature material.
Referring to FIG. 16A, there is shown an embodiment of a structure 1600 characterized by a relatively high index underlayer 1406 on a low index substrate 1302, where the top surface diffraction grating 1304 has a lower index of refraction than the underlayer 1406 and is comparable to, but not necessarily equal to, the index of refraction of the substrate 1302. For example, the top surface grating may correspond to a refractive index of 1.5, the underlayer may correspond to a refractive index of 1.84, and the substrate may correspond to a refractive index of 1.5. Assume for this example that the period is 0.43 microns and λ corresponds to 0.532 microns.
A simulation related to this configuration is shown in fig. 16B. As shown in graph 1602a of this figure, for a 0.3 micron thick underlayer 1406, the diffraction efficiency drops as in the above-described arrangement, but begins to rise at the higher end of the angular range. The same is true for the 0.5 micron thick underlayer 1406 configuration, as shown in figure 1602 b. In each of these (0.3 micron, 0.5 micron) configurations, it is beneficial that the efficiency is relatively high at the higher limit of the angular range. Such functionality may tend to counteract the more sparse reflection spacing problem discussed above. Also shown in graph 1602c of this figure is an embodiment with a 90 degree rotated polarization case where the diffraction efficiency decreases as might be expected, but shows the desired behavior because higher efficiency is provided at steeper angles compared to shallower angles.
Indeed, in some embodiments, the diffraction efficiency for an angle may be increased at high angles. This may be a desirable feature of some embodiments as it helps compensate for low reflection intervals that may occur at high propagation angles. Thus, in embodiments where compensation for low reflection separation (which occurs at high propagation angles) is required, the structural configuration of fig. 16A may be preferred because it can promote diffraction efficiency for angles that increase at high angles, which is ideal for the monolithic configuration described above.
Referring to FIG. 17A, another structure 1700 is shown in which the refractive index of the underlying layer 1406 is substantially higher than the refractive index of the substrate 1302. The grating structure 1304 is on top and also has a higher index of refraction than the underlayer 1406. For example, the top surface grating may correspond to a refractive index of 1.86, the underlayer may correspond to a refractive index of 1.79, and the substrate may correspond to a refractive index of 1.5. As before, assume for this example that the period is 0.43 microns and λ corresponds to 0.532 microns.
Referring to FIG. 17B, a diagram 1702 shows simulation data for the structure 1700 of FIG. 17A. The resulting diffraction efficiency versus angle of incidence curve, as shown in graph 1702, shows ideal general behavior to help compensate for the above-described low reflection spacing at relatively high angles of incidence, and generally with reasonable diffraction efficiency across a larger range of angles.
Note that underlayer 1406 need not be uniform across the entire substrate. Any feature of the underlayer 1406 may vary at different locations of the substrate, such as variations in the thickness, composition, and/or refractive index of the underlayer 1406. One possible reason for changing the characteristics of the underlying layer 1406 is to promote uniform display characteristics when there is a known variation in the displayed image and/or the uneven transmission of light within the display system.
For example, as shown in FIG. 18A, consider whether a waveguide structure receives incident light at a single incoupling location 1802 on a waveguide. When incident light is injected into the waveguide 1302, less and less light will remain as it propagates along the length of the waveguide 1302. This means that the output light near the incoupling location 1802 may end up "brighter" than output light further along the length of the waveguide 1302. If the underlayer 1406 is uniform along the entire length of the waveguide 1302, the optical effect of the underlayer 1406 may enhance this non-uniform brightness level across the substrate.
The characteristics of the substrate 1406 may be adjusted across the substrate 1302 to make the output light more uniform. Fig. 18B illustrates one way by which to vary the thickness of underlayer 1406 across the length of waveguide substrate 1302, where underlayer 1406 is thinner near the incoupling location 1802 and thicker at distances further from location 1802. In this manner, the effect of underlayer 1406 promoting greater diffraction efficiency can at least partially mitigate the effects of optical loss along the length of waveguide substrate 1302, thereby promoting more uniform light output across the entire structure.
FIG. 18C illustrates an alternative approach in which the thickness of underlayer 1406 is constant, but the refractive index of underlayer 1406 varies across substrate 1302. For example, to address the problem that output light near location 1802 tends to be brighter than locations further from location 1802, the refractive index of underlayer 1406 can be configured to be the same or similar to the refractive index of substrate 1302 near location 1802, but at locations further from location 1802, the difference in these refractive index values increases. The composition of the underlayer 1406 material may be varied at different locations to achieve different values of refractive index. FIG. 18D illustrates a hybrid approach whereby both the thickness and refractive index of underlayer 1406 are varied across substrate 1302. It is noted that this same method can be used to change the thickness and/or refractive index of top grating surface 1304 and/or substrate 1302 in conjunction with or instead of changing underlayer 1406.
Thus, various combinations may be used, where an underlayer 1406 with one refractive index is combined with a top grating 1304 with another refractive index and a substrate 1302 with a third refractive index, and where adjusting these relative values provides a large amount of diffraction efficiency variation dependent on the angle of incidence. A layered waveguide having layers of different refractive indices is provided. Various combinations and permutations and associated performance data are provided to illustrate the functionality. Advantages include an increased angle, which provides an increased output angle of grating 1304, and thus an increased eyepiece field of view. Furthermore, the ability to counteract the normal reduction in angular diffraction efficiency is functionally beneficial.
Fig. 14B shows one embodiment of placing another layer of material 1409 (top surface) over the grating layer 1304. The layer 1409 can be implemented by configuration to address different design goals. For example, the layer 1409 can form a gap layer between the plurality of stacked diffractive structures 1401a and 1401b, e.g., as shown in figure 14C. As shown in fig. 14C, the gap layer 1409 can be used to remove any air gaps/gaps and provide a support structure for the stacked diffraction elements. In such an application, layer 1409 may be formed of a material having a relatively low refractive index (e.g., about 1.1 or 1.2). Although not shown in this figure, other layers (e.g., weak lenses) may be disposed between the diffractive structures 1401a and 1401 b.
In addition, the layer 1409 may be formed of a material having a relatively high refractive index. In this case, the grating on layer 1409 (rather than grating surface 1304) will provide a diffractive effect for all or most of the incident light.
It will be apparent that different relative combinations of refractive index values may be selected for different layers, including layer 1409, to achieve desired optical effects and results.
Any suitable fabrication technique may be used to fabricate such a structure. Certain high index polymers (such as those known as "MR 174") can be directly embossed, printed, or etched to produce the desired patterned structure, although there may be difficulties associated with process shrinkage of these layers, and the like. Thus, in another embodiment, another material may be imprinted, embossed, or etched on the high index polymer layer (i.e., a layer such as MR 174) to produce a functionally similar result. Prior art printing, etching (i.e., which may include steps similar to resist removal and patterning steps used in conventional semiconductor processes) and embossing techniques may be utilized and/or combined to accomplish such printing, embossing, and/or etching steps. Molding techniques similar to those used in DVD manufacturing, for example, may also be used for certain replication steps. In addition, certain jetting or deposition techniques used in printing and other deposition processes may also be used to accurately deposit certain layers.
The following portion of the present disclosure will now describe an improved method for achieving patterning on a substrate for diffraction, wherein imprinting of deposited imprinting material is performed according to some embodiments of the present invention. These methods allow for very precise distribution of the imprinting material and very precise formation of different imprinting patterns on any number of substrate surfaces. It is noted that the following description may be used in conjunction with and for implementing the grating configuration described above. However, it is expressly noted that the deposition method of the present invention may also be used with other configurations.
According to some embodiments, a patterned distribution of the imprinting material (e.g., a patterned inkjet distribution) is performed to enable deposition of the imprinting material on the substrate. This method of using patterned inkjet distribution allows very precise volume control of the material to be deposited. Furthermore, this approach can be used to provide a smaller, more uniform base layer under the grating surface-and as noted above, the base thickness of the layer can have a significant impact on the performance of the eyepiece/optic.
Fig. 19 illustrates a method for achieving precise variable volume deposition of imprinting material on a single substrate. As shown, a template 1902 is provided having a first set of deeper depth structures 1904 and a second set of shallower (e.g., standard) depth structures 1906. When depositing the imprinting material on imprinting receptor 1908, a relatively large volume of imprinting material 1910 is deposited corresponding to the portion of the template having deeper depth structures 1904 of template 1902. In contrast, a relatively small volume of imprinting material 1912 is deposited in conjunction with shallower depth structures 1906 of template 1902. The template is then used to imprint the first and second sets of depth structures into the imprint material, thereby forming corresponding structures having different depths and/or patterns in the imprint material. Thus, this method allows different features to be formed simultaneously on the imprint receptor 1908.
This approach can be employed to create a distribution that is intentionally non-uniform for structures having different depths and/or feature parameters (e.g., where features are located on the same substrate and have different thicknesses). This can be used, for example, to create a volume of imprinting material that enables the simultaneous imprinting of a spatial distribution of structures of the same underlying layer thickness but varying depths.
The bottom of fig. 19 shows a structure 1920 formed by the deposition techniques/apparatus described above, wherein the underlayer 1922 has a uniform thickness despite differences in pattern depth and volume. It can be seen that the imprint material that has been deposited in structure 1920 has a non-uniform thickness. Here, top layer 1924 includes a first portion 1926 having a first set of layer thicknesses, and second portion 1928 has a second set of layer thicknesses. In this example, portion 1926 corresponds to a thicker layer than the standard/shallower thickness of portion 1928. It is noted, however, that any combination of thicknesses can be constructed using the concepts of the present invention, wherein a thickness thicker and/or thinner than the standard thickness is formed on the underlying layer.
This capability may also be used, for example, to deposit larger volumes of material to act as, for example, spacer elements, thereby facilitating the construction of multilayer diffractive optical elements.
Some embodiments relate to a method for enabling simultaneous deposition of multiple types of imprinting material on a substrate. This allows material having optical properties to be deposited simultaneously across multiple portions of the substrate. The method also provides the ability to adjust local regions associated with a particular function, for example to act as an in-coupling grating, an Orthogonal Pupil Expander (OPE) grating, or an Exit Pupil Expander (EPE) grating.
Fig. 20 illustrates a method and imprinting step for achieving directional simultaneous deposition of multiple different imprinting materials in the same layer, according to some embodiments. As shown, a template 2002 is provided to imprint patterns into different types of imprinting material 2010 and 2012 on an imprint recipient 2008. Materials 2010 and 2012 may include the same material having different optical properties (e.g., two variants of the same material having different refractive indices) or two completely different materials.
Any optical properties of the material may be considered and selected when employing this technique. For example, as shown in the embodiment of fig. 20, material 2010 corresponds to a high index material deposited in one cross section of imprint acceptor 2008, while material 2012 corresponds to a lower index material deposited in the area of the second cross section.
This forms a multifunctional diffractive optical element having high-index portions 2026 and low-index portions 2028, as shown by the resulting structure 2020. In this case, the high refractive index portion 2026 relating to the first function and the portion 2028 relating to the second function are imprinted at the same time.
While this example illustratively identifies the refractive index of the material as the optical property that is "tuned" when the materials are deposited simultaneously, it is noted that other optical properties may also be considered when identifying the type of material to be deposited in different portions of the structure. For example, opacity and absorbency are other properties that can be used to identify materials to be deposited in different portions of the structure to adjust the local characteristics of the final product.
Furthermore, one type of material may be deposited over/under another material prior to imprinting. For example, one refractive index material may be deposited directly under a second refractive index material prior to imprinting, thereby creating a gradient refractive index to form a diffractive optical element. This may be used, for example, to implement the structure shown in fig. 17A (or any other related structure described above or in the figures).
According to another embodiment, multi-faceted imprinting may be employed to imprint multiple sides of an optical structure. This allows imprinting on different sides of the optical element to achieve multiplexing of functions through the base layer volume. In this way, different eyepiece functions can be realized without adversely affecting the grating structure function.
21A-B illustrate an example method of implementing double-sided imprinting in the context of a total internal reflection diffractive optical element. As shown in fig. 21A, a first template 2102a may be used to create an imprint on side "a" of a substrate/imprint recipient 2108. This forms a first pattern 2112 with the first material on side a of the structure.
As shown in fig. 21B, a second imprint can be created on side "B" of the same substrate using template 2102B. This forms a second pattern 2114 with the second material on side B of the substrate.
It is noted that the sides a and B may have the same or different patterns and/or may have the same or different types of materials. Further, the pattern on each side may include varying layer thicknesses (e.g., using the method of fig. 19) and/or have different material types on the same side (e.g., using the method of fig. 20).
As shown in fig. 22, a first pattern 2112 is imprinted on side a and a second pattern 2114 is imprinted on the opposite side B of the substrate 2108. The composite function of the resulting double-sided imprinting element 2200 may now be achieved. Specifically, when input light is applied to double-sided imprinting element 2200, some light exits element 2200 to implement first function 1, while other light exits to implement second function 2.
Additional embodiments relate to multi-layer overlay imprinting and/or multi-layer separation/offset substrate integration. In either/both of these methods, the previously imprinted pattern may be ejected and printed again. The adhesive may be sprayed onto the first layer and the second substrate bonded to the first layer (possibly with air gaps) and a subsequent spraying process may be deposited on the second substrate and imprinted. The series of imprinted patterns may be sequentially joined to one another by a roll-to-roll process. It is noted that the method of implementing multi-layer overlay imprinting may be used in conjunction with or in place of the multi-layer separation/offset substrate integration method.
Fig. 23 illustrates a method for implementing multi-layer overlay imprinting. Here, a first imprint material 2301 can be deposited over a substrate 2308 and imprinted. Followed by deposition of a second imprinting material 2302 (and possibly imprinting of second imprinting material 2302). This results in a composite multilayer structure having both the first imprint material 2301 and the second imprint material 2302. In one embodiment, subsequent imprinting may be effected with respect to second imprinting material 2302. In an alternative embodiment, no subsequent imprinting is effected with respect to second imprinting material 2302.
Fig. 24 illustrates a method for implementing multi-layer separation/offset substrate integration. Here, the first substrate 1 and the second substrate 2 may be deposited with an imprinting material and then imprinted. Substrate 1 and substrate 2 may then be sandwiched together and bonded (possibly with offset features (also imprinted)), which in one embodiment provide air gaps 2402 between the active structures of substrate 2 and the backside of substrate 1. The air gaps 2402 may be created using imprinted spacers 2404.
According to yet another embodiment, a method of achieving variable volume material deposition distributed across a substrate is disclosed that may rely on a priori knowledge of surface non-uniformities. For explanation, consider the substrate 2502 shown in fig. 25. As shown, surface non-uniformity of the substrate 2502 may result in an undesirable degree of parallelism, thereby reducing optical performance. In this case, the variability of the substrate 2502 (or a previously imprinted layer) may be measured.
Variable volume imprinting material deposition may be used to provide a horizontal distribution of imprinting material to be deposited independent of underlying topography or set of physical characteristics. For example, the substrate may be leveled by a vacuum chuck and in-situ metrology performed to assess surface height (e.g., by a low coherence or laser-based contact measurement probe). The amount of dispersion of the imprinting material may be varied based on the measurement data to produce a more uniform layer upon replication. In this example, portion 2504a of the substrate has the greatest level of variability, portion 2504b has a moderate level of variability, and portion 2504c has the lowest level of variability. Thus, a high volume imprinting material may be deposited in portion 2504a, a medium volume imprinting material deposited in portion 2504b, and a low/standard volume imprinting material deposited in portion 2504 c. As shown in the resulting product 2506, this results in a more uniform total substrate/imprinting material/imprinting pattern thickness, which in turn may adjust or benefit the performance of the imprinting apparatus.
It is noted that while this example shows variability due to thickness non-uniformity, other types of non-uniformity may be addressed by this embodiment of the invention. In another embodiment, variability may be caused by the presence of pits, spikes, or other anomalies or features associated with local locations on the substrate.
In the foregoing specification, the invention has been described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention. For example, the process flow described above is described with reference to a particular sequence of process actions. However, the order of many of the described process actions may be varied without affecting the scope or operation of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Various example embodiments of the present invention are described herein. Reference to these examples is not intended to be limiting in any way. These examples are provided to illustrate the broader aspects of the invention. Various modifications may be made to the described invention, and equivalents may be substituted, without departing from the true spirit and scope of the invention. In addition, many modifications may be made to adapt a particular situation, material, composition, process action(s), or step(s), to the objective(s), spirit or scope of the present invention. Further, those skilled in the art will appreciate that various modifications described and illustrated herein have discrete components and features which may be readily separated from or combined with the features of any of the other various embodiments without departing from the scope or spirit of the present invention. All such modifications are intended to be within the scope of the claims associated with this disclosure.
The invention includes a method that may be performed using a subject device. The methods may include the acts of providing such suitable devices. Such providing may be performed by an end user. In other words, the act of "providing" requires only that the end user acquire, access, approach, position, set up, activate, power on, or other act to provide the necessary equipment in the subject method. The methods described herein may be performed in any order of events that is logically possible, and in the order of events described.
Various example aspects of the invention and details regarding material selection and fabrication have been described above. Additional details of the invention can be understood in conjunction with the above-referenced patents and publications and the general knowledge of one skilled in the art. Further acts that are commonly or logically employed with respect to method-based aspects of the invention are also understood in the foregoing manner.
Furthermore, although the invention has been described with reference to a number of examples, optionally incorporating a number of features, the invention is not limited to the descriptions or indications envisaged with respect to each variant of the invention. Various modifications may be made to the described invention and equivalents (whether described herein or not included for the sake of simplicity) without departing from the true spirit and scope of the invention. Further, if a range of values is provided, it is understood that each intervening value, or every intervening value in the stated range, between the upper and lower limit of that range and any other stated range, is encompassed within the invention.
In addition, it is contemplated that any optional feature of the inventive variations described may be listed and claimed independently, or in combination with any one or more of the features described herein. Reference to a single item includes the possible presence of multiple identical items. More specifically, as used herein and in the associated claims, the singular forms "a," "an," "the," and "the" include plural referents unless the context clearly dictates otherwise. In other words, the use of the article allows "at least one" of the subject item in the foregoing description and in the claims associated with this disclosure. It is further noted that the above claims may exclude any optional elements at the time of writing. Accordingly, this statement is intended to serve as antecedent basis for use of such exclusive terminology as "solely," "exclusively," etc., or use of a "negative" limitation in connection with the recitation of claim elements.
The term "comprising" in the claims associated with this disclosure should, without the use of such exclusive terminology, allow the inclusion of any additional element(s) (independent of whether a given number of elements are listed in such claims), or the addition of a feature may be viewed as altering the nature of the elements listed in such claims. All technical and scientific terms used herein are to be given the broadest possible, readily understood meaning other than that specifically defined herein, while retaining the validity of the claims.
The scope of the present invention is not limited to the examples provided and/or the subject specification, but is only limited by the scope of the claim language associated with this disclosure.
The above description of illustrated embodiments is not intended to be exhaustive or to limit the embodiments to the precise forms disclosed. While specific embodiments of, and examples for, the disclosure are described herein for illustrative purposes, various equivalent modifications are possible without departing from the spirit and scope of the disclosure, as those skilled in the relevant art will recognize. The teachings of the various embodiments provided herein may be applied to other devices that implement virtual or AR or hybrid systems, and/or that employ user interfaces (not necessarily the example AR systems generally described above).
For example, the foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, schematics, and examples. These block diagrams, schematics, and examples contain one or more functions and/or operations, and those skilled in the art will appreciate that each function and/or operation in these block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof.
In one embodiment, the present subject matter may be implemented via an Application Specific Integrated Circuit (ASIC). However, those skilled in the art will recognize that all or part of the embodiments disclosed herein can be equivalently implemented in standard integrated circuits, as one or more computer programs executed by one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs executed by one or more controllers (e.g., microcontrollers) as one or more programs executed by one or more processors (e.g., microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of the teachings of this disclosure.
When logic is implemented as software and stored in memory, the logic or information may be stored on any computer-readable medium for use by or in connection with any processor-related system or method. In the context of this disclosure, a memory is a computer-readable medium that is an electronic, magnetic, optical, or other physical device or means that contains or stores a computer and/or processor program. Logic and/or information can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions associated with logic and/or information.
In the context of this specification, a "computer-readable medium" can be any means that can store the program associated with logic and/or information for use by or in connection with the instruction execution system, apparatus, and/or device. The computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: portable computer diskette (magnetic disk, compact flash, secure digital, etc.), Random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM, EEPROM, or flash memory), portable compact disc read-only memory (CDROM), digital magnetic tape, and other non-transitory media.
Many of the methods described herein can be performed variably. For example, many of the methods can include additional acts, omit some acts, and/or perform acts in a different order than shown or described.
The various embodiments described above can be combined to provide further embodiments. To the extent not inconsistent with the specific teachings and definitions herein, all U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications, and non-patent publications are referred to in this specification and/or listed in the application data sheet. Aspects of the embodiments can be modified, if necessary, to employ systems, circuits and concepts of the various patents, applications and publications to provide yet further embodiments.
These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which the claims are entitled. Accordingly, the claims are not limited by the disclosure.
Furthermore, the various embodiments described above can be combined to provide further embodiments. Aspects of the embodiments can be modified, if necessary, to employ concepts of the various patents, applications and publications to provide yet further embodiments.
These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which the claims are entitled. Accordingly, the claims are not limited by the disclosure.

Claims (15)

1. A method of manufacturing a diffractive optical element for an eyepiece, the method comprising:
depositing a first layer on a first substrate, wherein the first layer comprises a first portion and a second portion, the first portion is deposited to have a first depth over a first area on the first substrate, the first portion has a first optical index of refraction, the second portion is deposited to have a second depth over a second area on the first substrate, and the second portion has a second optical index of refraction that is different from the first optical index of refraction;
identifying a template having an imprinted pattern formed thereon, the template comprising a first set of depth structures corresponding to the first depth of the first portion and a second set of depth structures corresponding to the second depth of the second portion; and
imprinting the imprint pattern into the first portion and the second portion on the first substrate using the template, wherein
The diffractive optical element comprises a first diffractive optical element,
the imprint pattern comprises a diffraction pattern for the first diffractive optical element,
the first portion is deposited to have the first depth prior to imprinting and the first portion is associated with a first function,
the second portion is deposited to have the second depth and the second portion is associated with a second function prior to imprinting, an
The first depth is different from the second depth.
2. The method of claim 1, further comprising:
simultaneously imprinting the first and second portions deposited on the first and second areas, respectively, using the template to form first and second patterns on the first substrate, wherein
The imprint pattern includes the first pattern and the second pattern,
the template imprints the first pattern on the first portion of the first layer using the first set of depth structures, and
the template imprints the second pattern on the second portion of the first layer using the second set of depth structures.
3. The method of claim 2, wherein the first pattern corresponds to a first diffraction grating pattern and the second pattern corresponds to a second diffraction grating pattern.
4. The method of claim 2, wherein the first or second portion functions as an in-coupling grating, an orthogonal pupil expander grating, or an exit pupil expander grating.
5. The method of claim 1, further comprising:
depositing a second layer over the first layer for the first diffractive optical element of the eyepiece, wherein
The second layer comprises a first material having a first refractive index value in a first region of the second layer,
the second layer further comprises a second material having a second refractive index value in a second region of the second layer,
the first substrate comprises a third material having a third refractive index value, an
Adjusting the first, second, and third refractive index values to provide a change in diffraction efficiency as a function of incident angle.
6. The method of claim 1, wherein the diffractive optical element further comprises a second diffractive optical element, and wherein the method further comprises:
stacking the second diffractive optical element over a first weak lens further stacked over the first diffractive optical element, wherein
The first diffractive optical element being closer to an eye of a viewer than the second diffractive optical element and defining a first focal plane having a first focal length at optical infinity,
the second diffractive optical element is separated from the first diffractive optical element by the first weak lens, and
the second diffractive optical element includes a second substrate and the first weak lens defines a second focal plane having a second focal length that is less than the first focal length at optical infinity.
7. The method of claim 6, wherein the diffractive optical element further comprises a third diffractive optical element, and wherein the method further comprises:
stacking the third diffractive optical element over the second diffractive optical element, wherein
The third diffractive optical element includes a third substrate and is disposed farther from the eye of the viewer than the second diffractive optical element,
the third diffractive optical element is separated from the second diffractive optical element by a second lens, and
the combination of the first weak lens and the second lens defines a third focal plane having a third focal length that is less than the second focal length.
8. The method of claim 7, further comprising:
a compensating lens layer is disposed over the third diffractive optical element to compensate for a total power of the first weak lens and the second lens.
9. The method of claim 7, further comprising: stacking one or more additional diffractive optical elements above the third diffractive optical element and away from an eye of a viewer, wherein the third diffractive optical element is separated from the one or more additional diffractive optical elements by one or more respective individual lenses, and a combination of the one or more respective individual lenses, the first weak lens, and the second lens each define a respective focal plane having one or more corresponding focal lengths that are less than the third focal length.
10. The method of claim 9, further comprising:
a compensating lens layer is disposed over the one or more additional diffractive optical elements to compensate for the total power of the lenses separating the diffractive optical layers.
11. The method of claim 10, wherein at least two of the first diffractive optical element, the second diffractive optical element, the third diffractive optical element and one or more additional diffractive optical elements are multiplexed to produce at least one additional focal plane in addition to the first focal plane, the second focal plane and the third focal plane.
12. The method of claim 6, wherein the first portion of the first layer is deposited over the second portion of the first layer prior to imprinting the imprint pattern into the first portion and the second portion.
13. The method of claim 1, further comprising:
imprinting a first pattern of the imprinted pattern into the first portion; and
imprinting a second pattern into the second portion different from the first pattern, wherein the imprinted pattern includes the first pattern and the second pattern.
14. The method of claim 1, wherein the first substrate having a first imprint pattern is overlaid on a second substrate having a second imprint pattern.
15. The method of claim 1, further comprising:
depositing a second layer on the first substrate for the first diffractive optical element of the eyepiece, wherein
The second layer is deposited on an opposite surface to the surface on which the first layer is deposited,
the second layer comprising a first material having a refractive index value in a first region of the second layer,
the first substrate comprises different materials having a different refractive index value, an
Adjusting refractive index values of the first material and the different material of the first substrate to provide a change in diffraction efficiency dependence on angle of incidence.
CN202011551607.0A 2015-03-05 2016-03-05 Improved manufacturing of virtual and augmented reality systems and components Active CN112558307B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201562128925P 2015-03-05 2015-03-05
US62/128,925 2015-03-05
US15/007,117 2016-01-26
US15/007,117 US9915826B2 (en) 2013-11-27 2016-01-26 Virtual and augmented reality systems and methods having improved diffractive grating structures
PCT/US2016/021093 WO2016141372A1 (en) 2015-03-05 2016-03-05 Improved manufacturing for virtual and augmented reality systems and components
CN201680013598.5A CN107430217B (en) 2015-03-05 2016-03-05 Improved manufacturing of virtual and augmented reality systems and components

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201680013598.5A Division CN107430217B (en) 2015-03-05 2016-03-05 Improved manufacturing of virtual and augmented reality systems and components

Publications (2)

Publication Number Publication Date
CN112558307A CN112558307A (en) 2021-03-26
CN112558307B true CN112558307B (en) 2022-08-02

Family

ID=59714233

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011551607.0A Active CN112558307B (en) 2015-03-05 2016-03-05 Improved manufacturing of virtual and augmented reality systems and components
CN201680013598.5A Active CN107430217B (en) 2015-03-05 2016-03-05 Improved manufacturing of virtual and augmented reality systems and components

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201680013598.5A Active CN107430217B (en) 2015-03-05 2016-03-05 Improved manufacturing of virtual and augmented reality systems and components

Country Status (6)

Country Link
JP (3) JP6873041B2 (en)
KR (4) KR20230175351A (en)
CN (2) CN112558307B (en)
AU (2) AU2016225962B2 (en)
CA (2) CA2976955C (en)
IL (1) IL253996B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11726241B2 (en) 2015-01-26 2023-08-15 Magic Leap, Inc. Manufacturing for virtual and augmented reality systems and components
CA2976955C (en) * 2015-03-05 2022-04-26 Magic Leap, Inc. Improved manufacturing for virtual and augmented reality systems and components
CN107728319B (en) * 2017-10-18 2024-01-23 广东虚拟现实科技有限公司 Visual display system and method and head-mounted display device
WO2019195193A1 (en) * 2018-04-02 2019-10-10 Magic Leap, Inc. Waveguides having integrated spacers, waveguides having edge absorbers, and methods for making the same
EP3830631A4 (en) * 2018-08-03 2021-10-27 Magic Leap, Inc. Unfused pose-based drift correction of a fused pose of a totem in a user interaction system
EP3844554B1 (en) 2018-09-07 2024-03-20 Huawei Technologies Co., Ltd. High refractive index waveguide for augmented reality
WO2020263866A1 (en) 2019-06-24 2020-12-30 Magic Leap, Inc. Waveguides having integral spacers and related systems and methods
WO2021044123A1 (en) * 2019-09-06 2021-03-11 Bae Systems Plc Waveguide and method for fabricating a waveguide
EP3809038A1 (en) * 2019-10-17 2021-04-21 BAE SYSTEMS plc Waveguide and method for fabricating a waveguide
WO2021072111A1 (en) * 2019-10-08 2021-04-15 Magic Leap, Inc. Color-selective waveguides for augmented reality/mixed reality applications
CN111443486A (en) * 2020-03-25 2020-07-24 北京枭龙科技有限公司 Grating waveguide element and near-to-eye display device
CN114690297A (en) * 2020-12-29 2022-07-01 华为技术有限公司 Composite grating, method for manufacturing the same, diffraction optical waveguide, and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1170139A (en) * 1995-06-08 1998-01-14 松下电器产业株式会社 Phase grating its fabricating method, optical encoder, motor using optical encoder, and robot using the motor
CN1648611A (en) * 2004-01-26 2005-08-03 三丰株式会社 Photoelectric encoder and method of manufacturing scales
CN101151562A (en) * 2005-04-04 2008-03-26 米拉茨创新有限公司 Multi-plane optical apparatus
CN103675969A (en) * 2013-12-04 2014-03-26 中国科学院上海光学精密机械研究所 High-efficiency oblique double-layer optical grating
KR20140077813A (en) * 2012-12-14 2014-06-24 엘지디스플레이 주식회사 Thin Flat Type Controlled Viewing Window Display

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1545048A (en) * 1976-05-27 1979-05-02 Rca Corp Simplified diffractive colour filtering technique
US5268790A (en) * 1991-12-20 1993-12-07 Hughes Aircraft Company Zoom lens employing refractive and diffractive optical elements
US5258871A (en) * 1992-06-01 1993-11-02 Eastman Kodak Company Dual diffraction grating beam splitter
JPH08254604A (en) * 1995-03-15 1996-10-01 Omron Corp Optical element, picture display device and image pickup unit using it
IL118209A0 (en) * 1996-05-09 1998-02-08 Yeda Res & Dev Active electro-optical wavelength-selective mirrors and active electro-optic wavelength-selective filters
US5861113A (en) * 1996-08-01 1999-01-19 The United States Of America As Represented By The Secretary Of Commerce Fabrication of embossed diffractive optics with reusable release agent
JP2000241616A (en) * 1999-02-22 2000-09-08 Sharp Corp Diffraction grating, manufacture thereof and optical pickup
JP3618057B2 (en) * 1999-03-03 2005-02-09 シャープ株式会社 Optical element manufacturing equipment
JP4727034B2 (en) 2000-11-28 2011-07-20 オリンパス株式会社 Observation optical system and imaging optical system
US20030017424A1 (en) * 2001-07-18 2003-01-23 Miri Park Method and apparatus for fabricating complex grating structures
US6998196B2 (en) * 2001-12-28 2006-02-14 Wavefront Technology Diffractive optical element and method of manufacture
EP1443344A1 (en) * 2003-01-29 2004-08-04 Heptagon Oy Manufacturing micro-structured elements
DE102004009422A1 (en) * 2004-02-24 2005-09-08 Metronic Ag Method and device for applying diffractive elements to surfaces
DE102004020363A1 (en) * 2004-04-23 2005-11-17 Schott Ag Method for producing a master, master and method for producing optical elements and optical element
JP2007017521A (en) * 2005-07-05 2007-01-25 Lintec Corp Resin sheet for manufacturing planar optical waveguide
US20080043334A1 (en) * 2006-08-18 2008-02-21 Mirage Innovations Ltd. Diffractive optical relay and method for manufacturing the same
EP1942364A1 (en) * 2005-09-14 2008-07-09 Mirage Innovations Ltd. Diffractive optical relay and method for manufacturing the same
AU2007219683B2 (en) * 2006-03-03 2012-01-12 Universite Laval Method and apparatus for spatially modulated electric field generation and electro-optical tuning using liquid crystals
US8154803B2 (en) 2006-04-13 2012-04-10 Panasonic Corporation Diffractive optical element with improved light transmittance
DE102006037431A1 (en) * 2006-08-09 2008-04-17 Ovd Kinegram Ag Production of multi-layer bodies useful in element for security- and value document such as banknotes and credit cards, by forming a relief structure in an area of replication layer and applying a layer on carrier and/or replication layer
WO2008038058A1 (en) * 2006-09-28 2008-04-03 Nokia Corporation Beam expansion with three-dimensional diffractive elements
US8160411B2 (en) * 2006-12-28 2012-04-17 Nokia Corporation Device for expanding an exit pupil in two dimensions
JP2009025501A (en) * 2007-07-19 2009-02-05 Topcon Corp Wavelength plate with diffraction grating, and method of manufacturing wavelength plate with diffraction grating
JP2009025558A (en) * 2007-07-19 2009-02-05 Tohoku Univ Wavelength selection element and method for manufacturing the same
JP5205866B2 (en) * 2007-08-23 2013-06-05 住友電気工業株式会社 Mold forming method, diffraction grating forming method, and distributed feedback semiconductor laser manufacturing method
TWI342862B (en) * 2008-01-31 2011-06-01 Univ Nat Taiwan Method of micro/nano imprinting
JPWO2010140341A1 (en) * 2009-06-03 2012-11-15 パナソニック株式会社 Diffractive optical element
JP2011033935A (en) * 2009-08-04 2011-02-17 Toppan Printing Co Ltd Optical article and method of manufacturing the same
NL2005266A (en) * 2009-10-28 2011-05-02 Asml Netherlands Bv Imprint lithography.
KR101686024B1 (en) * 2011-04-27 2016-12-13 후지필름 가부시키가이샤 Curable composition for imprints, pattern-forming method and pattern
JP5696017B2 (en) * 2011-09-27 2015-04-08 富士フイルム株式会社 Curable composition for imprint, pattern forming method and pattern
GB2500631B (en) * 2012-03-27 2017-12-27 Bae Systems Plc Improvements in or relating to optical waveguides
US9671566B2 (en) * 2012-06-11 2017-06-06 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US9625637B2 (en) * 2012-08-13 2017-04-18 3M Innovative Properties Company Diffractive lighting devices with 3-dimensional appearance
CA2910125A1 (en) * 2013-04-26 2014-10-30 Jx Nippon Oil & Energy Corporation Substrate having rugged structure obtained from hydrophobic sol/gel material
CA2976955C (en) * 2015-03-05 2022-04-26 Magic Leap, Inc. Improved manufacturing for virtual and augmented reality systems and components

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1170139A (en) * 1995-06-08 1998-01-14 松下电器产业株式会社 Phase grating its fabricating method, optical encoder, motor using optical encoder, and robot using the motor
CN1648611A (en) * 2004-01-26 2005-08-03 三丰株式会社 Photoelectric encoder and method of manufacturing scales
CN101151562A (en) * 2005-04-04 2008-03-26 米拉茨创新有限公司 Multi-plane optical apparatus
KR20140077813A (en) * 2012-12-14 2014-06-24 엘지디스플레이 주식회사 Thin Flat Type Controlled Viewing Window Display
CN103675969A (en) * 2013-12-04 2014-03-26 中国科学院上海光学精密机械研究所 High-efficiency oblique double-layer optical grating

Also Published As

Publication number Publication date
CN107430217B (en) 2021-01-15
KR102617948B1 (en) 2023-12-22
IL253996A0 (en) 2017-10-31
CA2976955C (en) 2022-04-26
JP7204808B2 (en) 2023-01-16
IL253996B (en) 2021-08-31
AU2021203240B2 (en) 2023-02-02
KR102319390B1 (en) 2021-10-28
JP2018510377A (en) 2018-04-12
KR20230175351A (en) 2023-12-29
JP2021107940A (en) 2021-07-29
KR20170125937A (en) 2017-11-15
CA2976955A1 (en) 2016-09-09
CA3151575A1 (en) 2016-09-09
KR102500734B1 (en) 2023-02-16
AU2016225962A1 (en) 2017-10-05
AU2016225962B2 (en) 2021-02-25
AU2021203240A1 (en) 2021-06-10
NZ735537A (en) 2021-09-24
CN112558307A (en) 2021-03-26
JP6873041B2 (en) 2021-05-19
JP2023041678A (en) 2023-03-24
CN107430217A (en) 2017-12-01
KR20210131452A (en) 2021-11-02
KR20230023072A (en) 2023-02-16

Similar Documents

Publication Publication Date Title
AU2021250895B2 (en) Virtual and augmented reality systems and methods having improved diffractive grating structures
CN112558307B (en) Improved manufacturing of virtual and augmented reality systems and components
US11353641B2 (en) Manufacturing for virtual and augmented reality systems and components
EP3855221B1 (en) Improved manufacturing for virtual and augmented reality systems and components
US11726241B2 (en) Manufacturing for virtual and augmented reality systems and components
NZ735537B2 (en) Improved manufacturing for virtual and augmented reality systems and components
NZ762952B2 (en) Virtual and augmented reality systems and methods having improved diffractive grating structures
NZ734573B2 (en) Virtual and augmented reality systems and methods having improved diffractive grating structures

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant