CN115903235A - Display device for displaying hologram and method thereof - Google Patents

Display device for displaying hologram and method thereof Download PDF

Info

Publication number
CN115903235A
CN115903235A CN202211268047.7A CN202211268047A CN115903235A CN 115903235 A CN115903235 A CN 115903235A CN 202211268047 A CN202211268047 A CN 202211268047A CN 115903235 A CN115903235 A CN 115903235A
Authority
CN
China
Prior art keywords
image
display device
optical
pixels
alignment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211268047.7A
Other languages
Chinese (zh)
Inventor
胡大文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/581,945 external-priority patent/US11391955B2/en
Application filed by Individual filed Critical Individual
Publication of CN115903235A publication Critical patent/CN115903235A/en
Pending legal-status Critical Current

Links

Images

Abstract

Techniques for displaying holograms in a wearable display device are provided. An image source sequentially projects a holographic image in three primary colors into the waveguide, where the image is amplitude and phase modulated in a Spatial Light Modulator (SLM). Depending on the implementation, the image source may be next to an end of the waveguide or optical fiber, where the optical fiber is used to convey the holographic image near the waveguide to be projected into the waveguide.

Description

Display device for displaying hologram and method thereof
Technical Field
The present invention relates generally to the field of display devices, and more particularly to the architecture and design of display devices fabricated as a pair of glasses that can be used in a variety of applications including virtual reality and augmented reality. In particular, the present invention employs Amplitude Modulation (AM) and Phase Modulation (PM) to implement holographic images for display, for example, in wearable display devices.
Background
Virtual reality or VR is generally defined as the real and immersive simulation of a three-dimensional environment created using interactive software and hardware and experienced or controlled by movement of a subject. A person using a virtual reality device is typically able to look around an artificially generated three-dimensional environment, walking around and interacting with features or objects depicted on a screen or in goggles. Virtual reality artificially creates a sensory experience, which may include sight, touch, hearing, and less common smell.
Augmented Reality (AR) is a technology that adds computer-generated augmentation to existing reality to make it more meaningful through the ability to interact with it. AR is developed into applications and used on mobile devices to mix digital components into the real world so that they augment each other, but can also be easily discerned. AR technology is rapidly becoming the mainstream. It is used to display a score overlay on a televised sports game on a mobile device and pop up a 3D email, photo or text message. The leader of this technology industry also uses AR to do exciting and revolutionary things with holograms and motion activation commands.
Separately, the delivery methods of virtual reality and augmented reality are different. Most 2016's of virtual reality are displayed on computer display screens, projector screens, or by virtual reality headsets (also known as head mounted displays or HMDs). HMDs are typically in the form of head-mounted goggles, with the screen in front of the eyes. Virtual reality actually brings the user into the digital world by switching off external stimuli. In this way, the user is only interested in the digital content being displayed in the HMD. Augmented reality is increasingly used in mobile devices, such as laptops, smart phones, and tablets, to change the way the real world and digital images, graphics, intersect and interact.
Indeed, VR and AR are not always opposed because they do not always operate independently of each other, but rather are often mixed together to create a more immersive experience. For example, haptic feedback as vibrations and sensations added to the interaction with the graphics is considered enhanced. However, it is often used within virtual reality scenes in order to make the experience more realistic by haptics.
Virtual reality and augmented reality are prominent examples of experiences and interactions that are expected to become immersive in the simulated platform of entertainment and gaming or to be motivated by the addition of new dimensions to the interaction between digital devices and the real world. They both open up, without doubt, real and virtual worlds, alone or mixed together.
Fig. 1A illustrates exemplary goggles for delivering or displaying VR or AR applications as is common on the market today. Regardless of the design of the goggles, they appear to be bulky and cumbersome and create inconvenience when worn by the user. Furthermore, most goggles are not see-through. In other words, when the user wears the goggles, he or she will not be able to see or do anything else. Therefore, there is a need for a device that can display VR and AR and also allow the user to perform other tasks when needed.
Various wearable devices are being developed for VR/AR and holographic applications. FIG. 1B shows a diagram of HoloLens from Microsoft. Weighing 579g (1.2 lbs) at this weight, the wearer will feel uncomfortable after a period of wear. In fact, the products available on the market are generally bulky and bulky compared to normal eyeglasses (25 g-100 g). There are reports that wearable devices according to HoloLens from Microsoft would deliver to the united states army. If truly equipped to a soldier, the weight of the wearable device would likely greatly affect the soldier's movements, especially in the battlefield, requiring rapid movement. Thus, there is a further need for a wearable AR/VR viewing or display device that looks similar to a pair of normal eyeglasses but also allows for a smaller footprint, enhanced impact performance, low cost packaging and easier manufacturing process.
Wearable display devices provide a relatively ideal viewing environment to view holograms because they are often placed near the eye and can block a significant amount of ambient light. Thus, there remains a further need for a solution to generate holograms for projection onto media, such as transparent lenses, in wearable display devices.
Many eyeglass-type display devices use a common design that places an image forming component (e.g., LCOS) in front or near the frame of the lens, it is desirable to reduce image transmission loss and use fewer components. However, such designs often unbalance the eyeglass-type display, with the front portion being much heavier than the rear portion, adding some pressure on the nose. Thus, there remains a need to disperse the weight of such display devices when they are worn by a user.
Regardless of how the wearable display device is designed, there are still many components, wires, and even batteries that must be used to make the display device functional and operable. Although much effort has been made to move as many parts as possible to the attachable apparatus or housing to drive the display apparatus from the user's waist or pocket, necessary parts such as copper wires must be used to transmit various control signals and image data. The wires, typically in the form of cables, do have a weight that increases stress on the wearer when wearing such display devices. Thus, there remains a need for a transmission medium that is as light as possible without sacrificing the desired functionality.
There are many other needs, not individually listed, which one of ordinary skill in the art would readily appreciate that one or more embodiments of the present invention detailed herein would clearly satisfy.
Disclosure of Invention
This section is for the purpose of summarizing some aspects of the invention and to briefly introduce some preferred embodiments. Simplifications or omissions may be made in this section as well as in the abstract and the title for the purpose of avoiding obscuring the section, the abstract and the title. Such simplifications or omissions are not intended to limit the scope of the present invention.
The present invention relates generally to the architecture and design of wearable devices that may be used for virtual reality and augmented reality applications. According to one aspect of the invention, a display device is made in the form of a pair of eyeglasses and includes a minimum number of portions to reduce its complexity and weight. A separate shell or outer cover is provided that is portable for attachment or attachment to a user (e.g., a pocket or belt). The housing includes all necessary parts and circuitry to generate content for virtual reality and augmented reality applications, resulting in a minimum number of parts required on the eyewear, thus making the eyewear smaller in footprint, enhanced in impact performance, lower in packaging cost, and easier in manufacturing processes. Content is optically picked up by the fiber optic cable and delivered to the glasses through optical fibers in the fiber optic cable, where the content is projected individually to a tailored lens for display of the content in front of the wearer's eyes.
According to another aspect of the invention, the glasses do not comprise electronic components and are coupled to a housing by means of a transmission line comprising one or more optical fibers (single or multiple optical fibers may be used interchangeably hereinafter), wherein the optical fibers are responsible for transporting the content or optical image from one end of the optical fiber to the other end thereof by total internal reflection within the optical fiber. An optical image is picked up by the focusing lens from the microdisplay in the housing.
According to yet another aspect of the invention, the optical image is of lower resolution in the fiber but it accepts twice the speed of the normal refresh rate (e.g., 120Hz versus 60 Hz). Where two successive frames of the lower resolution image are combined near the other end of the fiber to produce a higher resolution image, the combined image is refreshed to a generally normal refresh rate.
According to a further aspect of the invention, each lens comprises a prism in the form of: which propagates an optical image projected onto one edge of the prism to an optical path where a user can see an image formed from the optical image. The prism is also integrated or stacked on an optical corrective lens that is complementary or reciprocal to the lens of the prism to form an integrated lens of the eyewear. The optical correction lens is provided to correct the optical path from the prism, allowing the user to view through the integrated lens without optical distortion.
According to yet another aspect of the invention, an exemplary prism is a waveguide. Each of the integrated optics includes an optical waveguide that propagates an optical image projected onto one end of the waveguide to the other end through an optical path where an image formed from the optical image is viewable by a user. The waveguide may also be integrated with or stacked on an optical corrective lens to form an integrated lens of the eyewear.
According to yet another aspect of the invention, the integrated lens may also be coated with a multilayer film having optical properties to enhance the optical image in front of the eye of the user.
According to yet another aspect of the invention, the glasses include several electronic devices (e.g., sensors or microphones) to enable various interactions between the wearer and the displayed content. The signals captured by the device (e.g., depth sensor) are transmitted to the housing by wireless means (e.g., RF wireless or bluetooth) to eliminate the wired connection between the glasses and the housing.
According to yet another aspect of the invention, an optical conduit is used to convey an optical image received from an image source (e.g., a microdisplay). The optical conduit is enclosed in or integrated with a temple of the display device. Depending on the embodiment, the optical conduit including the bundle or array of optical fibers may be twisted, thinned, or otherwise deformed to fit the fashion design of the temple while transporting the optical image from one end of the temple to the other.
According to yet another aspect of the invention, the optical image is a holographic image/video (hologram). The hologram is generated by phase modulating and amplitude modulating an optical image by a Spatial Light Modulation (SLM) device. Depending on the embodiment, the hologram may be generated near the integrated lens (the end of the temple) or transported from an external device via an optical fiber.
According to one aspect of the invention, light propagation (e.g., from an SLM) is controlled in two different directions (e.g., 45 degrees and 0 degrees) to simultaneously perform Amplitude Modulation (AM) and Phase Modulation (PM) in a liquid crystal. According to another aspect of the invention, a mask is used to form an array or pattern of stamped microstructures, wherein the pattern includes an array of alignment elements, a first set of the alignment elements being aligned in a first direction and a second set of the alignment elements being aligned in a second direction. Depending on the application, the two cells from the first and second sets may correspond to a single pixel or two adjacent pixels, resulting in an amplitude modulation and a phase modulation within the pixel or within the pixel array.
According to yet another aspect of the invention, a portable device can be used to house the SLM to perform AM and PM and to provide the hologram for transport over the fiber. Depending on the implementation, the portable device may be implemented as a standalone device or a docking unit to receive a smartphone. The portable device is primarily a control box connected to a network, such as the internet, and generates control and command signals when controlled by a user. When the smartphone is received in the docking unit, many of the functions provided in the smartphone, such as the network interface and the touch screen, may be used to receive input from the user.
The present invention may be embodied as an apparatus, method, or system. Different embodiments may yield different benefits, objects, and advantages. In one embodiment, the present invention is a display device comprising: a display device, comprising: a frame for glasses; at least one integrated lens, wherein the integrated lens is framed in the eyeglass frame; a spatial light modulation device to amplitude and phase modulate an optical image to produce a modulated image; and at least one holographic mirror that receives the modulated image and rotates the modulated image 90 degrees to project the modulated image into the integrated lens, wherein the holographic mirror is optically coated to selectively allow specific wavelengths to pass or reflect, a hologram produced by the modulated image being visible in the integrated lens by a user wearing the display device.
In another embodiment, the present invention is a method of displaying a device, the method comprising: providing an eyeglass frame comprising at least one integrated lens and a temple attached to the eyeglass frame; receiving an optical image; modulating the optical image in amplitude and phase in a spatial light modulation device; generating a hologram using the light intensity reflected by the spatial light modulation device irradiated by the uniform laser sheet; and projecting the hologram into the integrated lens through a 90 degree rotation via a mirror, wherein the mirror is optically coated to selectively allow specific wavelengths to pass or reflect, the hologram produced by the modulated image being visible in the integrated lens by a user wearing the display device.
In addition to the above objects, which are achieved by the practice of the invention in the following description and which result in the embodiments shown in the drawings, there are many other objects.
Drawings
These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
FIG. 1A illustrates exemplary goggles for delivering or displaying VR/AR applications that are common on the market today;
FIG. 1B shows a simplified diagram of HoloLens from Microsoft;
FIG. 2A shows an exemplary pair of glasses that may be used for VR applications according to one embodiment of the present invention;
FIG. 2B illustrates the use of an optical fiber to transport light from one location to another along a curved path in a more efficient manner or by total internal reflection within the fiber;
FIG. 2C illustrates two exemplary ways of encapsulating an optical fiber or a plurality of optical fibers according to one embodiment of the invention;
FIG. 2D shows how an image is carried from the microdisplay to the imaging medium by a fiber optic cable;
FIG. 2E illustrates an exemplary set of Variable Focus Elements (VFE) to accommodate adjustment of the projection of an image onto an optical object (e.g., an imaging medium or prism);
FIG. 2F illustrates an exemplary lens that may be used with the eyewear shown in FIG. 2A, wherein the lens includes two portions, a prism and an optical corrective lens or corrector;
FIG. 2G shows internal reflections from multiple sources (e.g., a sensor, an imaging medium, and multiple light sources) in an irregular prism;
fig. 2H shows such an integrated lens in comparison to a coin and ruler;
fig. 2I shows a shirt with the cable enclosed within or attached to the shirt;
FIG. 3A illustrates how three single color images are visually combined and perceived by human vision as a full color image;
FIG. 3B shows that three different color images are produced under three lights at wavelengths λ 1, λ 2, and λ 3, respectively, and the imaging medium includes three films, each film coated with one type of phosphor.
FIG. 4 illustrates the use of a waveguide to transport an optical image from one end of the waveguide to its other end;
FIG. 5A illustrates an exemplary functional block diagram that may be used with a separate shell or housing to generate content for virtual reality and augmented reality for display on the exemplary eyewear of FIG. 2A;
fig. 5B shows an embodiment according to which an exemplary circuit is used in a single housing device case (also referred to herein as an image engine).
FIG. 5C shows an exemplary embodiment showing how a user may wear a pair of designed display glasses, according to one embodiment of the invention.
FIG. 5D illustrates an exemplary functional block diagram of circuitry for the graphics engine of FIG. 5B, which employs the techniques disclosed in U.S. Pat. No. 10,147,350 in accordance with one embodiment;
FIG. 5E shows an example of an image engine (source) located near one end of the temple of the glasses (i.e., the hinge region);
FIG. 5F shows a top view of a wearable display device configured to display a hologram according to one embodiment of the invention;
FIG. 5G shows an exemplary block circuit diagram employing the techniques disclosed in U.S. Pat. No. 10,147,350, the contents of which are hereby incorporated by reference, in accordance with one embodiment;
FIG. 6A illustrates an array of multi-pixel cells, each of which shows four sub-pixel cells, as an example;
FIG. 6B illustrates a concept to generate an expanded image from two generated frames;
FIG. 6C shows an exemplary expanded image into a double sized image with sub-pixel units, the image being written with pixel values to all (four) sub-pixel units in a group, the expansion being processed through two passes and separated to form two frames;
FIG. 6D illustrates the separation of images in terms of the method to produce two frames of the same size as the original image by light intensity;
FIG. 6E shows another embodiment for enlarging an input image to an expanded but two substantially reduced and interleaved images;
FIG. 7A shows how an optical image is generated using an optical cube in one embodiment;
FIG. 7B shows that display glasses that do not include any other electrical and electronic components provide images or video to the integrated lens;
fig. 8A shows an exemplary LCoS structure that produces a 2-dimensional optical image (i.e., 2D different intensities of light or modulated light to obtain a gray scale of the image);
figure 8b.1 shows an exemplary cross-sectional view of an LC layer with an alignment layer, where the alignment (rubbing) angle determines the characteristics of the light passing through the LC molecules;
FIG. 8B.2 shows exemplary functional layers in an LCoS;
FIG. 8C shows an example of how an LCoS may be modified or redesigned to implement an embodiment of the present invention;
FIG. 8D shows an exemplary 8 × 8 array of aligned cells (each corresponding to a pixel);
FIG. 8E shows an array of alignment cells, each designed for AM and PM simultaneously, which in operation appears to have split each pixel in the entire SLM device into two alternating portions, allowing half of the pixels to perform AM and the other half to perform PM simultaneously at different fractions of AM to PM;
FIG. 8F shows two separate graphical plots, one being the reflectivity plot AM and the other being the phase plot PM;
FIG. 8G shows an exemplary alignment cell of the pixel of FIG. 8F, each or all of which can be steered to produce simultaneous different reflectivity curves AM (or transmittance in the case of LCoS) and phase curves PM;
FIG. 8H shows the simulation results on a single pixel that does not involve adjacent pixels;
FIG. 8I shows an exemplary embodiment of a method using a photo-alignment mask; and
FIG. 9 shows a process or flow for creating an SLM device that performs both AM and PM within a cell or array, according to one embodiment.
Detailed Description
The detailed description of the invention is presented largely in terms of procedures, steps, logic blocks, processing, and other symbolic representations of operations that are directly or indirectly analogous to data processing devices coupled to a network. These process descriptions and representations are generally used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art.
Reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, the order of the blocks in a process flow diagram or illustration representing one or more embodiments of the invention is not intended to indicate any particular order nor imply any limitations in the invention per se.
Embodiments of the invention are discussed herein with reference to fig. 2A-8J. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.
Referring now to the figures, in which like numerals refer to like parts throughout the several views. FIG. 2A shows an exemplary pair of glasses 200 for VR/AR applications according to one embodiment of the present invention. The eyewear 200 does not significantly differ in appearance from a normal pair of eyeglasses, but includes two flexible cables 202 and 204 extending from temples 206 and 208, respectively. According to one embodiment, each of the two flexible cables 202 and the temples 206 and 208 are integrated or removably connected at one end thereof and include one or more optical fibers.
Both flexible cables 202 are coupled at their other end to a portable computing device 210, the computing device 210 or external box 210 including the necessary components to generate image data to drive the microdisplay on which the electronic image is displayed. The image is transported through the optical fiber by total internal reflection therein in the flexible cable 202 up to the other end of the optical fiber, where the image is projected onto the lens in the glasses 200. As will be further described, according to one embodiment of the invention, the optical image may be a hologram.
According to one embodiment, each of the two flexible cables 202 includes one or more optical fibers. Optical fibers are used to transmit light from one place to another along a curved path in a more efficient manner as shown in fig. 2B. In one embodiment, the optical fiber is formed from thousands of strands of very fine quality glass or quartz having an index of refraction on the order of about 1.7. The thickness of one strand is small. The strands are coated with a layer of a material of lower refractive index. The ends of the strands are polished and clamped firmly after careful alignment thereof. When light is incident at one end at a small angle, it is refracted into the strand (or fiber) and is incident on the interface of the fiber and the coating. At incident angles greater than the critical angle, the light rays undergo total internal reflection and substantially transport the light from one end to the other, even when the fiber is bent. Depending on the embodiment of the present invention, a single optical fiber or a plurality of optical fibers arranged in parallel may be used to transport an optical image projected onto one end of an optical fiber to its other end. Typically a high resolution image will require more optical fiber to transmit. According to one embodiment, as will be described below, the number of fibers used by transmitting the first (low) resolution image is minimized to achieve a small number of fibers. After transmission, two such images (e.g., two consecutive images) are combined at double the refresh rate to produce a viewable second (high) resolution image.
Fig. 2C shows two exemplary ways 230 or 232 of encapsulating an optical fiber or fibers according to one embodiment of the invention. The encapsulated optical fiber may be used as cable 202 or 204 in fig. 2A and extends through each of the non-flexible temples 206 and 208 to its end. According to one embodiment, the temples 206 and 208 are made of a material type common to a pair of ordinary eyeglasses (e.g., plastic or metal), a portion of the cable 202 or 204 is embedded or integrated in the temples 206 or 208, thereby creating a non-flexible portion, while another portion of the cable 202 or 204 is still flexible. According to another embodiment, the non-flexible portion and the flexible portion of the cable 202 or 204 may be removably connected by an interface or connector.
Reference is now made to fig. 2D, which illustrates how an image is conveyed from the microdisplay 240 to the imaging medium 244 through a fiber optic cable 242. As will be described further below, the imaging medium 244 can be a solid object (e.g., a film or lens) or a non-solid object (e.g., air). The microdisplay is a display with a very small screen (e.g., less than one inch). This type of tiny electronic display system was introduced commercially by the end of the 90's of the 20 th century. The most common applications for microdisplays include rear projection TVs and head mounted displays. The microdisplay may be reflective or transmissive depending on the way light is allowed to pass through the display element. An image (not shown) displayed on the microdisplay 240 is picked up by one end of the fiber optic cable 242, which carries the image to the other end of the fiber optic cable 242, via a lens 246. Another lens 248 is provided to collect images from the fiber optic cable 242 and project images onto the imaging media 244. Depending on the embodiment, there are different types of microdisplays and imaging media. Some embodiments of the microdisplay and the imaging media are described in detail below.
Fig. 2E illustrates an exemplary set of Variable Focus Elements (VFEs) 250 to accommodate adjustment of the projection of an optical image onto an optical object (e.g., imaging media, prism, or lens). To facilitate the description of the various embodiments of the present invention, it is assumed that image media is present. As shown in fig. 2E, the image 252 conveyed through the fiber optic cable reaches an end surface 254 of the fiber optic cable. The image 252 is focused onto an imaging medium 258 by a set of lenses 256, referred to herein as Variable Focus Elements (VFEs). The VFE 256 is provided to adjust to ensure that the image 252 is accurately focused onto the imaging media 258. Depending on the implementation, adjustments to the VFE 256 may be done manually or automatically based on inputs (e.g., measurements obtained from sensors). According to one embodiment, the adjustment of the VFE 256 is performed automatically according to a feedback signal derived from a sensed signal from a sensor against the eye (pupil) of the wearer wearing the eyeglasses 200 of fig. 2A.
Referring now to FIG. 2F, an exemplary lens 260 that may be used with the eyewear shown in FIG. 2A is shown. The lens 260 includes two portions: a prism 262 and an optical correction lens or corrector 264. The prism 262 and the corrector 264 are stacked to form the optic 260. As the name implies, the optical corrector 264 is provided to correct the optical path from the prism 262 such that light passing through the prism 262 travels straight through the corrector 264. In other words, the refracted light from the prism 262 is corrected or released from refraction by the corrector 264. In optics, a prism is a transparent optical element with a flat, polished surface that refracts light. At least two of the planar surfaces must have an angle between them. The exact angle between the surfaces depends on the application. The traditional geometry is a triangular prism with a triangular base and rectangular sides, and in spoken use, a prism is often referred to as this type. The prism may be made of any material that is transparent to the wavelength for which it is designed. Typical materials include glass, plastic and fluorspar. According to one embodiment, the type of prism 262 is not actually in the shape of a geometric prism, and thus the prism 262 is referred to herein as an arbitrary-shaped prism, which directs the corrector 264 to a shape that is complementary, reciprocal, or conjugate to the form of the prism 262 to form the optic 260 (i.e., an integrated optic). As described further below, in one embodiment, the prism 262 may simply be a waveguide.
On one edge of the lens 260 or the edge of the prism 262, there are at least three items that utilize the prism 262. Labeled 267 is imaging media corresponding to imaging media 244 of fig. 2D or imaging media 258 of fig. 2E. Depending on the embodiment, the image conveyed by the fiber 242 of fig. 2D may be projected directly onto the edge of the prism 262, or formed on the imaging media 267 before it is projected onto the edge of the prism 262. In any case, depending on the shape of the prism 262, the projected image is refracted in the prism 262 and then seen by the eye 265. In other words, a user wearing a pair of glasses using lenses 262 can see an image displayed through or in prism 262.
A sensor 266 is provided to image the position or movement of the pupil in the eye 265. Also, based on the refraction provided by the prism 262, the sensor 266 may find the location of the pupil. In operation, an image of the eye 265 is captured. The image is analyzed to derive the manner in which the pupil views the image shown through or in the lens 260. In AR applications, the position of the pupil may be used to activate some action. Optionally, a light source 268 is provided to illuminate the eye 265 to facilitate image capture by the sensor 266. According to one embodiment, the light source 268 uses a near-guess source, whereby the user or his eye 265 will not be affected by the light source when the light source 268 is on.
Fig. 2G shows internal reflections from multiple sources (e.g., sensor 266, imaging media 267, and light source 268). Since the prism is uniquely designed, especially in shape, or has specific edges, the rays from the source are reflected several times within the prism 268 and then impinge on the eye 265. For completeness, fig. 2H shows a comparison of such lenses in size with a coin and a straight edge.
As described above, there are different types of microdisplays and, therefore, different imaging media. The following table summarizes some of the microdisplays that may be used to facilitate the generation of an optical image that can be transported from one end to the other end by one or more optical fibers by total internal reflection within the optical fibers.
Figure BDA0003893858670000101
LCoS = liquid crystal on silicon;
LCD = liquid crystal display;
OLED = organic light emitting diode;
RGB = red, green, and blue; and
SLM = spatial light modulator.
In the first case shown in the above table, a full color image is actually displayed on a silicon substrate. As shown in fig. 2D, the full color image can be picked up by a focusing lens or set of lenses that project the full image onto exactly one end of the optical fiber. The image is transported within the fiber and again picked up by another focusing lens at the other end of the fiber. The imaging medium 244 of fig. 2D may not be physically needed because the delivered image is visible and full color. The color image may be projected directly onto one edge of the prism 262 of fig. 2F.
In the second case shown in the table above, the LCoS is used with different light sources. Specifically, there are at least three color light sources (e.g., red, green, and blue) used sequentially. In other words, each light source produces a single color image. The image picked up by the optical fiber is only a single color image. A full-color image can be reproduced when all three different single-color images are combined. The imaging medium 244 of fig. 2D is provided to reproduce a full color image from three different single color images respectively conveyed by the optical fibers.
Fig. 2I shows a shirt 270 with a cable 272 enclosed within or attached to shirt 270. Shirt 270 is an example of a fabric material or a multi-layer piece. Such relatively thin cables may be embedded in the multilayer. When a user wears such a shirt made or designed according to one embodiment, the cable itself has less weight and the user is more free to move around.
Fig. 3A shows how three single color images 302 are visually combined and perceived by human vision as a full color image 304. According to one embodiment, three colored light sources are used, for example red, green and blue light sources that are switched on in sequence. More specifically, when the red light source is turned on, only a red image is produced as a result (e.g., from a microdisplay). The red image is then optically picked up and transported by the optical fiber and then projected into the prism 262 of fig. 2F. As the green and blue light is then turned on in sequence, green and blue images are generated and conveyed separately by the optical fibers and then projected into the prism 262 of fig. 2F. It is well known that human vision possesses the ability to combine three single color images and perceive them as a full color image. With all three single color images projected in sequence into the prism perfectly aligned, the eye sees a full color image.
Also in the second case shown above, the light source may be approximately invisible. According to one embodiment, three light sources generate light close to the UV band. Under such illumination, three different color images may still be generated and delivered, but are not fully visible. Before the color image can be presented to the eye or projected into a prism, it will be converted into three primary color images, which can then be perceived as a full color image. According to one embodiment, the imaging media 244 of FIG. 2D is provided. Fig. 3B shows that at the wavelengths respectively, it can then be perceived as a full color image. According to one embodiment, the imaging media 244 of FIG. 2D is provided. FIG. 3B shows three different color images 310 produced under three light sources at wavelengths λ 1, λ 2, and λ 3, respectively, and the imaging medium 312 includes three film layers 314, each film layer 314 being coated with one type of phosphor, i.e., a substance exhibiting a luminescence phenomenon. In one embodiment, three types of phosphors at wavelengths 405nm, 435nm, and 465nm are used to convert three different color images produced under three light sources in proximity to the UV band. In other words, when one such color image is projected onto a film layer coated with a phosphor at a wavelength of 405nm, the single color image is converted to a red image that is subsequently focused and projected into a prism. The process is the same for the other two single color images through the film layer coated with phosphor at wavelengths 435nm or 465nm, producing green and blue images. When such red, green, and blue images are projected sequentially into the prism, together they are perceived by human vision as a full-color image.
In the third or fourth case shown in the table above, instead of using light in the visible spectrum of the human eye or that is nearly invisible, the light source uses a laser source. Visible lasers and invisible lasers are also present. Without much difference to the operation of the first and second cases, the third or fourth case uses so-called Spatial Light Modulation (SLM) to form a full color image. Spatial light modulators are a general term describing devices for modulating the amplitude, phase or polarization of light waves in space and time. In other words, the SLM + lasers (RGB sequential) can produce three separate color images. When the color images are combined with or without an imaging medium, a full color image can be reproduced. In the case of SLM + laser (not visible), the imaging media would be presented to convert the non-visible image to a full color image, in which case appropriate film layers may be used as shown in fig. 3B.
Referring now to FIG. 4, a waveguide 400 is shown for transporting an optical image 402 from one end 404 to another end 406 of the waveguide 400, where the waveguide 400 may be stacked with one or more sheets of glass or lens (not shown) or coated with one or more film layers to form or be part of a suitable lens for application to a pair of glasses displaying an image from a computing device. As known to those skilled in the art, an optical waveguide is a spatially inhomogeneous structure for guiding light, i.e. a spatial region for limiting the light propagation, wherein the waveguide contains a region of increased refractive index compared to the surrounding medium (usually called cladding).
Waveguide 400 is transparent and shaped at end 404 in a suitable manner to allow image 402 to propagate along waveguide 400 to end 406, where user 408 can view through waveguide 400 to see propagating image 410. According to one embodiment, one or more film layers are disposed on waveguide 400 to magnify propagating image 410 such that eye 408 can see a significantly magnified image 412. One example of such a film is known as metalenses, which is essentially an array of thin titania nanosheets on a glass substrate.
Referring now to fig. 5A, an exemplary functional block diagram 500 is shown that may be used with a separate shell or housing to generate virtual reality and augmented reality related content for display on the exemplary eyewear of fig. 2A. As shown in fig. 5A, two micro-displays 502 and 504 are provided to supply content to two lenses in the glasses of fig. 2A, essentially a left image to the left lens and a right image to the right lens. Examples of such content are 2D or 3D images and video or holograms. Each of the microdisplays 502 and 504 is driven by a corresponding driver 506 or 508.
The entire circuit 500 is controlled and driven by a controller 510 programmed to produce the content. According to one embodiment, the circuit 500 is designed to communicate with the internet (not shown), receiving the content from other devices. Specifically, the circuit 500 includes an interface that receives the sensing signal wirelessly (e.g., RF or bluetooth) from a remote sensor (e.g., sensor 266 of fig. 2F). The controller 510 is programmed to analyze the sensing signals and provide feedback signals to control certain operations of the eyewear, such as a projection mechanism, which includes a focusing mechanism that automatically focuses and projects an optical image onto the edge of the prism 262 of fig. 2F or the waveguide 400 of fig. 4. Further, audio is provided to synchronize with the content, and the audio can be transmitted wirelessly to the headset (e.g., via bluetooth).
Fig. 5A illustrates an exemplary circuit 500 that generates content for display in a pair of glasses contemplated in one embodiment of the present invention. The circuit 500 shows that there are two micro-displays 502 and 504 for providing two corresponding image or video streams to the two lenses of the glasses in fig. 2A. According to one embodiment, only one microdisplay may be used to drive both lenses of the glasses in fig. 2A. Such circuitry is not provided herein, as those skilled in the art know how the circuitry can be designed or how to modify the circuitry 500 of fig. 5A.
Fig. 5B shows an embodiment according to which an exemplary circuit 500 is used within a single housing device box 516. The device 516 includes the necessary electronic components to receive an image source or video from the smart phone 518, while also being a controller to provide the required interface so that the wearer or user can manipulate what is received and shown on the display glasses, and how to interact with the display. Fig. 5C shows an exemplary embodiment showing how a user may wear such display glasses. The display glasses 520 according to this embodiment do not include active electronic components (power drives) other than a pair of optical fibers 522 to deliver an image or video. The accompanying sound can be provided by the smartphone 518 directly to a headset (earbud or bluetooth headset). As will be further described below, the thickness or number of fibers 522 from the image engine 516 to the glasses 520 used to transmit or transport the low resolution images and video will again be reduced.
FIG. 5D shows an example of having some portions of the electronic components (i.e., the image engine) 530 in the device 516 located near one end of the temple 532, according to one embodiment of the invention. The image engine 530 includes a light source 534 to illuminate an SLM (e.g., LCoS) 536 and an optical component 538 implemented to perform AM and PM to provide a generated image (e.g., hologram) to be transported to a lens (not shown) via optical fibers integrated in the temple 532. In operation, image data is provided to the image engine 530 via the electrical line 540 coupled to the device 516.
FIG. 5E shows an example of an image engine 530 located near the other end of a temple 532 (i.e., the hinge region in a pair of conventional eyeglasses), according to one embodiment of the invention. In this example, the image data is provided directly to the image engine 530 via a wire 540 coupled to the device 516.
Fig. 5F shows a top view 550 of a wearable display device configured to display a hologram according to one embodiment of the invention. At least one laser diode 552 is provided as a laser source for generating a laser slab 554 via an optical or mirror arrangement 556, wherein the laser slab 554 is a uniform planar light. The light plate 554 impinges on an optical cube 558 (in one embodiment, consisting of two halves) that directs the light plate 554 onto the SLM 560. Light 554 is modulated according to an image displayed on SLM 560 and is further modulated in amplitude and phase, as will be described further below. Reflected light 562 strikes a mirror 564, which redirects or rotates reflected light 562 90 degrees in the forward direction. A user wearing the wearable display device can view hologram 566 via mirror 564, where mirror 564 is optically coated to selectively allow particular wavelengths to pass or reflect. To prevent the projected hologram 566 from being reflected by the mirror 568, the mirror 568 is optically coated or integrated with a wavelength selective holographic mirror 570 to selectively allow certain wavelengths to pass through.
Fig. 5G illustrates an exemplary circuit 580, according to one embodiment, that employs the techniques disclosed in U.S. patent No. 10,147,350, the contents of which are hereby incorporated by reference. As shown in FIG. 5G, the circuit 550 essentially produces two low resolution images (e.g., 640x 480) that are displaced diagonally by one pixel and have a refresh rate of 120Hz (60 Hz in the United states for the commonly used "standard" refresh rate). The refresh rate typically used is 60Hz for most television TVs, PC monitors, and smart phones. A refresh rate of 60Hz means that the display is refreshed 60 times a second, in other words, the displayed image is updated (or refreshed) every 16.67 microseconds (ms). When such two images are refreshed twice at the standard refresh rate, the image resolution perceived by the user on the integrated glasses is doubled, i.e., nominally up to 1280x960.
According to one embodiment, the native (first) resolution of a displayed image on the display glasses, e.g., 640x480, or a resolution that is pre-set for efficient transmission over optical fiber, is at a first refresh rate when transmitting video. If the image is at a higher resolution than the first resolution, it may be reduced to a lower resolution. According to US patent No. 10,147,350, a duplicate but diagonally displaced image by half a pixel is generated resulting in a second image also at the first resolution, both images being projected on the optical fibre 522 in sequence at twice the refresh rate of the original image, that is to say the second refresh rate is equal to twice (= 2X) the first refresh rate. When the images are output sequentially from the other end of the fibre, they will be seen in the waveguide as an image of the second resolution, which is twice that of the first resolution.
Fig. 6A-6E duplicate fig. 16A-16E of U.S. patent No. US10,147,350. As described above, the optical image output from the optical fiber in one embodiment of the present invention will be twice the spatial resolution seen by the input image. Referring to FIG. 6A, a pixel cell array 600 (forming an image or a data image) is shown having 4 sub-image cells 604A, 604B, 604C, and 604D. When an input image (e.g., 500x 500) with a first resolution is received and displayed as the first resolution, each pixel value is stored in each pixel unit 600. In other words, sub-picture elements 604A, 604B, 604C, and 604D are all written or stored with the same value and are addressed simultaneously. As shown in FIG. 6A, a word line (e.g., WL0, WL1, or WL 2) may be addressed to sub-pixels belonging to two columns of pixels 602 at the same time, and a bit line (e.g., BL0, BL1, or BL) may be addressed to sub-pixels belonging to two rows of pixels 602 at the same time. At any instant, a pixel value is written into pixel 602, where sub-picture elements 604A, 604B, 604C, and 604D are selected. Finally, the input images are displayed at a first resolution (e.g., 500x 500), i.e., the input images are all at the same resolution.
Now assume that an input (data) image of a first resolution (e.g., 500x 500) is received and displayed at a second resolution (e.g., 1000x 1000), where the second resolution is twice the first resolution. According to one embodiment, sub-image elements are used to achieve a viewable resolution. It is extremely important to understand that this improved spatial resolution is viewable by the human eye, rather than the actual double resolution of the input image. To facilitate the description of the invention, fig. 6B and 6C are used to illustrate how to expand an enlarged input image to achieve a viewable resolution.
Now assume that an input image 610 is at a resolution of 500x500. The input image 610 is expanded to a size of 1000x1000 of an image 614 via data processing 612 (e.g., enlarging and sharpening). Fig. 6C shows an example where image 616 is expanded to image 618 to be twice as large with sub-pixel cells. In operation each pixel of image 616 is written to a group comprising all (four) sub-pixel units (e.g., 2x2 for the exemplary sub-pixel). Those skilled in the art will appreciate that the description herein can be readily applied to other sub-pixel structures (3 x3, 4x4, 5x5, etc.), resulting in even more observable resolutions. According to one embodiment, a sharpening process (e.g., data processing of the portion of FIG. 16B) is applied to expand image 618 to cause the underlying process of enlarging image 618 (e.g., filtering, thinning, or sharpening image edges) to achieve the goal of generating two image frames from expanded image 618. In one embodiment, the values of each subpixel are computationally recalculated to achieve a better defined edge to generate image 620, and in another embodiment, the values of adjacent pixels are referenced to obtain a sharp edge.
Processed image 620 is then separated into two images 622 and 624 via a separate process 625. Both images 622 and 624 have the same resolution as the input image (e.g., 500x 500), where the sub-pixel elements of images 622 and 624 are written or stored with the same value. The boundary extent of the pixel cells in image 622 is intentionally different from the boundary extent of the pixel cells in image 624. In one embodiment, the boundary of the pixel cell is offset by half a pixel in the vertical direction (corresponding to one sub-pixel in a 2x2 sub-pixel array) and also offset by half a pixel in the horizontal direction (corresponding to one sub-pixel in a 2x2 sub-pixel array). The separation process 625 proceeds in one manner: when images 622 and 624 overlap, the combined image can best fit into image 620 and is four times the resolution of input image 616. In the example of FIG. 6C, to maintain a fixed intensity of the input image 610, the separation process 625 also includes a process of reducing the intensity of each of the two images 622 and 624 by 50%. In operation, the intensity of the first image is reduced by N percent, where N is an integer and ranges from 1 to 100, but is set to be about 50 in practice. As a result, the intensity of the second image is reduced to (100-N) percent. Either of these two images 622 and 624 is displayed at twice the refresh rate of the input image 610. In other words, if the input image is displayed at 50Hz per second, each pixel of the two images 622 and 624 is displayed at 100Hz per second. The combined image perceived by the viewer approximates image 620 due to the offset of the pixel boundaries and the processing of the data. The pixel boundary that is offset between the two images 626 and 624 has the effect of "shifting" the pixel boundary. According to another embodiment, as shown by two pixels 626 and 628, the example illustrated in FIG. 6C is similarly shifted one (sub) pixel in the southeast direction.
According to one embodiment, the separation process 625 may be performed by an image algorithm or a pixel shift, where a pixel shift refers to a sub-pixel in the sub-pixel structure shown in FIG. 6A. There are many ways to separate one NxM image into two images by intensity and each image is still NxM, so the perceived display effect to see two images of either will be twice the refresh rate for optimal vision. For example, one exemplary approximation is to keep and modify the original image and reduce the intensity as a first frame, while the remainder of the first frame is used to generate a second frame, again with reduced intensity. In another embodiment, the approximation is to shift the bit by half (1/2) a pixel (e.g., horizontally and vertically or diagonally) from the first frame (from the original or modified acquisition) to generate a second frame, which is further detailed later. FIG. 6C shows that two pictures 622 and 624 are generated from processing the expanded picture 620, in accordance with the graphics algorithm, concurrently with the generation of two pixels 626 and 628, by diagonally shifting the pixels of the first frame to generate the second frame. It should be noted that the separation process here means that two frames equivalent to the original image size are generated by separating the images by their intensities. Fig. 6D shows an image of two pixels, one at full intensity (shown as black) and the other at half full intensity (shown as gray). When the two pixel images are separated into two frames of the same size as the original, the first frame has two pixels, both of which are half the full intensity (shown as gray) and the second frame also has two pixels, one of which is half the full intensity (shown as gray) and the other of which is almost zero percent full intensity (shown as white). Now there are twice as many pixels as the original image, which shows a checkerboard pattern like a western flag-jumping tray. Since each pixel is refreshed 60 times per second instead of 120 times, each pixel has half the brightness, but because they are twice as much, the brightness of the image as a whole remains the same.
Referring now to FIG. 6E, another embodiment is shown to expand the input image 610. The input image 610 still assumes a resolution of 500x500. Via data processing 612, the input image 610 is expanded to a size of 1000x1000. In this embodiment, it should be appreciated that 1000x1000 is not the resolution of the extended image. The expanded image is images 630 and 632 with two 500x500 reductions to a considerable extent. The expanded view 634 of the considerably reduced images 630 and 632 shows that the pixels in one image are considerably reduced allowing pixels of another image to be generated between the pixels. According to one embodiment of the invention, the first grid image is derived from the input image and the second image is derived from the first image. As shown in the expanded view 634 of FIG. 6E, an exemplary pixel 636 in the second image 632 is derived from three pixels 638A, 638B, and 638C. The same way, that is to say to shift by half a (1/2) pixel along a set direction, can be applied to generate all the pixels of the second image. At the end of the data processing 612, there is an interleaved image comprising two images 630 and 632, each of 500x500. Another process flow 625 that causes a separate process is applied to the interleaved image to generate or store two images 630 and 632 therein.
Referring to FIG. 7A, one embodiment is shown showing how an optical image is generated using an optical cube 702. Based on the light source 704, an image displayed on a micro-display (e.g., LCoS or OLED, one of the spatial light modulation devices) 706 is projected as an optical image (light intensity) picked up by a lens 708. The optical image is then transmitted to the other end thereof via the optical fiber 710. The optical image is then projected into a waveguide or integrated lens 712 via another lens (e.g., collimator or collimator) 714. The optical image is ultimately viewed by a person's eye 716 in the waveguide lens 712.
According to one embodiment, the light source 704 is a laser sheet generated from a laser spot. There are many optical ways to generate a uniform laser sheet (planar laser), the details of which will not be described further herein to avoid obscuring important aspects of the invention. The laser sheet is used to illuminate an SLM (spatial light modulation) 706, and is amplitude-modulated and phase-modulated by the SLM 706. Reflected light from SLM706 is captured and focused on a medium (e.g., waveguide 712 or one end of an optical fiber) via a lens (not shown). In operation, three laser sheets of three primary colors (e.g., red, green, and blue) sequentially impinge upon SLM706, assuming SLM706 is a reflective device. Each laser slab is modulated in the SLM706, the reflected (modulated) light intensity is coupled into a waveguide 712, and the user can view the reconstructed color hologram in the waveguide 712.
Fig. 7B shows display glasses 720 that do not include any other power-driven electronic components, with an image engine 722 located externally (e.g., enclosure 516 of fig. 5B) to provide images or video to the integrated lens, thus making display glasses 720 extremely lightweight while still being able to see all types of images/video, including holograms.
Fig. 8A illustrates an exemplary structure 800 of an LCoS that may be used in the image engine 530 of fig. 5D or 5E or the image engine 722 of fig. 7B. In perspective view, the LCoS produces a 2-dimensional optical image (i.e., 2D light or modulated light of different intensities). It is well known that digital images can be transmitted over data cables, whereas optical images cannot be transmitted over the data cables. In general, depending on the application, the optical image may be conveyed via an optical medium (e.g., air, waveguide, or fiber). LCoS does not use micro mirrors that turn on and off, but uses liquid crystals as light modulators, the amount of reflected light being controlled by changing its angle.
Liquid Crystals (LC) are substances in a mesogenic state (not completely liquid or solid). Its molecules usually retain their shape, like a solid, but they can also move around like a liquid. For example, nematic liquid crystals are arranged in loosely parallel lines. The liquid crystal layer (or LC layer) is positioned, sandwiched or coupled between a transparent electrode layer and a reflective electrode layer, wherein the reflective electrode in the reflective electrode layer comprises an array of pixel electrodes and is constructed on a silicon substrate. It should be noted that there are other layers integrated with the LC layer between the transparent electrode layer (sometimes simply referred to as transparent layer, such as fig. 8A and 8C) and the reflective electrode layer (sometimes also simply referred to as reflective layer, such as fig. 8A and 8C). As used herein, the terms "positioned," "sandwiched," or "coupled" between two layers does not mean that there is only one item between the two layers. Other layers of material or components may be added on top of or sandwiching the article to alter, modify or enhance the behavior, performance or characteristics of the article, all between the two layers. When placed between two polarizing layers, the twisted crystal directs the path of the light. When a voltage difference is applied between the transparent electrode layer and one of the pixel electrodes, the LC molecules therebetween are reoriented with an applied electric field. By redirecting light, the crystal allows or prevents it from passing through it.
The length of the liquid crystal molecules is generally much longer than their width. In the rod-shaped liquid crystal, molecules are locally aligned in the same direction, thereby generating optical birefringence, i.e., a refractive index along a long axis of the molecules is significantly different from a refractive index perpendicular to the long axis of the molecules. In other words, birefringence is an optical property of a material whose refractive index depends on the polarization and propagation direction of light. Without further elaboration of the molecules and/or the liquid crystal and how they influence the birefringence, it is well known how the light entering the liquid crystal or the polarization and propagation direction of the light determines the reflection or transmission of light through the LC layer.
When a voltage difference is applied between the transparent electrode layer and one of the pixel electrodes, the LC molecules therebetween are reoriented with an applied electric field. By redirecting light, the crystal allows or prevents it from passing through it. Because LC is birefringent, the orientation results in a phase shift to the light, commonly referred to as phase retardation, which can be controlled by the voltage difference generated by the electrically controlled birefringence effect (ECB mode).
When incident light of linear polarization enters the LC layer at an angle of Φ to the guide axis of the liquid crystal, it is split into two light beams having different polarizations, i.e., an extraordinary wave (E light) whose polarization direction is parallel to the guide axis of the liquid crystal, and an ordinary wave (O light) whose polarization direction is perpendicular to the guide axis of the liquid crystal. Because the E light and the O light pass through the liquid crystal at different speeds, their refractive indices are different. Therefore, when two waves are emitted from the liquid crystal, there is a phase difference δ therebetween, i.e.:
Figure BDA0003893858670000171
where d is the cell gap (i.e., the thickness of the LC layer) and Δ n depends on the applied voltage, temperature, and wavelength λ of the incident light v And Δ n = n e -n o Which is also called birefringence.
When the homogeneous cell is sandwiched between two polarizers, the normalized light transmission is governed by the following equation:
T=cos 2 X-sin 2βsin 2(β-X)sin 2 (Sigma/2); equation (2)
Where X is the angle between the polarizer and the analyzer, β is the angle between the polarizer and the LC director, and σ is the phase retardation in equation 1. For the simplest case of β =45 degrees and two polarizers in parallel (X = 0) or orthogonal (X = 90), the normalized light transmission is reduced to:
T =cos 2 (Sigma/2); equation (3)
T =sin 2 (Sigma/2); equation (4)
As further shown in fig. 8A, there are basic components, i.e. alignment layers, which indicate a macroscopically uniform alignment of the liquid crystal molecules (mesogens) near their surface, substantially orienting the LC molecules with a specific pretilt angle, which is the angle between the director axis of the LC molecules and the alignment layer. Fig. 8b.1 shows an exemplary cross-sectional view of an LC layer with an alignment layer, where the pretilt alignment dictates the characteristics of the light passing through the LC molecules. Different pretilt alignment angles can produce very different modulated light, as can the thickness of the LC (e.g., the corresponding optical path therethrough). There are several ways to form the surface alignment layer. One example is one-way mechanical friction using a thin polyimide coating. The film is spin coated and then cured at an appropriate temperature according to the type of polyimide. Subsequently, the cured film is rubbed with velveteen, thereby creating micro-or nano-grooves along the rubbing direction to which the LC molecules are aligned accordingly. Fig. 8b.2 shows functional layers in an exemplary LCoS.
Referring now to fig. 8C, it shows an example of how an LCoS 800 may be modified or redesigned to implement one embodiment of the present invention. An alignment layer 802 is disposed on top of the liquid crystal layer 803 (i.e., LC layer) to configure the liquid crystal to have a predetermined pre-tilt alignment angle across the entire pixel array. The incident light is transmitted through the LC layer with almost zero absorption. The integration of high performance drive circuits allows the applied voltage on each pixel to be varied, thereby controlling the phase delay of the incident wavefront through the device. Currently, light modulation using LCoS devices is of two types, amplitude Modulation (AM) and Phase Modulation (PM). In the AM case, the amplitude of the optical signal is modulated by changing the linear polarization direction of the incident light. In the case of PM, the phase delay is achieved by electrically controlling the adjustment of the optical refractive index along the optical path. No further details of how incident light is modulated by the Liquid Crystal (LC) or in the LC layer need be described to avoid obscuring aspects of the invention. One of the objects, benefits and advantages in the present invention is to control the pretilt alignment angle via a modified alignment layer or an array of alignment cells integrated with the alignment layer. For ease of describing the invention, for AM, all alignment cells are aligned diagonally (e.g., neither horizontally nor vertically, or between 20-60 degrees); for PM, all alignment units are aligned horizontally, meaning starting from 0 degrees to 360 degrees and beyond 360 degrees.
It is noted that throughout the description herein, the alignment layer serves as a base to form or hold the alignment cells (or imprint microstructures). Those skilled in the art will appreciate from the description herein that the described alignment units may be well incorporated into the alignment layer when designing or forming the alignment layer. For convenience of description of the present invention, it is assumed that the alignment unit is formed on top of the alignment layer.
According to one embodiment of the invention, two differently aligned cells are arranged in a manner 806 such that their alignment alternates across the alignment layer, i.e. the alignment of each alignment cell is different from the alignment of its neighboring alignment cells. In other words, the alignment of the alignment cells alternates from AM to PM. In operation, AM and PM occur simultaneously when light passes through the cells and the LC layer is applied with an appropriate voltage or current. One of the advantages, benefits and objects of the present invention is to have AM and PM occur simultaneously in an SLM device (e.g., an LCoS panel). Because all light is phase and amplitude modulated simultaneously, the holographic image reconstructed from this implementation can efficiently exhibit high resolution.
Fig. 8D shows an exemplary 8 x 8 array of aligned cells each corresponding to one pixel. According to one embodiment, illustrated in fig. 8D, the alignment cells for AM and PM are arranged alternately across the entire SLM device, i.e. the pixels alternate in odd and even rows or columns within one SLM device. In the perspective view, half of the pixels perform AM and the other half of the pixels simultaneously perform PM. In some modified embodiments, the alignment cells may be chosen randomly for AM and PM, or the desired pattern may be designed to define a particular pixel or group of pixels for AM or PM.
According to another embodiment shown in fig. 8E, by splitting each pixel into two parts, half of the pixels are allowed to perform AM and the other half of the pixels perform PM simultaneously, which can allow an alignment unit performing AM and PM to be located within a single pixel. Depending on the implementation, the percentage of one pixel used to perform AM or PM may be 50% or predefined depending on the desired performance, some of which are also shown in FIG. 8E.
According to one prior art system, the light efficiency of amplitude modulation based holographic displays is estimated to be very low (e.g., approximately only 5%), while the light efficiency based on phase modulation is increased (e.g., to 95%). With AM and PM integration, light efficiency can be significantly increased without loss of resolution.
Fig. 8F shows two separate graphical curves 842 and 844, one being the reflectivity curve 842 and the other being the phase curve. At an appropriate electrode voltage (e.g., 0-2.5V, 0-5V, etc.), half of the pixels 846 perform AM by gradually transitioning the corresponding LC from black to white or from white to black, while the other half of the pixels 846 perform PM by gradually transitioning the corresponding LC from 0 to 2 pi or from 2 pi to 0. As shown in fig. 8E, the ratio of AM to PM on a single pixel is not fixed at 50. Fig. 8G shows the alignment cell of pixel 846 of fig. 8F being turned to produce different reflectivity and phase curves to achieve different desired results.
Fig. 8H shows the simulation results on a single pixel 846 that does not involve adjacent pixels. Simulations show that when the applied voltage is changed from 0V to 5V, the liquid crystal 850 corresponding to PM (left part) is oriented differently from the liquid crystal corresponding to AM (right part). It should be noted that the thickness or depth of the LC layer 850 is preferably up to twice that of its case for single modulation in one pixel. According to one embodiment, the depth of the LC layer 850 is assumed to be 2D large, where D is the depth of the LC layer for one pixel or pixel array only for AM or PM. In other words, slightly more than twice the thickness to ensure that a phase shift (0-2 π) is achieved). The corresponding reflectivity curve 852 and phase curve 854 are also presented with two voltages V1=0 and V2= 5. Assuming that the physical dimensions of pixel 846 are 6.4 μm wide, if the ratio is 50.
Fig. 8I shows an exemplary embodiment 860 of the method using a photo-alignment mask. A photomask 864 is added over alignment layer 862. Given a predefined pattern imprinted on photomask 864, for example, each cell 50/50, i.e., cell 862, is configured to cause AM and PM to occur simultaneously. Photomask 864 is etched with UV light or other means 868. Thus, the pixels are covered by the alignment unit 870 with two different alignments, one 872 for PM and one 874 for AM. All cells of the alignment layer in a single SLM device etch the same pattern as similarly shown in fig. 8E. In an alternative embodiment (not shown in fig. 8I), alignment cells with horizontal alignment or diagonal alignment cover only one pixel. All neighboring cells, each covering a single pixel, may be aligned differently. In other words, all alignment cells are alternately horizontally aligned PM or diagonally aligned AM, as similarly shown in fig. 8D.
FIG. 9 shows a flow or process 880 for creating an SLM device that performs both AM and PM within a cell or array, according to one embodiment. The process 880 may be better understood in conjunction with the above figures. The process 880 begins when a photomask is added on top of the alignment layer.
SLM (spatial light modulation) devices such as LCoS include an LC layer to control the passage of reflected (or transmitted) light. As described above, one embodiment is to modify or add an alignment layer on top of the LC layer. Depending on the resolution of the SLM, there are multiple calibration units, each responsible for one pixel. These cells need to be uniquely controlled in order for the LC in the LC layer to adjust the amplitude and phase of the reflected light, taking into account the characteristics of the underlying LC.
At 882, a photomask is placed over the alignment layer. As described above for fig. 8D and 8E, there are two ways to have AM and PM occur simultaneously in a single SLM device, one within the alignment cell and the other within the array of alignment cells. For ease of describing both embodiments, the term "cell-based simultaneous modulation" means that AM and PM are performed simultaneously within a cell, i.e. the alignment cell is split into two parts, one for AM and the other for PM, and so on for each alignment cell in a single SLM device. The term "array based simultaneous modulation" means that all aligned cells perform AM or PM alternately, i.e. adjacent aligned cells perform different modulations.
At 884, a process 880 determines how to design or configure the photomask via printing or photolithography. If it is determined that cell-based simultaneous modulation is to be implemented, process 880 branches to 886 where a corresponding pattern may be printed on the photomask. According to one embodiment, all cells in the array have the same pattern. According to another embodiment, as seen in FIG. 8E, all cells in a row have the same pattern in that row, while adjacent rows have a half-pixel shift pattern, forming two alternating patterns on that row. If it is determined that array-based simultaneous modulation is to be implemented, process 880 branches to 888 where a corresponding pattern may be printed on the photomask. The pattern provides that some cells are designated to perform one modulation (e.g., AM) and neighboring cells of the cell performing one modulation are designated to perform another modulation (e.g., PM). Fig. 8D shows an exemplary portion of such a pattern.
The pattern may be different depending on the desired properties. Generally, the ratio of AM to PM within one cell is 50/50, but the ratio of AM to PM can be adjusted to any number as desired. Once the pattern is determined, the pattern can be imprinted onto a photomask. The details of patterning or imprinting a pattern onto a photomask will not be described further herein as they are well known in the art, e.g., in semiconductor manufacturing. Process 880 now turns to 890 where the photomask is etched. There are many ways in the prior art to etch a photomask. Again, the details of etching the photomask will not be described further herein, as they are well known in the art (e.g., in semiconductor fabrication). Since the alignment layer has a specified alignment cell, an SLM device that performs both AM and PM within the cell is created at 892, or an SLM device that performs both AM and PM within the array is created at 894.
The invention has been described with a certain degree of particularity. It will be understood by those of skill in the art that the present disclosure of the embodiments is by way of example only, and that various changes in the arrangement and combination of parts may be made without departing from the spirit and scope of the invention. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description of the embodiments.

Claims (18)

1. A display device, comprising:
a spectacle frame;
at least one integrated lens, wherein the integrated lens is framed in the eyeglass frame;
a spatial light modulation device to amplitude-modulate and phase-modulate an optical image to generate a modulated image; and
at least one holographic mirror that receives the modulated image and projects the modulated image into the integrated lens after rotating it 90 degrees, wherein the holographic mirror is optically coated to selectively allow specific wavelengths to pass or reflect, the hologram produced by the modulated image being visible in the integrated lens by a user wearing the display device.
2. The display device of claim 1, further comprising a light source that illuminates the spatial light modulation device, wherein the hologram is reflected light from the spatial light modulation device.
3. The display device according to claim 2, wherein the light source is a uniform laser sheet, and the spatial light modulation device includes a micro-display illuminated by the uniform laser sheet.
4. The display device according to claim 2, wherein the spatial light modulation device comprises a first group of pixels performing Amplitude Modulation (AM); and a second group of pixels performing Phase Modulation (PM), wherein the first group of pixels and the second group of pixels are within a single array, the AM and the PM are performed by a liquid crystal layer sandwiched between a transparent electrode layer and a reflective electrode layer, wherein the reflective electrode comprises an array of pixel electrodes, each pixel electrode controlling one pixel, the reflective electrode being built on a silicon substrate.
5. The display device of claim 4, wherein the first set of pixels is interleaved with the second set of pixels within the single array.
6. The display device according to claim 5, wherein the spatial light modulation device further comprises a photo mask on top of an alignment layer disposed over the liquid crystal layer, wherein the photo mask has a pattern comprising an array of alignment cells, each alignment cell corresponding to one pixel, wherein a first set of the alignment cells are aligned in a first direction and a second set of the alignment cells are aligned in a second direction.
7. The display device according to claim 6, wherein the alignment unit of the first group is staggered from the alignment unit of the second group in an alignment layer.
8. The display device of claim 6, wherein the first set of the alignment units causes phase modulation of light and the second set of the alignment units causes amplitude modulation of light.
9. A display device as claimed in claim 2, wherein the image source is located next to the temple and projects the hologram into the edge of the waveguide.
10. The display device of claim 9, wherein the image source is an end of a plurality of optical fibers, the optical fibers being encapsulated in or integrated with the temple.
11. A display device as claimed in claim 10, characterised in that the optical fibre is part of the temple, the other end of the optical fibre receiving a sequence of optical images projected by a lens positioned before the spatial light modulation device.
12. The display device of claim 1, wherein the data image that produces the two-dimensional optical image is at a first refresh rate and a first resolution, and two consecutive two-dimensional optical images are displayed in the integrated optic, resulting in a combined composite optical image at a second refresh rate and a second resolution.
13. The display device of claim 12, wherein the first refresh rate =2x second refresh rate and the first resolution =1/2 x second resolution.
14. The display device of claim 13, wherein the two consecutive two-dimensional optical images from the optical fibers are used to produce a composite optical image viewed by a viewer of the display device.
15. A method for a display device, the method comprising:
providing an eyeglass frame comprising at least one integrated lens and a temple attached to the eyeglass frame;
receiving an optical image;
modulating the optical image in amplitude and phase in a spatial light modulation device;
generating a hologram using the light intensity reflected by the spatial light modulation device irradiated by the uniform laser sheet; and
projecting the hologram into the integrated lens through a 90 degree rotation via a mirror, wherein the mirror is optically coated to selectively allow specific wavelengths to pass or reflect, a hologram produced by the modulated image being visible in the integrated lens by a user wearing the display device.
16. The method of claim 15, wherein the spatial light modulation device comprises a microdisplay, the method further comprising:
irradiating the uniform laser sheet onto the micro-display; and
amplitude and phase modulating the optical image from the uniform laser sheet.
17. The method according to claim 16, wherein the spatial light modulation device comprises a first set of pixels performing Amplitude Modulation (AM); and a second group of pixels performing Phase Modulation (PM), wherein the first group of pixels and the second group of pixels perform the AM and the PM via a liquid crystal layer sandwiched between a transparent electrode layer and a reflective electrode layer within a single array, wherein the reflective electrode includes an array of pixel electrodes each controlling one pixel, the reflective electrode being built on a silicon substrate.
18. The method of claim 17, wherein the first set of pixels is interleaved with the second set of pixels within the single array.
CN202211268047.7A 2022-01-23 2022-10-17 Display device for displaying hologram and method thereof Pending CN115903235A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/581,945 2022-01-23
US17/581,945 US11391955B2 (en) 2016-12-08 2022-01-23 Display devices for displaying holograms

Publications (1)

Publication Number Publication Date
CN115903235A true CN115903235A (en) 2023-04-04

Family

ID=86483296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211268047.7A Pending CN115903235A (en) 2022-01-23 2022-10-17 Display device for displaying hologram and method thereof

Country Status (1)

Country Link
CN (1) CN115903235A (en)

Similar Documents

Publication Publication Date Title
TWI769191B (en) Liquid crystal diffractive devices with nano-scale pattern and methods of manufacturing the same
CN107203043B (en) See-through display device
US10353213B2 (en) See-through display glasses for viewing 3D multimedia
KR20180012057A (en) See-through type display apparatus
US9946075B1 (en) See-through display glasses for virtual reality and augmented reality applications
CN110196494B (en) Wearable display system and method for delivering optical images
CN108267859B (en) Display equipment for displaying 3D multimedia
US20210294107A1 (en) Optical image generators using miniature display panels
US11852836B2 (en) Directional illuminator and display device with pupil steering by tiltable reflector
US11391955B2 (en) Display devices for displaying holograms
CN110967828A (en) Display system and head-mounted display device
JP2010243787A (en) Video display and head-mounted display
US11163177B2 (en) See-through display glasses with single imaging source
CN115903235A (en) Display device for displaying hologram and method thereof
JP2023553797A (en) patterned backlight for display panel
US20200018961A1 (en) Optical image generators using miniature display panels
CN114675418A (en) Ultra lightweight wearable display device and method for display device
US11231589B2 (en) Ultralight wearable display device
CN110196495B (en) Light display device
US20230176380A1 (en) Pupil-replicating lightguide with switchable out-coupling efficiency distribution and display based thereon
CN110286486B (en) Method for conveying optical images
US20230314846A1 (en) Configurable multifunctional display panel
US20230176378A1 (en) Lightguides with tunable gratings for dynamically variable field-of-view
US20230107434A1 (en) Geometrical waveguide illuminator and display based thereon
US20240094611A1 (en) Optical modulator and image projector based on leaky-mode waveguide with temporal multiplexing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination