CN108267859B - Display equipment for displaying 3D multimedia - Google Patents

Display equipment for displaying 3D multimedia Download PDF

Info

Publication number
CN108267859B
CN108267859B CN201810022405.3A CN201810022405A CN108267859B CN 108267859 B CN108267859 B CN 108267859B CN 201810022405 A CN201810022405 A CN 201810022405A CN 108267859 B CN108267859 B CN 108267859B
Authority
CN
China
Prior art keywords
optical
images
image
sequence
prism
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810022405.3A
Other languages
Chinese (zh)
Other versions
CN108267859A (en
Inventor
胡大文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/405,067 external-priority patent/US10353213B2/en
Application filed by Individual filed Critical Individual
Publication of CN108267859A publication Critical patent/CN108267859A/en
Application granted granted Critical
Publication of CN108267859B publication Critical patent/CN108267859B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/22Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
    • G02B30/25Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type using polarisation techniques
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The invention describes a display device for displaying 3D multimedia, characterized in that: the display device includes: an image polarizer that receives the sequence of optical images and generates a sequence of alternating polarized images; an optical cube provided to receive the sequence of alternating polarization images, wherein the optical cube includes two orthogonal polarizers sandwiching the optical cube and separates the sequence of alternating polarization images into two sequences of orthogonal polarization images; a pair of projection mechanisms for each of two eyes of a human, wherein each of the projection mechanisms receives one of the two orthogonal polarization image sequences. Which describes the architecture and design of a wearable device for viewing multimedia in 3D. A separate housing is provided to generate content for display on the eyewear. An optical cube is provided to separate the sequence of alternating polarized content into two sequences of orthogonally polarized images that are projected onto two lenses, respectively.

Description

Display equipment for displaying 3D multimedia
Technical Field
The present invention relates generally to the field of display devices, and more particularly to the architecture and design of display devices, where the display devices are manufactured in the form of a pair of glasses, and may be used in a variety of applications including 3D capable virtual reality, augmented reality.
Background
Virtual reality or VR is generally defined as the real and immersive simulation of a three-dimensional environment created using interactive software and hardware and experienced or controlled by movement of a subject. A person using a virtual reality device is typically able to look around an artificially generated three-dimensional environment, walking around and interacting with features or items depicted on a screen or in goggles. Virtual reality artificially creates a sensory experience that may include sight, touch, hearing, and less frequent smell.
Augmented Reality (AR) is a technology that performs computer-generated augmentation based on existing reality to make it more meaningful via the ability to interact with it. AR is developed into applications and used on mobile devices to mix digital components into the real world so that they augment each other, but can also be easily discerned. AR technology is rapidly becoming the mainstream. It is used to display a score overlay on a televised sports game on a mobile device and pop up a 3D email, photo or text message. Leaders in the industry are also using AR to do exciting and revolutionary things with holograms and motion activation commands.
The delivery methods of virtual reality and augmented reality are different when viewed separately. Most of the 2016 year virtual reality is displayed on a computer monitor, projector screen, or with any of a virtual reality headset (also known as a head mounted display or HMD). HMDs typically take the form of head-mounted goggles with a screen in front of the eyes. Virtual reality actually brings the user into the digital world by switching off external stimuli. In this way, the user is only interested in the digital content being displayed in the HMD. Augmented reality is increasingly being used in mobile devices such as laptops, smart phones and tablets to change the way the real world and digital images, graphics intersect and interact.
Indeed, VR is not always relative to AR because they do not always operate independently of each other, and in fact they often mix together to create a more immersive experience. For example, haptic feedback as vibrations and sensations added to the interaction with the graphics is considered augmented. However, they are typically used within virtual reality scenes in order to make the experience more vivid via haptics.
Virtual reality and augmented reality are prominent examples of experiences and interactions that are motivated by the need to become immersed in analog platforms for entertainment and play or to add new dimensions to the interaction between digital devices and the real world. Which alone or mixed together open up similarly both real and virtual worlds without doubt.
FIG. 1 shows exemplary goggles as commonly found today in the market for applications delivering or displaying VR or AR. Regardless of the design of the goggles, they appear to be bulky and cumbersome and create inconvenience when worn by the user. Furthermore, most goggles are not see-through. In other words, when the user wears the goggles, he or she will not be able to see or do anything else. Therefore, there is a need for a device that can display VR and AR and allow the user to perform other tasks as desired.
Various wearable devices are being developed for VR/AR and holographic applications. Fig. 2 shows a schematic diagram of HoloLens from Microsoft. It weighed 579g (1.2 lbs). With this weight, the wearer will feel uncomfortable after a period of wear. In fact, the products available in the market are generally heavy and bulky compared to normal spectacles. Thus, there is a further need for a wearable AR/VR viewing or display device that looks similar to a pair of ordinary eyeglasses but can also be processed with a smaller footprint, enhanced impact performance, lower cost packaging and easier manufacturing process.
Most wearable AR/VR viewing or display devices are capable of displaying 3D video or images based on 3D content. There is also a need for a pair of see-through glasses capable of presenting a 3D display.
Disclosure of Invention
This section is intended to summarize some aspects of the present invention and to briefly introduce some preferred embodiments. Simplifications or omissions in this section as well as in the abstract and the title may be made to avoid obscuring the purpose of the section, the abstract and the title. Such simplifications or omissions are not intended to limit the scope of the present disclosure.
The present invention relates generally to the architecture and design of wearable devices for virtual reality and augmented reality applications. According to one aspect of the present invention, a display device is manufactured in the form of a pair of glasses and includes a minimum number of parts to reduce its complexity and weight. A separate shell or outer cover is provided that is portable to attach or attach to a user (e.g., a pocket or belt). The housing contains all the necessary parts and circuitry to produce content for virtual reality and augmented reality applications, resulting in the minimum number of parts required on the eyewear, thus resulting in a smaller footprint for the eyewear, enhanced impact performance, lower cost packaging, and easier manufacturing process. The content is optically picked up by the fiber optic cable and carried through the optical fibers in the fiber optic cable to the glasses, where the content is respectively projected to specially manufactured lenses for displaying the content in front of the wearer's eyes.
The main technical scheme of the invention is as follows:
a display device for displaying 3D multimedia, characterized by: the display device includes: an image polarizer that receives the sequence of optical images and generates a sequence of alternating polarized images; an optical cube provided to receive the sequence of alternating polarization images, wherein the optical cube includes two orthogonal polarizers sandwiching the optical cube and separates the sequence of alternating polarization images into two sequences of orthogonal polarization images; a pair of projection mechanisms for each of two eyes of a human, wherein each of the projection mechanisms receives one of the two orthogonal polarization image sequences.
Preferably, the display device further includes: at least one lens, and a fiber optic cable comprising at least one optical fiber to carry the optical image from one end of the fiber optic cable to the other end of the fiber optic cable by total internal reflection in the optical fiber, wherein the optical image is projected by the at least one lens onto the image polarizer.
Preferably, the image polarizer is an active shutter.
Preferably, the image polarizer comprises a liquid crystal layer sandwiched between two transparent layers, wherein the liquid crystal layer is applied with an electric field.
Preferably, each of the projection mechanisms includes a prism that receives the one of the two orthogonally polarized image sequences from a first edge of the prism, the one of the two orthogonally polarized image sequences being viewed by one of the eyes from only a second edge of the prism.
Preferably, each of the projection mechanisms further comprises an optical correction lens integrated with the prism to correct an optical path exiting from the prism.
Preferably, the prism and the optical modifier stack such that a user sees through the integrated lens without optical distortion.
Preferably, each of the projection mechanisms includes a set of variable focusing elements, at least one of which is adjustable to focus the optical image from the optical fiber onto the first edge of the prism.
Preferably, each of the projection mechanisms comprises an optical waveguide that receives the one of the two sequences of orthogonally polarized images projected onto one side of the optical waveguide and propagates the one of the two sequences of orthogonally polarized images to the other side of the optical waveguide where the one of the two sequences of orthogonally polarized images is seen.
Preferably, a portion of the optical cable is enclosed within or attached to a portion of an article of clothing.
Glasses for displaying 3D multimedia, characterized in that: the eyeglasses comprise: a pair of lenses;
a beam disposed between the lenses; an image polarizer that receives a sequence of optical images and generates a sequence of alternating polarization images, wherein the sequence of optical images is carried by an optical cable comprising optical fibers; an optical cube disposed near or on the beam, receiving the sequence of alternating polarization images, wherein the optical cube sandwiched between two orthogonal polarizers splits the sequence of alternating polarization images into two sequences of orthogonal polarization images, each sequence of orthogonal polarization images projected into an edge of one of the two lenses.
Preferably, the optical fibre is provided to carry the optical image from one end thereof to the other end thereof by total internal reflection in the optical fibre, and wherein the optical image is from a microdisplay.
Preferably, the two orthogonal polarization image sequences are projected onto the lens separately.
Preferably, the two sequences of orthogonally polarized images are projected onto the edge of the lens, respectively.
Preferably, each of the lenses includes a prism that receives the one of the two sequences of orthogonally polarized images from a first edge of the prism, the one of the two sequences of orthogonally polarized images being viewed by one of the eyes via the prism only.
Preferably, each of the lenses further comprises an optical correction lens integrated with the prism to correct the optical path exiting from the prism.
According to another aspect of the invention, the eyewear (i.e., the lenses therein) and the housing are coupled by an optical cable comprising at least one optical fiber, wherein the optical fiber is responsible for carrying content or an optical image from one end of the optical fiber to the other end thereof by total internal reflection within the optical fiber. An optical image is picked up by the focusing lens from the microdisplay in the housing.
According to yet another aspect of the invention, each of the lenses includes some form of prism that propagates an optical image being projected onto one edge of the prism to an optical path so that the optical image is viewable by a user. The prism is also integrated or stacked on an optical corrective lens that is complementary or reciprocal to the optical corrective lens of the prism to form an integrated lens of the eyewear. The optical correction lens is provided to correct the optical path from the prism, allowing the user to see through the integrated lens without optical distortion.
According to yet another aspect of the invention, each of the lenses includes an optical waveguide that propagates an optical image being projected onto one end of the waveguide to the other end using an optical path such that the optical image is viewable by a user. The waveguide may also be integrated with or stacked on an optical correction lens to form an integrated lens of the eyewear.
According to yet another aspect of the invention, the integrated lens may be further coated with one or more films having optical properties that magnify the optical image in front of the user's eye.
According to yet another aspect of the invention, the glasses include several electronic devices (e.g., sensors or microphones) to enable various interactions between the wearer and the displayed content. Signals captured by the device (e.g., depth sensor) are transmitted to the housing via wireless means (e.g., RF or bluetooth) to preclude a wired connection between the glasses and the housing.
According to yet another aspect of the invention, instead of using two optical cables to carry images from two microdisplays, a single optical cable is used to carry images from one microdisplay. The optical cable may pass through any of the temples of the eyeglasses. A splitting mechanism, placed near or just on the beam of the eyeglasses, is used to split the image into two versions, one for the left lens and the other for the right lens. The two images are then projected into a prism or waveguide, respectively, which may be used in the two mirrors.
According to yet another aspect of the invention, the fiber optic cable is enclosed within or attached to a functional multilayer structure forming part of an article of clothing. When a user wears a shirt made or designed according to one of the embodiments, the cable itself has less weight and the user can perform more activities.
According to yet another aspect of the invention, the eyewear includes a pair of two different (e.g., orthogonal) polarizing plates to display one polarized image on one of the two eyewear lenses and the other polarized image on the other of the two eyewear lenses so that the wearer of the eyewear can view multimedia in 3D.
The present invention may be embodied as devices, methods, and portions of systems. Different embodiments may yield different benefits, objects, and advantages. In one embodiment, the present invention is a display device for displaying 3D multimedia, the display device comprising: an image polarizer that receives the sequence of optical images and generates a sequence of alternating polarized images; and an optical cube provided to receive the sequence of alternating polarization images, wherein the optical cube includes two orthogonal polarizers sandwiching the optical cube and separates the sequence of alternating polarization images into two sequences of orthogonal polarization images. The display device further includes a pair of projection mechanisms for the two eyes of the human respectively, wherein each of the projection mechanisms receives one of the two orthogonal polarization image sequences.
According to another embodiment, the present invention is a display apparatus for displaying 3D multimedia, the display apparatus comprising: a pair of lenses; a beam disposed between the lenses; and an image polarizer that receives a sequence of optical images and generates a sequence of alternating polarization images, wherein the sequence of optical images is carried by an optical cable comprising optical fibers. The display device further includes an optical cube disposed near or on the beam mount that receives the sequence of alternating polarization images, wherein the optical cube sandwiched between two orthogonal polarizing plates splits the sequence of alternating polarization images into two sequences of orthogonal polarization images, each sequence of orthogonal polarization images projected into an edge of one of the two lenses.
In addition to the above objects, which are achieved by the practice of the invention in the following description and which result in the embodiments illustrated in the drawings, there are many other objects.
Drawings
These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
FIG. 1 shows exemplary goggles as commonly found today in the market for applications delivering or displaying VR or AR;
FIG. 2 shows a schematic diagram of HoloLens from Microsoft;
FIG. 3 shows an exemplary pair of glasses that may be used for VR applications according to one embodiment of the present invention;
FIG. 4 illustrates the use of an optical fiber to transport light from one location to another along a curved path in a more efficient manner or by total internal reflection within the fiber;
FIGS. 5 and 6 show two exemplary ways of encapsulating an optical fiber or fibers according to one embodiment of the invention;
FIG. 7 shows how an image is conveyed from the microdisplay to an imaging medium via a fiber optic cable;
FIG. 8 shows an exemplary set of Variable Focusing Elements (VFEs) to accommodate adjustment of the projection of an image onto an optical object (e.g., an imaging medium or prism);
FIG. 9 shows an exemplary lens that may be used in the eyewear shown in FIG. 3, wherein the lens comprises two portions: an arbitrary shaped prism for VR and additional optical corrective lenses or an arbitrary shaped corrector when AR is needed;
FIG. 10 shows internal reflections from multiple sources (e.g., a sensor, an imaging medium, and multiple light sources) in an irregular prism;
FIG. 11 shows a comparison of this integrated lens with a coin and a ruler;
FIG. 12 shows a shirt with the fiber optic cable enclosed within or attached to the shirt;
FIG. 13 shows how three single color images are visually combined and perceived by human vision as a full-color image;
FIG. 14 shows the generation of three different color images under three lights at wavelengths λ 1, λ 2, and λ 3, respectively, the imaging medium comprising three films, each film coated with one type of phosphor;
FIG. 15 shows that there are three color laser sources being driven by one or more MEMS to scan a defined area (e.g., a screen);
FIG. 16 shows the use of a waveguide to carry an optical image from one end of the waveguide to its other end;
FIG. 17 shows an exemplary functional block diagram that may be used in a separate shell or housing to generate content for virtual reality and augmented reality for display on the exemplary eyewear of FIG. 3;
fig. 18 shows a modified version of fig. 3, in which an image propagated or carried by a fiber optic cable is split into two portions (e.g., left and right images) using a splitting mechanism (or splitting mechanism, separating mechanism);
FIG. 19 shows an exemplary detachment mechanism according to one embodiment of the present invention;
FIG. 20 shows a functional block diagram for displaying multimedia (e.g., graphics, objects, images, or video) in 3D according to one embodiment of the invention;
FIG. 21 shows an example of an implementation for the electronics section in FIG. 20;
FIG. 22 shows polarizing a sequence of images using a liquid crystal panel acting as an active shutter; and
FIG. 23 shows an exemplary implementation of the optics block of FIG. 20, according to one embodiment of the invention.
Detailed Description
The detailed description of the invention is presented largely in terms of procedures, steps, logic blocks, processing, and other symbolic representations of operations that are directly or indirectly analogous to data processing apparatus coupled to a network. These process descriptions and representations are generally used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art.
Reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, the order of the blocks in a process flow diagram or drawing representing one or more embodiments of the invention is not inherently indicative of any particular order nor does it imply any limitations in the invention.
Embodiments of the present invention are discussed herein with reference to fig. 3-19. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.
Referring now to the drawings, in which like numerals refer to like parts throughout the several views. FIG. 3 shows an exemplary pair of eyeglasses 200 for use in VR/AR applications in accordance with one embodiment of the present invention. The eyeglasses 200 appear to be no significant difference from a normal pair of eyeglasses, but include two flexible cables 202 and 204 extending from temples 206 and 208, respectively. According to one embodiment, each pair of the two flex cables 202 and temples 206 and 208 are integrally or removably connected at one end thereof and include one or more optical fibers.
Both flexible cables 202 are coupled at their other ends to a portable computing device 210, where the computing device 210 generates images captured by the cables 202 based on the microdisplay. The image is carried via the optical fibers in the flex cable 202 by total internal reflection therein up to the other end of the optical fibers, where it is projected onto the lenses in the eyeglasses 200.
According to one embodiment, each of the two flexible cables 202 includes one or more optical fibers. Optical fibers are used to transport light from one location to another along a curved path in a more efficient manner as shown in fig. 4. In one embodiment, the optical fiber is formed from thousands of strands of very fine quality glass or quartz having an index of refraction on the order of about 1.7. The thickness of one strand is small. The strands are coated with a layer of a material of lower refractive index. The ends of the strands are polished and clamped firmly after carefully aligning them. When light is incident at one end at a small angle, it is refracted into the strand (or fiber) and incident on the interface of the fiber and coating. At incident angles greater than the critical angle, the light rays undergo total internal reflection and essentially carry the light from one end to the other, even when the fiber is bent. Depending on the embodiment of the present invention, a single optical fiber or a plurality of optical fibers arranged in parallel may be used to carry the optical image projected onto one end of the optical fiber to the other end thereof.
Fig. 5, 6 show two exemplary ways of encapsulating an optical fiber or fibers according to one embodiment of the invention. The encapsulated optical fiber may be used as cable 202 or 204 in fig. 3 and extends through each of the non-flexible temples 206 and 208 to its end. According to one embodiment, the temples 206 and 208 are made of a type of material (e.g., plastic or metal) commonly found in a pair of ordinary eyeglasses, with a portion of the cable 202 or 204 embedded or integrated in the temple 206 or 208, thereby creating a non-flexible portion, while the other portion of the cable 202 or 204 remains flexible. According to another embodiment, the non-flexible portion and the flexible portion of the cable 202 or 204 may be removably connected via one type of interface or connector.
Referring now to fig. 7, it is shown how images are carried from the microdisplay 240 to an imaging medium 244 via a fiber optic cable 242. As will be described further below, the imaging medium 244 may be a solid object (e.g., a film) or a non-solid object (e.g., air). A microdisplay is a display with a very small screen (e.g., less than one inch). Tiny electronic display systems of this type were introduced commercially by the end of the 90 s of the 20 th century. The most common applications of microdisplays include rear projection TVs and head mounted displays. The microdisplay may be reflective or transmissive depending on the way light is allowed to pass through the display element. An image (not shown) displayed on the microdisplay 240 is picked up by one end of the fiber optic cable 242 via the mirror 246, and the fiber optic cable 242 carries the image to the other end of the fiber optic cable 242. Another lens 248 is provided to collect the image from the fiber optic cable 242 and project it to the imaging media 244. Depending on the embodiment, there are different types of microdisplays and imaging media. Some embodiments of the microdisplay and the imaging medium are described in detail below.
Fig. 8 shows an exemplary set of Variable Focusing Elements (VFEs) 250 to accommodate adjustment of the projection of an image onto an optical object, such as an imaging medium or prism. To facilitate the description of various embodiments of the present invention, the presence of image media is assumed. As illustrated in fig. 8, an image 252 carried by the fiber optic cable reaches an end face 254 of the fiber optic cable. The image 252 is focused onto an imaging medium 258 by a set of mirrors 256, which set of mirrors 256 is referred to herein as a Variable Focusing Element (VFE). The VFE 256 is provided to be adjusted to ensure that the image 252 is accurately focused onto the imaging media 258. Depending on the implementation, the adjustment of the VFE 256 may be made manually or automatically based on inputs (e.g., measurements obtained from sensors). According to one embodiment, the adjustment of the VFE 256 is performed automatically from a feedback signal derived from a sensed signal from a sensor gazing at the eye (pupil) of the wearer wearing the eyeglasses 200 of fig. 3.
Referring now to fig. 9, an exemplary lens 260 that may be used in the eyewear shown in fig. 3 is shown. The lens 260 includes two portions: a prism 262 and an optical correction lens or corrector 264. The prism 262 and the corrector 264 are stacked to form the optic 260. As the name indicates, an optical modifier 264 is provided to modify the optical path from the prism 262 so that light passing through the prism 262 passes through the modifier 264. In other words, the refracted light from the prism 262 is modified or de-refracted by the modifier 264. In optics, a prism is a transparent optical element with a flat polished surface that refracts light. At least two of the planar surfaces must have some angle between them. The exact angle between the surfaces depends on the application. The traditional geometry is a triangular prism with a triangular base and rectangular sides, and in spoken use, a prism is often referred to as this type. The prism may be made of any material that is transparent to the wavelength for which it is designed. Typical materials include glass, plastic and fluorspar. According to one embodiment, the type of prism 262 is not actually in the shape of a geometric prism, and thus the prism 262 is referred to herein as an arbitrary-shaped prism that directs the corrector 264 to a shape that is complementary, reciprocal, or conjugate to the shape of the prism 262 to form the optic 260.
On one edge of the lens 260 or the edge of the prism 262, there are at least three items that utilize the prism 262. Referenced by 267 is imaging media corresponding to imaging media 244 of fig. 7 or imaging media 258 of fig. 8. Depending on the embodiment, the image carried by the fiber 242 of fig. 7 may be projected directly onto the edge of the prism 262 or formed on the imaging media 267 before it is projected onto the edge of the prism 262. In any case, the projected image is refracted in the prism 262 according to the shape of the prism 262 and then seen by the eye 265. In other words, a user wearing a pair of eyeglasses employing lenses 262 may see an image displayed via prism 262 or in prism 262.
A sensor 266 is provided to image the position or movement of the pupil in the eye 265. Again, based on the refraction provided by the prism 262, the position of the pupil can be seen by the sensor 266. In operation, an image of the eye 265 is captured. The image is analyzed to derive how the pupil is looking at the image being shown via lens 260 or in lens 260. In AR applications, the position of the pupil may be used to activate some action. Optionally, a light source 268 is provided to illuminate the eye 265 to facilitate image capture by the sensor 266. According to one embodiment, the light source 268 uses a near infrared source so that the user or his eyes 265 will not be affected by the light source 268 when open.
Fig. 10 shows internal reflections from multiple sources (e.g., sensor 266, imaging media 267, and light source 268). Because the prism is uniquely designed to a particular shape or have particular edges, the rays from the source reflect several times within the prism 268 and then impinge on the eye 265. For completeness, fig. 11 shows a comparison of the dimensions of this lens with a coin and a straight edge.
As described above, there are different types of microdisplays and, therefore, different imaging media. The following table summarizes some of the microdisplays that may be used to facilitate the creation of an optical image that may be transported by one or more optical fibers from one end to the other by total internal reflection within the optical fibers.
Figure GDA0001651502290000091
Figure GDA0001651502290000101
LCoS ═ liquid crystal on silicon;
LCD ═ liquid crystal displays;
OLED ═ organic light emitting diode;
RGB ═ red, green, and blue;
SLM is spatial light modulation; and
MEMS is a micro-electro-mechanical system (e.g., micro-mirror DLP).
In the first case shown in the above table, a full color image is actually displayed on silicon. As shown in fig. 7, a full color image can be picked up by a focusing lens or a set of lenses that project the full image onto just one end of the optical fiber. The image is carried within the fiber and picked up again by another focusing lens at the other end of the fiber. Because the conveyed image is visible and full color, the imaging medium 244 of fig. 7 may not be physically needed. The color image may be projected directly onto one edge of the prism 262 of fig. 9.
In a second case, shown in the above table, the LCoS is used with different light sources. In particular, there are at least three color light sources (e.g., red, green, and blue) used sequentially. In other words, each light source produces a single color image. The image picked up by the optical fiber is only a single color image. A full color image can be reproduced when all three different single color images are combined. The imaging medium 244 of fig. 7 is provided to reproduce a full color image from three different single color images respectively carried by the optical fibers.
Fig. 12 shows a shirt 270 with fiber optic cable 272 enclosed within shirt 270 or attached to shirt 270. Shirt 270 is an example of multiple layers. This relatively thin cable may be embedded in the garment material(s). When a user wears this shirt made or designed according to one of the embodiments, the cable itself has less weight and the user can perform more activities.
Fig. 13 shows how three single color images 302 are visually combined and perceived by human vision as a full color image 304. According to one embodiment, three colored light sources are used, such as red, green and blue light sources that are sequentially turned on. More specifically, when the red light source is turned on, only a red image is produced as a result (e.g., from a microdisplay). The red image is then optically picked up and carried by the optical fiber and then projected into the prism 262 of fig. 9. As the green and blue light is then and sequentially turned on, green and blue images are generated and carried separately by the optical fibers and then projected into the prism 262 of fig. 9. It is well known that human vision possesses the ability to combine three single color images and perceive them as a full color image. With all three single color images projected sequentially into the prism perfectly aligned, the eye sees a full color image.
Also, in the second case shown above, the light source may be accessible and invisible. According to one embodiment, three light sources generate light close to the UV band. Under this illumination, three different color images can still be generated and transported, but are not fully visible. It will be converted to three primary color images, which can then be perceived as a full color image, before it can be presented to the eye or projected into a prism. According to one embodiment, the imaging media 244 of FIG. 7 is provided. Fig. 14 shows the generation of three different color images 310 under three light sources at wavelengths λ 1, λ 2, and λ 3, respectively, the imaging medium 312 comprising three films 314, each film 314 coated with one type of phosphor, i.e., a substance that exhibits a luminescence phenomenon. In one embodiment, three types of phosphors at wavelengths 405nm, 435nm, and 465nm are used to convert three different color images produced under three light sources in proximity to the UV band. In other words, when one such color image is projected onto the phosphor coated film at a wavelength of 405nm, the single color image is converted into a red image that is then focused and projected into the prism. The same process holds for the other two single color images that pass through the phosphor-coated film at wavelengths 435nm or 465nm, producing green and blue images. When such red, green, and blue images are sequentially projected into the prism, they are perceived together by human vision as a full-color image.
In the third or fourth case shown in the above table, instead of using light in the visible spectrum or near invisible to the human eye, the light source uses a laser source. Visible and non-visible lasers are also present. Without much difference in operation from the first and second cases, the third or fourth case uses Spatial Light Modulation (SLM) to form a full color image. Spatial light modulators are a general term describing devices for modulating the amplitude, phase or polarization of light waves in space and time. In other words, the SLM + laser (RGB sequential) can produce three separate color images. When combined with or without an imaging medium, a full color image can be reproduced. In the case of SLM + laser (non-visible), the imaging medium will be presented to convert the non-visible image to a full color image, in which case appropriate films may be used as shown in fig. 14.
In the fifth case shown in the above table, the optical image is produced by three color sources, such as red, green, and blue lasers. Depending on the display content, the three color sources are sequentially turned on to scan a predefined area to show color pixels or images that are then captured and focused onto one end of the optical fiber. Fig. 15 shows that there are three color laser sources 320, 322, and 324 being driven by one or more MEMS 326 (micro-electro-mechanical systems) to scan a defined area (e.g., a screen), where the intensity of each of the laser beams from the sources 320, 322, and 324 is related to one of the three component colors in the image. For example, a color pixel has a set of color values (R, G, B) ═ 255,127,63, and the corresponding intensity ratios of the three color lasers are (3,2, 1). In operation, the red laser emits a red beam of intensity (I), the green laser emits a green beam of intensity 1/2I, and the blue laser emits a blue beam of intensity 1/3I. In one embodiment, the intensity of the laser beam may be adjusted in a combined manner to meet the hue.
When the beam is switched on, the beam is controlled by a driver to scan a certain area. In one embodiment, the actuator is a MEMS mounted or actuated mirror, MEMS being defined as miniaturized mechanical and electromechanical elements (i.e., devices and structures) fabricated using microfabrication techniques. When the MEMS is controlled, the beam is caused to scan across a defined area. With all three lasers scanning sequentially, an optical color image is formed and captured for transmission by the optical fiber to the glasses.
In the sixth case shown in the above table, instead of sequentially turning on three color sources, the three color sources are turned on simultaneously and scanned simultaneously, producing equally optical color images.
In the seventh case shown in the above table, instead of using visible laser light, the light source uses a laser source that is nearly invisible to the human eye. Without much difference from the operation of the fourth or fifth case, an imaging medium is required to convert the non-visible image into a full color image, in which case an appropriate film may be used as shown in fig. 14.
Referring now to fig. 16, it is shown that an optical image 402 is carried from one end 404 to another end 406 of the waveguide 400 using the waveguide 400, where the waveguide 400 may be stacked with one or more pieces of glasses or lenses (not shown) or coated with one or more films to form suitable lenses for a pair of glasses for an application displaying an image from a computing device. As known to those skilled in the art, an optical waveguide is a spatially inhomogeneous structure for guiding light, i.e. a spatial region for confining light that can propagate therein, wherein the waveguide contains a region with an increased refractive index compared to the surrounding medium (often referred to as cladding).
The waveguide 400 is transparent and appropriately shaped at the end 404 to allow the image 402 to propagate along the waveguide 400 to the end 406, where a user 408 can see through the waveguide 400 in order to see the propagated image 410. According to one embodiment, one or more films are disposed on the waveguide 400 to magnify the propagated image 410 so that the eye 408 can see a distinctly magnified image 412. One example of such a film is known as a metamaterial, essentially an array of thin titanium dioxide norethindrone on a glass substrate.
Referring now to fig. 17, an exemplary functional block diagram 500 is shown that may be used in a separate shell or housing to generate content for virtual reality and augmented reality for display on the exemplary eyewear of fig. 3. As shown in fig. 17, there are two micro-displays 502 and 504 provided to supply content to the two lenses in the glasses of fig. 3, essentially a left image going to the left lens and a right image going to the right lens. Examples of content are 2D or 3D images and video or holograms. Each of the micro-displays 502 and 504 is driven by a corresponding driver 506 or 508.
The entire circuit 500 is controlled and driven by a controller 510 programmed to produce content. According to one embodiment, the circuit 500 is designed to communicate with the Internet (not shown), receiving the content from other devices. In particular, the circuit 500 includes an interface to receive a sensing signal from a remote sensor (e.g., sensor 266 of fig. 9) via a wireless means (e.g., RF or bluetooth). The controller 510 is programmed to analyze the sensing signals and provide feedback signals to control specific operations of the eyewear, such as a projection mechanism that includes a focusing mechanism that automatically focuses and projects an optical image onto the edges of the prisms 262 of fig. 9. Further, audio is provided to synchronize with the content, and the audio may be wirelessly transmitted to the headset.
FIG. 17 shows an exemplary circuit 500 to generate content for display in a pair of glasses contemplated in one embodiment of the present invention. The circuit 500 shows that there are two micro-displays 502 and 504 for providing two respective image or video streams to the two lenses of the glasses in fig. 3. According to one embodiment, only one microdisplay may be used to drive both lenses of the glasses in fig. 3. Such circuitry is not provided herein because those skilled in the art know how to design the circuitry or how to modify the circuitry 500 of FIG. 17.
Given a video stream or an image, the advantage is that only one optical cable is required to carry the image. Fig. 18 shows a modified version 600 of fig. 3 to show: a cable 602 is used to couple the computing device 210 to the eyewear 208. Instead of using two optical cables to carry images from two microdisplays as shown in fig. 3, a single optical cable is used to carry images from one microdisplay. The optical cable may pass through any of the temples of the eyeglasses and possibly further through a portion of one of the headrails. A splitting mechanism, placed near or just on the beam of the eyeglasses, is used to split the image into two versions, one for the left lens and the other for the right lens. The two images are then projected into a prism or waveguide, respectively, which may be used in the two mirrors.
To split the image propagated or carried by the cable 602, the eyewear 600 is designed to include a splitting mechanism 604 preferably disposed near or at its beam. FIG. 19 shows an exemplary detachment mechanism 610 according to one embodiment of the present invention. A cube 612, also referred to as an X-cube beam splitter, for splitting incident light into two separate components is provided to receive images from the microdisplay via cable 602. In other words, the image is projected onto one side of the X-cube 612. The X-cube 612 is internally coated with a specific reflective material to split the incident image into two parts, one going to the left and the other to the right, as shown in fig. 19. The split image passes through polarizer 614 or 616 to hit reflector 618 or 620, and reflector 618 or 620 reflects the image back to polarized mirror 626 or 628. The two polarizing plates 614 and 616 are polarized differently (e.g., horizontally and vertically or circularly left and right) corresponding to images sequentially generated for either the left or right eye. Coated with a particular reflective material, polarized mirror 626 or 628 reflects the image to the corresponding eye. Depending on the implementation, the reflected image from polarizing mirror 626 or 628 may impinge on one edge of prism 262 of FIG. 9 or waveguide 400 of FIG. 16. Optionally, two waveplates 622 and 624 are disposed in front of the reflectors 618 and 620, respectively.
Referring now to FIG. 20, a functional block diagram 700 is shown for displaying multimedia (e.g., graphics, objects, images, or video) in 3D according to one embodiment of the invention. Referenced by 702 is the electrical/mechanical portion used to generate multimedia. Fig. 21 shows an example of an implementation of the electronics portion 702. In comparison to fig. 17, fig. 21 uses a single microdisplay 712 driven by a driver 714. According to one embodiment, the electronics portion 702 is enclosed in a housing that is carried or worn by the user. Because only one image source is generated in electronics portion 702, only one fiber optic cable is required in block 704, the fiber optic and lens system.
Referenced by 706 is a single source polarization of the image. Fig. 22 shows a liquid crystal panel 720 for polarizing an image. Liquid crystals are substances that flow like liquids, but have a degree of ordering of their arrangement of molecules that can cause the polarization of light waves to change as they pass through the liquid. The degree of change in polarization depends on the strength of the applied electric field. According to one embodiment, the liquid crystal panel 720 is sandwiched between two sheets or glass coated with a resistive substance, typically a metal compound, called Indium Tin Oxide (ITO), uniformly sputtered thereon. When an image 722 carried from the optical fiber impinges on the liquid crystal panel 720, the image is polarized by the controlled liquid crystals in the liquid crystal panel 720. With the applied electric field switched via the power 724, the image 722 polarizes into two parts: p-polarized images and s-polarized images. In principle, p-polarized light is understood to have an electric field direction parallel to the plane of incidence on the liquid crystal panel 720, while s-polarized light has an electric field oriented perpendicular to the liquid crystal panel 720.
Depending on the video standard being used, the incoming image 722 arrives at the image frame rate F. By alternating the applied electric field, the polarized image is left at the image frame rate 2F. In other words, for example, when video comes at 60Hz, the output stream 726 is a sequence of alternating polarization images pspspsps.
FIG. 23 shows an implementation of block 708 of FIG. 20, according to one embodiment of the invention. An optical cube 732 (also referred to as an X-cube) is disposed near or on the beam mount of an eyeglass implemented in accordance with one embodiment shown in fig. 3 and 9 or 16. The X-cube 732 is sandwiched between two polarizing plates 734 and 736. As the sequence of alternating polarized images (pspsps..) comes in and impinges on the X-cube 732, the incoming sequence of alternating polarized images is redirected in two directions by the two internal reflectors 733 and 755 of the X-cube 732. In operation, the sequence hits both internal reflectors 733 and 755, the P sequence is filtered out to the left, and the S sequence is filtered out to the right. Thus, one eye sees the P sequence and the other eye sees the S sequence. With the combination of the perceived P-sequence and S-sequence, the human eye sees a 3D effect.
The invention has been described with a certain degree of particularity. It will be understood by those of skill in the art that the present disclosure of the embodiments is by way of example only, and that various changes in the arrangement and combination of parts may be made without departing from the spirit and scope of the invention. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description of the embodiments.

Claims (15)

1. A display device for displaying 3D multimedia, characterized by: the display device includes:
an image polarizer that receives a sequence of optical images at a first frame rate and produces a sequence of alternating polarized images at a second frame rate, the second frame rate being twice the first frame rate, the image polarizer comprising a liquid crystal layer sandwiched between two transparent layers, the liquid crystal layer being applied with an electric field, wherein each optical image of the sequence of optical images simultaneously produces two orthogonal polarized images of the sequence of alternating polarized images by switching the applied electric field;
an optical cube provided for receiving the sequence of alternating polarization images, wherein the optical cube comprises two orthogonal polarizers sandwiching the optical cube, separating the sequence of alternating polarization images into two sequences of orthogonal polarization images;
a pair of projection mechanisms for each of two eyes of a human, wherein each of the projection mechanisms receives one of the two orthogonal polarization image sequences.
2. The display device according to claim 1, wherein: it further comprises:
at least one lens, and
a fiber optic cable comprising at least one optical fiber to carry the optical image from one end of the fiber optic cable to the other end of the fiber optic cable by total internal reflection in the optical fiber, wherein the optical image is projected by the at least one lens onto the image polarizer.
3. The display device according to claim 2, wherein: the image polarizer is an active shutter.
4. The display device according to claim 1, wherein: each of the projection mechanisms includes a prism that receives the one of the two orthogonally polarized image sequences from a first edge of the prism, the one of the two orthogonally polarized image sequences being viewed by one of the eyes from only a second edge of the prism.
5. The display device according to claim 4, wherein: each of the projection mechanisms further includes an optical correction lens integrated with the prism to correct an optical path exiting the prism.
6. The display device according to claim 5, wherein: the prism and the optical modifier stack allows a user to see through the integrated lens without optical distortion.
7. The display device according to claim 2, wherein: each of the projection mechanisms includes a set of variable focusing elements, at least one of which is adjustable to focus the optical image from the optical fiber onto a first edge of a prism.
8. The display device according to claim 1, wherein: each of the projection mechanisms includes an optical waveguide that receives the one of the two sequences of orthogonally polarized images projected onto one side of the optical waveguide and propagates the one of the two sequences of orthogonally polarized images to the other side of the optical waveguide where the one of the two sequences of orthogonally polarized images is seen.
9. The display device according to claim 2, wherein: a portion of the fiber optic cable is enclosed within or attached to a portion of an article of clothing.
10. A display device for displaying 3D multimedia, characterized in that the display device comprises:
a pair of lenses;
a beam disposed between the lenses;
an image polarizer that receives a sequence of optical images at a first frame rate and produces a sequence of alternating polarized images at a second frame rate, wherein the sequence of optical images is carried by an optical cable comprising optical fibers, the second frame rate is twice the first frame rate, the image polarizer comprising a liquid crystal layer sandwiched between two transparent layers, the liquid crystal layer being applied with an electric field, wherein each frame of the sequence of optical images simultaneously produces two orthogonal polarized images in the sequence of alternating polarized images by switching the applied electric field;
an optical cube disposed near or on the beam, receiving the sequence of alternating polarization images, wherein the optical cube sandwiched between two orthogonal polarizers splits the sequence of alternating polarization images into two sequences of orthogonal polarization images, each sequence of orthogonal polarization images projected into an edge of one of the two lenses.
11. The display device according to claim 10, wherein: the optical fiber is provided to carry the optical image from one end thereof to the other end thereof by total internal reflection in the optical fiber, and wherein the optical image is from a microdisplay.
12. The display device according to claim 11, wherein: the two orthogonal polarization image sequences are projected onto the lens respectively.
13. The display device according to claim 11, wherein: the two orthogonal polarization image sequences are projected onto the edge of the lens, respectively.
14. The display device according to claim 10, wherein: each of the lenses includes a prism that receives the one of the two sequences of orthogonally polarized images from a first edge of the prism, the one of the two sequences of orthogonally polarized images being viewed by one of the eyes via only the prism.
15. The display device according to claim 14, wherein: each of the lenses further includes an optical correction lens integrated with the prism to correct an optical path exiting the prism.
CN201810022405.3A 2017-01-12 2018-01-10 Display equipment for displaying 3D multimedia Active CN108267859B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/405,067 2017-01-12
US15/405,067 US10353213B2 (en) 2016-12-08 2017-01-12 See-through display glasses for viewing 3D multimedia

Publications (2)

Publication Number Publication Date
CN108267859A CN108267859A (en) 2018-07-10
CN108267859B true CN108267859B (en) 2021-08-20

Family

ID=62773399

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810022405.3A Active CN108267859B (en) 2017-01-12 2018-01-10 Display equipment for displaying 3D multimedia
CN201810022370.3A Active CN108732752B (en) 2017-01-12 2018-01-10 Display equipment for virtual reality and augmented reality

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810022370.3A Active CN108732752B (en) 2017-01-12 2018-01-10 Display equipment for virtual reality and augmented reality

Country Status (1)

Country Link
CN (2) CN108267859B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI728605B (en) * 2018-12-20 2021-05-21 中央研究院 Metalens for light field imaging
CN111240414B (en) * 2020-01-23 2021-03-09 福州贝园网络科技有限公司 Glasses waistband type computer device
US11994682B2 (en) 2020-10-26 2024-05-28 Megaforce Company Limited Projection glasses, projection temple structure, and modularize optical engine
CN113163191B (en) * 2021-03-30 2023-07-04 杭州小派智能科技有限公司 Split type short-focus VR equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5198928A (en) * 1991-06-26 1993-03-30 Hughes Aircraft Company Light weight binocular helmet visor display
CN103096112A (en) * 2012-10-30 2013-05-08 青岛海信电器股份有限公司 Two-dimension (2D)/three-dimension (3D) polarized light display method, polarized light display device and television
WO2013126454A1 (en) * 2012-02-23 2013-08-29 Lc-Tec Displays Ab Optical polarization state modulator assembly for use in stereoscopic three-dimensional image projection system
CN103688208A (en) * 2010-12-24 2014-03-26 奇跃公司 An ergonomic head mounted display device and optical system
CN104702938A (en) * 2010-10-01 2015-06-10 株式会社日本显示器西 3d image display device
CN105157576A (en) * 2015-05-27 2015-12-16 合肥工业大学 Laser measuring device and method capable of achieving three-dimensional displacement measurement

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09213101A (en) * 1995-11-27 1997-08-15 Matsushita Electric Works Ltd Portable light irradiation device
CN101661163A (en) * 2009-09-27 2010-03-03 合肥工业大学 Three-dimensional helmet display of augmented reality system
EP2817667B1 (en) * 2012-02-21 2019-03-20 Corning Optical Communications LLC Structures and method for thermal management in active optical cable (aoc) assemblies
JP6079268B2 (en) * 2013-01-29 2017-02-15 セイコーエプソン株式会社 Image display device
EP3296797B1 (en) * 2013-03-25 2019-11-06 North Inc. Method for displaying an image projected from a head-worn display with multiple exit pupils
BR112015029884A2 (en) * 2013-05-29 2017-07-25 Volfoni R&D optical polarization device for a stereoscopic imaging projector
JP6209456B2 (en) * 2013-05-31 2017-10-04 株式会社Qdレーザ Image projection apparatus and projection apparatus
US9664905B2 (en) * 2013-06-28 2017-05-30 Microsoft Technology Licensing, Llc Display efficiency optimization by color filtering
US10533850B2 (en) * 2013-07-12 2020-01-14 Magic Leap, Inc. Method and system for inserting recognized object data into a virtual world
JP2015148782A (en) * 2014-02-10 2015-08-20 ソニー株式会社 Image display device and display device
US20160196693A1 (en) * 2015-01-06 2016-07-07 Seiko Epson Corporation Display system, control method for display device, and computer program
NZ773814A (en) * 2015-03-16 2023-03-31 Magic Leap Inc Methods and systems for diagnosing and treating health ailments
CN105278108A (en) * 2015-09-10 2016-01-27 上海理鑫光学科技有限公司 Double-screen stereo imaging augmented reality system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5198928A (en) * 1991-06-26 1993-03-30 Hughes Aircraft Company Light weight binocular helmet visor display
CN104702938A (en) * 2010-10-01 2015-06-10 株式会社日本显示器西 3d image display device
CN103688208A (en) * 2010-12-24 2014-03-26 奇跃公司 An ergonomic head mounted display device and optical system
WO2013126454A1 (en) * 2012-02-23 2013-08-29 Lc-Tec Displays Ab Optical polarization state modulator assembly for use in stereoscopic three-dimensional image projection system
CN103096112A (en) * 2012-10-30 2013-05-08 青岛海信电器股份有限公司 Two-dimension (2D)/three-dimension (3D) polarized light display method, polarized light display device and television
CN105157576A (en) * 2015-05-27 2015-12-16 合肥工业大学 Laser measuring device and method capable of achieving three-dimensional displacement measurement

Also Published As

Publication number Publication date
CN108267859A (en) 2018-07-10
CN108732752B (en) 2022-04-05
CN108732752A (en) 2018-11-02

Similar Documents

Publication Publication Date Title
US10353213B2 (en) See-through display glasses for viewing 3D multimedia
US8867139B2 (en) Dual axis internal optical beam tilt for eyepiece of an HMD
CN108267859B (en) Display equipment for displaying 3D multimedia
CN107870438B (en) Device, light engine component and the method for augmented reality
CN110196494B (en) Wearable display system and method for delivering optical images
US9946075B1 (en) See-through display glasses for virtual reality and augmented reality applications
TW202011080A (en) Augmented/virtual reality near eye display with edge imaging spectacle lens
WO2013142086A1 (en) Optical beam tilt for offset head mounted display
KR20140036351A (en) Compact see-through display system
US10823966B2 (en) Light weight display glasses
US20210294107A1 (en) Optical image generators using miniature display panels
US10725301B2 (en) Method and apparatus for transporting optical images
US11163177B2 (en) See-through display glasses with single imaging source
CN110967828A (en) Display system and head-mounted display device
CN110196495B (en) Light display device
CN110297327A (en) Using the lightweight display device of active optical cable
CN110286486B (en) Method for conveying optical images
US20190162967A1 (en) Light weight display glasses using an active optical cable
US20200018961A1 (en) Optical image generators using miniature display panels
US11002967B2 (en) Method and system for communication between a wearable display device and a portable device
US11231589B2 (en) Ultralight wearable display device
CN109963145A (en) Vision display system and method and head-wearing display device
CN114675418A (en) Ultra lightweight wearable display device and method for display device
CN115903235A (en) Display device for displaying hologram and method thereof
US20230107434A1 (en) Geometrical waveguide illuminator and display based thereon

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant