CN110196495B - Light display device - Google Patents

Light display device Download PDF

Info

Publication number
CN110196495B
CN110196495B CN201910408581.5A CN201910408581A CN110196495B CN 110196495 B CN110196495 B CN 110196495B CN 201910408581 A CN201910408581 A CN 201910408581A CN 110196495 B CN110196495 B CN 110196495B
Authority
CN
China
Prior art keywords
optical
image
conduit
display device
optical image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910408581.5A
Other languages
Chinese (zh)
Other versions
CN110196495A (en
Inventor
胡大文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/996,499 external-priority patent/US10823966B2/en
Application filed by Individual filed Critical Individual
Publication of CN110196495A publication Critical patent/CN110196495A/en
Application granted granted Critical
Publication of CN110196495B publication Critical patent/CN110196495B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01BCABLES; CONDUCTORS; INSULATORS; SELECTION OF MATERIALS FOR THEIR CONDUCTIVE, INSULATING OR DIELECTRIC PROPERTIES
    • H01B9/00Power cables
    • H01B9/005Power cables including optical transmission elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The invention describes architecture and design of a wearable display device. According to one aspect of the invention, at least one optical conduit is embedded in or integrated with a temple of the wearable display device. The optical conduit is used to transport an optical image from one end to the other, where the optical image is generated in an image source (e.g., a microdisplay) according to image data. The microdisplay is powered through an active optical cable and receives the image data and control signals.

Description

Light display device
Technical Field
The present invention relates generally to the field of display devices, and more particularly to the architecture and design of display devices, wherein the display devices are manufactured in the form of a pair of glasses, and may be used in a variety of applications including virtual reality and augmented reality.
Background
Virtual reality or VR is generally defined as the real and immersive simulation of a three-dimensional environment created using interactive software and hardware and experienced or controlled by movement of a subject. A person using a virtual reality device is typically able to look around an artificially generated three-dimensional environment, walking around and interacting with features or objects depicted on a screen or in goggles. Virtual reality artificially creates a sensory experience, which may include sight, touch, hearing, and, less commonly, smell.
Augmented Reality (AR) is a technology that adds computer-generated augmentation to existing reality to make it more meaningful through the ability to interact with it. AR is developed into applications and used on mobile devices to mix digital components into the real world so that they augment each other, but can also be easily discerned. AR technology is rapidly becoming the mainstream. It is used to display a score overlay on a televised sports game on a mobile device and pop up a 3D email, photo or text message. The leader of this technology industry also uses AR to do exciting and revolutionary things with holograms and motion activation commands.
Separately, the delivery methods of virtual reality and augmented reality are different. Most 2016's of virtual reality are displayed on computer display screens, projector screens, or by virtual reality headsets (also known as head mounted displays or HMDs). HMDs are typically in the form of head mounted goggles with a screen in front of the eyes. Virtual reality actually brings the user into the digital world by switching off external stimuli. In this way, the user is only interested in the digital content being displayed in the HMD. Augmented reality is increasingly used in mobile devices, such as laptops, smart phones, and tablets, to change the way the real world and digital images, graphics, intersect and interact.
Indeed, VR and AR are not always opposed because they do not always operate independently of each other, but rather are often mixed together to create a more immersive experience. For example, haptic feedback as vibrations and sensations added to the interaction with the graphics is considered enhanced. However, it is often used within virtual reality scenes in order to make the experience more realistic by touch.
Virtual reality and augmented reality are prominent examples of experiences and interactions that are expected to become immersive in the simulated platform of entertainment and gaming or to be driven by the addition of new dimensions to the interaction between a digital device and the real world. They both open up, of course, both real and virtual worlds, either individually or mixed together.
Fig. 1A shows exemplary goggles for delivering or displaying VR or AR applications as are common on the market today. Regardless of the design of the goggles, they appear to be bulky and cumbersome and create inconvenience when worn by the user. Furthermore, most goggles are not see-through. In other words, when the user wears the goggles, he or she will not be able to see or do anything else. Therefore, there is a need for a device that can display VR and AR and also allow the user to perform other tasks when needed.
Various wearable devices are being developed for VR/AR and holographic applications. Fig. 1B shows a simplified diagram of HoloLens from Microsoft. Weighing 579g (1.2lbs) at this weight, the wearer will feel uncomfortable after a period of wear. In fact, the products available on the market are generally bulky and bulky compared to normal spectacles. Thus, there is a further need for a wearable AR/VR viewing or display device that looks similar to a pair of ordinary eyeglasses but also allows for a smaller footprint, enhanced impact performance, low cost packaging and easier manufacturing process.
Many eyeglass-type display devices use a common design that places an image forming component (e.g., LCOS) in front or near the frame of the lens, it is desirable to reduce image transmission loss and use fewer components. However, such designs often unbalance the eyeglass-type display, with the front portion being much heavier than the rear portion, adding some pressure on the nose. Thus, there remains a need to disperse the weight of such display devices when they are worn by a user.
Regardless of how the wearable display device is designed, there are still many components, wires, and even batteries that must be used to make the display device functional and operable. Although much effort has been made to move as many parts as possible to the attachable apparatus or housing to drive the display apparatus from the user's waist or pocket, necessary parts such as copper wires must be used to transmit various control signals and image data. The wires, typically in the form of cables, do have a weight that increases stress on the wearer when such display devices are worn by the wearer. Thus, there remains a need for a transmission medium that is as light as possible without sacrificing the desired functionality.
There are many other needs, not individually listed, which one of ordinary skill in the art would readily appreciate that one or more embodiments of the present invention detailed herein would clearly satisfy.
Disclosure of Invention
This section is for the purpose of summarizing some aspects of the invention and to briefly introduce some preferred embodiments. Simplifications or omissions may be made in this section as well as in the abstract and the title for the purpose of avoiding obscuring the section, the abstract and the title. Such simplifications or omissions are not intended to limit the scope of the present invention.
The present invention relates generally to the architecture and design of wearable devices that may be used for virtual reality and augmented reality applications. According to one aspect of the invention, a display device is made in the form of a pair of eyeglasses and includes a minimum number of portions to reduce its complexity and weight. A separate shell or outer cover is provided that is portable for attachment or attachment to a user (e.g., a pocket or belt). The housing contains all the necessary parts and circuitry to generate content for virtual reality and augmented reality applications, resulting in a minimum number of parts required on the eyewear, thus making the eyewear smaller in footprint, enhanced in impact performance, lower in packaging cost, and easier in manufacturing processes. Content is optically picked up by the fiber optic cable and delivered to the glasses through optical fibers in the fiber optic cable, where the content is projected individually to a tailored lens for display of the content in front of the wearer's eyes.
According to another aspect of the invention, the glasses (i.e. the lenses therein) and the housing are coupled by an optical cable comprising at least one optical fiber, wherein the optical fiber is responsible for transporting the content or optical image from one end of the optical fiber to the other end thereof by total internal reflection within the optical fiber. An optical image is picked up by the focusing lens from the microdisplay in the housing.
According to yet another aspect of the invention, each lens comprises a prism in the form of: which propagates an optical image projected onto one edge of the prism to an optical path where a user can see an image formed from the optical image. The prism is also integrated or stacked on an optical corrective lens that is complementary or reciprocal to the lens of the prism to form an integrated lens of the eyewear. The optical correction lens is provided to correct the optical path from the prism, allowing the user to view through the integrated lens without optical distortion.
According to yet another aspect of the invention, an exemplary prism is a waveguide. Each of the integrated lenses includes an optical waveguide that propagates an optical image projected onto one end of the waveguide to the other end through an optical path where an image formed from the optical image is visible to a user. The waveguide may also be integrated with or stacked on an optical corrective lens to form an integrated lens of the eyewear.
According to yet another aspect of the invention, the integrated lens may also be coated with one of a plurality of films having optical properties that magnify the optical image of the user's anterior eye.
According to yet another aspect of the invention, the glasses contain several electronic devices (e.g., sensors or microphones) to enable various interactions between the wearer and the displayed content. The signals captured by the device (e.g., depth sensor) are transmitted to the housing by wireless means (e.g., RF or bluetooth) to eliminate the wired connection between the glasses and the housing.
According to yet another aspect of the invention, instead of using two optical cables to convey images from two micro-displays, a single optical cable is used to convey images from one micro-display. The fiber optic cable may pass through any of the temples of the eyeglasses. A splitting mechanism, placed near or just above the bridge of the eyeglasses, is used to split the image into two versions, one for the left lens and the other for the right lens. These two images are then projected into a prism or waveguide, respectively, which may be used in the two mirrors.
According to yet another aspect of the invention, the fiber optic cable is enclosed within or attached to a functional multilayer structure forming part of an article of clothing. When a user wears a shirt made or designed according to one embodiment, the cable itself has less weight and the user can perform more activities.
According to yet another aspect of the invention, an optical conduit is used to convey an optical image received from an image source (e.g., a microdisplay). The optical conduit is enclosed in or integrated with a temple of the display device. Depending on the embodiment, the optical conduit including the bundle or array of optical fibers may be twisted, thinned, or otherwise deformed to fit the fashion design of the temple while transporting the optical image from one end of the temple to the other.
To further reduce the weight of the display device, according to yet another aspect of the present invention, an active optical cable is used as a communication medium between the display device and a portable device, wherein the portable device is wearable by or attachable to a user. An active optical cable comprises two ends and at least one optical fiber and two conductors, wherein the two ends are coupled by the optical fiber and two conductors. The two conductors carry power and ground to power the two terminals and the operation of the display device, while the at least optical fiber is used to carry all data, control and command signals.
According to yet another aspect of the present invention, the portable device may be implemented as a stand-alone device or a docking unit to receive a smartphone. The portable device is primarily a control box connected to a network, such as the internet, and generates control and command signals when controlled by a user. When the smartphone is received in the docking unit, many of the functions provided in the smartphone, such as the network interface and the touch screen, may be used to receive input from the user.
The present invention may be embodied as an apparatus, method, or system. Different embodiments may yield different benefits, objects, and advantages. In one embodiment, the present invention is a display device comprising: at least one lens; an optical catheter comprising a bundle of optical fibers; two temples, at least one of the temples being integrated with the optical conduit, wherein the optical conduit is coupled to an image source located near one end of the optical conduit to transmit an optical image from the image source to the other end of the optical conduit; and an active optical cable coupling the image source to a portable device carried by a user.
In another embodiment, the present invention is a display device comprising: at least one lens; a temple; an optical block that receives an optical image from the microdisplay; at least one optical conduit having a first end and a second end, the optical conduit integrated within the temple, the first end coupled to the optics block and receiving the optical image, the optical image transported to the second end by total internal reflection within the optical conduit; and an integrated lens coupled to the second end, receiving the optical image from the optical conduit, and presenting the optical image for viewing by a user of the display device.
Compared with the prior art, the display device has lighter weight and more uniform weight.
In addition to the above objects, which are achieved by the practice of the invention in the following description and which result in the embodiments shown in the drawings, there are many other objects.
Drawings
These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
FIG. 1A illustrates exemplary goggles for delivering or displaying VR or AR applications that are common on the market today;
FIG. 1B shows a simplified diagram of HoloLens from Microsoft;
FIG. 2A shows an exemplary pair of glasses that may be used for VR applications according to one embodiment of the present invention;
FIG. 2B illustrates the use of an optical fiber to transport light from one location to another along a curved path in a more efficient manner or by total internal reflection within the fiber;
FIG. 2C illustrates two exemplary ways of encapsulating an optical fiber or a plurality of optical fibers according to one embodiment of the invention;
FIG. 2D shows how an image is carried from the microdisplay to the imaging medium by a fiber optic cable;
FIG. 2E illustrates an exemplary set of Variable Focus Elements (VFE) to accommodate adjustment of the projection of an image onto an optical object (e.g., an imaging medium or prism);
FIG. 2F illustrates an exemplary lens that may be used in the eyewear shown in FIG. 2A, wherein the lens comprises two portions, a prism and an optical corrective lens or corrector;
FIG. 2G shows internal reflections from multiple sources (e.g., a sensor, an imaging medium, and multiple light sources) in an irregular prism;
fig. 2H shows such an integrated lens in comparison to a coin and ruler;
fig. 2I shows a shirt with the cable enclosed within or attached to the shirt;
FIG. 3A illustrates how three single color images are visually combined and perceived by human vision as a full color image;
FIG. 3B shows that three different color images are produced under three lights at wavelengths λ 1, λ 2, and λ 3, respectively, and the imaging medium comprises three films, each film coated with one type of phosphor.
FIG. 4 illustrates the use of a waveguide to transport an optical image from one end of the waveguide to its other end;
FIG. 5 illustrates an exemplary functional block diagram that may be used with a separate shell or housing to generate content for virtual reality and augmented reality for display on the exemplary eyewear of FIG. 2A;
FIG. 6A shows a modified version of FIG. 2A, wherein a splitting mechanism is used to split an image propagated or conveyed by the fiber optic cable into two portions (e.g., a left image and a right image);
FIG. 6B illustrates an exemplary detachment mechanism according to one embodiment of the present invention;
FIG. 7A illustrates an exemplary integration of a plurality of individual optical fibers integrated and shaped to form a fiber optic catheter;
FIG. 7B shows a conduit shaped as part of a temple of an eyeglass;
FIG. 7C illustrates a light source embodiment that can be used as the light source of FIG. 7B;
FIG. 7D illustrates one embodiment in which the optical conduit is not rotated but receives the optical image in a standard orientation;
FIG. 7E illustrates an example of a temple useful in the display eyewear described in this disclosure, wherein the temple contains an optical conduit;
FIG. 8A illustrates an active optical cable, referred to herein, comprising two ends and a plurality of optical fibers coupled between the two ends;
fig. 8B and 8C each show an example of an active optical cable containing four optical fibers for carrying four channel signals and three conductors for power and ground and data buses;
FIG. 9A shows a frame of a pair of glasses worn by a human;
FIG. 9B shows an exploded view of the vicinity of the temple end of the eyeglasses;
FIG. 9C illustrates another embodiment in which the display glasses are implemented as a set of clip-on glasses on ordinary reading glasses;
FIG. 9D shows an embodiment in which the optical conduit is not used directly in a temple;
FIG. 9E illustrates one embodiment of integrating an optical block in an eyeglass or lens frame, wherein the optical block includes a cube, a micro-display, and a light source;
fig. 9F illustrates a cover or clip-on cover used when showing glasses for VR applications;
FIG. 10A shows a block diagram of the use of a pair of display glasses (i.e., a display device) in conjunction with a smartphone (e.g., an iPhone), according to one embodiment of the invention; and
fig. 10B shows an internal functional block diagram of an exemplary docking unit that may be used in fig. 10A or as a stand-alone portable device operable by a wearer to control the display device.
Detailed Description
The detailed description of the invention is presented largely in terms of procedures, steps, logic blocks, processing, and other symbolic representations of operations that are directly or indirectly analogous to data processing devices coupled to a network. These process descriptions and representations are generally used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art.
Reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, the order of the blocks in a process flow diagram or illustration representing one or more embodiments of the invention is not intended to indicate any particular order nor imply any limitations in the invention per se.
Embodiments of the invention are discussed herein with reference to fig. 2A-10B. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.
Referring now to the drawings, in which like numerals refer to like parts throughout the several views. FIG. 2A shows an exemplary pair of glasses 200 for VR/AR applications according to one embodiment of the present invention. The eyeglasses 200 do not differ significantly in appearance from a normal pair of eyeglasses, but include two flexible cables 202 and 204 extending from temples 206 and 208, respectively. According to one embodiment, each pair of the two flexible cables 202 and the temples 206 and 208 are integrated or removably connected at one end thereof and include one or more optical fibers. In this context, a temple, which may also be referred to as a temple, is to be understood as a support part at the edge.
Both flexible cables 202 are coupled at their other ends to a portable computing device 210, where the computing device 210 generates images captured by the cables 202 based on a microdisplay. The image is transported through the optical fiber by total internal reflection therein in the flexible cable 202 up to the other end of the optical fiber, where the image is projected onto the lens in the glasses 200.
According to one embodiment, each of the two flexible cables 202 contains one or more optical fibers. Optical fibers are used to transmit light from one place to another along a curved path in a more efficient manner as shown in fig. 2B. In one embodiment, the optical fiber is formed from thousands of strands of very fine quality glass or quartz having an index of refraction on the order of about 1.7. The thickness of one strand is small. The strands are coated with a layer of a material of lower refractive index. The ends of the strands are polished and clamped firmly after careful alignment thereof. When light is incident at one end at a small angle, it is refracted into the strand (or fiber) and is incident on the interface of the fiber and the coating. At incident angles greater than the critical angle, the light rays undergo total internal reflection and substantially transport the light from one end to the other, even when the fiber is bent. Depending on the embodiment of the present invention, a single optical fiber or a plurality of optical fibers arranged in parallel may be used to transport an optical image projected onto one end of an optical fiber to the other end thereof.
Fig. 2C shows two exemplary ways of encapsulating an optical fiber or fibers according to one embodiment of the invention. The encapsulated optical fiber may be used as cable 202 or 204 in fig. 2A and extends through each of the inflexible temples 206 and 208 to its end. According to one embodiment, the temples 206 and 208 are made of a material type common in a pair of ordinary eyeglasses (e.g., plastic or metal), a portion of the cable 202 or 204 is embedded or integrated in the temples 206 or 208, thereby creating a non-flexible portion, while another portion of the cable 202 or 204 is still flexible. According to another embodiment, the non-flexible portion and the flexible portion of the cable 202 or 204 may be removably connected by an interface or connector.
Reference is now made to fig. 2D, which illustrates how an image is transferred from the microdisplay 240 to an imaging medium 244 through a fiber optic cable 242. As will be described further below, the imaging medium 244 can be a solid object (e.g., a film) or a non-solid object (e.g., air). The microdisplay is a display with a very small screen (e.g., less than one inch). This type of tiny electronic display system was introduced commercially by the end of the 90's of the 20 th century. The most common applications for microdisplays include rear projection TVs and head mounted displays. The microdisplay may be reflective or transmissive depending on the way light is allowed to pass through the display element. An image (not shown) displayed on the microdisplay 240 is picked up by one end of the fiber optic cable 242, which carries the image to the other end of the fiber optic cable 242, via a lens 246. Another lens 248 is provided to collect images from the fiber optic cable 242 and project images to the imaging media 244. Depending on the embodiment, there are different types of microdisplays and imaging media. Some embodiments of the microdisplay and the imaging media are described in detail below.
Fig. 2E illustrates an exemplary set of Variable Focus Elements (VFEs) 250 to accommodate adjustment of the projection of an image onto an optical object (e.g., an imaging medium or prism). To facilitate the description of various embodiments of the present invention, it is assumed that image media is present. As shown in fig. 2E, the image 252 conveyed through the fiber optic cable reaches an end surface 254 of the fiber optic cable. The image 252 is focused onto an imaging medium 258 by a set of lenses 256, referred to herein as a Variable Focus Element (VFE). The VFE 256 is provided to adjust to ensure that the image 252 is accurately focused onto the imaging media 258. Depending on the implementation, adjustments to the VFE 256 may be done manually or automatically based on inputs (e.g., measurements obtained from sensors). According to one embodiment, the adjustment of the VFE 256 is performed automatically according to a feedback signal derived from a sensed signal from a sensor against the eye (pupil) of the wearer wearing the eyeglasses 200 of fig. 2A.
Referring now to FIG. 2F, an exemplary lens 260 that may be used with the eyewear shown in FIG. 2A is shown. Lens 260 includes two portions: a prism 262 and an optical correction lens or corrector 264. The prism 262 and the corrector 264 are stacked to form the optic 260. As the name implies, the optical corrector 264 is provided to correct the optical path from the prism 262 such that light passing through the prism 262 travels straight through the corrector 264. In other words, the refracted light from the prism 262 is corrected or released from refraction by the corrector 264. In optics, a prism is a transparent optical element with a flat, polished surface that refracts light. At least two of the planar surfaces must have an angle between them. The exact angle between the surfaces depends on the application. The traditional geometry is a triangular prism with a triangular base and rectangular sides, and in spoken use, a prism is often referred to as this type. The prism may be made of any material that is transparent to the wavelength for which it is designed. Typical materials include glass, plastic and fluorspar. According to one embodiment, the type of prism 262 is not actually in the shape of a geometric prism, and thus the prism 262 is referred to herein as an arbitrary-shaped prism, which directs the corrector 264 to a shape that is complementary, reciprocal, or conjugate to the form of the prism 262 to form the optic 260.
On one edge of the lens 260 or the edge of the prism 262, there are at least three items that utilize the prism 262. Designated 267 is imaging media corresponding to imaging media 244 of fig. 2D or imaging media 258 of fig. 2E. Depending on the embodiment, the image conveyed by the fiber 242 of fig. 2D may be projected directly onto the edge of the prism 262, or formed on the imaging media 267 before it is projected onto the edge of the prism 262. In any case, depending on the shape of the prism 262, the projected image is refracted in the prism 262 and subsequently seen by the eye 265. In other words, a user wearing a pair of glasses using lenses 262 can see an image displayed through or in prism 262.
A sensor 266 is provided to image the position or movement of the pupil in the eye 265. Also, based on the refraction provided by the prism 262, the sensor 266 may find the location of the pupil. In operation, an image of the eye 265 is captured. The image is analyzed to derive the manner in which the pupil views the image shown through or in the lens 260. In AR applications, the position of the pupil may be used to activate some action. Optionally, a light source 268 is provided to illuminate the eye 265 to facilitate image capture by the sensor 266. According to one embodiment, the light source 268 uses a near-guess source, whereby the user or his eye 265 will not be affected by the light source when the light source 268 is turned on.
Fig. 2G shows internal reflections from multiple sources (e.g., sensor 266, imaging media 267, and light source 268). Since the prism is uniquely designed, or has specific edges, especially in shape, the light rays from the source reflect several times within the prism 268 and then impinge upon the eye 265. For completeness, fig. 2H shows a comparison of such lenses in size with a coin and a ruler.
As described above, there are different types of microdisplays and, therefore, different imaging media. The following table summarizes some of the microdisplays that may be used to facilitate the creation of an optical image that can be transported from one end to the other end by one or more optical fibers by total internal reflection within the optical fibers.
Figure BDA0002062075130000101
LCoS ═ liquid crystal on silicon;
LCD ═ liquid crystal displays;
OLED ═ organic light emitting diode;
RGB ═ red, green, and blue; and
SLM is spatial light modulation.
In the first case shown in the above table, a full color image is actually displayed on silicon. As shown in fig. 2D, the full color image can be picked up by a focusing lens or set of lenses that project the full image onto exactly one end of the optical fiber. The image is transported within the fiber and again picked up by another focusing lens at the other end of the fiber. The imaging medium 244 of fig. 2D may not be physically needed because the delivered image is visible and full color. The color image may be projected directly onto one edge of the prism 262 of fig. 2F.
In the second case shown in the table above, the LCoS is used with different light sources. Specifically, there are at least three color light sources (e.g., red, green, and blue) used sequentially. In other words, each light source produces a single color image. The image picked up by the optical fiber is only a single color image. A full color image can be reproduced when all three different single color images are combined. The imaging medium 244 of fig. 2D is provided to reproduce a full color image from three different single color images respectively conveyed by the optical fibers.
Fig. 2I shows a shirt 270 with a cable 272 enclosed within or attached to shirt 270. Shirt 270 is an example of a fabric material or a multi-layer piece. Such relatively thin cables may be embedded in the multilayer. When a user wears such a shirt made or designed according to one embodiment, the cable itself has less weight and the user is more free to move around.
Fig. 3A shows how three single color images 302 are visually combined and perceived by human vision as a full color image 304. According to one embodiment, three colored light sources are used, for example red, green and blue light sources that are sequentially switched on. More specifically, when the red light source is turned on, only a red image is produced as a result (e.g., from a microdisplay). The red image is then optically picked up and transported by the optical fiber and then projected into the prism 262 of fig. 2F. As the green and blue light is then sequentially turned on, green and blue images are generated and separately conveyed by the optical fibers and then projected into the prism 262 of fig. 2F. It is well known that human vision possesses the ability to combine three single color images and perceive them as a full color image. With all three single color images projected sequentially into the prism perfectly aligned, the eye sees a full color image.
Also in the second case shown above, the light source may be approximately invisible. According to one embodiment, three light sources generate light close to the UV band. Under such illumination, three different color images may still be generated and delivered, but are not fully visible. Before the color image can be presented to the eye or projected into a prism, it will be converted into three primary color images, which can then be perceived as a full color image. According to one embodiment, the imaging media 244 of FIG. 2D is provided. FIG. 3B shows that three different color images 310 are produced under three light sources at wavelengths λ 1, λ 2, and λ 3, respectively, and the imaging medium 312 includes three film layers 314, each film layer 314 being coated with one type of phosphor, i.e., a substance that exhibits a luminescence phenomenon. In one embodiment, three types of phosphors at wavelengths 405nm, 435nm, and 465nm are used to convert three different color images produced under three light sources in the near UV band. In other words, when one such color image is projected onto a film layer coated with a phosphor at a wavelength of 405nm, the single color image is converted to a red image that is then focused and projected into a prism. The process is the same for the other two single color images through the film layer coated with phosphor at wavelengths 435nm or 465nm, producing green and blue images. When such red, green, and blue images are sequentially projected into the prism, they are perceived together by human vision as a full-color image.
In the third or fourth case shown in the table above, instead of using light in the visible spectrum of the human eye or that is nearly invisible, the light source uses a laser source. Visible lasers and invisible lasers are also present. Without much difference to the operation of the first and second cases, the third or fourth case uses so-called Spatial Light Modulation (SLM) to form a full color image. Spatial light modulators are a general term describing devices for modulating the amplitude, phase or polarization of light waves in space and time. In other words, the SLM + laser (RGB sequential) can produce three separate color images. When the color images are combined with or without an imaging medium, a full color image can be reproduced. In the case of an SLM + laser (not visible), the imaging medium will be present to convert the invisible image to a full color image, in which case appropriate film layers may be used as shown in fig. 3B.
Referring now to FIG. 4, a waveguide 400 is shown for transporting an optical image 402 from one end 404 to another end 406 of the waveguide 400, where the waveguide 400 may be stacked with one or more sheets of glass or lens (not shown) or coated with one or more film layers to form a suitable lens for application to a pair of glasses for displaying images from a computing device. As known to those skilled in the art, an optical waveguide is a spatially inhomogeneous structure for guiding light, i.e. a spatial region for limiting the light propagation, wherein the waveguide contains a region of increased refractive index compared to the surrounding medium (usually called cladding).
Waveguide 400 is transparent and shaped at end 404 in a suitable manner to allow image 402 to propagate along waveguide 400 to end 406, where user 408 can view through waveguide 400 to see propagating image 410. According to one embodiment, one or more film layers are disposed on waveguide 400 to magnify propagating image 410 such that eye 408 can see a significantly magnified image 412. One example of such a film is known as metalenses, which is essentially an array of thin titania nanosheets on a glass substrate.
Referring now to fig. 5, an exemplary functional block diagram 500 is shown that may be used with a separate shell or housing to generate virtual reality and augmented reality related content for display on the exemplary eyewear of fig. 2A. As shown in fig. 5, two micro-displays 502 and 504 are provided to supply content to two lenses in the glasses of fig. 2A, essentially a left image going to the left lens and a right image going to the right lens. Examples of such content are 2D or 3D images and video or holograms. Each of the micro-displays 502 and 504 is driven by a corresponding driver 506 or 508.
The entire circuit 500 is controlled and driven by a controller 510 programmed to produce the content. According to one embodiment, the circuit 500 is designed to communicate with the internet (not shown), receiving the content from other devices. Specifically, the circuit 500 includes an interface that receives the sensing signal wirelessly (e.g., RF or bluetooth) from a remote sensor (e.g., sensor 266 of fig. 2F). Controller 510 is programmed to analyze the sense signals and provide feedback signals to control certain operations of the glasses, such as the projection mechanism, which includes a focusing mechanism that auto-focuses and projects an optical image onto the edge of prism 262 of fig. 2F. Further, audio is provided to synchronize with the content, and the audio may be wirelessly transmitted to headphones.
Fig. 5 illustrates an exemplary circuit 500 that generates content for display in a pair of glasses as contemplated in one embodiment of the present invention. The circuit 500 shows that there are two micro-displays 502 and 504 for providing two corresponding image or video streams to the two lenses of the glasses in fig. 2A. According to one embodiment, only one microdisplay may be used to drive both lenses of the glasses in fig. 2A. Such circuitry is not provided herein, as those skilled in the art know how the circuitry can be designed or how to modify the circuitry 500 of fig. 5.
Given a video stream or an image, the advantage is that only one optical cable is required to transport the image. Fig. 6A illustrates a modified version 600 of fig. 2A showing a cable 602 for coupling the housing 210 to the eyewear 208. Instead of using two optical cables to convey images from two micro-displays as shown in fig. 2A, a single optical cable is used to convey images from one micro-display. The optical cable may pass through any temple of the eyewear and possibly also through a portion of one of the upper frames. A splitting mechanism, placed near or just above the bridge of the eyeglasses, is used to split the image into two versions, one for the left lens and the other for the right lens. These two images are then projected into a prism or waveguide, respectively, which may be used in the two mirrors.
To split the image propagated or conveyed by cable 602, eyewear 600 is designed to include a splitting mechanism 604 that is preferably positioned near or at its nosepiece. Figure 6B illustrates an exemplary detachment mechanism 610 according to one embodiment of the present invention. A cube 612, also referred to as an X-cube beam splitter, for splitting incident light into two separate components is provided to receive images from the microdisplay through the cable 602. In other words, the image is projected onto one side of the X-cube 612. The interior of the X-cube 612 is coated with some reflective material to split the incident image into two portions, one to the left and the other to the right, as shown in fig. 6B. The split image passes through polarizer 614 or 616 to strike reflector 618 or 620, which reflects the image back to polarizing mirror 626 or 628. The two polarizers 614 and 616 are polarized in different ways (e.g., horizontally and vertically, or left and right handed circles) corresponding to images sequentially generated for the left or right eye. Coated with some reflective material, the polarizing mirror 626 or 628 reflects the image to the corresponding eye. Depending on the implementation, the reflected image from the polarizing mirror 626 or 628 may impinge on one edge of the prism 262 of fig. 2F or on the waveguide 400 of fig. 4. Optionally, two waveplates 622 and 624 are positioned in front of reflectors 618 and 620, respectively.
Fig. 2B or fig. 2D show that a fiber optic cable 220 or 242 is used to transport images from one end to the other. The use of optical fibers, typically encased in a flexible material such as plastic, can significantly reduce the weight of the eyewear. According to one embodiment, the fiber optic cable is made with a plurality of optical fibers integrated in parallel to form a fiber optic conduit. Fig. 7A illustrates an exemplary integration of a fiber optic catheter 700. A plurality of individual optical fibers are integrated and shaped to form the optical fiber catheter 700, wherein their cross-section is a predefined shape (e.g., rectangular or square). When an optical image is projected onto one end of the catheter 700, the light beams of the image travel through total internal reflection in each fiber and reach the other end of the catheter 700, respectively, in the fibers.
Referring now to fig. 7B, a conduit 710 is shown shaped as part of a temple of an eyeglass. In general, the image projected onto one end of the catheter 710 has an aspect ratio of 4:3 or 16: 9. Regardless of the exact numerical value of the ratio (the attribute describing the relationship between image width and height), the horizontal dimension of an image is typically longer than the vertical dimension. Preferably, the conduit 710 is shaped to have an aspect ratio similar to that of the image, which will make the temple appear thick horizontally. According to one embodiment, the conduit 710 is twisted 90 degrees in certain portions. In other words, the conduit 710 starts with a ratio similar to the image aspect ratio, and then ends with a ratio similar to the image aspect ratio. For an image with an aspect ratio of 16:9 (i.e., horizontal: vertical), a first portion of the catheter 200 is made to have a ratio of 9:16 and a second portion of the catheter 200 is made to have a ratio of 16: 9. An important advantage, benefit and objective of this embodiment is to design both temples of the glasses to appear less bulky (i.e., retro-fit or fashionable) even when used on their own or containing a catheter for delivering images or video.
Fig. 7B shows that the conduit 710 is twisted 90 degrees near one end of the conduit 710. An optical image is projected from an image source onto the beginning portion 714 of the catheter 710, where the image source can be easily rotated to accommodate the shape of the beginning interface 714. Assume that the image from the image source has an aspect ratio of 9: 16. Thus, the first portion 716 of the conduit 710 may be made thinner horizontally than vertically. The catheter 710 is then rotated 90 degrees in the second portion 718 of the catheter 710, with the image also rotated 90 degrees. Thus, the image from the end portion 720 of the catheter 710 has 16:9, and may project onto an integrated lens (e.g., 260 of fig. 2F) or a waveguide (e.g., 400 of fig. 4) for normal viewing.
Depending on the implementation, the image source may simply be a projection from the fiber optic cable 220 or 242 of fig. 2B or 2D, an optical image produced from the microdisplay 240, or an optical cube that provides an optical image. According to one embodiment, a microdisplay 240 (e.g., LCOS imager 724) is provided to produce an optical image that is projected into optical cube 712. Also shown in FIG. 7B are two enlarged versions of the optical cube 712. In one embodiment, the optical cube 712 includes two optical sheets or blocks 717 and 718 in the shape of a triangle. A special optical material or film layer 720 is provided between the two blocks 717 and 718. The light source 722 projects light into the block 717. Light is then passed through the film layer 720 to the microdisplay 240 (e.g., LCOS imager 724) to illuminate the microdisplay 240. The microdisplay 240 produces an optical image using light from a light source 722. The image is then reflected into the block 718 and through the film layer 720. The image is also projected onto the beginning portion 714 of the conduit 710 for transmission within the conduit 710 to the second end 720 thereof. One important advantage, benefit, and advantage in this embodiment is that the use of optical fibers to transmit images from one end to the other does not significantly increase the metal weight that would otherwise be present when using a cable with an array of wires. According to one embodiment, a waveguide 726 is provided to convey the projected optical image to an appropriate location and form an image based on the projected optical image.
FIG. 7C illustrates an embodiment of a light source 730 that may be used as the light source 722 of FIG. 7B, according to one embodiment. The light source 730 includes a light guide 732, a shroud 734, and a number of lamps 736 (two of which are shown). Light from the lamp 736 is projected into the guide 732. In one embodiment, the shield 734 is reflective on one side and opaque on the other side. Such a shield 734 is provided to reflect the light onto block 717, and also to prevent any light from passing out of guide 730. In other words, the mask 734 can be made of a film, one side of which is reflective and the other side of which is opaque.
The description of FIG. 7B is based on the following assumptions: the optical image received at the first end 714 of the catheter 710 has been rotated 90 degrees. Thus, the catheter was rotated back 90 degrees to normalize the image orientation. Those skilled in the art will appreciate that the above description applies equally to images received rotated any number of degrees, in which case the catheter 710 may be turned around the same amount to normalize image orientation. Fig. 7D illustrates an embodiment in which the optical conduit 750 is not rotated, but receives the optical image in a standard orientation (e.g., maintaining an aspect ratio of 16:9 or 4: 3). The optical image from the image source 752 is passed through an optical lens 754, which can correspondingly shrink the image in the vertical or horizontal or both directions. To facilitate the description of the present invention, assume that the lens 754 shrinks the received image only a predefined amount (e.g., 70%) horizontally. In this way, the width or thickness of the conduit 750 can be made thinner. At the other end of the conduit 750, there is a second lens 756. Optically, the mirror 756 acts in reverse to that performed by the mirror 754, i.e., expands the image horizontally by a predefined amount (e.g., 1/0.70) to restore the size of the original image from the cube 752.
In operation, an optical image with an aspect ratio of X: Y (e.g., 16:9) from image source 752 is projected through (horizontal demagnification) lens 754. The aspect ratio is now Y: Y (e.g., 9: 9). An optically distorted image is conveyed through the catheter 750 and then projected through the lens 756. As described above, the mirror 756 horizontally expands the light beam, thereby restoring the optically distorted image to a normal image with an aspect ratio of X: Y (16: 9). One advantage, benefit and object of this embodiment is to enable the temple to be designed or styled normally even when it is used to convey optical images or video therein. In other words, the conduit 750 can be designed in any size or shape as long as the pair of lenses 754 and 756 are conjugated, which means that they operate optically exactly opposite to each other.
Fig. 7E illustrates an example of a temple 760 that may be used in the display eyewear described in this disclosure. Regardless of the material that may be used for the temple 760, it encloses an optical conduit 762 (e.g., conduit 710 or 750) and an image source 764. Since the optical conduit 762 is made of an array of optical fibers, it can be structured according to a predefined shape, even bent if desired. In general, optical conduit 762 is fabricated as part of temple 760. The image source 764 is preferably located near one of the ends of the optical conduit 762 and may also be enclosed in the temple 760 according to one embodiment.
Regardless of how the image source 764 is configured, at least some wires must be present for coupling the image source 764 to a portable device to receive image data, various signals, and instructions. According to one embodiment, the microdisplay in the image source 712 or 752 requires power to operate and receive electronic signals to produce the image/video as desired. When the microdisplay is moved into or near the temple, power and signals must be brought to the microdisplay. Various copper wires may have to be used. In prior art systems, cables containing one or more conductors or wires are typically used. However, cables weigh significantly more than fiber optic cables and may add some pressure to the eyewear when two temples are connected or attached to such cables. In general, the more wires in the cable, the heavier the temple may be.
According to one embodiment, most of these wires are replaced with optical fibers. Fig. 8A shows an active optical cable 800, referred to herein, that includes two ends 802 and 804 and at least one optical fiber 806 coupled between the two ends 802 and 804. Additionally, in FIG. 8A at least two wires (not visible) are embedded in the fiber 806, one for power and the other for ground. These two wires are used to supply power from one end to the other. The number of optical fibers 803 may vary or be constant depending on the manner or how much of the signal needs to pass through the cable 800. The two terminals 802 and 804 may be implemented as pluggable (e.g., USB-C type) depending on the actual needs. Each of the two ends 802 and 804 includes a converter (e.g., a photodiode) to convert an electronic signal to light or vice versa. Each of the two terminals 802 and 804 also contains the necessary integrated circuitry to perform the encoding or decoding functions when needed, i.e., encoding and rendering in colored light upon receipt of a data set or electronic signal, or decoding to recover an electronic signal upon receipt of colored light. The details of the end 802 or 804 are not provided herein so as not to obscure other aspects of the invention. Assume that cable 800 is used to carry a set of signals from end 802 to end 804. When the end 802 receives a signal, a converter in the end 802 converts the signal into a light beam comprising a set of optical signals, where each optical signal is encoded according to a signal. Alternatively, a set of beams is generated by the transducer, each beam corresponding to a signal. The beam is then passed within the fiber from the first end 802 to the second end 804. Once at the second end 804, a converter in the second end 804 converts the light beam back into one or more electronic signals. Those skilled in the art can appreciate that cable 800 is much lighter than wire-based cables that are otherwise used to carry these signals. It will also be readily appreciated that active optical cables require one or more optical fibers to transmit the data, control signals or various instructions required to present the appropriate images/video to the viewer.
Fig. 8A lists specifications based on which such cables 808 may be implemented. Depending on the embodiment, the number of optical fibers may be specifically specified. In one example, image data in red, green, and blue colors are carried in three different optical fibers, respectively, and control signals are carried in one optical fiber, whereby a 4-channel fiber configuration can be established for an active optical cable. Fig. 8A also illustrates the flexibility of such fiber optic based cables, which can be folded or extended without loss of signal. Fig. 8B and 8C each show an example of a cable 800 containing 4 optical fibers for carrying image data and control signals and for power, ground and I2C three wires of the data bus, but with different interfaces (LVDS and DisplayPort). Since power consumption is small in such applications, the wires for power or ground can be made extremely thin to reduce the weight of the cable 800.
Referring now to fig. 9A, a frame of a pair of glasses 900 worn by a human is shown. Fig. 9B shows an exploded view of the vicinity of the temple end of the eyeglasses 900. The temple includes an optical conduit 902. One end of the optical conduit 902 is connected to an optical image source 904 to receive optical images therefrom. The optical image source 904 includes a microdisplay 906 and an optical cube 908. The optical image source 904 receives control signals and image or video data over the active optical cable 910 to produce an optical image or video. The optical signal is projected into the optical conduit 902 and transported through the optical conduit 902 to the other end thereof.
Fig. 9C illustrates another embodiment in which the display glasses are implemented as a set of clip-on glasses 920 on ordinary glasses. Somewhat unlike conventional clip-on sunglasses, the eyewear 920 includes at least one temple 922, wherein the temple 922 encloses an optical conduit to transmit an optical image from one end to the other. Note that the temple 922 is truncated. It does not necessarily extend all the way to the ear of a human or wearer. Depending on the embodiment, the length of the truncated temple 922 may be about one inch or extend to the ear. One of the purposes of having such truncated temples 922 is to distribute the weight or pressure of the clip eyeglasses 920 over the nose, which is largely responsible for holding the eyeglasses 924 as well as the clip eyeglasses 920. An active optical cable (not shown) is provided to couple the truncated temple 922 to a portable device (not shown).
Alternatively or in comparison, fig. 9D shows an embodiment in which an optical catheter is not used directly in the temple. Alternatively, the image source 930 is provided adjacent to one integrated lens (e.g., 260 of fig. 2F). The image source 930 is implemented as a block or optical block, which includes an optical cube. Block 930 is shown positioned adjacent to a display optic (e.g., integrated optic 260 of fig. 2F). Fig. 9E shows one embodiment where the block 930 is integrated into an eyeglass or lens frame 932. Instead of using an optical catheter, an active optical cable 934 is used to deliver a data image from which the block 930 comprising micro-displays and light sources generates an optical image all the way to the vicinity of the integrated lens (not shown). The active optical cables 934 are embedded in or integrated with the temples 936. The optical image is then projected into an integrated lens as shown in fig. 2F. Alternatively, fig. 9F shows an embodiment where the display device may be covered with a shield. In some applications (e.g., VR or viewing long videos), the see-through feature of the display glasses may cause some disruption when ambient light or movement is relatively intense. Accordingly, a shield 940 is provided and may be mounted to the display glasses 942. Specifically, the shroud 940 is intended to disable the see-through feature of the display glasses 942 so a viewer can focus on the viewing of the video displayed in the lenses 944 and 946. According to one embodiment, the shield 940 is made opaque to light from the surroundings (e.g., ambient light). For convenience, the shielding member 940 may be made in the form of clip-on sunglasses to be easily opened or closed. In one embodiment, the shield 940 may also be made as a visor to block substantially all ambient lighting from the surroundings.
Referring now to fig. 10A, a block diagram 1000 of the use of a pair of display glasses (i.e., a display device) 1002 in conjunction with a smartphone (e.g., an iPhone) is shown, in accordance with one embodiment of the present invention. The glasses 200 of fig. 2A or the glasses 900 of fig. 9A may be used as the display device 1002. A cable 1004 (e.g., the active optical cable 800 of fig. 8A) is used to couple the eyeglasses 1002 to a docking unit 1006, the docking unit 1006 being provided to receive the smartphone. The docking unit 1006 allows a user (i.e., the wearer of the display device 1002) to control the display device 1002, such as to select a medium for display, interact with a display, enable or disable applications (e.g., email, browser, and mobile payment).
According to one embodiment, the docking unit 1006 comprises a set of batteries that can be charged by a power cord and used to charge the smartphone when needed. One advantage, benefit, and object of embodiments providing the docking unit is to use many of the functions already in the smartphone. For example, a network interface need not be provided in the docking unit 1006, as the smartphone already has the network interface. In operation, a user may control the smartphone to achieve a desired purpose, and by connecting the docking unit 1006 to the display device 1002 via the cable 1004, its contents may be easily displayed or rendered on the display device 1002.
As shown in fig. 10A, the docking unit 1006 includes two portions, either or both of which may be used for one implementation. The first portion includes a receiving unit to receive a smartphone, and may or may not have a battery pack that is rechargeable and charges the smartphone if one exists and receives the smartphone. The second portion includes various interfaces to communicate with the smartphone to receive data and instructions therefrom for the display device 1002 to display images/video for viewing by the wearer. One feature, benefit or advantage of the present invention is the use of fiber optic cables to couple a portable device to the display device 1002. Generally, portable devices are worn by the wearer (e.g., hung on a belt or placed in a pocket). In one embodiment, the garment 270 of FIG. 2I may be used to hide cables and provide more freedom for the wearer to move around.
Referring now to fig. 10B, there is shown an internal functional block diagram 1100 of an exemplary such docking unit that may be used in fig. 10A or as a stand-alone portable device operable by a wearer to control the display device 1002. The device as shown in FIG. 10B includes a microprocessor or microcontroller 1022, a memory space 1024 in which application modules 1026 reside, an input interface 1028, an image buffer 1030 to drive a display device through a display interface 1032, and a network interface 1034. The application module 1026 is a software version representing one embodiment of the present invention and is downloadable from a library (e.g., Apple Store) or an indication server over a network. One exemplary function provided by the application module 1026 is to allow a user (or wearer of the display device) to effect certain interactions with the display through predefined movements of the eye sensed by the sensor 266 of fig. 2F.
The input interface 1028 includes one or more input mechanisms. A user may interact with the display device by entering commands into the microcontroller 1022 using an input mechanism. Examples of such input mechanisms include a microphone or mic to receive audio commands and a keyboard (such as a displayed soft keyboard) or touch screen to receive commands. Another example of such an input mechanism is a camera provided to take pictures or capture video, with data of pictures or video stored in the device for immediate or subsequent use by the application module 1026. An image buffer 1030 coupled to the microcontroller 1022 is provided to buffer image/video data for generating optical images/video for display on the display device. The display interface 1032 is provided to drive the active optical cable and feed data from the image buffer 1030 onto the active optical cable. In one embodiment, the display interface 1032 is caused to encode and send along the active optical cable certain instructions received on the input interface 1028. The network interface 1034 is provided to allow the device 1100 to communicate with other devices over a designated medium, such as a data network. Those skilled in the art will appreciate that certain functions or blocks shown in fig. 10B are readily provided in a smartphone, but need not be provided when a smartphone is used in the docking unit.
The present invention has been described in sufficient detail with a certain degree of particularity. It is understood by those skilled in the art that the present disclosure of the embodiments is made only by way of example, and that numerous changes in the arrangement and combination of parts may be made without departing from the spirit and scope of the invention as claimed. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description of the embodiments.

Claims (20)

1. A display device, comprising:
an image source for providing a two-dimensional optical image, comprising a microdisplay that produces a two-dimensional optical image;
an optical conduit comprising an optical fiber array, the optical fiber array consisting of a plurality of optical fibers;
two temples, at least one of the temples being integrated with the optical conduit, wherein the optical conduit is coupled to an image source located near one end of the optical conduit to transmit a two-dimensional optical image from the image source to the other end of the optical conduit by total internal reflection within the optical conduit;
an integrated lens coupled with and receiving the optical image from an optical conduit and presenting a two-dimensional optical image for viewing by a user of the display device.
2. The display device of claim 1, wherein the optical conduit twists a predefined number of degrees to rotate the optical image by the predefined number of degrees.
3. The display device of claim 2, wherein the optical image is rotated the predefined number of degrees as it emerges from the image source.
4. The display device of claim 3, wherein the optical image is rotated back after the optical image passes through the optical conduit.
5. The display device of claim 1, wherein an aspect ratio of the optical image is reversed when the optical image is provided from the image source.
6. The display device of claim 5, wherein the optical conduit is rotated 90 degrees and the aspect ratio of the optical image is normalized after the optical image passes through the optical conduit.
7. The display device of claim 1, further comprising:
an optical shrink lens positioned between the image source and the optical conduit and causing the optical image from the image source to shrink before the optical image is projected into the optical conduit, wherein a shrunk optical image passes through the optical conduit; and
an optical expansion lens positioned beyond the other end of the optical conduit to expand the contracted optical image back to the optical image after the contracted optical image emerges from the optical conduit.
8. The display device of claim 7, wherein the optically contracting optic and the optically expanding optic are equally optically opposite.
9. The display device of claim 1, wherein the image source further comprises:
an optical cube comprising two triangular halves with a film layer integrated therebetween;
a light source; and
wherein the light source and the microdisplay are both mounted on two different surfaces of the optical cube.
10. The display device according to claim 1, wherein it further comprises:
an active optical cable coupling the image source to a portable device carried by a user,
the active optical cable includes a first end and a second end, at least one optical fiber coupling the first end with the second end, and two wires.
11. A display device, comprising:
a temple;
an optical block that receives a two-dimensional optical image from the microdisplay;
at least one optical conduit having a first end and a second end, the optical conduit integrated within the temple, the first end coupled to the optics block and receiving a two-dimensional optical image, the optical image transported to the second end by total internal reflection within the optical conduit; and
an integrated lens coupled to the second end, receiving a two-dimensional optical image from the optical conduit, and presenting the optical image for viewing by a user of the display device.
12. The display device of claim 11, further comprising:
an active optical cable comprising at least one optical fiber and two wires, wherein the optical fiber is used to transmit at least control signals to control the microdisplay and the two wires, one for power and the other for ground, are used to provide energy to the microdisplay.
13. The display device of claim 12, wherein the optical conduit includes a plurality of optical fibers integrated to form a single piece having a predefined shape to be used as part of or enclosed in the temple.
14. The display device of claim 13, wherein the optical image received at the first end of the optical conduit is transported to the second end of the optical conduit by total internal reflection within the optical fiber.
15. The display device of claim 11, wherein the optical conduit is rotated a predefined number of degrees to rotate the optical image the predefined number of degrees.
16. The display device of claim 15, wherein the received optical image has a V: h, the predefined degree is 90 degrees, and the optical image coming out of the second end has an aspect ratio of H: V.
17. The display device of claim 11, further comprising:
a first optic positioned between the optics block and the optical conduit, wherein the optical image is optically contracted prior to delivery through the optical conduit; and
a second optic positioned near the second end of the conduit, wherein a distorted optical image emerging from the conduit is expanded by the second optic, the first and second optics being conjugate.
18. The display device of claim 11, wherein the optical conduit is encapsulated in a material to form part of the temple.
19. The display device of claim 11, wherein the lens comprises:
a prism that receives the optical image projected onto a first edge of the prism, the optical image being viewed by a wearer from a second edge of the prism; and
an optical correction optic integrated with the prism to correct an optical path out of the prism.
20. The display device of claim 19, wherein the prism and the optically corrective lens are stacked such that a wearer views through the integrated lens without optical distortion.
CN201910408581.5A 2018-06-03 2019-05-16 Light display device Active CN110196495B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/996,499 US10823966B2 (en) 2016-12-08 2018-06-03 Light weight display glasses
US15/996,499 2018-06-03

Publications (2)

Publication Number Publication Date
CN110196495A CN110196495A (en) 2019-09-03
CN110196495B true CN110196495B (en) 2022-02-15

Family

ID=67752841

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910408581.5A Active CN110196495B (en) 2018-06-03 2019-05-16 Light display device

Country Status (1)

Country Link
CN (1) CN110196495B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102375188A (en) * 2010-07-08 2012-03-14 泰科电子荷兰公司 Method and apparatus for routing optical fibers in flexible circuits
CN103969828A (en) * 2013-01-29 2014-08-06 精工爱普生株式会社 Image display device
CN105988219A (en) * 2015-03-17 2016-10-05 精工爱普生株式会社 Head-mounted display device, and control method for head-mounted display device
CN107870428A (en) * 2016-09-28 2018-04-03 精工爱普生株式会社 Image display device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110057862A1 (en) * 2009-09-07 2011-03-10 Hsin-Liang Chen Image display device
JP2015511334A (en) * 2012-02-21 2015-04-16 コーニング オプティカル コミュニケーションズ リミテッド ライアビリティ カンパニー Structure and method for thermal management in an active optical cable (AOC) assembly
WO2014155288A2 (en) * 2013-03-25 2014-10-02 Ecole Polytechnique Federale De Lausanne (Epfl) Method and apparatus for head worn display with multiple exit pupils
US10133532B2 (en) * 2015-09-25 2018-11-20 Seiko Epson Corporation Display system, display device, information display method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102375188A (en) * 2010-07-08 2012-03-14 泰科电子荷兰公司 Method and apparatus for routing optical fibers in flexible circuits
CN103969828A (en) * 2013-01-29 2014-08-06 精工爱普生株式会社 Image display device
CN105988219A (en) * 2015-03-17 2016-10-05 精工爱普生株式会社 Head-mounted display device, and control method for head-mounted display device
CN107870428A (en) * 2016-09-28 2018-04-03 精工爱普生株式会社 Image display device

Also Published As

Publication number Publication date
CN110196495A (en) 2019-09-03

Similar Documents

Publication Publication Date Title
CN110196494B (en) Wearable display system and method for delivering optical images
US11391951B2 (en) Dual depth exit pupil expander
US20190129182A1 (en) Integrated lenses in wearable display devices
US10353213B2 (en) See-through display glasses for viewing 3D multimedia
US10823966B2 (en) Light weight display glasses
US9946075B1 (en) See-through display glasses for virtual reality and augmented reality applications
US10725301B2 (en) Method and apparatus for transporting optical images
WO2014085102A1 (en) Dual axis internal optical beam tilt for eyepiece of an hmd
US20190129184A1 (en) Laser-based display engine in wearable display devices
US11662575B2 (en) Multi-depth exit pupil expander
CN108267859B (en) Display equipment for displaying 3D multimedia
US20210294107A1 (en) Optical image generators using miniature display panels
US11526014B2 (en) Near eye display projector
US20190129183A1 (en) Integrated frames in wearable display devices
KR20220134774A (en) Polarization-multiplexed optics for head-worn display systems
CN110196495B (en) Light display device
CN110286486B (en) Method for conveying optical images
US20190162967A1 (en) Light weight display glasses using an active optical cable
US11163177B2 (en) See-through display glasses with single imaging source
CN110297327A (en) Using the lightweight display device of active optical cable
US11002967B2 (en) Method and system for communication between a wearable display device and a portable device
US20200018961A1 (en) Optical image generators using miniature display panels
US11231589B2 (en) Ultralight wearable display device
CN114675418A (en) Ultra lightweight wearable display device and method for display device
CN115903235A (en) Display device for displaying hologram and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant