WO2015103638A1 - System, method, and apparatus for displaying an image with reduced color breakup - Google Patents

System, method, and apparatus for displaying an image with reduced color breakup Download PDF

Info

Publication number
WO2015103638A1
WO2015103638A1 PCT/US2015/010377 US2015010377W WO2015103638A1 WO 2015103638 A1 WO2015103638 A1 WO 2015103638A1 US 2015010377 W US2015010377 W US 2015010377W WO 2015103638 A1 WO2015103638 A1 WO 2015103638A1
Authority
WO
WIPO (PCT)
Prior art keywords
subframe
illumination sequence
subframe illumination
image
sequence
Prior art date
Application number
PCT/US2015/010377
Other languages
English (en)
French (fr)
Inventor
Allan Thomas EVANS
Original Assignee
Avegant Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avegant Corporation filed Critical Avegant Corporation
Publication of WO2015103638A1 publication Critical patent/WO2015103638A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/48Laser speckle optics
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2006Lamp housings characterised by the light source
    • G03B21/2033LED or laser light sources
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/208Homogenising, shaping of the illumination light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/315Modulator illumination systems
    • H04N9/3161Modulator illumination systems using laser light sources
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/007Optical devices or arrangements for the control of light using movable or deformable optical elements the movable or deformable optical element controlling the colour, i.e. a spectral characteristic, of the light
    • G02B26/008Optical devices or arrangements for the control of light using movable or deformable optical elements the movable or deformable optical element controlling the colour, i.e. a spectral characteristic, of the light in the form of devices for effecting sequential colour changes, e.g. colour wheels
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1006Beam splitting or combining systems for splitting or combining different wavelengths
    • G02B27/102Beam splitting or combining systems for splitting or combining different wavelengths for generating a colour image from monochromatic image signal sources
    • G02B27/1026Beam splitting or combining systems for splitting or combining different wavelengths for generating a colour image from monochromatic image signal sources for use with reflective spatial light modulators
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/149Beam splitting or combining systems operating by reflection only using crossed beamsplitting surfaces, e.g. cross-dichroic cubes or X-cubes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/02Diffusing elements; Afocal elements

Definitions

  • the invention is system, method, and apparatus (collectively the "system") for displaying images. More specifically, the invention is a system that reduces the color breakup or rainbow effect in the display of video images.
  • the "rainbow effect” is a well-known anomaly with respect to digital light processing (DLP)
  • DLP digital light processing
  • the phenomenon is even described in Wikipedia and illustrated in a video posted on YouTube.
  • the color in a DLP produced image is traditionally produced by a spinning filter commonly referred to a color wheel.
  • DLP projectors that no longer use a mechanical color wheel still produce a "rainbow effect" in the displayed images.
  • the "rainbow effect” has been described as a brief flash of colors when the viewer rapidly looks from side to side on the screen or looks rapidly from the screen to side of the room. These flash of colors look like small flickering rainbows.
  • the "rainbow effect” is not a desirable anomaly for viewers. It would be desirable to eliminate or at least further reduce instances of the "rainbow effect”.
  • the invention is system, method, and apparatus (collectively the "system") for displaying images. More specifically, the invention is a system that reduces the color breakup or rainbow effect in the display of video images.
  • the system uses subframe illumination sequences that are not identical to each other in order to eliminate or at least substantially reduce the "color breakup" or "rainbow effect" of conventional DLP projectors.
  • the differences between the subframe illumination sequences can be relatively significant.
  • Figure 1 a is a block diagram illustrating an example of a subframe illumination sequence in the prior art. Pulses of red, green, and blue light are used to formulate the resulting image, but the subframe illumination sequences are all identical to each other.
  • Figure 1 b is a composition diagram illustrating an example of a prior art video that is displayed using identical subframe illumination sequences.
  • the video is comprised of numerous individual frames. Each frame is produced by the processing of one or more subframe illumination sequences.
  • Figure 1 c is a block diagram illustrating an example of various subframe illumination sequence attributes in the prior art.
  • a sequence is defined by the order of colors, the intensity of the pulses, the length of the gap between pulses, the duration of the pulse, and the pulsed pixels (i.e. the color map).
  • Figure 1 d is a composition diagram similar to the prior art diagram of Figure 1 b, except that the subframe illumination sequences are not identical.
  • Figure 1 e is a flow chart diagram illustrating an example of pulsing a series of subframes with colored light.
  • Figure 2a is a block diagram illustrating an example of different assemblies, components, and light that can be present in the operation of the system.
  • Figure 2b is a block diagram similar to Figure 2a, except that the disclosed system also includes a projection assembly.
  • Figure 2c is a hierarchy diagram illustrating an example of different components that can be included in an illumination assembly.
  • the subframe illumination sequence is something that is implemented by the light source.
  • Figure 2d is a hierarchy diagram illustrating an example of different components that can be included in an imaging assembly.
  • Figure 2e is a hierarchy diagram illustrating an example of different components that can be included in a projection assembly.
  • Figure 2f is a block diagram illustrating examples of different types of supporting components that can be included in the structure and function of the system.
  • Figure 2g is a flow chart diagram illustrating an example of a method for displaying an image.
  • Figure 3a is a block diagram illustrating an example of a DLP system that has implemented the use of non-identical subframe illumination sequences 854.
  • Figure 3b is a block diagram illustrating a more detailed example of a DLP system.
  • Figure 4a is diagram of a perspective view of a VRD apparatus embodiment of the system.
  • Figure 4b is environmental diagram illustrating an example of a side view of a user wearing a VRD apparatus embodying the system.
  • Figure 4c is a configuration diagram illustrating an example of the components that can be used in a VRD apparatus implementing the use of non-identical subframe illumination sequences.
  • Figure 5a is a hierarchy diagram illustrating an example of the different categories of display systems that the innovative system can be potentially be implemented in, ranging from giant systems such as stadium scoreboards to VRD visor systems that project visual images directly on the retina of an individual user.
  • Figure 5b is a hierarchy diagram illustrating an example of different categories of display apparatuses that close mirrors the systems of Figure 5a.
  • Figure 5c is a perspective view diagram illustrating an example of user wearing a
  • Figure 5d is hierarchy diagram illustrating an example of different display/projection technologies that can be incorporated into the system, such as DLP- based applications.
  • Figure 5e is a hierarchy diagram illustrating an example of different operating modes of the system pertaining to immersion and augmentation.
  • Figure 5f is a hierarchy diagram illustrating an example of different operating modes of the system pertaining to the use of sensors to detect attributes of the user and/or the user's use of the system.
  • Figure 5g is a hierarchy diagram illustrating an example of different categories of system implementation based on whether or not the device(s) are integrated with media player components.
  • Figure 5h is hierarchy diagram illustrating an example of two roles or types of users, a viewer of an image and an operator of the system.
  • Figure 5i is a hierarchy diagram illustrating an example of different attributes that can be associated with media content.
  • Figure 5j is a hierarchy diagram illustrating examples of different contexts of images.
  • the invention is system, method, and apparatus (collectively the “system") for displaying images. More specifically, the invention is a system that reduces the color breakup or "rainbow effect" in the display of video images.
  • the system utilizes subframe illumination sequences that are not identical to each other. Doing this can eliminate or at least substantially reduce the "rainbow effect" complained of by some viewers.
  • the prior art utilizes identical subframe illumination sequences. This practice originates from the dependence on color wheels, but the practice continues today even though there are alternative mechanisms for imbuing color into a projected image.
  • the subframe illumination sequence is about a sequence of pulsing light (a pulse) to create a partial image (a subframe).
  • FIG. 1 a is a block diagram illustrating an example of a subframe illumination sequence 854 in the prior art.
  • An image 880 is created by transmitting a subframe 852 of the various colors in a preordained sequence that can be referred to as a subframe illumination sequence 854.
  • the image 880 seen by a viewer 96 is the result of three subimages or subframes.
  • the first subframe 852 consists of the red pixels required to construct the image 880.
  • the second subframe 852 consists of the green pixels required to construct the image 880.
  • the third subframe 852 consists of the blue pixels required to construct the image 880.
  • This sequence of the three subframes 852 is used to convey to the viewer 96 a single image 880 such as a frame 882 in a video 890.
  • the subframe illumination sequence 854 can be implemented in a variety of different ways, such as through the use of a color wheel 240, the use of multiple light sources 210, each generating a differently colored light, and other techniques known in the prior art. Different prior art approaches may involve 6 colors instead of 3, and other variations of the process.
  • One common denominator shared by the prior art is the use of subframe illumination sequences 854 that are identical to each other.
  • the identical replication of subframe illumination sequences 854 affirmatively contributes to the "rainbow" effect perceived by many viewers 96 in watching a video 890, particularly when viewed through a DLP projector.
  • Figure 1 b is a composition diagram illustrating an example of a prior art video 890 that is displayed using identical subframe illumination sequences 854.
  • the video 890 is comprised of numerous individual frames 882.
  • Each frame 882 is produced by the processing of one or more subframe illumination sequences 854, which involve pulses 860 of light that become subframes 852 of the image 880 when the pulse 860 of light reaches the imaging assembly 300.
  • the subframe illumination sequences 854 are identical with respect to the order of the colors, the intensity of the pulses, the length of a gap (if any), the duration of the pulses, and the specific pulsed pixels with respect to each color (i.e. a color map).
  • Each frame 882 is formulated by one or more subframe illumination sequences 854. The figure illustrates such sequences 854 for only one frame 882 (frame 3) due to space limitations. However, the process applies to each individual frame 882 in the video 890.
  • Figure 1 c is a block diagram illustrating an example of various subframe illumination sequence attributes 870 in the prior art.
  • a sequence is defined by a color order 871 , a pulse intensity 872, a gap length 873, a pulse duration 874, and a pulsed pixel set, i.e. color map 875.
  • these attributes 870 were precisely identical because a color wheel 240 was the way of implementing the different pulses of colored light.
  • the creation of alternatives to the color wheel 240 have not resulted in the use of differing subframe illumination sequences 854.
  • the core innovation of the system 100 is the use of subframe illumination sequences 865 that are not identical to each other.
  • the differences between sequences 865 can be substantial or relatively minor while still advancing the cause of eliminating or at least reducing the "rainbow effect".
  • Figure 1 d is a composition diagram similar to the prior art diagram of Figure 1 b, except that the subframe illumination sequences 854 are not identical. That fact manifests itself by each sequence 854 possesses a unique number (1 -N) and each subframe 882 possess a unique number (subframes 1 -6) with respect to the particular frame 882. No numbers are repeated.
  • pulse 1 (860) could be identical to pulse 4 (860) and pulse 2 (860) could be identical to pulse 5 (860), but if pulse 3 (860) and pulse 6 (860) differ with respect to at least one subframe illumination sequence attribute (870), then the two sequences 854 are not identical.
  • Non-identical subframe sequences 854 means that there is at least one difference between the collective attributes (870). The difference could be in the color order 871 .
  • sequence 1 could have color order 871 of red-green-blue but sequence 2 could have a color order 871 of blue-green-red, red-blue-green, or some other different color order 871 with all other attributes 870 remaining identical.
  • the difference in sequences 854 could pertain to pulse intensity 872.
  • the pulses 860 used to create the subframes 852 can vary between pulses, or even during the duration of a pulse 860.
  • Gap length 873 (which can also be referred to as gap duration 873) is another potential useful attribute 873 for variation.
  • Traditional color wheels 240 do not utilize gaps between colors. There are no gaps, and thus the gap lengths are zero. In some prior art approaches, there may be pulses of white light or of no light whatsoever. Such periods are "gaps" and the duration of those periods are gap lengths 873. In some embodiments of the system 100, altering the gap lengths 873 between sequences 854 can be a highly effective tool.
  • Duration 874 (which can also be referred to as pulse duration 874) refers to the duration of a pulse 860.
  • the variables of pulse intensity 872, gap duration 873, and pulse duration 874 can involve substantial interplay between them.
  • the attribute 870 of pulsed pixels 875 (which can also be referred to as a color map) refers to the pixels being pulsed. For example in a first red pulse 860 there may be additional pixels or conversely fewer pixels being pulsed with light.
  • FIG. 1 e is a process flow diagram illustrating an example of the core process.
  • a first sequence of light pulses 860 are implemented in accordance with a first subframe illumination sequence 854.
  • a second sequence of light pulses 860 are implemented in accordance with a second subframe illumination sequence 854.
  • the differences between the two sequences 854 can pertain to even a single attribute 870 in a single pulse 860.
  • FIG. 2a is a block diagram of a system 100 comprised of an illumination assembly 200 that supplies light 800 to an imaging assembly 300.
  • a modulator 320 of the imaging assembly 300 uses the light 800 from the illumination assembly 200 to create the image 880 that is displayed by the system 100.
  • the system 100 can also include a projection assembly 400 that directs the image 880 from the imaging assembly 300 to a location where it can be accessed by one or more users 90.
  • the image 880 generated by the imaging assembly 300 will often be modified in certain ways before it is displayed by the system 100 to users 90, and thus the image generated by the imaging assembly 300 can also be referred to as an interim image 850 or a work-in-process image 850.
  • An illumination assembly 200 performs the function of supplying light 800 to the system 100 so that an image 880 can be displayed.
  • the illumination assembly 200 can include a light source 210 for generating light 800. It is the light source 210 that ultimately implements the subframe illumination sequence 854 because it is the light source 210 that supplies light 800 to the system 100.
  • Figure 2c is a hierarchy diagram illustrating an example of different components that can be included in the illumination assembly 200.
  • Those components can include but are not limited a wide range of light sources 210, a diffuser assembly 280, and a variety of supporting components 150.
  • Examples of light sources 210 can include but are such as a multi-bulb light source 21 1 , an LED lamp 212, a 3 LED lamp 213, a laser 214, an OLED 215, a CFL 216, an incandescent lamp 218, and a non-angular dependent lamp 219.
  • the light source 210 is where light 800 is generated and moves throughout the rest of the system 100. Thus, each light source 210 is a location 230 for the origination of light 800.
  • a 3 LED lamp as a light source, which one LED designated for each primary color of red, green, and blue.
  • An imaging assembly 300 performs the function of creating the image 880 from the light 800 supplied by the illumination assembly 200.
  • a modulator 320 can transform the light 800 supplied by the illumination assembly 200 into the image 880 that is displayed by the system 100.
  • the image 880 generated by the imaging assembly 300 can sometimes be referred to as an interim image 850 because the image 850 may be focused or otherwise modified to some degree before it is directed to the location where it can be experienced by one or more users 90.
  • Imaging assemblies 300 can vary significantly based on the type of technology used to create the image. Display technologies such as DLP (digital light processing), LCD (liquid-crystal display), LCOS (liquid crystal on silicon), and other methodologies can involve substantially different components in the imaging assembly 300.
  • DLP digital light processing
  • LCD liquid-crystal display
  • LCOS liquid crystal on silicon
  • Figure 2d is a hierarchy diagram illustrating an example of different components that can be utilized in the imaging assembly 300 for the system 100.
  • a prism 310 can be very useful component in directing light to and/or from the modulator 320.
  • DLP applications will typically use an array of TIR prisms 31 1 or RTIR prisms 312 to direct light to and from a DMD 324.
  • a modulator 320 (sometimes referred to as a light modulator 320) is the device that modifies or alters the light 800, creating the image 880 that is to be displayed. Modulators 320 can operate using a variety of different attributes of the modulator 320.
  • a reflection-based modulator 322 uses the reflective-attributes of the modulator 320 to fashion an image 880 from the supplied light 800. Examples of reflection-based modulators 322 include but are not limited to the DMD 324 of a DLP display and some LCOS (liquid crystal on silicon) panels 340.
  • a transmissive-based modulator 321 uses the transmissive-attributes of the modulator 320 to fashion an image 880 from the supplied light 800.
  • transmissive-based modulators 321 include but are not limited to the LCD (liquid crystal display) 330 of an LCD display and some LCOS panels 340.
  • the imaging assembly 300 for an LCOS or LCD system 100 will typically have a combiner cube or some similar device for integrating the different one-color images into a single image 880.
  • the imaging assembly 300 can also include a wide variety of supporting components 150.
  • a projection assembly 400 can perform the task of directing the image 880 to its final destination in the system 100 where it can be accessed by users 90.
  • the image 880 created by the imaging assembly 300 will be modified in at least some minor ways between the creation of the image 880 by the modulator 320 and the display of the image 880 to the user 90.
  • the image 880 generated by the modulator 320 of the imaging assembly 400 may only be an interim image 850, not the final version of the image 880 that is actually displayed to the user 90.
  • FIG. 2e is a hierarchy diagram illustrating an example of different components that can be part of the projection assembly 400.
  • a display 410 is the final destination of the image 880, i.e. the location and form of the image 880 where it can be accessed by users 90.
  • Examples of displays 410 can include an active screen 412, a passive screen 414, an eyepiece 416, and a VRD eyepiece 418.
  • the projection assembly 400 can also include a variety of supporting components 150 as discussed below.
  • Light 800 can be a challenging resource to manage. Light 800 moves quickly and cannot be constrained in the same way that most inputs or raw materials can be.
  • Figure 2f is a hierarchy diagram illustrating an example of some supporting components 150, many of which are conventional optical components. Any display technology application will involve conventional optical components such as mirrors 141 (including dichroic mirrors 152) lenses 160, collimators 170, and plates 180. Similarly, any powered device requires a power source 191 and a device capable of displaying an image 880 is likely to have a processor 190.
  • the system 100 can be described as the interconnected functionality of an illumination assembly 200, an imaging assembly 300, and a projection assembly 400.
  • the system 100 can also be described in terms of a method 900 that includes an illumination process 910, an imaging process 920, and a projection process 930.
  • the system 100 can be implemented with respect to a wide variety of different display technologies, including but not limited to DLP.
  • DLP Embodiments
  • FIG 3a illustrates an example of a DLP system 141 , i.e. an embodiment of the system 100 that utilizes DLP optical elements.
  • DLP systems 141 utilize a DMD 314 (digital micromirror device) comprised of millions of tiny mirrors as the modulator 320. Each micro mirror in the DMD 314 can pertain to a particular pixel in the image 880.
  • DMD 314 digital micromirror device
  • the illumination assembly 200 includes a light source 210 and multiple diffusers 282.
  • the light 800 then passes to the imaging assembly 300.
  • Two TIR prisms 31 1 direct the light 800 to the DMD 314, the DMD 314 creates an image 880 with that light 800, and the TIR prisms 31 1 then direct the light 800 embodying the image 880 to the display 410 where it can be enjoyed by one or more users 90.
  • Figure 3b is a more detailed example of a DLP system 141 .
  • the illumination assembly 200 includes one or more lenses 160, typically a condensing lens 160 and then a shaping lens 160 (not illustrated) is used to direct the light 800 to the array of TIR prisms 31 1 .
  • a lens 160 is positioned before the display 410 to modify/focus image 880 before providing the image 880 to the users 90.
  • Figure 3b also includes a more specific term for the light 800 at various stages in the process.
  • the system 100 can be implemented in a wide variety of different configurations and scales of operation. However, the original inspiration for the conception of using non- identical subframe illumination sequences 854 occurred in the context of a VRD visor system 106 embodied as a VRD visor apparatus 1 16.
  • a VRD visor apparatus 1 16 projects the image 880 directly onto the eyes of the user 90.
  • the VRD visor apparatus 1 16 is a device that can be worn on the head of the user 90.
  • the VRD visor apparatus 1 16 can include sound as well as visual capabilities. Such embodiments can include multiple modes of operation, such as visual only, audio only, and audio-visual modes. When used in a non-visual mode, the VRD apparatus 1 16 can be configured to look like ordinary headphones.
  • Figure 4a is a perspective diagram illustrating an example of a VRD visor apparatus 1 16.
  • Two VRD eyepieces 418 provide for directly projecting the image 880 onto the eyes of the user 90.
  • Figure 4b is a side view diagram illustrating an example of a VRD visor apparatus 1 16 being worn on the head 94 of a user 90. The eyes 92 of the user 90 are blocked by the apparatus 1 16 itself, with the apparatus 1 16 in a position to project the image 880 on the eyes 92 of the user 90.
  • Figure 4c is a component diagram illustrating an example of a VRD visor apparatus 1 16 for the left eye 92. A mirror image of Figure 4c would pertain to the right eye 92.
  • a 3 LED light source 213 generates the light which passes through a condensing lens 160 that directs the light 800 to a mirror 151 which reflects the light 800 to a shaping lens 160 prior to the entry of the light 800 into an imaging assembly 300 comprised of two TIR prisms 31 1 and a DMD 314.
  • the interim image 850 from the imaging assembly 300 passes through another lens 160 that focuses the interim image 850 into a final image 880 that is viewable to the user 90 through the eyepiece 416.
  • the system 100 represents a substantial improvement over prior art display technologies. Just as there are a wide range of prior art display technologies, the system 100 can be similarly implemented in a wide range of different ways.
  • the innovation of altering the subframe illumination sequence 854 within a particular frame 882 can be implemented at a variety of different scales, utilizing a variety of different display technologies, in both immersive and augmenting contexts, and in both one-way (no sensor feedback from the user 90) and two-way (sensor feedback from the user 90 ) embodiments.
  • Display devices can be implemented in a wide variety of different scales.
  • the monster scoreboard at EverBanks Field (home of the Jacksonville Jaguars) is a display system that is 60 feet high, 362 feet long, and comprised of 35.5 million LED bulbs. The scoreboard is intended to be viewed simultaneously by tens of thousands of people.
  • the GLYPHTM visor by Avegant Corporation is a device that is worn on the head of a user and projects visual images directly in the eyes of a single viewer. Between those edges of the continuum are a wide variety of different display systems.
  • the system 100 displays visual images 808 to users 90 with enhanced light with reduced coherence.
  • the system 100 can be potentially implemented in a wide variety of different scales.
  • Figure 5a is a hierarchy diagram illustrating various categories and subcategories pertaining to the scale of implementation for display systems generally, and the system 100 specifically. As illustrated in Figure 5a, the system 100 can be implemented as a large system 101 or a personal system 103
  • a large system 101 is intended for use by more than one simultaneous user 90.
  • Examples of large systems 101 include movie theater projectors, large screen TVs in a bar, restaurant, or household, and other similar displays.
  • Large systems 101 include a subcategory of giant systems 102, such as stadium scoreboards 102a, the Time Square displays 102b, or other or the large outdoor displays such as billboards off the expressway.
  • a personal system 103 is an embodiment of the system 100 that is designed to for viewing by a single user 90.
  • Examples of personal systems 103 include desktop monitors 103a, portable TVs 103b, laptop monitors 103c, and other similar devices.
  • the category of personal systems 103 also includes the subcategory of near-eye systems 104.
  • a near-eye system 104 is a subcategory of personal systems 103 where the eyes of the user 90 are within about 12 inches of the display.
  • Near-eye systems 104 include tablet computers 104a, smart phones 104b, and eye-piece applications 104c such as cameras, microscopes, and other similar devices.
  • the subcategory of near-eye systems 104 includes a subcategory of visor systems 105.
  • a visor system 105 is a subcategory of near-eye systems 104 where the portion of the system 100 that displays the visual image 200 is actually worn on the head 94 of the user 90. Examples of such systems 105 include virtual reality visors, Google Glass, and other conventional head-mounted displays 105a.
  • the category of visor systems 105 includes the subcategory of VRD visor systems 106.
  • a VRD visor system 106 is an implementation of a visor system 105 where visual images 200 are projected directly on the eyes of the user.
  • the technology of projecting images directly on the eyes of the viewer is disclosed in a published patent application titled "IMAGE GENERATION SYSTEMS AND IMAGE GENERATING METHODS" (U.S. Serial Number 13/367,261 ) that was filed on February 6, 2012, the contents of which are hereby incorporated by reference. It is anticipated that a VRD visor system 106 is particularly well suited for the implementation of the multiple diffuser 140 approach for reducing the coherence of light 210.
  • FIG. 5b is a hierarchy diagram illustrating an example of different categories and subcategories of apparatuses 1 10.
  • Figure 5b closely mirrors Figure 5a.
  • the universe of potential apparatuses 1 10 includes the categories of large apparatuses 1 1 1 and personal apparatuses 1 13.
  • Large apparatuses 1 1 1 include the subcategory of giant apparatuses 1 12.
  • the category of personal apparatuses 1 13 includes the subcategory of near-eye apparatuses 1 14 which includes the subcategory of visor apparatuses 1 15.
  • VRD visor apparatuses 1 16 comprise a category of visor apparatuses 1 15 that implement virtual retinal displays, i.e. they project visual images 200 directly into the eyes of the user 90.
  • Figure 5c is a diagram illustrating an example of a perspective view of a VRD visor system 106 embodied in the form of an integrated VRD visor apparatus 1 16 that is worn on the head 94 of the user 90. Dotted lines are used with respect to element 92 because the eyes 92 of the user 90 are blocked by the apparatus 1 16 itself in the illustration.
  • FIG. 5d is a hierarchy diagram illustrating different categories of the system 100 based on the underlying display technology in which the system 200 can be implemented.
  • the system 100 is intended for use as a DLP system 141 , but could be potentially be used as an LCOS system 143 or even an LCD system 142 although the means of implementation would obviously differ and the reasons for implementation may not exist.
  • the system 100 can also be implemented in other categories and subcategories of display technologies.
  • Figure 5e is a hierarchy diagram illustrating a hierarchy of systems 100 organized into categories based on the distinction between immersion and augmentation.
  • Some embodiments of the system 100 can have a variety of different operating modes 120.
  • An immersion mode 121 has the function of blocking out the outside world so that the user 90 is focused exclusively on what the system 100 displays to the user 90.
  • an augmentation mode 122 is intended to display visual images 200 that are superimposed over the physical environment of the user 90.
  • the distinction between immersion and augmentation modes of the system 100 is particularly relevant in the context of near-eye systems 104 and visor systems 105.
  • system 100 can be configured to operate either in immersion mode or augmentation mode, at the discretion of the user 90. While other embodiments of the system 100 may possess only a single operating mode 120.
  • Figure ff is a hierarchy diagram that reflects the categories of a one-way system 124 (a non-sensing operating mode 124) and a two-way system 123 (a sensing operating mode 123).
  • a two-way system 123 can include functionality such as retina scanning and monitoring. Users 90 can be identified, the focal point of the eyes 92 of the user 90 can potentially be tracked, and other similar functionality can be provided.
  • a one-way system 124 there is no sensor or array of sensors capturing information about or from the user 90.
  • Display devices are sometimes integrated with a media player.
  • a media player is totally separate from the display device.
  • a laptop computer can include in a single integrated device, a screen for displaying a movie, speakers for projecting the sound that accompanies the video images, a DVD or BLU-RAY player for playing the source media off a disk.
  • Such a device is also capable of streaming
  • Figure 5g is a hierarchy diagram illustrating a variety of different categories of systems 100 based on the whether the system 100 is integrated with a media player or not.
  • An integrated media player system 107 includes the capability of actually playing media content as well as displaying the image 880.
  • a non-integrated media player system 108 must communicate with a media player in order to play media content.
  • Figure 5h is a hierarchy diagram illustrating an example of different roles that a user 90 can have.
  • a viewer 96 can access the image 880 but is not otherwise able to control the functionality of the system 100.
  • An operator 98 can control the operations of the system 100, but cannot access the image 880.
  • the viewers 96 are the patrons and the operator 98 is the employee of the theater.
  • media content 840 can include a wide variety of different types of attributes.
  • a system 100 for displaying an image 880 is a system 100 that plays media content 840 with a visual attribute 841 .
  • many instances of media content 840 will also include an acoustic attribute 842 or even a tactile attribute.
  • an image 880 is parts of a larger video 890 context.
  • an image 880 can be stand-alone still frame 882.
  • Table 1 below sets forth a list of element numbers, names, and descriptions/definitions.
  • a user 90 is a viewer 96 and/or operator 98 of the system 100.
  • the user 90 is typically a human being.
  • users 90 can be different organisms such as dogs or cats, or even automated technologies such as expert systems, artificial intelligence applications, and other similar "entities”.
  • the eye consists of different portions including but not limited to the sclera, iris, cornea, pupil, and retina.
  • Some embodiments of the system 100 involve a VRD visor apparatus 1 16 that can project the desired image 880 directly onto the eye 92 of the user 90.
  • Head The portion of the body of the user 90 that includes the eye 92.
  • Some embodiments of the system 100 can involve a visor apparatus 1 15 that is worn on the head 94 of the user 90.
  • a user 90 of the system 100 who views the image 880 provided by the system 100. All viewers 96 are users 90 but not all users 90 are viewers 96. The viewer 96 does not necessarily control or operate the system 100. The viewer 96 can be a passive beneficiary of the system 100, such as a patron at a movie theater who is not responsible for the operation of the projector or someone wearing a visor apparatus 125 that is controlled by someone else.
  • a user 90 of the system 100 who exerts control over the processing of the system 100. All operators 98 are users 90 but not all users 90 are operators 98. The operator 98 does not necessarily view the images
  • the operator 98 may be someone operating the system 100 for the benefit of others who are viewers 96.
  • the operator 98 of the system 100 may be someone such as a projectionist at a movie theater or the individual controlling the system 100.
  • System A collective configuration of assemblies, subassemblies, components, processes, and/or data that provide a user 90 with the functionality of engaging in a media experience such as viewing an image 890.
  • Some embodiments of the system 100 can involve a single integrated apparatus 1 10 hosting all components of the system 100 while other embodiments of the system 100 can involve different non-integrated device configurations.
  • Some embodiments of the system 100 can be large systems 102 or even giant system 101 while other embodiments of the system 100 can be personal systems 103, such as near-eye systems 104, visor systems 105, and VRD visor systems 106.
  • Systems 100 can also be referred to as media systems 100 or display systems 100.
  • Giant System An embodiment of the system 100 intended to be viewed simultaneously by a thousand or more people.
  • Examples of giant systems 101 include scoreboards at large stadiums, electronic billboards such the displays in Time Square in New York City, and other similar displays.
  • a giant system 100 is a subcategory of large systems 102.
  • a large system 102 is not a personal system 103.
  • the media experience provided by a large system 102 is intended to be shared by a roomful of viewers 96 using the same illumination assembly 200, imaging assembly 300, and projection assembly 400.
  • Examples of large systems 102 include but are not limited to a projector/screen configuration in a movie theater, classroom, or conference room; television sets in sports bar, airport, or residence; and scoreboard displays at a stadium.
  • Large systems 101 can also be referred to as large media systems 101 .
  • personal media systems include desktop computers (often referred to as personal computers), laptop computers, portable televisions, and near-eye systems 104.
  • personal systems 103 can also be referred to as personal media systems 103.
  • Near-eye systems 104 are a subcategory of personal systems 103.
  • Near-Eye A category of personal systems 103 where the media experience is System communicated to the viewer 96 at a distance that is less than or equal to about 12 inches (30.48 cm) away.
  • Examples of near-eye systems 103 include but are not limited to tablet computers, smart phones, and visor media systems 105.
  • Near-eye systems 104 can also be referred to as near-eye media systems 104.
  • Near-eye systems 104 include devices with eye pieces such as cameras, telescopes, microscopes, etc.
  • Visor systems 105 can also be referred to as visor media systems 105.
  • VRD Visor VRD stands for a virtual retinal display. VRDs can also be referred to System as retinal scan displays (“RSD”) and as retinal projectors ("RP"). VRD projects the image 880 directly onto the retina of the eye 92 of the viewer 96.
  • a VRD Visor System 106 is a visor system 105 that utilizes a VRD to display the image 880 on the eyes 92 of the user 90.
  • a VRD visor system 106 can also be referred to as a VRD visor media system 106.
  • the apparatus 1 10 can include the illumination assembly 200, the imaging assembly 300, and the projection assembly 400. Some embodiments of the apparatus 1 10 can include a media player 848 while other embodiments of the apparatus 1 10 are configured to connect and communicate with an external media player 848. Different configurations and connection technologies can provide varying degrees of "plug and play" connectivity that can be easily installed and removed by users 90.
  • Common examples of a giant apparatus 1 1 1 include the scoreboards at a professional sports stadium or arena.
  • An apparatus 1 10 implementing an embodiment of a large system Apparatus 102.
  • large apparatuses 1 1 1 include movie theater projectors and large screen television sets.
  • a large apparatus 1 1 1 is typically positioned on a floor or some other support structure.
  • a large apparatus 1 1 1 such as a flat screen TV can also be mounted on a wall.
  • Personal Media An apparatus 1 10 implementing an embodiment of a personal system Apparatus 103. Many personal apparatuses 1 12 are highly portable and are supported by the user 90. Other embodiments of personal media apparatuses 1 12 are positioned on a desk, table, or similar surface. Common examples of personal apparatuses 1 12 include desktop computers, laptop computers, and portable televisions.
  • visor apparatuses 1 15 are held in the hand of the user 90.
  • Examples of near-eye apparatuses 1 14 include smart phones, tablet computers, camera eye-pieces and displays, microscope eye-pieces and displays, gun scopes, and other similar devices.
  • Visor An apparatus 1 10 implementing an embodiment of a visor system 105. Apparatus The visor apparatus 1 15 is worn on the head 94 of the user 90. The visor apparatus 1 15 can also be referred simply as a visor 1 15.
  • VRD Visor An apparatus 1 10 in a VRD visor system 106. Unlike a visor apparatus Apparatus 1 14, the VRD visor apparatus 1 15 includes a virtual retinal display that projects the visual image 200 directly on the eyes 92 of the user 90.
  • the system 100 can be implemented in such a Modes way as to support distinct manners of operation.
  • the user 90 can explicitly or implicitly select which operating mode 120 controls.
  • the system 100 can determine the applicable operating mode 120 in accordance with the processing rules of the system 100.
  • the system 100 is implemented in such a manner that supports only one operating mode 120 with respect to a potential feature. For example, some systems 100 can provide users 90 with a choice between an immersion mode 121 and an augmentation mode 122, while other embodiments of the system 100 may only support one mode 120 or the other.
  • Immersion An operating mode 120 of the system 100 in which the outside world is at least substantially blocked off visually from the user 90, such that the images 880 displayed to the user 90 are not superimposed over the actual physical environment of the user 90.
  • the act of watching a movie is intended to be an immersive experience.
  • Augmentation An operating mode 120 of the system 100 in which the image 880 displayed by the system 100 is added to a view of the physical environment of the user 90, i.e. the image 880 augments the real world.
  • Google Glass is an example of an electronic display that can function in an augmentation mode.
  • Sensing An operating mode 120 of the system 100 in which the system 100 captures information about the user 90 through one or more sensors. Examples of different categories of sensing can include eye tracking pertaining to the user's interaction with the displayed image 880, biometric scanning such as retina scans to determine the identity of the user 90, and other types of sensor readings/measurements.
  • the system 100 can be Technology implemented using a wide variety of different display technologies.
  • DLP System An embodiment of the system 100 that utilizes digital light processing
  • DLP to compose an image 880 from light 800.
  • LCD System An embodiment of the system 100 that utilizes liquid crystal display
  • LCD liquid crystal
  • LCOS System An embodiment of the system 100 that utilizes liquid crystal on silicon
  • LCOS LCOS
  • the text and drawings of a patent are not intended to serve as product blueprints.
  • One of ordinary skill in the art can devise multiple variations of supplementary components 150 that can be used in conjunction with the innovative elements listed in the claims, illustrated in the drawings, and described in the text.
  • Mirrors 151 An object that possesses at least a non-trivial magnitude of reflectivity with respect to light. Depending on the context, a particular mirror could be virtually 100% reflective while in other cases merely 50% reflective. Mirrors 151 can be comprised of a wide variety of different materials.
  • a lens 160 Lens An object that possesses at least a non-trivial magnitude of transmissivity. Depending on the context, a particular lens could be virtually 100% transmissive while in other cases merely about 50% transmissive. A lens 160 is often used to focus light 800.
  • processors 190 A central processing unit (CPU) that is capable of carrying out the instructions of a computer program.
  • the system 100 can use one or more processors 190 to communicate with and control the various components of the system 100.
  • Power Source A source of electricity for the system 100.
  • Examples of power sources include various batteries as well as power adaptors that provide for a cable to provide power to the system 100.
  • Illumination A collection of components used to supply light 800 to the imaging Assembly assembly 300. Common example of components in the illumination assembly 200 include light sources 210 and diffusers 282. The illumination assembly 200 can also be referred to as an illumination subsystem 200.
  • reflection-based light modulators 322 include LCDs 330 and LCOSs
  • Modulator reflection-based light modulators 322 include DMDs 324 and LCOSs
  • a DMD 324 is typically comprised of a several thousand microscopic mirrors arranged in an array on a processor 190, with the individual microscopic mirrors corresponding to the individual pixels in the image 880.
  • a liquid crystal LCD display that uses the light modulating properties of liquid crystals.
  • Each pixel of an LCD typically consists of a layer of molecules aligned between two transparent electrodes, and two polarizing filters (parallel and perpendicular), the axes of transmission of which are (in most of the cases) perpendicular to each other. Without the liquid crystal between the polarizing filters, light passing through the first filter would be blocked by the second (crossed) polarizer.
  • Some LCDs are transmissive while other LCDs are transflective.
  • LCOS Panel or A light modulator 320 in an LCOS (liquid crystal on silicon) display.
  • An LCOS 244 can be transmissive or reflective.
  • Connbiner colors of light 800 to formulate an image 880 Connbiner colors of light 800 to formulate an image 880.
  • the projection assembly 400 Projection A collection of components used to make the image 880 accessible to Assembly the user 90.
  • the projection assembly 400 includes a display 410.
  • the projection assembly 400 can also include various supporting components 150 that focus the image 880 or otherwise modify the interim image 850 transforming it into the image 880 that is displayed to one or more users 90.
  • the projection assembly 400 can also be referred to as a projection subsystem 400.
  • the display component 410 Display or An assembly, subassembly, mechanism, or device by which visual Screen image 200 is made accessible to the user 90.
  • the display component 120 can be in the form of a panel 122 that is viewed by the user 90 or a screen 126 onto which the visual image 200 is projected onto by a projector 124.
  • the display component 120 is a retinal projector 128 that projects the visual image 200 directly onto the eyes 92 of the user 90.
  • Passive Screen A non-powered surface on which the image 880 is projected.
  • a conventional movie theater screen is a common example of a passive screen 412.
  • Eyepiece A display 410 positioned directly in front of the eye 92 of an individual user 90.
  • VRD Eyepiece An eyepiece 416 that provides for directly projecting the image 880 on or VRD Display the eyes 92 of the user 90.
  • a VRD eyepiece 418 can also be referred to as a VRD display 418.
  • Light 800 is the media through which an image is conveyed, and light
  • Light 800 is what enables the sense of sight.
  • Light is electromagnetic radiation that is propagated in the form of photons.
  • Light can be coherent light 802, partially coherent light 803, or non-coherent light 804.
  • the image 880 displayed to the user 90 by the system 100 can in many instances, be but part of a broader media experience.
  • a unit of media content 840 will typically include visual attributes 841 and acoustic attributes 842.
  • Tactile attributes 843 are not uncommon in certain contexts. It is anticipated that the olfactory attributes 844 and gustatory attributes 845 may be added to media content 840 in the future.
  • Attributes pertaining to the sense of sight.
  • the core function of the Attributes system 100 is to enable users 90 to experience visual content such as images 880 or video 890.
  • visual content will be accompanied by other types of content, most commonly sound or touch.
  • smell or taste content may also be included as part of the media content 840.
  • Attributes pertaining to the sense of sound.
  • the core function of the Attributes system 100 is to enable users 90 to experience visual content such as images 880 or video 890.
  • media content 840 will also involve other types of senses, such as the sense of sound .
  • the system 100 and apparatuses 1 10 embodying the system 100 can include the ability to enable users 90 to experience tactile attributes 843 included with other types of media content 840.
  • the system 100 and apparatuses 1 10 embodying the system 100 can include the ability to enable users 90 to experience tactile attributes 843 included with other types of media content 840.
  • Olfactory Attributes pertaining to the sense of smell It is anticipated that future Attributes versions of media content 840 may include some capacity to engage users 90 with respect to their sense of smell. Such a capacity can be utilized in conjunction with the system 100, and potentially integrated with the system 100.
  • the iPhone app called oSnap is a current example of gustatory attributes 845 being transmitted electronically.
  • Attributes pertaining to the sense of taste may include some capacity to engage users 90 with respect to their sense of taste. Such a capacity can be utilized in conjunction with the system 100, and potentially integrated with the system 100.
  • a media player 848 Media Player
  • the system 100 for displaying the image 880 to one or more users 90 may itself belong to a broader configuration of applications and systems.
  • a media player 848 is device or configuration of devices that provide the playing of media content 840 for users. Examples of media players 848 include disc players such as DVD players and BLU- RAY players, cable boxes, tablet computers, smart phones, desktop computers, laptop computers, television sets, and other similar devices. Some embodiments of the system 100 can include some or all of the aspects of a media player 848 while other embodiments of the system 100 will require that the system 100 be connected to a media player 848.
  • users 90 may connect a VRD apparatus 1 16 to a BLU-RAY player in order to access the media content 840 on a BLU-RAY disc.
  • the VRD apparatus 1 16 may include stored media content 840 in the form a disc or computer memory component.
  • Non-integrated versions of the system 100 can involve media players 848 connected to the system 100 through wired and/or wireless means.
  • the image 880 displayed to user 90 is created by the modulation of light 800 generated by one or light sources 210 in the illumination assembly 200.
  • the image 880 will typically be modified in certain ways before it is made accessible to the user 90. Such earlier versions of the image 880 can be referred to as an interim image 850.
  • Subframe A portion of an image 880 or interim image 850.
  • a DLP projector will illuminate different pixels at different times based on color.
  • a subframe 853 is created by a pulse 860 of light. The particular pixels being illuminated in a subframe 852 can be referred to as color map 875
  • Subframe The sequence at which different subframes 852 are illuminated with Illumination different colors of light (800).
  • a DLP projector has traditionally used a Sequence color wheel 240 to implement the subframe illumination sequence 854. or
  • a pulse 860 Pulse An emission of light generated by the light source 210.
  • a pulse 860 can be defined with respect to color/wavelength, intensity, duration, the applicable pulse pixels (a color map), and an order in a sequence 854.
  • Subframe Characteristics of a subframe illumination sequence 854 and its pulses Illumination 860 include color order 871 , pulse intensity 872, Sequence gap length 873, pulse duration 874, and pulsed pixels 875 (i.e. color Attributes map).
  • the system 100 performs the function of displaying images 880 to one or more users 90.
  • light 800 is modulated into an interim image 850, and subsequent processing by the system 100 can modify that interim image 850 in various ways.
  • the then final version of the interim image 850 is no longer a work in process, but an image 880 that is displayed to the user 90.
  • each image 880 can be referred to as a frame 882.
  • Video 890 Video In some instances, the image 880 displayed to the user 90 is part of a sequence of images 880 can be referred to collectively as a video 890.
  • Video 890 is comprised of a sequence of static images 880 representing snapshots displayed in rapid succession to each other. Persistence of vision in the user 90 can be relied upon to create an illusion of continuity, allowing a sequence of still images 880 to give the impression of motion.
  • the entertainment industry currently relies primarily on frame rates between 24 FPS and 30 FPS, but the system 100 can be implemented at faster as well as slower frame rates.
  • Illumination A process for generating light 800 for use by the system 100.
  • the Method illumination method 910 is a process performed by the illumination assembly 200.
  • Imaging A process for generating an interim image 850 from the light 800 Method supplied by the illumination assembly 200.
  • the imaging method 920 can also involve making subsequent modifications to the interim image 850.
  • the display method 930 Display Method A process for making the image 880 available to users 90 using the interim image 850 resulting from the imaging method 920.
  • the display method 930 can also include making modifications to the interim image 850.

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Lenses (AREA)
  • Liquid Crystal Display Device Control (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
PCT/US2015/010377 2014-01-06 2015-01-06 System, method, and apparatus for displaying an image with reduced color breakup WO2015103638A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461924209P 2014-01-06 2014-01-06
US61/924,209 2014-01-06

Publications (1)

Publication Number Publication Date
WO2015103638A1 true WO2015103638A1 (en) 2015-07-09

Family

ID=53494154

Family Applications (3)

Application Number Title Priority Date Filing Date
PCT/US2015/010377 WO2015103638A1 (en) 2014-01-06 2015-01-06 System, method, and apparatus for displaying an image with reduced color breakup
PCT/US2015/010372 WO2015103633A1 (en) 2014-01-06 2015-01-06 System, method, and apparatus for displaying an image using multiple diffusers
PCT/US2015/010380 WO2015103640A1 (en) 2014-01-06 2015-01-06 Imaging a curved mirror and partially transparent plate

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/US2015/010372 WO2015103633A1 (en) 2014-01-06 2015-01-06 System, method, and apparatus for displaying an image using multiple diffusers
PCT/US2015/010380 WO2015103640A1 (en) 2014-01-06 2015-01-06 Imaging a curved mirror and partially transparent plate

Country Status (4)

Country Link
EP (1) EP3092791A4 (zh)
JP (1) JP2017511496A (zh)
CN (1) CN106464818A (zh)
WO (3) WO2015103638A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106595859B (zh) * 2016-11-01 2019-10-29 清华大学 鬼成像方法和应用其的鬼成像装置
CN110603513B (zh) * 2018-08-27 2023-08-29 深圳市汇顶科技股份有限公司 一种眼睛跟踪设备及利用光学成像跟踪眼睛的方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060238717A1 (en) * 2005-04-22 2006-10-26 Bart Maximus Method and systems for projecting images
US20120069131A1 (en) * 2010-05-28 2012-03-22 Abelow Daniel H Reality alternate
US20130307842A1 (en) * 2012-05-15 2013-11-21 Imagine Mobile Augmented Reality Ltd System worn by a moving user for fully augmenting reality by anchoring virtual objects

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6097543A (en) * 1992-02-07 2000-08-01 I-O Display Systems Llc Personal visual display
US6008781A (en) * 1992-10-22 1999-12-28 Board Of Regents Of The University Of Washington Virtual retinal display
US5886822A (en) * 1996-10-08 1999-03-23 The Microoptical Corporation Image combining system for eyeglasses and face masks
JP3943680B2 (ja) * 1997-01-06 2007-07-11 オリンパス株式会社 映像表示装置
US6097353A (en) * 1998-01-20 2000-08-01 University Of Washington Augmented retinal display with view tracking and data positioning
US7084841B2 (en) * 2000-04-07 2006-08-01 Tibor Balogh Method and apparatus for the presentation of three-dimensional images
US6932090B1 (en) * 2003-02-06 2005-08-23 The United States Of America As Represented By The United States National Aeronautics And Space Administration Motion sickness treatment apparatus and method
US7724210B2 (en) * 2004-05-07 2010-05-25 Microvision, Inc. Scanned light display system using large numerical aperture light source, method of using same, and method of making scanning mirror assemblies
JP2008508621A (ja) * 2004-08-03 2008-03-21 シルバーブルック リサーチ ピーティワイ リミテッド ウォークアップ印刷
US7275826B2 (en) * 2005-08-03 2007-10-02 Carestream Health, Inc. Fundus camera having curved mirror objective
US8376585B2 (en) * 2008-10-28 2013-02-19 Raymond A. Noeth Energy efficient illumination apparatus and method for illuminating surfaces
US20120086917A1 (en) * 2009-09-29 2012-04-12 Sanyo Electric Co., Ltd. Optical unit, projection display apparatus, and optical diffuser
US9134534B2 (en) * 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9529191B2 (en) * 2010-11-03 2016-12-27 Trex Enterprises Corporation Dynamic foveal vision display

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060238717A1 (en) * 2005-04-22 2006-10-26 Bart Maximus Method and systems for projecting images
US20120069131A1 (en) * 2010-05-28 2012-03-22 Abelow Daniel H Reality alternate
US20130307842A1 (en) * 2012-05-15 2013-11-21 Imagine Mobile Augmented Reality Ltd System worn by a moving user for fully augmenting reality by anchoring virtual objects

Also Published As

Publication number Publication date
JP2017511496A (ja) 2017-04-20
EP3092791A1 (en) 2016-11-16
CN106464818A (zh) 2017-02-22
WO2015103640A1 (en) 2015-07-09
WO2015103633A1 (en) 2015-07-09
EP3092791A4 (en) 2017-09-06

Similar Documents

Publication Publication Date Title
US9995857B2 (en) System, apparatus, and method for displaying an image using focal modulation
US9823474B2 (en) System, apparatus, and method for displaying an image with a wider field of view
US20160195718A1 (en) System, method, and apparatus for displaying an image using multiple diffusers
US20170139209A9 (en) System, method, and apparatus for displaying an image using a curved mirror and partially transparent plate
US10409079B2 (en) Apparatus, system, and method for displaying an image using a plate
US20160292921A1 (en) System, apparatus, and method for displaying an image using light of varying intensities
US20170068311A1 (en) System, apparatus, and method for selectively varying the immersion of a media experience
US20160198133A1 (en) System, method, and apparatus for displaying an image with reduced color breakup
US6847489B1 (en) Head-mounted display and optical engine thereof
CN107894666B (zh) 一种头戴式多深度立体图像显示系统及显示方法
US20110057862A1 (en) Image display device
JP7096371B2 (ja) オフアングル分離が強化されたスーパーステレオスコピックディスプレイ
US10353213B2 (en) See-through display glasses for viewing 3D multimedia
Rolland et al. The past, present, and future of head-mounted display designs
US11656466B2 (en) Spatio-temporal multiplexed single panel based mutual occlusion capable head mounted display system and method
Liu et al. Full‐color multi‐plane optical see‐through head‐mounted display for augmented reality applications
CN108732752A (zh) 一种用于虚拟现实、扩增现实的显示设备
WO2015179455A2 (en) Apparatus, system, and method for displaying an image using a plate
WO2015103638A1 (en) System, method, and apparatus for displaying an image with reduced color breakup
CN109154737B (zh) 动态全三维显示
KR101406793B1 (ko) 컬러휠, 이를 채용한 조명유닛, 및 컬러휠을 채용한2d/3d 겸용 영상표시장치
WO2023143505A1 (zh) 一种图像生成装置、显示设备和图像生成方法
Sun et al. Simulation design of a wearable see-through retinal projector
JP2015046848A (ja) 画像表示装置
Bernacki et al. Virtual reality 3D headset based on DMD light modulators

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15733187

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase
122 Ep: pct application non-entry in european phase

Ref document number: 15733187

Country of ref document: EP

Kind code of ref document: A1