EP3146389A2 - Appareil, système et procédé permettant d'afficher une image à l'aide d'une plaque - Google Patents

Appareil, système et procédé permettant d'afficher une image à l'aide d'une plaque

Info

Publication number
EP3146389A2
EP3146389A2 EP15795835.6A EP15795835A EP3146389A2 EP 3146389 A2 EP3146389 A2 EP 3146389A2 EP 15795835 A EP15795835 A EP 15795835A EP 3146389 A2 EP3146389 A2 EP 3146389A2
Authority
EP
European Patent Office
Prior art keywords
light
plate
image
display
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP15795835.6A
Other languages
German (de)
English (en)
Other versions
EP3146389A4 (fr
Inventor
Scott D. DEWALD
Allan Thomas EVANS
Chris WESTRA
Warren Cornelius WELCH, III
Andrew Gross
Geoffrey Michael Hill
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avegant Corp
Original Assignee
Avegant Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/590,953 external-priority patent/US20170139209A9/en
Priority claimed from US14/678,974 external-priority patent/US20170068311A1/en
Application filed by Avegant Corp filed Critical Avegant Corp
Publication of EP3146389A2 publication Critical patent/EP3146389A2/fr
Publication of EP3146389A4 publication Critical patent/EP3146389A4/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/0816Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
    • G02B26/0833Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements the reflecting element being a micromechanical device, e.g. a MEMS mirror, DMD
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/005Projectors using an electronic spatial light modulator but not peculiar thereto
    • G03B21/008Projectors using an electronic spatial light modulator but not peculiar thereto using micromirror devices
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2066Reflectors in illumination beam
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/28Reflectors in projection beam
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3102Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using two-dimensional electronic spatial light modulators
    • H04N9/3111Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using two-dimensional electronic spatial light modulators for displaying the colours sequentially, e.g. by using sequentially activated light sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/315Modulator illumination systems

Definitions

  • the invention is an apparatus, system, and method (collectively the "system") that can display an image to a viewer. More specifically, the system can utilize a plate that is partially transmissive and partially reflective in lieu of expensive prisms such as TIR or RTIR prisms to direct light to and from a modulator.
  • the system can utilize a plate that is partially transmissive and partially reflective in lieu of expensive prisms such as TIR or RTIR prisms to direct light to and from a modulator.
  • Light is an important raw material in any image display device. Light is generated by a light source, modulated into an image, and then finalized and focused into an image that is made accessible to a viewer. Within these different action steps, light must be directed from place to place. Light can be a challenging resource to manage because light is comprised of very small units that are capable of moving independent of each other. Light moves incredibly fast, and light readily changes direction upon hitting different objects. The vision of human beings is based on light bouncing around and hitting different objects and reaching the human eye.
  • the invention is an apparatus, system, and method (collectively the "system") that can display an image to a viewer. More specifically, the system can utilize a plate that is partially transmissive and partially reflective in lieu of expensive prisms such as TIR or RTIR prisms to direct light to and from a modulator.
  • the system can utilize a plate that is partially transmissive and partially reflective in lieu of expensive prisms such as TIR or RTIR prisms to direct light to and from a modulator.
  • the plate serves as a "traffic cop” for light reaching the modulator (such as an DMD) to form an image as well as light leaving the DMD (or other type of modulator) that is modulated to form the desired image.
  • This functionality is typically performed by prisms such as TIR prisms, RTIR prisms, and other prisms known in the art (collectively “prisms").
  • prisms are highly expense, and the system can be implemented without such prisms while still provided viewers with high quality images.
  • the plate of the system can be implemented in a wide variety of different ways using a wide variety of different materials and configurations. Different embodiments of the system can provide specific advantages and functions over mere replacement of the applicable prisms.
  • Figure 1 a is a block diagram illustrating an example of a prior art image display that uses prisms to direct light to and from a DMD.
  • Figure 1 b is a block diagram illustrating an example of a system that utilizes a plate in lieu of a configuration of prisms.
  • Figure 1 c is a block diagram illustrating an example a system that utilizes a plate in lieu of a configuration of prisms. Figure 1 c also illustrates some of the instances where light 800 is lost in the process.
  • Figure 1 d is a flow chart diagram illustrating an example of a method for displaying an image that utilizes a plate.
  • Figure 1 e is a diagram illustrating an example of different light pathways resulting when light travels from an illumination assembly to the plate. About 50% of the light is reflected towards the DMD and about 50% of the light is lost by passing through the plate.
  • Figure 1 f is a diagram illustrating an example of different light pathways resulting when light travels from the DMD towards the plate. About 50% of the light is transmitted through the plate and about 50% of the light is lost by reflection back from the plate.
  • Figure 1 g is a block diagram illustrating an example of a system actively using a plate to display an image.
  • Figure 1 h is a block diagram illustrating an example of a system in a compressed operating mode to reduce the space taken up by the plate.
  • Figure 11 is a block diagram illustrating an example of the position of a plate with respect to two lenses while the system is displaying an image.
  • Figure 1 m is a block diagram illustrating an example of the position of a plate with respect to two lenses while the system is in a compressed operating mode.
  • Figure 1 n is a block diagram illustrating an example of how a plate can function as a traffic cop in directing the flow of light to various assemblies and components of the system.
  • Figure 2a is a block diagram illustrating an example of different assemblies, components, and light that can be present in the operation of the system.
  • Figure 2b is a block diagram similar to Figure 2a, except that the disclosed system also includes a tracking assembly (which can also be referred to as a sensor assembly) and an augmentation assembly
  • Figure 2c is a hierarchy diagram illustrating an example of different components that can be included in an illumination assembly.
  • Figure 2d is a hierarchy diagram illustrating an example of different components that can be included in an imaging assembly.
  • Figure 2e is a hierarchy diagram illustrating an example of different components that can be included in a projection assembly.
  • Figure 2f is a hierarchy diagram illustrating an example of different components that can be included in the sensor assembly (which can also be referred to as a tracking assembly).
  • Figure 2g is hierarchy diagram illustrating examples of different types of supporting components that can be included in the structure and function of the system.
  • Figure 3a is diagram of a perspective view of a VRD apparatus embodiment of the system.
  • Figure 3b is environmental diagram illustrating an example of a side view of a user wearing a VRD apparatus embodying the system.
  • Figure 3c is a configuration diagram illustrating an example of the components that can be used in a VRD apparatus.
  • Figure 4a is a hierarchy diagram illustrating an example of the different categories of display systems that the innovative system can be potentially be implemented in, ranging from giant systems such as stadium scoreboards to VRD visor systems that project visual images directly on the retina of an individual user.
  • Figure 4b is a hierarchy diagram illustrating an example of different categories of display apparatuses.
  • Figure 4c is a perspective view diagram illustrating an example of user wearing a VRD visor apparatus.
  • Figure 4d is hierarchy diagram illustrating an example of different display/projection technologies that can be incorporated into the system, such as DLP- based applications.
  • Figure 4e is a hierarchy diagram illustrating an example of different operating modes of the system pertaining to immersion and augmentation.
  • Figure 4f is a hierarchy diagram illustrating an example of different operating modes of the system pertaining to the use of sensors to detect attributes of the user and/or the user's use of the system.
  • Figure 4g is a hierarchy diagram illustrating an example of different categories of system implementation based on whether or not the device(s) are integrated with media player components.
  • Figure 4h is hierarchy diagram illustrating an example of two roles or types of users, a viewer of an image and an operator of the system.
  • Figure 4i is a hierarchy diagram illustrating an example of different attributes that can be associated with media content.
  • Figure 4j is a hierarchy diagram illustrating examples of different contexts of images.
  • the invention is an apparatus, system, and method (collectively the "system") that can display an image to a viewer. More specifically, the system can utilize a plate that is partially transmissive and partially reflective in lieu of expensive prisms such as TIR or RTIR prisms to direct light to and from a DMD. All element numbers referenced in the text below are referenced in Table 1 provided further below.
  • Any image display system or device can be divided into at least three primary components: (1 ) an illumination assembly that provides light the light in which to form an image; (2) an imaging assembly that modulates that light into what will become the displayed image; and (3) a projection assembly that projects the modulated light to an intended destination where it can be accessed by one or more viewers.
  • the third step of projecting the modulated light typically involves focusing the light and other processes which modify the light in certain respects.
  • the image generated by the imaging assembly is actually only an interim image, since the light comprising the image will be modified in certain ways in the time between it leaves the imaging assembly and reaches the eyes of a viewer.
  • the heart of any image display device is the imaging assembly. That is where a modulator transforms light generated by a light source into something a viewer will want to see.
  • modulators include DMDs, LCOS panels, and LCD panels.
  • a DMD is a reflection-based light modulator. DMD stands for
  • Figure 1 a is a block diagram illustrating an example of prior art approach to the display of an image.
  • the illumination assembly 200 generates light 800. That light encounters a configuration of two prisms 310 which collectively direct the unmodulated light 800 from the illumination assembly 200 towards the DMD 324 and the modulated light 800 from the DMD towards the projection assembly 400 so that the image 880 can be accessed by one or more viewers 96.
  • FIG. 1 a is shows the pathway of light 800 that makes it into the image 880, not the light that is lost during the process.
  • Figure 1 b is a block diagram illustrating an alternative to the prior art approach of Figure 1 a.
  • a plate 340 with both reflective 372 and transmissive 374 properties is used to direct unmodulated light 800 to the DMD 324.
  • the optical chain 870 (which can also be referred to as an optical pathway 870) of light 800 that actually reaches is illustrated in unbroken lines.
  • Figure 1 c is a somewhat less simplified version of Figure 1 b in that some of the lost light 800 is illustrated in the Figure.
  • the dotted horizontal line pointing to the right represents light 800 that was transmitted through the plate 340 rather than being deflected by it. That light 800 is lost to the process of forming an image.
  • the dotted line from the plate 340 directed downwards at an angle towards the DMD 324 represents modulated light 800 from the DMD 324 that was reflected back rather than transmitted through the plate 340.
  • Figure 1 d is a flow chart of a method 900 for displaying an image 880 that utilizes a plate 340.
  • the system 100 generates light 800 utilizing an illumination assembly 200. That light 800 reaches the plate 340. Some of the light from 910 is lost through the transmissive 374 aspects of the plate 340, while other rays of light 910 from 910 are reflected at 922 towards the modulator 320.
  • the modulator 320 modulates the light 800, forming an interim image 850 that is directed back to the plate 340. Some of that light 800 is lost through the reflective 372 characteristics of the plate 340 while other rays of of light 800 are transmitted at 926 for inclusion in the image 880 that is displayed to viewers 96.
  • the plate 340 can be comprised of glass 342, plastic film 344, or combinations of both glass 342 and plastic 344. Some embodiments of the plate 344 can involve multiple layers 346 as well as various coatings 348.
  • the plate 340 can be implemented as a dynamic plate 341 . Plastic film 344 embodiments of the plate 340 can be implementd as modulate film 345 embodiments.
  • the plate 340 can be implemented with an aperture 350 and even dynamic apertures 352 that are changed on an image to image basis.
  • Plates 340 can involve a variety of different gradients 360, including adjustable gradients 362 such as adjustable diffractive gradients 364. Different plates 340 can have different magnitudes of reflectiveness 372 and transmissiveness 374. Some plates 340 can impact the polarization 373 of light 800 that reaches the plate 340. Adjustable gradients 362 can be used to implement desirable optical effects 380.
  • the plate 340 can include holographic elements 382, and be embodied in as a micro lens array 384.
  • the plate 340 can also be embodied in as a collapsible plate 340 so that the plate 340 takes up less room when the system 100 is not displaying images 880.
  • the plate 340 can involve different magnitudes of reflectiveness 372, transmissiveness 374, and polarization 373, but such characteristics can also vary with respect to where the light 800 falls on the spectrum 802 light wavelengths. Some embodiments can involve uniform attributes across a full spectrum 803 of light 803. Other embodiments may differentiate between infrared 806, ultraviolet 807, visible light 804, or even within a partial spectrum of visible light 804.
  • Figures 1 e and 1f illustrate examples of a plate 340 that is approximately 50% reflective 372 and 50% transmissive 374. Many embodiments will involve ranges between about 60/40% and 40/60%. However, the system 100 can be implemented far outside those ranges.
  • Figures 1 g and 11 illustrate examples of the system 100 using a plate 340 to display an image 880.
  • Figure 1 h and 1 m illustrate corresponding examples of such a plate 340 in compressed mode 128, where the plate 340 is collapsed to save space while the system 100 is not being used to display images 880.
  • Figure 1 n is an example of the different assemblies and components that can utilize the plate 340 to perform the function of a "traffic cop" with respect to the flow of light 800.
  • FIG. 1 is a block diagram of a system 100 comprised of an illumination assembly 200 that supplies light 800 to an imaging assembly 300.
  • a modulator 320 of the imaging assembly 300 uses the light 800 from the illumination assembly 200 to create the image 880 that is displayed by the system 100.
  • the diagram is from the point of view of a pathway of light 800 that forms the image 880, so the plate 340 appears twice within the imaging assembly 300 because light 800 touches the plate 340 before reaching the modulator 320 and after leaving the modulator 320.
  • the system 100 can also include a projection assembly 400 that directs the image 880 from the imaging assembly 300 to a location where it can be accessed by one or more users 90, a display 410.
  • the image 880 generated by the imaging assembly 300 will often be modified in certain ways before it is displayed by the system 100 to users 90, and thus the image generated by the imaging assembly 300 can also be referred to as an interim image 850 or a work-in-process image 850.
  • An illumination assembly 200 performs the function of supplying light 800 to the system 100 so that an image 880 can be displayed.
  • the illumination assembly 200 can include a light source 210 for generating light 800.
  • the illumination assembly 200 generates the light 800 that is used and processed by other assemblies of the system 100.
  • Figure 2c is a hierarchy diagram illustrating an example of different components that can be included in the illumination assembly 200.
  • Those components can include but are not limited a wide range of light sources 210, a diffuser assembly 280, and a variety of supporting components 150.
  • Examples of light sources 210 can include but are such as a multi-bulb light source 21 1 , an LED lamp 212, a 3 LED lamp 213, a laser 214, an OLED 215, a CFL 216, an incandescent lamp 218, and a non-angular dependent lamp 219.
  • the light source 210 is where light 800 is generated and moves throughout the rest of the system 100.
  • each light source 210 is a location 230 for the origination of light 800.
  • An imaging assembly 300 performs the function of creating the image 880 from the light 800 supplied by the illumination assembly 200.
  • a modulator 320 can transform the light 800 supplied by the illumination assembly 200 into the image 880 that is displayed by the system 100.
  • the image 880 generated by the imaging assembly 300 can sometimes be referred to as an interim image 850 because the image 850 may be focused or otherwise modified to some degree before it is directed to the location where it can be experienced by one or more users 90.
  • Imaging assemblies 300 can vary significantly based on the type of technology used to create the image. Display technologies such as DLP (digital light processing), LCD (liquid-crystal display), LCOS (liquid crystal on silicon), and other methodologies can involve substantially different components in the imaging assembly 300.
  • DLP digital light processing
  • LCD liquid-crystal display
  • LCOS liquid crystal on silicon
  • Figure 2f is a hierarchy diagram illustrating an example of some of the different components that can be utilized in the imaging assembly 300 for the system 100.
  • a prism 310 can be very useful component in directing light to and/or from the modulator 320.
  • DLP applications will typically use an array of TIR prisms 31 1 or RTIR prisms 312 to direct light to and from a DMD 324.
  • the plate 340 can replace the need for prisms 310 used in the system 100.
  • a modulator 320 (sometimes referred to as a light modulator 320) is the device that modifies or alters the light 800, creating the image 880 that is to be displayed. Modulators 320 can operate using a variety of different attributes of the modulator 320.
  • a reflection-based modulator 322 uses the reflective-attributes of the modulator 320 to fashion an image 880 from the supplied light 800. Examples of reflection-based modulators 322 include but are not limited to the DMD 324 of a DLP display and some LCOS (liquid crystal on silicon) panels 340.
  • a transmissive-based modulator 321 uses the transmissive-attributes of the modulator 320 to fashion an image 880 from the supplied light 800.
  • transmissive-based modulators 321 include but are not limited to the LCD (liquid crystal display) 330 of an LCD display and some LCOS panels 340.
  • the imaging assembly 300 for an LCOS or LCD system 100 will typically have a combiner cube or some similar device for integrating the different one-color images into a single image 880.
  • the imaging assembly 300 can also include a wide variety of supporting components 150.
  • a projection assembly 400 can perform the task of directing the image 880 to its final destination in the system 100 where it can be accessed by users 90.
  • the image 880 created by the imaging assembly 300 will be modified in at least some minor ways between the creation of the image 880 by the modulator 320 and the display of the image 880 to the user 90.
  • the image 880 generated by the modulator 320 of the imaging assembly 400 may only be an interim image 850, not the final version of the image 880 that is actually displayed to the user 90.
  • FIG. 2e is a hierarchy diagram illustrating an example of different components that can be part of the projection assembly 400.
  • a display 410 is the final destination of the image 880, i.e. the location and form of the image 880 where it can be accessed by users 90.
  • Examples of displays 410 can include an active screen 412, a passive screen 414, an eyepiece 416, and a VRD eyepiece 418.
  • the projection assembly 400 can also include a variety of supporting components 150 as discussed below.
  • a plate 340 can also serve as a component within the projection assembly 400 because the plate 340 is an excellent tool for managing the flow of light 800 between different system 100 components, as illustrated in Figure 2b.
  • FIG. 2b illustrates an example of the system 100 that includes a tracking assembly 500 (which is also referred to as a sensor assembly 500).
  • the sensor assembly 500 can be used to capture information about the user 90, the user's interaction with the image 880, and/or the exterior environment in which the user 90 and system 100 are physically present.
  • the sensor assembly 500 can include a sensor 510, typically a camera such as an infrared camera for capturing an eye-tracking attribute 530 pertaining to eye movements of the viewer 96.
  • a lamp 520 such as an infrared light source to support the functionality of the infrared camera, and a variety of different supporting components 150.
  • the tracking assembly 500 will utilize components of the projection assembly 400 such as the configuration of a curved mirror 420 operating in tandem with a partially transparent plate 340. Such a configuration can be used to capture infrared images of the eye 92 of the viewer 96 while simultaneously delivering images 880 to the eye 92 of the viewer 96.
  • the sensor assembly 500 can also include sensors 510 intended to capture visual images, video, sounds, motion, position, and other information from the operating environment 80.
  • An augmentation assembly 600 can allow natural light from the exterior environment 80 in through a window component 620 in the system 100 (the window component 620 can include a shutter component 610) that is capable of being opened or closed.
  • Light 800 can be a challenging resource to manage. Light 800 moves quickly and cannot be constrained in the same way that most inputs or raw materials can be.
  • Figure 2j is a hierarchy diagram illustrating an example of some supporting components 150, many of which are conventional optical components. Any display technology application will involve conventional optical components such as mirrors 141 (including dichroic mirrors 152) lenses 160, collimators 170, and plates 180. Similarly, any powered device requires a power source 191 and a device capable of displaying an image 880 is likely to have a processor 190.
  • the system 100 can be implemented with respect to a wide variety of different display technologies 140, including DLP systems 141 , LCD systems 142, and LCOS system 143.
  • the various drawings focus on DLP systems 141 because it is believed that the plate 340 is particularly useful as a substitute for TIR prisms 31 1 and RTIR prisms 312.
  • Figure 3a is a perspective diagram illustrating an example of a VRD visor apparatus 1 16.
  • Two VRD eyepieces 418 provide for directly projecting the image 880 onto the eyes of the user 90.
  • Figure 3b is a side view diagram illustrating an example of a VRD visor apparatus 1 16 being worn on the head 94 of a user 90.
  • the eyes 92 of the user 90 are blocked by the apparatus 1 16 itself, with the apparatus 1 16 in a position to project the image 880 on the eyes 92 of the user 90.
  • Figure 3c is a component diagram illustrating an example of a VRD visor apparatus 1 16 for the left eye 92. A mirror image of Figure 3c would pertain to the right eye 92.
  • a 3 LED light source 213 generates the light which passes through a condensing lens 160 that directs the light 800 to a mirror 151 which reflects the light 800 to a shaping lens 160 prior to the entry of the light 800 into an imaging assembly 300 comprised of a plate 340 and a DMD 324.
  • the interim image 850 from the imaging assembly 300 passes through another lens 160 that focuses the interim image 850 into a final image 880 that is viewable to the user 90 through the eyepiece 416.
  • the system 100 represents a substantial improvement over prior art display technologies. Just as there are a wide range of prior art display technologies, the system 100 can be similarly implemented in a wide range of different ways.
  • the innovation of using a plate 340 in lieu of prisms 340 to direct light 800 be implemented at a variety of different scales, utilizing a variety of different display technologies, in both immersive and augmenting contexts, and in both one-way (no sensor feedback from the user 90) and two-way (sensor feedback from the user 90 ) embodiments.
  • Display devices can be implemented in a wide variety of different scales.
  • the monster scoreboard at EverBanks Field (home of the Jacksonville Jaguars) is a display system that is 60 feet high, 362 feet long, and comprised of 35.5 million LED bulbs. The scoreboard is intended to be viewed simultaneously by tens of thousands of people.
  • the GLYPHTM visor by Avegant Corporation is a device that is worn on the head of a user and projects visual images directly in the eyes of a single viewer. Between those edges of the continuum are a wide variety of different display systems.
  • the system 100 displays visual images 808 to users 90 with enhanced light with reduced coherence.
  • the system 100 can be potentially implemented in a wide variety of different scales.
  • Figure 4a is a hierarchy diagram illustrating various categories and subcategories pertaining to the scale of implementation for display systems generally, and the system 100 specifically. As illustrated in Figure 4a, the system 100 can be implemented as a large system 101 or a personal system 103
  • a large system 101 is intended for use by more than one simultaneous user 90.
  • Examples of large systems 101 include movie theater projectors, large screen TVs in a bar, restaurant, or household, and other similar displays.
  • Large systems 101 include a subcategory of giant systems 102, such as stadium scoreboards 102a, the Time Square displays 102b, or other or the large outdoor displays such as billboards off the expressway.
  • a personal system 103 is an embodiment of the system 100 that is designed to for viewing by a single user 90.
  • Examples of personal systems 103 include desktop monitors 103a, portable TVs 103b, laptop monitors 103c, and other similar devices.
  • the category of personal systems 103 also includes the subcategory of near-eye systems 104.
  • a near-eye system 104 is a subcategory of personal systems 103 where the eyes of the user 90 are within about 12 inches of the display.
  • Near-eye systems 104 include tablet computers 104a, smart phones 104b, and eye-piece applications 104c such as cameras, microscopes, and other similar devices.
  • the subcategory of near-eye systems 104 includes a subcategory of visor systems 105.
  • a visor system 105 is a subcategory of near-eye systems 104 where the portion of the system 100 that displays the visual image 200 is actually worn on the head 94 of the user 90. Examples of such systems 105 include virtual reality visors, Google Glass, and other conventional head-mounted displays 105a.
  • the category of visor systems 105 includes the subcategory of VRD visor systems 106.
  • a VRD visor system 106 is an implementation of a visor system 105 where visual images 200 are projected directly on the eyes of the user.
  • the technology of projecting images directly on the eyes of the viewer is disclosed in a published patent application titled "IMAGE GENERATION SYSTEMS AND IMAGE GENERATING METHODS" (U.S. Serial Number 13/367,261 ) that was filed on February 6, 2012, the contents of which are hereby incorporated by reference.
  • FIG. 4b is a hierarchy diagram illustrating an example of different categories and subcategories of apparatuses 1 10.
  • Figure 4b closely mirrors Figure 5a.
  • the universe of potential apparatuses 1 10 includes the categories of large apparatuses 1 1 1 and personal apparatuses 1 13.
  • Large apparatuses 1 1 1 include the subcategory of giant apparatuses 1 12.
  • the category of personal apparatuses 1 13 includes the subcategory of near-eye apparatuses 1 14 which includes the subcategory of visor apparatuses 1 15.
  • VRD visor apparatuses 1 16 comprise a category of visor apparatuses 1 15 that implement virtual retinal displays, i.e. they project visual images 200 directly into the eyes of the user 90.
  • FIG. 4c is a diagram illustrating an example of a perspective view of a VRD visor system 106 embodied in the form of an integrated VRD visor apparatus 1 16 that is worn on the head 94 of the user 90. Dotted lines are used with respect to element 92 because the eyes 92 of the user 90 are blocked by the apparatus 1 16 itself in the illustration.
  • the prior art includes a variety of different display technologies, including but not limited to DLP (digital light processing), LCD (liquid crystal displays), and LCOS (liquid crystal on silicon).
  • Figure 4d which is a hierarchy diagram illustrating different categories of the system 100 based on the underlying display technology in which the system 200 can be implemented.
  • the system 100 is intended for use as a DLP system 141 , but could be potentially be used as an LCOS system 143 or even an LCD system 142 although the means of implementation would obviously differ and the reasons for implementation may not exist.
  • the system 100 can also be implemented in other categories and subcategories of display technologies.
  • Figure 4e is a hierarchy diagram illustrating a hierarchy of systems 100 organized into categories based on the distinction between immersion and augmentation.
  • Some embodiments of the system 100 can have a variety of different operating modes 120.
  • An immersion mode 121 has the function of blocking out the outside world so that the user 90 is focused exclusively on what the system 100 displays to the user 90.
  • an augmentation mode 122 is intended to display visual images 200 that are superimposed over the physical environment of the user 90.
  • the distinction between immersion and augmentation modes of the system 100 is particularly relevant in the context of near-eye systems 104 and visor systems 105.
  • Some embodiments of the system 100 can be configured to operate either in immersion mode or augmentation mode, at the discretion of the user 90. While other embodiments of the system 100 may possess only a single operating mode 120.
  • FIG. 1 Some embodiments of the system 100 will be configured only for a one-way transmission of optical information. Other embodiments can provide for capturing information from the user 90 as visual images 880 and potentially other aspects of a media experience are made accessible to the user 90.
  • Figure 4f is a hierarchy diagram that reflects the categories of a one-way system 124 (a non-sensing operating mode 124) and a two-way system 123 (a sensing operating mode 123).
  • a two-way system 123 can include functionality such as retina scanning and monitoring. Users 90 can be identified, the focal point of the eyes 92 of the user 90 can potentially be tracked, and other similar functionality can be provided.
  • a one-way system 124 there is no sensor or array of sensors capturing information about or from the user 90.
  • Display devices are sometimes integrated with a media player.
  • a media player is totally separate from the display device.
  • a laptop computer can include in a single integrated device, a screen for displaying a movie, speakers for projecting the sound that accompanies the video images, a DVD or BLU-RAY player for playing the source media off a disk.
  • Such a device is also capable of streaming
  • Figure 4g is a hierarchy diagram illustrating a variety of different categories of systems 100 based on the whether the system 100 is integrated with a media player or not.
  • An integrated media player system 107 includes the capability of actually playing media content as well as displaying the image 880.
  • a non-integrated media player system 108 must communicate with a media player in order to play media content.
  • Figure 4h is a hierarchy diagram illustrating an example of different roles that a user 90 can have.
  • a viewer 96 can access the image 880 but is not otherwise able to control the functionality of the system 100.
  • An operator 98 can control the operations of the system 100, but cannot access the image 880.
  • the viewers 96 are the patrons and the operator 98 is the employee of the theater.
  • media content 840 can include a wide variety of different types of attributes.
  • a system 100 for displaying an image 880 is a system 100 that plays media content 840 with a visual attribute 841 .
  • many instances of media content 840 will also include an acoustic attribute 842 or even a tactile attribute.
  • images 880 are parts of a larger video 890 context.
  • an image 880 can be stand-alone still frame 882.
  • Table 1 sets forth a chart that correlates element numbers, element names, and element definitions/descriptions.
  • the system 100 can be implemented in outdoor environments 80 as well as indoor environments 80.
  • Examples of operating environments 80 can include but are not limited the inside a vehicle, such as a car, boat, or plane; large public places, such as an airport, park, shopping mall, auditorium, sports stadium, grocery store, or church; domestic environments such as a house, apartment, or hotel room; and work environments such as an office or factory.
  • a user 90 is a viewer 96 and/or operator 98 of the system 100.
  • the user 90 is typically a human being.
  • users 90 can be different organisms such as dogs or cats, or even automated technologies such as expert systems, artificial intelligence applications, and other similar "entities”.
  • the eye consists of different portions including but not limited to the sclera, iris, cornea, pupil, and retina.
  • Some embodiments of the system 100 involve a VRD visor apparatus 116 that can project the desired image 880 directly onto the eye 92 of the user 90.
  • Head The portion of the body of the user 90 that includes the eye 92.
  • Some embodiments of the system 100 can involve a visor apparatus 1 15 that is worn on the head 94 of the user 90.
  • the operator 98 does not necessarily view the images 880 displayed by the system 100 because the operator 98 may be someone operating the system 100 for the benefit of others who are viewers 96.
  • the operator 98 of the system 100 may be someone such as a projectionist at a movie theater or the individual controlling the system 100.
  • System A collective configuration of assemblies, subassemblies, components, processes, and/or data that provide a user 90 with the functionality of engaging in a media experience by accessing a media content unit 840.
  • Some embodiments of the system 100 can involve a single integrated apparatus 1 10 hosting all components of the system 100 while other embodiments of the system 100 can involve different non-integrated device configurations.
  • Some embodiments of the system 100 can be large systems 102 or even giant system 101 while other embodiments of the system 100 can be personal systems 103, such as near-eye systems 104, visor systems 105, and VRD visor systems 106.
  • Systems 100 can also be referred to as display systems 100. The system 100 is believed to be particularly useful in the context of personal system 103.
  • Giant System An embodiment of the system 100 intended to be viewed simultaneously by a thousand or more people.
  • Examples of giant systems 101 include scoreboards at large stadiums, electronic billboards such the displays in Time Square in New York City, and other similar displays.
  • a giant system 101 is a subcategory of large systems 102.
  • a large system 102 is not a personal system 103.
  • the media experience provided by a large system 102 is intended to be shared by a roomful of viewers 96 using the same illumination assembly 200, imaging assembly 300, and projection assembly 400.
  • Examples of large systems 102 include but are not limited to a projector/screen configuration in a movie theater, classroom, or conference room; television sets in sports bar, airport, or residence; and scoreboard displays at a stadium.
  • Large systems 101 can also be referred to as large display systems 101 .
  • Personal System A category of embodiments of the system 100 where the media experience is personal to an individual viewer 96.
  • personal media systems include desktop computers (often referred to as personal computers), laptop computers, portable televisions, and near-eye systems 104.
  • personal systems 103 can also be referred to as personal media systems 103.
  • Near- eye systems 104 are a subcategory of personal systems 103.
  • Near-Eye System A category of personal systems 103 where the media experience is communicated to the viewer 96 at a distance that is less than or equal to about 12 inches (30.48 cm) away.
  • Examples of near-eye systems 103 include but are not limited to tablet computers, smart phones, system 100 involving eyepieces, such as cameras, telescopes, microscopes, etc., and visor media systems 105, .
  • Near-eye systems 104 can also be referred to as near-eye media systems 104.
  • Visor systems 105 can also be referred to as visor display systems 105.
  • VRD Visor System VRD stands for a virtual retinal display. VRDs can also be referred to as retinal scan displays (“RSD”) and as retinal projectors ("RP"). VRD projects the image 880 directly onto the retina of the eye 92 of the viewer 96.
  • a VRD Visor System 106 is a visor system 105 that utilizes a VRD to display the image 880 on the eyes 92 of the user 90.
  • a VRD visor system 106 can also be referred to as a VRD visor display system 106.
  • Apparatus A device that provides a user 90 with the ability to engage in a media experience 840, i.e. interact with a media content unit 840.
  • the apparatus 1 10 can be partially or even fully integrated with a media player 848. Many embodiments of the apparatus 110 will have a capability to communicate both acoustic attributes 842 and visual attributes 841 of the media experience 840 to the user 90.
  • the apparatus 1 10 can include the illumination assembly 200, the imaging assembly 300, and the projection assembly 400.
  • the apparatus 1 10 includes the media player 848 that plays the media content 840. In other embodiments, the apparatus 1 10 does not include the media player 848 that plays the media content 840.
  • Different configurations and connection technologies can provide varying degrees of "plug and play" connectivity that can be easily installed and removed by users 90.
  • Giant Apparatus An apparatus 1 10 implementing an embodiment of a giant system
  • a giant apparatus 1 1 1 include the scoreboards at a professional sports stadium or arena.
  • An apparatus 1 10 implementing an embodiment of a large system
  • large apparatuses 1 1 1 include movie theater projectors and large screen television sets.
  • a large apparatus 1 1 1 is typically positioned on a floor or some other support structure.
  • a large apparatus 1 1 1 such as a flat screen TV can also be mounted on a wall.
  • Personal Media An apparatus 1 10 implementing an embodiment of a personal Apparatus system 103.
  • Many personal apparatuses 1 12 are highly portable and are supported by the user 90.
  • Other embodiments of personal media apparatuses 1 13 are positioned on a desk, table, or similar surface.
  • Common examples of personal apparatuses 1 13 include desktop computers, laptop computers, and portable televisions.
  • Near-Eye An apparatus 1 10 implementing an embodiment of a near-eye Apparatus system 104.
  • Many near-eye apparatuses 1 14 are either worn on the head (are visor apparatuses 1 15) or are held in the hand of the user 90.
  • Examples of near-eye apparatuses 1 14 include smart phones, tablet computers, camera eye-pieces and displays, microscope eye-pieces and displays, gun scopes, and other similar devices.
  • Visor Apparatus An apparatus 1 10 implementing an embodiment of a visor system
  • the visor apparatus 1 15 is worn on the head 94 of the user 90.
  • the visor apparatus 1 15 can also be referred simply as a visor 1 15.
  • VRD Visor An apparatus 1 10 in a VRD visor system 106. Unlike a visor Apparatus apparatus 1 14, the VRD visor apparatus 1 15 includes a virtual retinal display that projects the visual image 200 directly on the eyes 92 of the user 90.
  • a VRD visor apparatus 1 16 is disclosed in U.S. Patent Number 8,982,014, the contents of which are incorporated by reference in their entirety.
  • Some embodiments of the system 100 can be implemented in such a way as to support distinct manners of operation.
  • the user 90 can explicitly or implicitly select which operating mode 120 controls.
  • the system 100 can determine the applicable operating mode 120 in accordance with the processing rules of the system 100.
  • the system 100 is implemented in such a manner that supports only one operating mode 120 with respect to a potential feature. For example, some systems 100 can provide users 90 with a choice between an immersion mode 121 and an augmentation mode 122, while other embodiments of the system 100 may only support one mode 120 or the other.
  • Immersion An operating mode 120 of the system 100 in which the outside world is at least substantially blocked off visually from the user 90, such that the images 880 displayed to the user 90 are not superimposed over the actual physical environment of the user 90.
  • the act of watching a movie is intended to be an immersive experience.
  • Augmentation An operating mode 120 of the system 100 in which the image 880 displayed by the system 100 is added to a view of the physical environment of the user 90, i.e. the image 880 augments the real world.
  • Google Glass is an example of an electronic display that can function in an augmentation mode.
  • Sensing An operating mode 120 of the system 100 in which the system 100 captures information about the user 90 through one or more sensors. Examples of different categories of sensing can include eye tracking pertaining to the user's interaction with the displayed image 880, biometric scanning such as retina scans to determine the identity of the user 90, and other types of sensor readings/measurements.
  • the plate 340 can be transitioned to a "compacted" or “collapsed” state in order to conserve space. This can be particularly desirable in the context of visor apparatus 1 15 or VRD visor apparatus.
  • the system 100 can be Technology implemented using a wide variety of different display technologies.
  • Examples of display technologies 140 include digital light processing (DLP), liquid crystal display (LCD), and liquid crystal on silicon (LCOS). Each of these different technologies can be implemented in a variety of different ways.
  • DLP digital light processing
  • LCD liquid crystal display
  • LCOS liquid crystal on silicon
  • DLP System An embodiment of the system 100 that utilizes digital light processing (DLP) to compose an image 880 from light 800.
  • DLP digital light processing
  • LCD System An embodiment of the system 100 that utilizes liquid crystal display
  • LCD liquid crystal
  • LCOS System An embodiment of the system 100 that utilizes liquid crystal on silicon (LCOS) to compose an image 880 from light 800.
  • LCOS liquid crystal on silicon
  • a system 100 like any Components electronic display is a complex combination of components and processes.
  • Light 800 moves quickly and continuously through the system 100.
  • Various supporting components 150 are used in different embodiments of the system 100. A significant percentage of the components of the system 100 can fall into the category of supporting components 150 and many such components 150 can be collectively referred to as "conventional optics".
  • Supporting components 150 can be necessary in any implementation of the system 100 in that light 800 is an important resource that must be controlled, constrained, directed, and focused to be properly harnessed in the process of transforming light 800 into an image 880 that is displayed to the user 90.
  • the text and drawings of a patent are not intended to serve as product blueprints.
  • One of ordinary skill in the art can devise multiple variations of supplementary components 150 that can be used in conjunction with the innovative elements listed in the claims, illustrated in the drawings, and described in the text.
  • Mirrors 151 can be comprised of a wide variety of different materials, and configured in a wide variety of shapes and sizes.
  • Lens An object that possesses at least a non-trivial magnitude of transmissivity. Depending on the context, a particular lens could be virtually 100% transmissive while in other cases merely about 50% transmissive. A lens 160 is often used to focus and/or light 800.
  • Processor A central processing unit (CPU) that is capable of carrying out the instructions of a computer program.
  • the system 100 can use one or more processors 190 to communicate with and control the various components of the system 100.
  • Examples of power sources include various batteries as well as power adaptors that provide for a cable to provide power to the system 100.
  • Different embodiments of the system 100 can utilize a wide variety of different internal and external power sources. 191 . Some embodiments can include multiple power sources 191 .
  • Illumination A collection of components used to supply light 800 to the imaging Assembly assembly 300.
  • Common example of components in the illumination assembly 200 include light sources 210 and diffusers.
  • the illumination assembly 200 can also be referred to as an illumination subsystem 200.
  • Light Source A component that generates light 800 There are a wide variety of different light sources 210 that can be utilized by the system 100.
  • a light source 210 that includes more than one illumination Source element.
  • a 3-colored LED lamp 213 is a common example of a multi-prong light source 212.
  • LED Lamp A light source 210 comprised of a light emitting diode (LED).
  • Transmissive- A modulator 320 that fashions an image 880 from light 800 utilizing Based Light a transmissive property of the modulator 320.
  • LCDs are a common Modulator example of a transmissive-based light modulator 321 .
  • Reflection-Based A modulator 320 that fashions an image 880 from light 800 utilizing Light Modulator a reflective property of the modulator 320.
  • Common examples of reflection-based light modulators 322 include DMDs 324 and LCOSs 340.
  • DMD A reflection-based light modulator 322 commonly referred to as a digital micro mirror device.
  • a DMD 324 is typically comprised of a several thousand microscopic mirrors arranged in an array on a processor 190, with the individual microscopic mirrors corresponding to the individual pixels in the image 880.
  • LCD Panel or LCD A light modulator 320 in an LCD (liquid crystal display).
  • LCD liquid crystal display
  • a liquid crystal display that uses the light modulating properties of liquid crystals.
  • Each pixel of an LCD typically consists of a layer of molecules aligned between two transparent electrodes, and two polarizing filters (parallel and perpendicular), the axes of transmission of which are (in most of the cases) perpendicular to each other. Without the liquid crystal between the polarizing filters, light passing through the first filter would be blocked by the second (crossed) polarizer.
  • Some LCDs are transmissive while other LCDs are transflective.
  • LCOS Panel or A light modulator 320 in an LCOS (liquid crystal on silicon) display Similar to a DMD 324, except that the LCOS 326 uses a liquid crystal layer on top of a silicone backplane instead of individual mirrors.
  • An LCOS 244 can be transmissive or reflective.
  • Dichroid Combiner A device used in an LCOS or LCD display that combines the Cube different colors of light 800 to formulate an image 880 or interim image 850.
  • the dichroid combiner cube 330 can be an equivalent to a prism 310 in the context of an LCOS system 142 or an LCD system 143.
  • the plate 340 can be implemented using a wide variety of materials such as glass 342 or plastic film 344.
  • the plate 340 can change its characteristics on an image by image or even subframe by subframe basis.
  • Glass A substantially hard and brittle substance, typically with transparent or translucent, made by fusing sand with soda, lime, and other ingredients that is rapidly cooled.
  • Many embodiments of the plate 340 include a glass 342 component. Some embodiments of the plate 340 are comprised substantially or even entirely of glass 342.
  • Plastic Film A synthetic material made from a wide range of polymers. Plastic film 344 can also be referred to simply as plastic 344. Many embodiments of the plate 340 can include a plastic 344 component
  • Modulated Film A plastic film 344 that modulates the light 800 that comes into contact with the film 345.
  • modulated films 345 include electrochromic, photchromic, and other types. Such films can be used to create a dynamic aperture 352 with desirable optical effects 860.
  • 340 can be comprised of one or more layers 346.
  • Coatings can be comprised of glass 342, plastic film 344, and/or other components with desirable reflectiveness 372, polarization 373, and/or transmissiveness 374 attributes.
  • Aperture A hole or opening.
  • the plate 340 can include one or more apertures 350 to facilitate the transmission of light 800 though the aperture 350.
  • Dynamic Aperture An aperture 350 that can provide for being dynamically opened, closed, broadened, narrowed, and/or changed in shape. This can be achieved in a variety of different ways, including means analogous to the shutter on a camera lens.
  • Gradient An increase or decrease in the magnitude of one or more optical properties, such as reflectiveness 372, polarization 473, and/or transmissiveness 374 resulting from a different location on an object such as a plate 340.
  • Adjustable A gradient 360 that provides for being dynamically modified while Gradient the system 100 is generating images 880.
  • Adjustable An adjustable gradient 362 where the function and purpose of the Diffractive adjustable gradient 362 is to address the diffraction of light 800.
  • the plate 340 Reflectiveness or The extent to which an object such as a plate 340 causes light 800 Reflectivity to reflect back.
  • the plate 340 the plate
  • the 340 will possess a level of reflectiveness 372 such that between about 40%-60% of light 800 striking the plate 340 to be reflected back.
  • a plate 340 possessing a reflectivity of about 50% is desirable in many embodiments of the system 100.
  • the system 100 can be implemented with a plate 340 possessing a wide variety of different magnitudes of reflectiveness 372 ranging from as little as about 0.5% up to about 99.5%.
  • the reflectivity 372 of the plate 340 or other component of the system 100 can differentiate light 800 on the basis of the wavelength of the applicable light 800 (i.e. where in the light 800 falls in the spectrum 802.
  • the plate 340 can be less reflective 372 in the infrared spectrum 806 than in the visual spectrum 804 to facilitate eye-tracking functionality performed by the system 100.
  • Polarization or Polarized light 800 is light 800 traveling in a substantially uniform Polarity orientation in which the vibrations in the light waves occur in a single place.
  • Light 800 can be polarized through transmission 374, through reflection 372, through refraction, or by scattering.
  • the plate 340 can impact the polarity 373 of the light 800 that the plate comes into contact with.
  • Transmissiveness The extent to which an object such as a plate 340 allows light 800 or to pass through the object.
  • an object such as a plate 340 allows light 800 or to pass through the object.
  • the plate 340 will possess a level of transmissivity 374 such that between about 40%-60% of light 800 striking the plate 340 can pass through.
  • a plate 340 possessing a transmissiveness of about 50% is desirable in many embodiments of the system 100.
  • the system 100 can be implemented with a plate 340 possessing a wide variety of different magnitudes of transmissivity 374 ranging from as little as about 0.5% up to about 99.5%.
  • Optical Effect A modification to the displayed image 880 that is desirable based on the context of the displayed image 800.
  • a desired optical effect 380 may be shading to create the color black in the image 880.
  • Holographic The plate 340 can include or be comprised of one or more Element holographic elements 382.
  • a holographic element 382 is an optical component, such as a lens, filter, beam splitter, or defraction grating.
  • a holographic element 382 can be produced using holographic imaging processes or principles. Dichromated gelatin and photoreists are among the holographic recording materials used in forming holographic elements 382.
  • the plate 340 can include or be comprised of an array of very small lenses.
  • a micro lens array 384 can also be referred to as a textured plated 384.
  • Collapsible Plate A plate 340 that provides for entering into a collapsed or compacted mode 128 when the system 100 is not being used to display an image 880.
  • the projection assembly 400 includes a display 410.
  • the projection assembly 400 can also include various supporting components 150 that focus the image 880 or otherwise modify the interim image 850 transforming it into the image 880 that is displayed to one or more users 90.
  • the projection assembly 400 can also be referred to as a projection subsystem 400.
  • Display or Screen An assembly, subassembly, mechanism, or device by which the image 880 is made accessible to the user 90. Examples of displays 410 include active screens 412, passive screens 414, eyepieces 416, and VRD eyepieces 418.
  • Passive Screen A non-powered surface on which the image 880 is projected.
  • a conventional movie theater screen is a common example of a passive screen 412.
  • Eyepiece A display 410 positioned directly in front of the eye 92 of an individual user 90.
  • VRD Eyepiece or An eyepiece 416 that provides for directly projecting the image 880 VRD Display on the eyes 92 of the user 90.
  • a VRD eyepiece 418 can also be referred to as a VRD display 418.
  • Curved Mirror An at least partially reflective surface that in conjunction with the splitting plate, a plate 340, or other similar component to project the image 880 onto the eye 92 of the viewer 96.
  • the curved mirror 420 can perform additional functions in embodiments of the system 100 that include a sensing mode 126 and/or an augmentation mode 122.
  • the sensor assembly 500 can also be referred to as a tracking assembly 500.
  • the sensor assembly 500 is a collection of components that can track the eye 92 of the viewer 96 while the viewer 96 is viewing an image 880.
  • the tracking assembly 500 can include an infrared camera 510, and infrared lamp 520, and variety of supporting components 150.
  • the assembly 500 can also include a quad photodiode array or CCD.
  • the sensor 510 is typically a camera, such as an infrared camera.
  • Microphone A sensor 510 that captures sounds of the exterior operating environment 80.
  • Motion Sensor A sensor 510 that detects motion in the operating environment 80.
  • Lamp A light source for the sensor 510 For embodiments of the sensor
  • a light source is typically very helpful.
  • the lamp 520 is an infrared lamp and the camera is an infrared camera. This prevents the viewer 96 from being impacted by the operation of the sensor assembly 500.
  • Eye-Tracking An attribute pertaining to the movement and/or position of the eye Attribute 92 of the viewer 96.
  • Some embodiments of the system 100 can be configured to selectively influence the focal point 870 of light 800 in an area of the image 880 based on one or more eye-tracking attributes 530 measured or captured by the sensor assembly 500.
  • Output Devices A device or component that communicates some aspect of the media experience 840 to the user 90.
  • the system 100 can utilize a wide variety of output devise 550, many of which may be standalone, non-integrated, plug and play types of components. Common examples of output devices 550 include speakers 560 and displays 410. Any mechanism for providing output or feedback to a user 90 in the prior art can be incorporated into the system 100.
  • speakers 560 include headphones and earphones.
  • Haptic Feedback A device or component that can provide haptic feedback to the Component user 90.
  • Augmentation A collection of components that provide for allowing or precluding Assembly an exterior environment image 650 from reaching the eye 92 of the viewer 96.
  • Shutter A device that provides for either allowing or disallowing exterior Component light from reaching the eyes 92 of the viewer 96 while the apparatus
  • Window A passageway for light from the exterior environment in an embodiment that is not fully immersive.
  • Exterior Light The surroundings of the system 100 or apparatus 110. Some embodiments of the system 100 can factor in lighting conditions of the exterior environment 650 in supplying light 800 for the display of images 880.
  • Parameters An at least substantially comprehensive compilation of different ways in which the apparatus 1 10 can operate.
  • the particular configuration 705 of parameters 700 that will be operable at any particular time will depend on the defining of one or more triggers 750.
  • categories of parameters 700 include but are not limited to a sound parameter 710, a display parameter 720, a progression parameter 730, and a haptic parameter 740.
  • Configuration A subset of operating parameters 700 from the universe of potential operating parameters 700. Different triggers 750 can result in different configurations 705.
  • the system 100 can be implemented to facilitate automatic changes from one configuration 705 of parameters 700 to another configuration 705 of parameters 700 based on or more triggers 750.
  • Sound Parameters A parameter 700 pertaining to the communication of acoustic attributes 842 in the media experience 840 by the system 100 to the user 90.
  • Examples of sound parameters 710 can include but are not limited to an off/mute 71 1 , a temporarily reduced volume 712, an alert 713, an external sound amplification 714, a message 715, an ongoing volume change 716.
  • Off/Mute The sound parameter 710 where sound ceases to be communicated by the system 100 to the user 90.
  • the sound parameter 710 where sound is temporarily reduced in Reduced Volume volume for a predefined period of time. This can serve as a notification to the user 90 as well as provide the user 90 with a time to react to the applicable trigger 750.
  • An audible notification can be communicated to the user 90.
  • the system 100 can import sounds from the environment 80 that are captured via a microphone or other similar sensor and the play that sound through the speakers 560 of the system 100.
  • Ongoing Volume The sound parameter 710 where the volume is changed on a non- Change temporary (i.e. ongoing basis).
  • Examples of display parameters 720 can include but are not limited to an off 721 , a dimmed display 722, an an/external view 723, an on/augmented view 724, a flash 725, a verbal alert 726, and an in increased brightness 727.
  • Display parameters 720 can be temporary (for a pre-defined period of time) or ongoing.
  • images 880 are displayed with light of reduced intensity.
  • Off/External View A display parameter 720 where the media content 840 is shut off, but a view of the operating environment 80 is displayed through a window or through the display 410.
  • On/Augmented A display parameter 720 where media content 840 continues to View play, but in an augmentation mode 122.
  • progression parameters 730 can include but are not limited to a stop 731 , a pause 732, and a timed-pause 733.
  • Pause A progression parameter 730 where the media experience 840 is paused.
  • Timed-Pause A progression parameter 730 where the media experience 840 is paused for a specified period of time, before the media experience 840 automatically starts playing again.
  • Haptic communication typically involves vibration of a device.
  • it might include a chair or other devices.
  • Haptic Alert The invocation of vibration to alert the user 90 to something.
  • Haptic alerts 741 can be effective way to get the attention of a user 90 engaged in primarily visual and/or acoustic content.
  • Muted Haptic For a media experience 840 that involves haptic feedback, the ability to mute that feedback can be a desirable parameter 700.
  • Trigger An event defined with respect to one or more inputs that is linked to one or more configurations 705. Examples of different categories of triggers 750 include but are not limited to user actions 760 and environmental stimuli 780.
  • User Action An activity by a user 90 that is linked or can be linked to a change in the configuration 705 of the system 100.
  • Examples of user actions 760 can include but are not limited to use or manipulation of a user control 761 , an eye-movement gesture 762, a kinetic gesture 763, a pre-defined user gesture 764, an input from peripheral device 765, a pre-defined voice command 766, and a pre-defined schedule 767.
  • User Control A user action 760 that involves the use or manipulation of a user control, such as a button, joystick, keypad, etc.
  • Eye-Movement A user action 760 that involves the movement of the eye 92 of the Gesture user 90.
  • Kinetic Gesture A user action 760 that involves the motion of the user 90.
  • Pre-Defined User A user action 760 that involves a gesture pre-defined by the user Gesture 90.
  • Peripheral Device A user action 760 that is in the form of an input received through a Input peripheral device.
  • Pre-Defined Voice A user action 760 that is in the form of a voice command captured Command through a microphone or similar sensor.
  • a user action 760 in the form of a scheduled date/time can be used as an alarm clock in some contexts.
  • a user 90 can set alarms such as when playing video games and wanting to avoid forgetting about the time and being late for a dinner date.
  • Examples of environmental stimuli 780 can include but are not limited to an external sounds 781 , an external light 782, a detected location 783, a detected proximity 784, a detected motion 785, and an external communication 785.
  • External Sound A sound from the operating environment 80 that is captured by a microphone.
  • External Light A temporary pulse of light or a continuous source of light in the operating environment 80.
  • Detected Location A GPS location. This can be a highly useful trigger 750 for a user
  • Detected Proximity The detection of an object in close proximity to the user 90 and/or apparatus 1 10.
  • Detected Motion The detection of a moving object in the operating environment 80.
  • External A phone call, e-mail, text message, or other form of communication Communication that can be routed by the user 90 through the system 100.
  • important communications can be differentiated based on the type of communication and the other person involved in the communication. It is anticipated that users 90 may route e-mail, phone calls, and other communications through the apparatus 1 10.
  • Light Light 800 is the media through which an image is conveyed, and light 800 is what enables the sense of sight.
  • Light is electromagnetic radiation that is propagated in the form of photons.
  • Spectrum Light 800 can be differentiated and categorized on the basis of wavelength.
  • the spectrum 802 of light 800 is a range of light 800 that includes very long wavelength light 800 (the infrared spectrum 806) through very short wavelength light 800 (the ultraviolet spectrum 807), including light 800 in the visible spectrum 804.
  • Light 800 at different parts of the spectrum 802 will be of different colors.
  • Full Spectrum Light 800 for which certain portions of the spectrum 802 are not blocked or differentiated.
  • many embodiments of the plate 340 will be full spectrum 803 processors of light even though only the visual spectrum 804 is used to comprise the image 880.
  • the visual spectrum 804 is comprised of light that
  • Visible Spectrum is red, orange, yellow, green, blue, indigo, and violet.
  • Partial Visual A subset of the visual spectrum 804.
  • Different embodiments of the Spectrum plate 340 can possess light impacting attributes such as different reflective 432, transmissiveness 434, and/or polarization 433, for different subsets of the visible spectrum 804.
  • Infrared Spectrum The portion of the spectrum 802 that is not visible to the human eye and has a longer wavelength than light 800 in the visible spectrum 804.
  • Ultraviolet The portion of the spectrum 802 that is not visible to the human Spectrum eye and has a shorter wavelength than light 800 in the visible spectrum 804.
  • Pulse An emission of light 800.
  • a pulse 810 of light 800 can be defined with respect to duration, wavelength, and intensity.
  • the image 880 displayed to the user 90 by the system 100 can in many instances, be but part of a broader media experience.
  • a unit of media content 840 will typically include visual attributes 841 and acoustic attributes 842.
  • Tactile attributes 843 are not uncommon in certain contexts. It is anticipated that the olfactory attributes 844 and gustatory attributes 845 may be added to media content 840 in the future.
  • Visual Attributes Attributes pertaining to the sense of sight.
  • the core function of the system 100 is to enable users 90 to experience visual content such as images 880 or video 890.
  • visual content will be accompanied by other types of content, most commonly sound or touch.
  • smell or taste content may also be included as part of the media content 840.
  • the core function of the system 100 is to enable users 90 to experience visual content such as images 880 or video 890.
  • media content 840 will also involve other types of senses, such as the sense of sound.
  • the system 100 and apparatuses 1 10 embodying the system 100 can include the ability to enable users 90 to experience tactile attributes 843 included with other types of media content 840.
  • Tactile Attributes Attributes pertaining to the sense of touch. Vibrations are a common example of media content 840 that is not in the form of sight or sound.
  • the system 100 and apparatuses 1 10 embodying the system 100 can include the ability to enable users 90 to experience tactile attributes 843 included with other types of media content 840.
  • Olfactory Attributes Attributes pertaining to the sense of smell. It is anticipated that future versions of media content 840 may include some capacity to engage users 90 with respect to their sense of smell. Such a capacity can be utilized in conjunction with the system 100, and potentially integrated with the system 100.
  • the iPhone app called oSnap is a current example of gustatory attributes 845 being transmitted electronically.
  • Attributes pertaining to the sense of taste. It is anticipated that Attributes future versions of media content 840 may include some capacity to engage users 90 with respect to their sense of taste. Such a capacity can be utilized in conjunction with the system 100, and potentially integrated with the system 100.
  • a media player 848 is device or configuration of devices that provide the playing of media content 840 for users.
  • Examples of media players 848 include disc players such as DVD players and BLU-RAY players, cable boxes, tablet computers, smart phones, desktop computers, laptop computers, television sets, and other similar devices.
  • Some embodiments of the system 100 can include some or all of the aspects of a media player 848 while other embodiments of the system 100 will require that the system 100 be connected to a media player 848.
  • users 90 may connect a VRD apparatus 1 16 to a BLU-RAY player in order to access the media content 840 on a BLU-RAY disc.
  • the VRD apparatus 1 16 may include stored media content 840 in the form a disc or computer memory component.
  • Non-integrated versions of the system 100 can involve media players 848 connected to the system 100 through wired and/or wireless means.
  • the image 880 displayed to user 90 is created by the modulation of light 800 generated by one or light sources 210 in the illumination assembly 200.
  • the image 880 will typically be modified in certain ways before it is made accessible to the user 90. Such earlier versions of the image 880 can be referred to as an interim image 850.
  • Optical Effect A modification to the displayed image 880 that is desirable based on the context of the displayed image 800.
  • a desired optical effect 380 may be shading to create the color black in the image 880.
  • Optical Chain or The travel path of light 800 within the system 100, beginning with Optical Pathway one or more light sources 210 in the illumination assembly 200 and ending with the image 880 displayed in a location that is accessible to the viewer 96.
  • each image 880 can be referred to as a frame 882.
  • Video 880 displayed to the user 90 is part of a sequence of images 880 can be referred to collectively as a video 890.
  • Video 890 is comprised of a sequence of static images 880 representing snapshots displayed in rapid succession to each other. Persistence of vision in the user 90 can be relied upon to create an illusion of continuity, allowing a sequence of still images 880 to give the impression of motion.
  • the entertainment industry currently relies primarily on frame rates between 24 FPS and 30 FPS, but the system 100 can be implemented at faster as well as slower frame rates.
  • Stereoscopic A video 890 comprised of stereoscopic images 881 .
  • Method A process for displaying an image 880 to a user 90 is described.
  • Illumination A process for generating light 800 for use by the system 100.
  • the Method illumination method 910 is a process performed by the illumination assembly 200.
  • the imaging method 920 can also involve making subsequent modifications to the interim image 850.
  • the display method 930 can also include making modifications to the interim image 850.

Abstract

L'invention concerne un appareil (110), un système (100) et un procédé (900) permettant d'afficher une image (880). Au lieu d'utiliser une configuration coûteuse de prismes (310), tels que des prismes TIR (311) ou des prismes RTIR (312) pour diriger la lumière (800) jusqu'à une matrice de micromiroirs (324) et en provenance de celle-ci, une plaque (340) présentant des caractéristiques de transmission (374), de réflexion (372) et/ou de polarisation (373) est utilisée. La plaque (340) peut être mise en œuvre dans une large gamme de différents modes de réalisation à l'aide d'une large gamme de composants différents et de configurations différentes.
EP15795835.6A 2014-05-19 2015-05-19 Appareil, système et procédé permettant d'afficher une image à l'aide d'une plaque Withdrawn EP3146389A4 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201461994997P 2014-05-19 2014-05-19
US14/590,953 US20170139209A9 (en) 2014-01-06 2015-01-06 System, method, and apparatus for displaying an image using a curved mirror and partially transparent plate
US14/678,974 US20170068311A1 (en) 2015-04-04 2015-04-04 System, apparatus, and method for selectively varying the immersion of a media experience
PCT/US2015/031649 WO2015179455A2 (fr) 2014-05-19 2015-05-19 Appareil, système et procédé permettant d'afficher une image à l'aide d'une plaque

Publications (2)

Publication Number Publication Date
EP3146389A2 true EP3146389A2 (fr) 2017-03-29
EP3146389A4 EP3146389A4 (fr) 2018-09-19

Family

ID=54554960

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15795835.6A Withdrawn EP3146389A4 (fr) 2014-05-19 2015-05-19 Appareil, système et procédé permettant d'afficher une image à l'aide d'une plaque

Country Status (4)

Country Link
EP (1) EP3146389A4 (fr)
JP (1) JP2017523480A (fr)
CN (1) CN106605171A (fr)
WO (1) WO2015179455A2 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107589546A (zh) * 2017-10-23 2018-01-16 北京小米移动软件有限公司 光学系统及增强现实眼镜
CN110133859B (zh) 2018-02-09 2021-09-03 中强光电股份有限公司 显示装置
CN110133860B (zh) * 2018-02-09 2022-01-25 中强光电股份有限公司 显示装置
US11693248B1 (en) * 2022-01-20 2023-07-04 Microsoft Technology Licensing, Llc TIR prisms and use of backlight for LCoS microdisplay illumination

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6097543A (en) * 1992-02-07 2000-08-01 I-O Display Systems Llc Personal visual display
US6652105B1 (en) * 2001-11-30 2003-11-25 Infocus Corporation Reflective light valve-based multimedia projector employing a patterned-silvered mirror
US7283112B2 (en) * 2002-03-01 2007-10-16 Microsoft Corporation Reflective microelectrical mechanical structure (MEMS) optical modulator and optical display system
US7320826B2 (en) * 2003-03-20 2008-01-22 Ppg Industries Ohio, Inc. Photochromic articles with reduced temperature dependency and methods for preparation
US7220006B2 (en) * 2003-08-08 2007-05-22 Allen Eddie E Method and apparatus for increasing effective contrast ratio and brightness yields for digital light valve image projectors
JP2006023441A (ja) * 2004-06-07 2006-01-26 Kazuji Yoshida 画像表示装置
US20070081248A1 (en) * 2005-10-11 2007-04-12 Kuohua Wu Reflector
US7486341B2 (en) * 2005-11-03 2009-02-03 University Of Central Florida Research Foundation, Inc. Head mounted display with eye accommodation having 3-D image producing system consisting of, for each eye, one single planar display screen, one single planar tunable focus LC micro-lens array, one single planar black mask and bias lens
US7483200B1 (en) * 2008-01-14 2009-01-27 Spatial Photonics, Inc. Multiple stop micro-mirror array display
JP5201580B2 (ja) * 2008-06-06 2013-06-05 新オプトウエア株式会社 ホログラム作成装置及びホログラムプリンタ
US7926951B2 (en) * 2008-07-11 2011-04-19 Eastman Kodak Company Laser illuminated micro-mirror projector
US20110044046A1 (en) * 2009-04-21 2011-02-24 Abu-Ageel Nayef M High brightness light source and illumination system using same
US20130314303A1 (en) * 2010-02-28 2013-11-28 Osterhout Group, Inc. Ar glasses with user action control of and between internal and external applications with feedback
ES2748116T3 (es) * 2011-03-14 2020-03-13 Dolby Laboratories Licensing Corp Sistema de proyección 3D
JP5811491B2 (ja) * 2011-04-12 2015-11-11 株式会社ニコン 顕微鏡及びそのプログラム
WO2013036789A1 (fr) * 2011-09-09 2013-03-14 University Of Connecticut Dispositifs électrochromiques obtenus par formation in situ de polymères conjugués
US8982014B2 (en) * 2012-02-06 2015-03-17 Battelle Memorial Institute Image generation systems and image generation methods

Also Published As

Publication number Publication date
JP2017523480A (ja) 2017-08-17
EP3146389A4 (fr) 2018-09-19
CN106605171A (zh) 2017-04-26
WO2015179455A2 (fr) 2015-11-26
WO2015179455A3 (fr) 2016-01-21

Similar Documents

Publication Publication Date Title
US10409079B2 (en) Apparatus, system, and method for displaying an image using a plate
US9995857B2 (en) System, apparatus, and method for displaying an image using focal modulation
US20170068311A1 (en) System, apparatus, and method for selectively varying the immersion of a media experience
US11567328B2 (en) See-through computer display systems with adjustable zoom cameras
US9823474B2 (en) System, apparatus, and method for displaying an image with a wider field of view
US11500207B2 (en) Image expansion optic for head-worn computer
US20210165225A1 (en) See-through computer display systems
US20160195721A1 (en) System, method, and apparatus for displaying an image using a curved mirror and partially transparent plate
US9971156B2 (en) See-through computer display systems
US20160195718A1 (en) System, method, and apparatus for displaying an image using multiple diffusers
US20160292921A1 (en) System, apparatus, and method for displaying an image using light of varying intensities
US10598949B2 (en) Method and apparatus for forming a visible image in space
US20160198133A1 (en) System, method, and apparatus for displaying an image with reduced color breakup
EP3146389A2 (fr) Appareil, système et procédé permettant d'afficher une image à l'aide d'une plaque
EP3092791A1 (fr) Imagerie utilisant un miroir incurvé et une plaque partiellement transparente

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20161219

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 9/31 20060101AFI20180511BHEP

Ipc: G02B 27/01 20060101ALI20180511BHEP

Ipc: G03B 21/28 20060101ALI20180511BHEP

A4 Supplementary search report drawn up and despatched

Effective date: 20180821

RIC1 Information provided on ipc code assigned before grant

Ipc: G02B 27/01 20060101ALI20180814BHEP

Ipc: H04N 9/31 20060101AFI20180814BHEP

Ipc: G03B 21/28 20060101ALI20180814BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200915

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210326