US20180182272A1 - Microelectromechanical system over-scanning for pupil distance compensation - Google Patents
Microelectromechanical system over-scanning for pupil distance compensation Download PDFInfo
- Publication number
- US20180182272A1 US20180182272A1 US15/390,346 US201615390346A US2018182272A1 US 20180182272 A1 US20180182272 A1 US 20180182272A1 US 201615390346 A US201615390346 A US 201615390346A US 2018182272 A1 US2018182272 A1 US 2018182272A1
- Authority
- US
- United States
- Prior art keywords
- point
- oscillation
- pixels
- delay
- lens
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 210000001747 pupil Anatomy 0.000 title abstract description 28
- 230000003287 optical effect Effects 0.000 claims abstract description 19
- 238000000034 method Methods 0.000 claims abstract description 10
- 230000010355 oscillation Effects 0.000 claims description 62
- 238000010586 diagram Methods 0.000 description 16
- 208000024875 Infantile dystonia-parkinsonism Diseases 0.000 description 7
- 208000001543 infantile parkinsonism-dystonia Diseases 0.000 description 7
- 238000012634 optical imaging Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 210000003128 head Anatomy 0.000 description 5
- 230000003111 delayed effect Effects 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000001404 mediated effect Effects 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000035479 physiological effects, processes and functions Effects 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 239000000872 buffer Substances 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/002—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/08—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
- G02B26/0816—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
- G02B26/0833—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements the reflecting element being a micromechanical device, e.g. a MEMS mirror, DMD
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/08—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
- G02B26/10—Scanning systems
- G02B26/105—Scanning systems with one or more pivoting mirrors or galvano-mirrors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/007—Use of pixel shift techniques, e.g. by mechanical shift of the physical pixels or by optical shift of the perceived pixels
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3129—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3129—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
- H04N9/3132—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen using one-dimensional electronic spatial light modulators
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0127—Head-up displays characterised by optical features comprising devices increasing the depth of field
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0149—Head-up displays characterised by mechanical features
- G02B2027/0161—Head-up displays characterised by mechanical features characterised by the relative positioning of the constitutive elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
- G02B2027/0174—Head mounted characterised by optical features holographic
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Definitions
- Embodiments herein generally relate to head worn displays and heads up displays; and in particular to a wearable display to accommodate a range of pupil distances.
- Modern display technology may be implemented to provide head worn displays (HWD) and to see through the display and to see information (e.g., images, text, or the like) in conjunction with the see through display.
- HWD head worn displays
- information e.g., images, text, or the like
- Such displays can be implemented in a variety of contexts, for example, defense, transportation, industrial, entertainment, wearable devices, or the like.
- an image may be reflected off a transparent projection surface to a user's eye to present an image in conjunction with a real world view.
- HWD systems have extremely difficult tradeoffs between various design and utility considerations, such as, for example, bulk, form-factor, see-through quality, field of view, etc. For example, achieving a normal eyewear form factor without bulk has not been achieved in a commercial head mounted display.
- IPD interpupillary distance
- FIG. 1 illustrates a first example optical display system.
- FIG. 2 illustrates the example optical display system of FIG. 1 in conjunction with a user's eye.
- FIG. 3 illustrates a second example optical display system.
- FIG. 4 illustrates the example optical display system of FIG. 3 in alternate detail.
- FIGS. 5A-5C illustrate the example optical system of FIG. 3 in conjunction with a first user.
- FIGS. 6A-6C illustrate the example optical system of FIG. 3 in conjunction with a second user.
- FIGS. 7A-7D illustrate examples of shifted images.
- FIGS. 8A-8D illustrates examples of pixel timing versus a MEMS mirror oscillation.
- FIG. 9 illustrate an example tunable projector.
- FIG. 10 illustrates an example logic flow.
- FIG. 11 illustrates an example computer readable medium.
- FIG. 12 illustrates another example system.
- HWDs head worn displays
- a projector configured to project images corresponding to a location on a lens
- HWDs provide a projection system and a lens that includes a holographic optical element (HOE).
- the projection system and the lens can be mounted to a frame to be worn by a user, for example, glasses, a helmet, or the like.
- the projection system projects an image onto an inside (e.g., proximate to the user) surface of the lens.
- the HOE reflects the image to an exit pupil (or viewpoint).
- the exit pupil is proximate to one of the user's eyes, and specifically, to the pupil of the user's eye. As such, the user may perceive the reflected image.
- IPD interpupillary distance
- two users with a different IPD may each wear a HWD (or similarly configured HWDs).
- the HWD may project an image to a lens with an HOE.
- the HOE may reflect the image to an exit pupil.
- the exit pupil may be proximate to the first user's eye pupil.
- the exit pupil may not be proximate to the second user's eye pupil as the second user has a different IPD than the first user (e.g., the second user's eyes are closer together than the first user's eyes, or the like). As such, the first user may correctly perceive the projected image but the second user may not.
- the present disclosure provides a HWD adapted to accept lenses with different HOEs.
- the present disclosure can provide a HWD configured to receive removable lenses.
- These removable lenses include an HOE.
- the HOE in one removable lens may be different (e.g., in a different location, with different optical characteristics, or the like) than the HOE in another removable lens to provide a HWD that can be provisioned for different IPDs.
- the HWD comprises a projector arranged to project an image into a location corresponding to the HOE of the lens mounted in the HWD.
- the projector includes a light source and a microelectromechanical system (MEMS) mirror arranged to oscillate about a number of oscillation axes to receive light emitted from the light source and reflect the light towards the lens to project an image onto the lens, and specifically onto the HOE.
- the projector can include an image location controller to control a position on the lens to which the image is projected. For example, the controller can shift the image from left to right (or from top to bottom) across the surface of the lens to project the image onto a location corresponding to a lens and HOE mounted to the HWD.
- the controller can delay a start of light corresponding to pixels of the image being emitted from the light source to correspond to a particular time in the periodic oscillation of the MEMS mirror.
- the controller can control the number of pixels in the image or can control the time period over which the light is emitted.
- the controller can control the light emitted from the light source to have different times depending upon the portion of the periodic oscillation, for example, the projector can implement different timing for left to right versus right to left MEMS mirror sweep.
- a HWD adapted to accept a number of lenses each having an HOE can be provided.
- Each of the HOEs in the lenses may be configured to reflect an image projected onto the HOE to an exit pupil in a particular location, where the exit pupil for one HOE may differ from the exit pupil of another HOE.
- a first lens with a first HOE may be provided to reflect an image to a first exit pupil.
- a second lens with a second HOE may be provided to reflect the image to a second exit pupil.
- the first and second exit pupils may be displaced from each other, for example, in a horizontal direction.
- the HWD may be provisioned with either the first lens or the second lens to provide an exit pupil in either the first or second exit pupil location.
- the HWD may be configured to provide an exit pupil (e.g., reflected image) in a first location for a first user with a first IPD (e.g., distance between pupils) or in a second location for a second user with a second IPD.
- an exit pupil e.g., reflected image
- the projection system projects an image onto the location of the lens corresponding to the HOE.
- a HWD accounting for different IPDs e.g., an IPD of the intended user, or the like.
- variables such as, “a”, “b”, “c”, which are used to denote components where more than one component may be implemented. It is important to note, that there need not necessarily be multiple components and further, where multiple components are implemented, they need not be identical. Instead, use of variables to reference components in the figures is done for convenience and clarity of presentation.
- FIG. 1 illustrates an embodiment of a system 100 .
- System 100 may include a tunable projector 110 and an optical imaging display 120 .
- the components of system 100 operate to provide a user with a computer-mediated reality.
- system 100 may overlay computer generated graphics onto a user's view of the world.
- system 100 may provide a virtual reality view.
- Optical imaging display 120 includes a projection surface 122 and holographic optical element (HOE) 124 (also referred to as a holographic optical combiner).
- the projection system 110 projects light 101 onto lens 122 .
- the projected light 101 can correspond to virtual images.
- the lens 122 and specifically the HOE 124 , reflects (or redirects) the light towards a viewpoint 103 (or exit pupil). More particularly the HOE 124 reflects the projected light 101 .
- the tunable projector 110 can also project light 101 ′ within projection range 111 (e.g., to project images onto lens 122 ). Said differently, the tunable projector 110 can project light 101 ′ across range 111 to project an image onto a location of the lens 122 , for example, corresponding to the location of the HOE 124 on lens 122 . This projected image is reflected by the HOE to be perceived within viewpoint 103 ′. This is explained in greater detail below (e.g., refer to FIGS. 7A-7D and FIGS. 8A-8D ).
- the system 100 is adapted to receive lens 122 , or another lens like lens 122 having HOEs 124 in different locations (e.g., refer to FIG. 4 , FIGS. 5A-5C and FIGS. 6A-6C ).
- the tunable projector 110 projects an image onto a location of the lens corresponding to the HOE 124 to project an image at viewpoint 103 .
- the location of the viewpoint 103 relative to the lens or system 100 can change, for example, to accommodate different IPDs.
- the lens 122 and the HOE 124 redirect the projected images and also transmit light from the external environment to the viewpoint 103 .
- a virtual image and a real world image may be presented at the viewpoint 103 .
- the device 100 may include multiple projection systems 110 and optical imaging displays 120 (e.g., lenses 122 and HOEs 124 ) to provide multiple viewpoints 103 (e.g., for a multiple eye display, or the like).
- projection surface 122 is referred to as lens 122 interchangeably.
- lens 122 may not be a lens as traditionally used.
- lens 122 can be a helmet visor, or other projection surface in which a computer-mediated reality is desired or in which the system 100 can be implemented. As such, embodiments are not limited in this context.
- FIG. 2 illustrates an embodiment of system 100 in conjunction with an eye 200 .
- optical imaging display 120 may reflect an image projected by tunable projector 110 towards an eye 200 of a user.
- eye 200 when eye 200 is located within viewpoint 103 , one or more portions of the reflected image may be visible to the eye 200 .
- tunable projector 110 can include a light source 112 to emit a light beam 113 of at least one wavelength.
- the light beam 113 is incident on (or received by) a scanning mirror 114 .
- the scanning mirror 114 rotates about a number of axes 115 to scan the light beam 113 as projected light 101 across lens 122 and particularly across HOE 124 .
- scanning mirror 114 scans the received light beam 113 onto (or across) the lens 122 while the light source 112 modulates or modifies the intensity of the light beam 113 to correspond to a digital image.
- a virtual or mediated reality display can be presented as the viewpoint 103 and may be perceived by a user via eye 200 .
- the tunable projector 110 further includes an image location controller 116 to control light beam 113 to change a location of projected image within range 111 .
- the image location controller 116 can modify light beam 113 to change a location of projected light 101 (e.g., corresponding to an image) within range 111 to accommodate different positions of HOE 124 on the surface of lens 122 .
- FIG. 3 illustrates an embodiment of wearable device 300 .
- Wearable device 300 can include a wearable frame 302 , which can couple with tunable projector 110 and optical imaging display 120 .
- wearable frame 302 may hold tunable projector 110 in a certain position with respect to display 120 .
- wearable frame 302 may hold tunable projector 110 at a spacing and angle with respect to display 120 such that images are appropriately reflected by HOE 124 to be viewed by the eye (e.g., eye 200 ) of a user.
- wearable frame 302 may position the eye 200 (refer to FIG. 2 ) at a spacing with respect to display 120 such that the eye 200 of a user is appropriately located in viewpoint 103 (refer to FIGS. 1 and 2 ).
- Embodiments are not limited in this context.
- Wearable frame 302 may include stems 312 A, 312 B, rims 314 A, 314 B, and bridge 316 .
- Stem 312 A may couple to tunable projector 110 and rim 314 A.
- Rim 314 A may couple to display 120 .
- display 120 may include lens 122 held by rim 314 A.
- the lens 122 may be plastic.
- HOE 124 can be affixed to lens 122 as described herein.
- Rim 314 A may be connected to rim 314 B by bridge 316 .
- wearable frame 302 may include any device able to properly position tunable projector 110 with respect to display 120 to enable the desired reflection of a projected image by the field imaging display 120 .
- wearable frame 302 may include one or more of eyeglass frames, a headband, a hat, a mask, a helmet, sunglasses, or similar head worn devices.
- the number and position of tunable projector 110 and display 120 may be altered without departing from the scope of this disclosure.
- wearable frame 302 may include two projectors and two displays to enable computer-augmented reality for both eyes of a user.
- tunable projector 110 may be embedded in stem 312 A of a pair of glasses.
- tunable projector 110 may be embedded in rim 314 A or bridge 316 of the wearable frame 302 .
- the tunable projector can be coupled (e.g., attached, embedded, or the like) to stem 312 B, rim 314 A, rim 314 B, or the like.
- display 120 can be a removable display, mounted in frame 302 .
- different lenses 122 having an HOE 124 in a different location from each other may be provisioned with frame 302 to provide a HWD 300 adaptable to different viewpoint locations.
- HWD 300 can provide a mediated-reality experience for users having different IPDs.
- wearable frame 302 may include control circuitry and a power source.
- the power source may include a battery or similar power storage device and provide operational power to wearable frame 302 .
- Control circuitry may include logic and/or hardware to implement one or more functional aspects of system 100 .
- control circuitry may enable wearable frame 302 to wirelessly communicate with one or more networks.
- lens 122 is an at least partially transparent surface with the HOE 124 affixed onto an inner (e.g., user facing) surface of lens 122 .
- the HOE 124 can be affixed to an external (e.g. not user facing) surface of lens 122 .
- the HOE 124 can be embedded (e.g., entirely or partially) within lens 122 , can form an integral part of lens 122 , or can form the entirety of lens 122 . Example are not limited in these contexts.
- the lens 122 and the HOE 124 may transmit light incident on a real world side of the lens 122 to provide a real world view.
- the lens 122 is opaque and the lens 122 does not transmit light incident on a real world side of the lens 122 .
- the lens 122 may be sunglass lenses to reduce an amount or type of light transmitted through the lenses, for example, by polarization or absorption.
- the lenses 122 may be prescription lenses to correct or augment light perceived from the real world and/or the virtual image.
- lens and particularly to a pair of eye glasses having a lens 122 and HOE 124 as described.
- present disclosure can be applied to other viewing apparatus, such as, for example, helmets, or the like.
- FIG. 4 is a block diagram of a top view of frame 302 of HWD 300 .
- frame 302 is adapted to receive a number of lenses 122 .
- each lens 122 has an HOE 124 , which can be in a different location.
- frame 302 is depicted including lens 122 - a , where “a” is positive integer.
- lens 122 - a depicted in this figure is shown with HOE 124 - 1 and HOE 124 - 2 in horizontal locations within the lens 122 - a . This is done for clarity in describing the reflection of light from the HOEs 124 - a based on their position relative to the tunable projector 110 .
- lens 122 - a can have a single HOE 124 - a positioned in a location within lens 122 for a specific IPD (e.g., refer to FIGS. 5A-5C and FIGS. 6A-6C ).
- the tunable projector 110 projects light 101 onto the lens 122 - a .
- the controller 116 of projector 110 can control light source 112 to cause light source to emit light at times and/or durations in relation to oscillation of MEMS mirror 114 to cause light 101 to be projected onto lens 122 - a in an area corresponding to the location of HOE 124 - a within lens 122 - a .
- controller 116 can cause projector 110 to project light 101 - 1 onto HOE 124 - 1 .
- controller 116 can cause projector 110 to project light 101 - 2 onto HOE 124 - 2 .
- projector 110 can project an image onto HOEs 124 - 1 or 124 - 2 to cause the image to be viewable at viewpoint 103 - 1 or 103 - 2 , respectively.
- viewpoints 103 - 1 and 103 - 2 are offset from each other in a horizontal direction. Accordingly, a lens (e.g., the lens 121 - 1 , 121 - 2 , or the like) may be provided and the tunable projector 110 configured to provide an exit pupil (e.g., viewpoint 103 - 1 , viewpoint 103 - 2 , or the like) for a particular IPD.
- a lens e.g., the lens 121 - 1 , 121 - 2 , or the like
- the tunable projector 110 configured to provide an exit pupil (e.g., viewpoint 103 - 1 , viewpoint 103 - 2 , or the like) for a particular IPD.
- FIGS. 5A-5C and 6A-6C depict example implementations of the HWD 300 for two different users, respectively, each having different IPDs. It is noted, that these example implementations, the hypothetical users and their hypothetical IPDs are provided for convenience and clarity in discussing the examples of the present disclosure. Furthermore, these figures are not drawn to scale. Examples are not limited in any of these respects.
- FIGS. 5A-5C depict the example implementation of HWD 300 - 1 provided to a user 500 .
- the user 500 is depicted including eyes 540 - 1 and 540 - 2 , and a corresponding IPD 501 . More specifically, the distance between the input pupils 541 - 1 and 541 - 2 of the user's eyes 540 - 1 and 540 - 2 is the IPD 501 .
- the user 500 is depicted wearing the device 300 - 1 , which has the removable lens 122 - 1 operably coupled therein.
- the lens 122 - 1 is depicted with the HOE 124 - 1 in a particular location. More specifically, the HOE 124 - 1 is depicted disposed a horizontal distance 511 away from the tunable projector 110 , occupying area 503 .
- FIG. 5C a top view of the user 500 wearing the device 300 - 1 is depicted.
- the tunable projector 110 is depicted projecting light 101 - 1 onto a portion of lens 122 - 1 to project an image onto area 503 , and thus, HOE 124 - 1 .
- viewpoint 103 - 1 is proximate to the input pupil 541 - 1 of the users eye 540 - 1 . Accordingly, by providing the lens 122 - 1 with HOE 124 - 1 in the set location and configuring tunable projector 110 to project light 101 - 1 into area 503 , viewpoint 103 - 1 is provided for user 500 having IPD 501 .
- FIGS. 6A-6C depict the example implementation of HWD 300 - 2 provided to a user 600 .
- the user 600 is depicted including eyes 640 - 1 and 640 - 2 , and a corresponding IPD 601 . More specifically, the distance between the input pupils 641 - 1 and 641 - 2 of the user's eyes 640 - 1 and 640 - 2 is the IPD 601 .
- the user 600 is depicted wearing the device 300 - 2 , which has the removable lens 122 - 2 operably coupled therein.
- the lens 122 - 2 is depicted with the HOE 124 - 2 in a particular location. More specifically, the HOE 124 - 2 is depicted disposed a horizontal distance 611 away from the tunable projector 110 , occupying area 603 .
- FIG. 6C a top view of the user 600 wearing the device 300 - 2 is depicted.
- the tunable projector 110 is depicted projecting light 101 - 2 onto a portion of lens 122 - 2 to project an image onto area 603 , and thus, HOE 124 - 2 .
- viewpoint 103 - 2 is proximate to the input pupil 641 - 1 of the users eye 640 - 1 . Accordingly, by providing the lens 122 - 2 with HOE 124 - 2 in the set location and configuring tunable projector 110 to project light 101 - 2 into area 603 , viewpoint 103 - 2 is provided for user 600 having IPD 601 .
- a HWD configured to receive a removable lens (e.g., the lens 122 - 1 , the lens 122 - 2 , or the like) may be provisioned to provide a HWD with an eyebox (e.g., viewpoint 103 - 1 , viewpoint 103 - 2 , or the like) for different IPDs.
- a removable lens e.g., the lens 122 - 1 , the lens 122 - 2 , or the like
- an eyebox e.g., viewpoint 103 - 1 , viewpoint 103 - 2 , or the like
- the device 300 may be configured for a particular user by, for example, measuring the user's IPD (e.g., in an optometrist office, using digital tools, or the like), fixing the appropriate lens 122 - a into the frame 101 , and configuring tunable projector 110 to project an image onto an area of the lens 122 - a corresponding to the location of the HOE 124 - a.
- IPD e.g., in an optometrist office, using digital tools, or the like
- the tunable projector 110 can be configured to project images onto portions of lens 122 corresponding to a location of the HOE 124 within lens 122 .
- tunable projector 110 is depicted projecting images onto area 503 and area 603 in FIGS. 5C and 6C , respectively.
- tunable projector 110 projects pixels from right to left and left to right as the MEMS mirror 114 oscillates back and forth along oscillation axes 115 .
- Controller 116 can be configured to send a control signal to light source 112 to cause light source 112 to emit light 113 corresponding to pixels (e.g., lines, or the like) of an image to be projected at times corresponding to the oscillation of the MEMS mirror 114 .
- controller 116 configured light source 112 to pulse light 113 at times coincident with a location of the MEMS mirror 114 about the axes 115 .
- the location of the MEMS mirror 114 about axes 115 respective to pulses of light 113 can depend on the desired location of the projected image on lens 122 .
- the location of the projected image can be shifted. It is important to note, that the location of the image can be shifted by an entire pixel or by portions of a pixel. That is, the present disclosure provides image shift granularity at less than a pixel resolution.
- FIGS. 7A-7D depict an image projected at different location on a projection surface 700 .
- surface 700 can be lens 122 , including an HOE in a particular location (e.g., dependent upon a desired viewpoint location, dependent upon a user's IPD, or the like).
- FIG. 7A depicts an image 710 projected onto surface 700 . As depicted, the image 710 is projected onto the entire projection surface 700 and centered over the surface 700 .
- FIG. 7B image 710 Centered is depicted projected onto a portion 742 of the projection surface 700 , with portions 743 not including image data. Furthermore, as depicted, image 710 Centered is depicted centered across surface 700 .
- FIG. 7A depicts an image 710 projected onto surface 700 .
- FIG. 7B depicts an image 710 projected onto a portion 742 of the projection surface 700 , with portions 743 not including image data.
- image 710 Centered is depicted centered across surface 700 .
- image 710 Right is depicted projected onto a portion 744 of the projection surface 700 , with portions 745 not including image data. Furthermore, as depicted, image 710 Right is depicted shifted, relative to the image 700 Centered, to the right in a horizontal plane across surface 700 .
- image 710 Left is depicted projected onto a portion 746 of the projection surface 700 , with portions 747 not including image data. Furthermore, as depicted, image 710 Left is depicted shifted, relative to the image 700 Centered, to the left in a horizontal plane across surface 700 .
- FIGS. 7A-7D are given for example only in describing the present disclosure. Shifts can be accomplished in any direction corresponding to an axes of oscillation of MEMS mirror 114 . Furthermore, the number of shifted images and the actual images projected can depend upon the implementation.
- images 700 Centered, 700 Right and 700 Left can be projected with a border. More specifically, portions of surface 700 where the images are not projected (e.g., portions 743 , 745 or 747 ) can have black pixels, white pixels, or another color of pixel projected to simulate a border.
- the controller 116 can send a control signal to the light source 112 to cause the light source 112 to not emit light 113 during periods where pixels are not displayed (described in greater detail below).
- the hardware required to project images e.g., frame buffers, processing logic, etc.
- FIGS. 8A-8D depict example pixel timing graphs depicted in relation to oscillations of a MEMS mirror about an axes. It is noted, that the depicted graphs are given for example only. Furthermore, the graphs are described with respect to the images depicted in FIGS. 7A-7D for purposes of clarity of presentation. Additionally, the pixel line spacing and slope of oscillation is exaggerated in these figures for clarity of presentation. Furthermore, the oscillation axes is described with respect to a left to right and right to left oscillation across a horizontal plane of display surface. Examples, however, are not limited in this context.
- FIG. 8A timing diagram 810 corresponding to projection of image 710 is depicted.
- Diagram 810 depicts oscillation 811 of MEMS mirror (e.g., MEMS mirror 114 , or the like) in conjunction with pixel lines 812 .
- the timing of emission (e.g., by light source 112 , or the like) of light 113 corresponding to pixel lines 812 is depicted.
- MEMS mirror oscillates about axes 115 from left to right and right to left.
- light 113 corresponding to pixel lines 812 are emitted by light source 112 at times coincident with the location of MEMS mirror in the scan cycle or period of oscillation.
- the depicted timing diagram corresponds to projection of image 710 .
- Image 710 is centered across the entire surface 700 .
- pixel lines 812 are spaced apart more at the edges of each period than in the center.
- the MEMS mirror 114 oscillates slower at edges (e.g., when changing directions of oscillation, or the like) than in the center or the oscillation axes.
- FIG. 8B timing diagram 820 corresponding to projection of image 710 Centered is depicted.
- Diagram 820 depicts oscillation 821 of MEMS mirror (e.g., MEMS mirror 114 , or the like) in conjunction with pixel lines 822 .
- the timing of emission (e.g., by light source 112 , or the like) of light 113 corresponding to pixel lines 822 is depicted.
- MEMS mirror oscillates about axes 115 from left to right and right to left.
- light 113 corresponding to pixel lines 822 are emitted by light source 112 at times coincident with the location of MEMS mirror in the scan cycle or period of oscillation.
- the depicted timing diagram corresponds to projection of image 710 Centered.
- Image 710 Centered is centered and projected on portion 742 of surface 700 .
- pixel lines 822 are delayed from the beginning and end of each pass across the axes (e.g., left to right and right to left) as well as spaced apart more at the edges of each period than in the center.
- FIG. 8C timing diagram 830 corresponding to projection of image 710 Right is depicted.
- Diagram 830 depicts oscillation 831 of MEMS mirror (e.g., MEMS mirror 114 , or the like) in conjunction with pixel lines 832 .
- the timing of emission (e.g., by light source 112 , or the like) of light 113 corresponding to pixel lines 832 is depicted.
- MEMS mirror oscillates about axes 115 from left to right and right to left.
- light 113 corresponding to pixel lines 832 are emitted by light source 112 at times coincident with the location of MEMS mirror in the scan cycle or period of oscillation.
- the depicted timing diagram corresponds to projection of image 710 Right.
- Image 710 Right is shifted to the right and projected on portion 744 of surface 700 .
- pixel lines 832 are shifted (e.g., delayed from starting in one direction and starting earlier in the other direction) relative to the centered pixel lines depicted in FIG. 8B .
- the duration of light pulses 113 corresponding to pixel lines 832 can vary based on the location of the MEMS mirror within period 833 .
- pixels lines 832 can be longer at the start of one cycle respective to the other cycle. More specifically, the duration of pixel lines can be different in the left to right oscillation in addition to the timing of the start of the pixel lines and the spacing between pixel lines.
- FIG. 8D timing diagram 840 corresponding to projection of image 710 Left is depicted.
- Diagram 840 depicts oscillation 841 of MEMS mirror (e.g., MEMS mirror 114 , or the like) in conjunction with pixel lines 842 .
- the timing of emission (e.g., by light source 112 , or the like) of light 113 corresponding to pixel lines 842 is depicted.
- MEMS mirror oscillates about axes 115 from left to right and right to left.
- light 113 corresponding to pixel lines 842 are emitted by light source 112 at times coincident with the location of MEMS mirror in the scan cycle or period of oscillation.
- the depicted timing diagram corresponds to projection of image 710 Left.
- Image 710 Left is shifted to the left and projected on portion 746 of surface 700 .
- pixel lines 842 are shifted (e.g., delayed from starting in one direction and starting earlier in the other direction) relative to the centered pixel lines depicted in FIG. 8B .
- the duration of light pulses 113 corresponding to pixel lines 832 can vary based on the location of the MEMS mirror within period 843 .
- pixels lines 842 can be longer at the start of one cycle respective to the other cycle. More specifically, the duration of pixel lines can be different in the left to right oscillation in addition to the timing of the start of the pixel lines and the spacing between pixel lines.
- projected images can be shifted across a display surface by a fraction of a pixel.
- FIG. 9 depicts a block diagram of an example of tunable projector 110 .
- the tunable projector 110 can include the light source 112 (e.g., a laser, an LED, or the like) and movable mirror 114 configured to oscillate about an axes (e.g., axes 115 , or the like).
- the mirror 114 may be a MEMS based mirror configured to rotate about a number of axes to scan light emitted from the light source 112 across a projection surface (e.g., the lens 122 , the area of a lens corresponding to HOE 124 , or the like).
- Tunable projector 110 can also include a controller 116 .
- the controller 116 may comprise hardware and/or software and may be configured to send one or more control signals to light source 112 and/or the mirror 114 to cause the light source 112 to emit light and the mirror 114 to rotate about a number of axes to project the light over a particular area corresponding to the HOE of a lens removably fixed in a frame of a device to which the tunable projector 110 is disposed.
- the controller 116 can include an IPD detector 902 .
- the IPD detector 902 may receive an information element to include an indication of an IPD (e.g., the IPD 501 , 601 , or the like) or an indication of a location of an HOE (e.g., the horizontal displacement from the projector (e.g., the displacement 511 , the displacement 611 , or the like).
- the IPD detector 902 may receive an information element from a smart phone (or the like) to include an indication of the location of the HOE 124 in the lens 122 removably coupled to the frame 302 .
- Controller 116 can also include a pixel timer 904 .
- pixel timer 904 can send control signals to light source 112 to modify the duration and/or start times of light 113 corresponding to pixels in an image to be projected.
- pixel timer 904 can determine a desired image location based on an IPD.
- Pixel timer 904 can determine pixel durations and pixel start times based on the image location and the movable mirror 114 scanning speed.
- controller 116 can include pixel timing table 906 .
- Pixel timer 904 can determine duration and/or start times for pixel based on the pixel timing table 906 .
- pixel timing able 906 is pre-programmed within controller 116 .
- pixel timing able 906 is generated during operation, such as, for example, by a processing unit coupled to controller 116 , processing logic included in controller 116 , or the like.
- FIG. 10 depicts a logic flow 1000 for projecting a virtual image.
- the logic flow 1000 may begin at block 1010 .
- scanning mirror 114 can oscillate about an oscillation axes (e.g., axes 115 , or the like).
- MEMS microelectromechanical system
- the light corresponding to a plurality of pixels of an image” light from light source 113 can be directed to scanning mirror 114 .
- light 113 can be directed at mirror 114 to be incident on mirror 114 during oscillation of mirror 114 about axes 115 .
- controller 116 can send a control signal to light source 112 to cause light source 112 to output light 113 corresponding to pixels of an image (e.g., image 700 , or the like) at times corresponding to the period of oscillation of the mirror 114 .
- FIG. 11 illustrates an embodiment of a storage medium 2000 .
- the storage medium 2000 may comprise an article of manufacture.
- the storage medium 2000 may include any non-transitory computer readable medium or machine readable medium, such as an optical, magnetic or semiconductor storage.
- the storage medium 2000 may store various types of computer executable instructions e.g., 2002 ).
- the storage medium 2000 may store various types of computer executable instructions to implement technique 1000 .
- Examples of a computer readable or machine readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
- Examples of computer executable instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like. The examples are not limited in this context.
- FIG. 12 is a diagram of an exemplary system embodiment and in particular, depicts a platform 3000 , which may include various elements.
- platform (system) 3000 may include a processor/graphics core 3002 , a chipset 3004 , an input/output (I/O) device 3006 , a random access memory (RAM) (such as dynamic RAM (DRAM)) 3008 , and a read only memory (ROM) 3010 , HWD 3020 (e.g., HWD 300 , or the like) and various other platform components 3014 (e.g., a fan, a cross flow blower, a heat sink, DTM system, cooling system, housing, vents, and so forth).
- System 3000 may also include wireless communications chip 3016 and graphics device 3018 . The embodiments, however, are not limited to these elements.
- I/O device 3006 , RAM 3008 , and ROM 3010 are coupled to processor 3002 by way of chipset 3004 .
- Chipset 3004 may be coupled to processor 3002 by a bus 3012 .
- bus 3012 may include multiple lines.
- Processor 3002 may be a central processing unit comprising one or more processor cores and may include any number of processors having any number of processor cores.
- the processor 3002 may include any type of processing unit, such as, for example, CPU, multi-processing unit, a reduced instruction set computer (RISC), a processor that has a pipeline, a complex instruction set computer (CISC), digital signal processor (DSP), and so forth.
- processor 3002 may be multiple separate processors located on separate integrated circuit chips.
- processor 3002 may be a processor having integrated graphics, while in other embodiments processor 3002 may be a graphics core or cores.
- Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Further, some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. Furthermore, aspects or elements from different embodiments may be combined.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
Abstract
Description
- Embodiments herein generally relate to head worn displays and heads up displays; and in particular to a wearable display to accommodate a range of pupil distances.
- Modern display technology may be implemented to provide head worn displays (HWD) and to see through the display and to see information (e.g., images, text, or the like) in conjunction with the see through display. Such displays can be implemented in a variety of contexts, for example, defense, transportation, industrial, entertainment, wearable devices, or the like.
- In particular, an image may be reflected off a transparent projection surface to a user's eye to present an image in conjunction with a real world view. Conventionally, HWD systems have extremely difficult tradeoffs between various design and utility considerations, such as, for example, bulk, form-factor, see-through quality, field of view, etc. For example, achieving a normal eyewear form factor without bulk has not been achieved in a commercial head mounted display.
- Adding to the difficulty in designing and manufacturing commercial HWDs is the range over which different user's physiology (e.g., interpupillary distance (IPD), or the like) can vary.
-
FIG. 1 illustrates a first example optical display system. -
FIG. 2 illustrates the example optical display system ofFIG. 1 in conjunction with a user's eye. -
FIG. 3 illustrates a second example optical display system. -
FIG. 4 illustrates the example optical display system ofFIG. 3 in alternate detail. -
FIGS. 5A-5C illustrate the example optical system ofFIG. 3 in conjunction with a first user. -
FIGS. 6A-6C illustrate the example optical system ofFIG. 3 in conjunction with a second user. -
FIGS. 7A-7D illustrate examples of shifted images. -
FIGS. 8A-8D illustrates examples of pixel timing versus a MEMS mirror oscillation. -
FIG. 9 illustrate an example tunable projector. -
FIG. 10 illustrates an example logic flow. -
FIG. 11 illustrates an example computer readable medium. -
FIG. 12 illustrates another example system. - Various embodiments may be generally directed to head worn displays (HWDs) and specifically to a HWD with a projector configured to project images corresponding to a location on a lens. In general, HWDs provide a projection system and a lens that includes a holographic optical element (HOE). The projection system and the lens can be mounted to a frame to be worn by a user, for example, glasses, a helmet, or the like. During operation, the projection system projects an image onto an inside (e.g., proximate to the user) surface of the lens. The HOE reflects the image to an exit pupil (or viewpoint). Ideally, the exit pupil is proximate to one of the user's eyes, and specifically, to the pupil of the user's eye. As such, the user may perceive the reflected image.
- It is to be appreciated that different users may have different physiology, for example, a different interpupillary distance (IPD). More specifically, the distance between the eye pupils of one user may differ from that of another user. For example, two users with a different IPD may each wear a HWD (or similarly configured HWDs). The HWD may project an image to a lens with an HOE. The HOE may reflect the image to an exit pupil. When the HWD is worn by a first user, the exit pupil may be proximate to the first user's eye pupil. However, when the HWD is worn by a second user, the exit pupil may not be proximate to the second user's eye pupil as the second user has a different IPD than the first user (e.g., the second user's eyes are closer together than the first user's eyes, or the like). As such, the first user may correctly perceive the projected image but the second user may not.
- The present disclosure provides a HWD adapted to accept lenses with different HOEs. For example, the present disclosure can provide a HWD configured to receive removable lenses. These removable lenses include an HOE. The HOE in one removable lens may be different (e.g., in a different location, with different optical characteristics, or the like) than the HOE in another removable lens to provide a HWD that can be provisioned for different IPDs.
- Furthermore, the HWD comprises a projector arranged to project an image into a location corresponding to the HOE of the lens mounted in the HWD. The projector includes a light source and a microelectromechanical system (MEMS) mirror arranged to oscillate about a number of oscillation axes to receive light emitted from the light source and reflect the light towards the lens to project an image onto the lens, and specifically onto the HOE. The projector can include an image location controller to control a position on the lens to which the image is projected. For example, the controller can shift the image from left to right (or from top to bottom) across the surface of the lens to project the image onto a location corresponding to a lens and HOE mounted to the HWD.
- In general, the controller can delay a start of light corresponding to pixels of the image being emitted from the light source to correspond to a particular time in the periodic oscillation of the MEMS mirror. The controller can control the number of pixels in the image or can control the time period over which the light is emitted. The controller can control the light emitted from the light source to have different times depending upon the portion of the periodic oscillation, for example, the projector can implement different timing for left to right versus right to left MEMS mirror sweep.
- Accordingly, a HWD adapted to accept a number of lenses each having an HOE can be provided. Each of the HOEs in the lenses may be configured to reflect an image projected onto the HOE to an exit pupil in a particular location, where the exit pupil for one HOE may differ from the exit pupil of another HOE. For example, a first lens with a first HOE may be provided to reflect an image to a first exit pupil. A second lens with a second HOE may be provided to reflect the image to a second exit pupil. The first and second exit pupils may be displaced from each other, for example, in a horizontal direction. Accordingly, the HWD may be provisioned with either the first lens or the second lens to provide an exit pupil in either the first or second exit pupil location. As such, the HWD may be configured to provide an exit pupil (e.g., reflected image) in a first location for a first user with a first IPD (e.g., distance between pupils) or in a second location for a second user with a second IPD.
- During operation, the projection system projects an image onto the location of the lens corresponding to the HOE. Thus, a HWD accounting for different IPDs (e.g., an IPD of the intended user, or the like) is provided.
- Reference is now made to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding thereof. It may be evident, however, that the novel embodiments can be practiced without these specific details. In other instances, known structures and devices are shown in block diagram form in order to facilitate a description thereof. The intention is to provide a thorough description such that all modifications, equivalents, and alternatives within the scope of the claims are sufficiently described.
- Additionally, reference may be made to variables, such as, “a”, “b”, “c”, which are used to denote components where more than one component may be implemented. It is important to note, that there need not necessarily be multiple components and further, where multiple components are implemented, they need not be identical. Instead, use of variables to reference components in the figures is done for convenience and clarity of presentation.
-
FIG. 1 illustrates an embodiment of asystem 100.System 100 may include atunable projector 110 and anoptical imaging display 120. In general, the components ofsystem 100 operate to provide a user with a computer-mediated reality. For example,system 100 may overlay computer generated graphics onto a user's view of the world. In some examples,system 100 may provide a virtual reality view. -
Optical imaging display 120 includes aprojection surface 122 and holographic optical element (HOE) 124 (also referred to as a holographic optical combiner). During operation, theprojection system 110 projects light 101 ontolens 122. The projected light 101 can correspond to virtual images. Thelens 122, and specifically theHOE 124, reflects (or redirects) the light towards a viewpoint 103 (or exit pupil). More particularly theHOE 124 reflects the projectedlight 101. - The
tunable projector 110 can also project light 101′ within projection range 111 (e.g., to project images onto lens 122). Said differently, thetunable projector 110 can project light 101′ acrossrange 111 to project an image onto a location of thelens 122, for example, corresponding to the location of theHOE 124 onlens 122. This projected image is reflected by the HOE to be perceived withinviewpoint 103′. This is explained in greater detail below (e.g., refer toFIGS. 7A-7D andFIGS. 8A-8D ). - However, in general, the
system 100 is adapted to receivelens 122, or another lens likelens 122 havingHOEs 124 in different locations (e.g., refer toFIG. 4 ,FIGS. 5A-5C andFIGS. 6A-6C ). Thetunable projector 110 projects an image onto a location of the lens corresponding to theHOE 124 to project an image atviewpoint 103. Thus, the location of theviewpoint 103 relative to the lens orsystem 100 can change, for example, to accommodate different IPDs. - With some examples, the
lens 122 and theHOE 124 redirect the projected images and also transmit light from the external environment to theviewpoint 103. As such, a virtual image and a real world image may be presented at theviewpoint 103. It is noted, that although thedevice 100 is depicted with asingle projection system 110 andoptical imaging display 120, thedevice 100 may includemultiple projection systems 110 and optical imaging displays 120 (e.g.,lenses 122 and HOEs 124) to provide multiple viewpoints 103 (e.g., for a multiple eye display, or the like). - As used herein,
projection surface 122 is referred to aslens 122 interchangeably. However,lens 122 may not be a lens as traditionally used. For example,lens 122 can be a helmet visor, or other projection surface in which a computer-mediated reality is desired or in which thesystem 100 can be implemented. As such, embodiments are not limited in this context. -
FIG. 2 illustrates an embodiment ofsystem 100 in conjunction with aneye 200. In various embodiments,optical imaging display 120 may reflect an image projected bytunable projector 110 towards aneye 200 of a user. In various such embodiments, wheneye 200 is located withinviewpoint 103, one or more portions of the reflected image may be visible to theeye 200. - In some examples,
tunable projector 110 can include alight source 112 to emit alight beam 113 of at least one wavelength. Thelight beam 113 is incident on (or received by) ascanning mirror 114. Thescanning mirror 114 rotates about a number ofaxes 115 to scan thelight beam 113 as projected light 101 acrosslens 122 and particularly acrossHOE 124. In general,scanning mirror 114 scans the receivedlight beam 113 onto (or across) thelens 122 while thelight source 112 modulates or modifies the intensity of thelight beam 113 to correspond to a digital image. Thus, a virtual or mediated reality display can be presented as theviewpoint 103 and may be perceived by a user viaeye 200. - The
tunable projector 110 further includes animage location controller 116 to controllight beam 113 to change a location of projected image withinrange 111. For example, theimage location controller 116 can modifylight beam 113 to change a location of projected light 101 (e.g., corresponding to an image) withinrange 111 to accommodate different positions ofHOE 124 on the surface oflens 122. -
FIG. 3 illustrates an embodiment ofwearable device 300.Wearable device 300 can include awearable frame 302, which can couple withtunable projector 110 andoptical imaging display 120. In various embodiments,wearable frame 302 may holdtunable projector 110 in a certain position with respect to display 120. For example,wearable frame 302 may holdtunable projector 110 at a spacing and angle with respect to display 120 such that images are appropriately reflected byHOE 124 to be viewed by the eye (e.g., eye 200) of a user. In some embodiments,wearable frame 302 may position the eye 200 (refer toFIG. 2 ) at a spacing with respect to display 120 such that theeye 200 of a user is appropriately located in viewpoint 103 (refer toFIGS. 1 and 2 ). Embodiments are not limited in this context. -
Wearable frame 302 may include stems 312A, 312B, rims 314A, 314B, andbridge 316.Stem 312A may couple totunable projector 110 andrim 314A.Rim 314A may couple to display 120. For example,display 120 may includelens 122 held byrim 314A. In some embodiments thelens 122 may be plastic.HOE 124 can be affixed tolens 122 as described herein.Rim 314A may be connected torim 314B bybridge 316. In various embodiments,wearable frame 302 may include any device able to properly positiontunable projector 110 with respect to display 120 to enable the desired reflection of a projected image by thefield imaging display 120. For instance,wearable frame 302 may include one or more of eyeglass frames, a headband, a hat, a mask, a helmet, sunglasses, or similar head worn devices. Further, the number and position oftunable projector 110 anddisplay 120 may be altered without departing from the scope of this disclosure. For example,wearable frame 302 may include two projectors and two displays to enable computer-augmented reality for both eyes of a user. As depicted, in some embodiments,tunable projector 110 may be embedded instem 312A of a pair of glasses. In other embodiments,tunable projector 110 may be embedded inrim 314A or bridge 316 of thewearable frame 302. In some examples, the tunable projector can be coupled (e.g., attached, embedded, or the like) to stem 312B,rim 314A,rim 314B, or the like. - Furthermore, display 120 can be a removable display, mounted in
frame 302. For example,different lenses 122 having anHOE 124 in a different location from each other may be provisioned withframe 302 to provide aHWD 300 adaptable to different viewpoint locations. Thus,HWD 300 can provide a mediated-reality experience for users having different IPDs. - It will be appreciated that the components of wearable frame 102 and their arrangement illustrated in
FIG. 3 is exemplary and other components and arrangements may be used without departing from the scope of this disclosure. For example,wearable frame 302 may include control circuitry and a power source. In some embodiments, the power source may include a battery or similar power storage device and provide operational power towearable frame 302. Control circuitry may include logic and/or hardware to implement one or more functional aspects ofsystem 100. For instance, control circuitry may enablewearable frame 302 to wirelessly communicate with one or more networks. - In some examples,
lens 122 is an at least partially transparent surface with theHOE 124 affixed onto an inner (e.g., user facing) surface oflens 122. In some examples, theHOE 124 can be affixed to an external (e.g. not user facing) surface oflens 122. In some examples, theHOE 124 can be embedded (e.g., entirely or partially) withinlens 122, can form an integral part oflens 122, or can form the entirety oflens 122. Example are not limited in these contexts. During operation, thelens 122 and theHOE 124 may transmit light incident on a real world side of thelens 122 to provide a real world view. In some examples, thelens 122 is opaque and thelens 122 does not transmit light incident on a real world side of thelens 122. With some examples, thelens 122 may be sunglass lenses to reduce an amount or type of light transmitted through the lenses, for example, by polarization or absorption. With some examples, thelenses 122 may be prescription lenses to correct or augment light perceived from the real world and/or the virtual image. - Furthermore, as noted, although reference herein is made to lens and particularly to a pair of eye glasses having a
lens 122 andHOE 124 as described. The present disclosure can be applied to other viewing apparatus, such as, for example, helmets, or the like. -
FIG. 4 is a block diagram of a top view offrame 302 ofHWD 300. As noted above,frame 302 is adapted to receive a number oflenses 122. Where eachlens 122 has anHOE 124, which can be in a different location. For example,frame 302 is depicted including lens 122-a, where “a” is positive integer. It is noted, that lens 122-a depicted in this figure is shown with HOE 124-1 and HOE 124-2 in horizontal locations within the lens 122-a. This is done for clarity in describing the reflection of light from the HOEs 124-a based on their position relative to thetunable projector 110. However, during practice, lens 122-a can have a single HOE 124-a positioned in a location withinlens 122 for a specific IPD (e.g., refer toFIGS. 5A-5C andFIGS. 6A-6C ). - During operation, the
tunable projector 110 projects light 101 onto the lens 122-a. For example, thecontroller 116 ofprojector 110 can controllight source 112 to cause light source to emit light at times and/or durations in relation to oscillation ofMEMS mirror 114 to cause light 101 to be projected onto lens 122-a in an area corresponding to the location of HOE 124-a within lens 122-a. For example,controller 116 can causeprojector 110 to project light 101-1 onto HOE 124-1. Likewise,controller 116 can causeprojector 110 to project light 101-2 onto HOE 124-2. Thus,projector 110 can project an image onto HOEs 124-1 or 124-2 to cause the image to be viewable at viewpoint 103-1 or 103-2, respectively. - It is noted, viewpoints 103-1 and 103-2 are offset from each other in a horizontal direction. Accordingly, a lens (e.g., the lens 121-1, 121-2, or the like) may be provided and the
tunable projector 110 configured to provide an exit pupil (e.g., viewpoint 103-1, viewpoint 103-2, or the like) for a particular IPD. -
FIGS. 5A-5C and 6A-6C depict example implementations of theHWD 300 for two different users, respectively, each having different IPDs. It is noted, that these example implementations, the hypothetical users and their hypothetical IPDs are provided for convenience and clarity in discussing the examples of the present disclosure. Furthermore, these figures are not drawn to scale. Examples are not limited in any of these respects. - Turning more particularly to
FIGS. 5A-5C , these figures depict the example implementation of HWD 300-1 provided to auser 500. InFIG. 5A , theuser 500 is depicted including eyes 540-1 and 540-2, and acorresponding IPD 501. More specifically, the distance between the input pupils 541-1 and 541-2 of the user's eyes 540-1 and 540-2 is theIPD 501. - The
user 500 is depicted wearing the device 300-1, which has the removable lens 122-1 operably coupled therein. InFIG. 5B , the lens 122-1 is depicted with the HOE 124-1 in a particular location. More specifically, the HOE 124-1 is depicted disposed ahorizontal distance 511 away from thetunable projector 110, occupyingarea 503. InFIG. 5C , a top view of theuser 500 wearing the device 300-1 is depicted. Thetunable projector 110 is depicted projecting light 101-1 onto a portion of lens 122-1 to project an image ontoarea 503, and thus, HOE 124-1. The image is reflected by the HOE 124-1 to viewpoint 103-1. As depicted, viewpoint 103-1 is proximate to the input pupil 541-1 of the users eye 540-1. Accordingly, by providing the lens 122-1 with HOE 124-1 in the set location and configuringtunable projector 110 to project light 101-1 intoarea 503, viewpoint 103-1 is provided foruser 500 havingIPD 501. - Turning more particularly to
FIGS. 6A-6C , these figures depict the example implementation of HWD 300-2 provided to auser 600. InFIG. 6A , theuser 600 is depicted including eyes 640-1 and 640-2, and acorresponding IPD 601. More specifically, the distance between the input pupils 641-1 and 641-2 of the user's eyes 640-1 and 640-2 is theIPD 601. - The
user 600 is depicted wearing the device 300-2, which has the removable lens 122-2 operably coupled therein. InFIG. 6B , the lens 122-2 is depicted with the HOE 124-2 in a particular location. More specifically, the HOE 124-2 is depicted disposed ahorizontal distance 611 away from thetunable projector 110, occupyingarea 603. InFIG. 6C , a top view of theuser 600 wearing the device 300-2 is depicted. Thetunable projector 110 is depicted projecting light 101-2 onto a portion of lens 122-2 to project an image ontoarea 603, and thus, HOE 124-2. The image is reflected by the HOE 124-2 to viewpoint 103-2. As depicted, viewpoint 103-2 is proximate to the input pupil 641-1 of the users eye 640-1. Accordingly, by providing the lens 122-2 with HOE 124-2 in the set location and configuringtunable projector 110 to project light 101-2 intoarea 603, viewpoint 103-2 is provided foruser 600 havingIPD 601. - Accordingly, as depicted in
FIGS. 5A-5C andFIGS. 6A-6C , a HWD configured to receive a removable lens (e.g., the lens 122-1, the lens 122-2, or the like) may be provisioned to provide a HWD with an eyebox (e.g., viewpoint 103-1, viewpoint 103-2, or the like) for different IPDs. Accordingly, thedevice 300 may be configured for a particular user by, for example, measuring the user's IPD (e.g., in an optometrist office, using digital tools, or the like), fixing the appropriate lens 122-a into theframe 101, and configuringtunable projector 110 to project an image onto an area of the lens 122-a corresponding to the location of the HOE 124-a. - As noted, the
tunable projector 110 can be configured to project images onto portions oflens 122 corresponding to a location of theHOE 124 withinlens 122. For example,tunable projector 110 is depicted projecting images ontoarea 503 andarea 603 inFIGS. 5C and 6C , respectively. In general,tunable projector 110 projects pixels from right to left and left to right as theMEMS mirror 114 oscillates back and forth along oscillation axes 115.Controller 116 can be configured to send a control signal tolight source 112 to causelight source 112 to emit light 113 corresponding to pixels (e.g., lines, or the like) of an image to be projected at times corresponding to the oscillation of theMEMS mirror 114. - More specifically, as the
MEMS mirror 113 oscillates aboutaxes 115,controller 116 configuredlight source 112 topulse light 113 at times coincident with a location of theMEMS mirror 114 about theaxes 115. The location of theMEMS mirror 114 aboutaxes 115 respective to pulses of light 113 can depend on the desired location of the projected image onlens 122. As such, by configuringlight source 112 to emit light 113 corresponding to pixels of the image at times within the oscillation cycle ofMEMS mirror 114, the location of the projected image can be shifted. It is important to note, that the location of the image can be shifted by an entire pixel or by portions of a pixel. That is, the present disclosure provides image shift granularity at less than a pixel resolution. -
FIGS. 7A-7D depict an image projected at different location on aprojection surface 700. In some examples,surface 700 can belens 122, including an HOE in a particular location (e.g., dependent upon a desired viewpoint location, dependent upon a user's IPD, or the like).FIG. 7A depicts animage 710 projected ontosurface 700. As depicted, theimage 710 is projected onto theentire projection surface 700 and centered over thesurface 700. Turning more specifically, toFIG. 7B , image 710Centered is depicted projected onto aportion 742 of theprojection surface 700, withportions 743 not including image data. Furthermore, as depicted, image 710Centered is depicted centered acrosssurface 700. Turning more specifically, toFIG. 7C , image 710Right is depicted projected onto a portion 744 of theprojection surface 700, withportions 745 not including image data. Furthermore, as depicted, image 710Right is depicted shifted, relative to the image 700Centered, to the right in a horizontal plane acrosssurface 700. Turning more specifically, toFIG. 7D , image 710Left is depicted projected onto aportion 746 of theprojection surface 700, withportions 747 not including image data. Furthermore, as depicted, image 710Left is depicted shifted, relative to the image 700Centered, to the left in a horizontal plane acrosssurface 700. - It is noted, that the shifted images depicted in
FIGS. 7A-7D are given for example only in describing the present disclosure. Shifts can be accomplished in any direction corresponding to an axes of oscillation ofMEMS mirror 114. Furthermore, the number of shifted images and the actual images projected can depend upon the implementation. - In some examples, images 700Centered, 700Right and 700Left can be projected with a border. More specifically, portions of
surface 700 where the images are not projected (e.g.,portions controller 116 can send a control signal to thelight source 112 to cause thelight source 112 to not emit light 113 during periods where pixels are not displayed (described in greater detail below). As such, the hardware required to project images (e.g., frame buffers, processing logic, etc.) can be reduced as the size of the projected images is reduced. -
FIGS. 8A-8D depict example pixel timing graphs depicted in relation to oscillations of a MEMS mirror about an axes. It is noted, that the depicted graphs are given for example only. Furthermore, the graphs are described with respect to the images depicted inFIGS. 7A-7D for purposes of clarity of presentation. Additionally, the pixel line spacing and slope of oscillation is exaggerated in these figures for clarity of presentation. Furthermore, the oscillation axes is described with respect to a left to right and right to left oscillation across a horizontal plane of display surface. Examples, however, are not limited in this context. - Turning more specifically, to
FIG. 8A , timing diagram 810 corresponding to projection ofimage 710 is depicted. Diagram 810 depictsoscillation 811 of MEMS mirror (e.g.,MEMS mirror 114, or the like) in conjunction withpixel lines 812. The timing of emission (e.g., bylight source 112, or the like) oflight 113 corresponding topixel lines 812 is depicted. As noted, MEMS mirror oscillates aboutaxes 115 from left to right and right to left. As such, during eachperiod 813 light 113 corresponding topixel lines 812 are emitted bylight source 112 at times coincident with the location of MEMS mirror in the scan cycle or period of oscillation. As noted, the depicted timing diagram corresponds to projection ofimage 710.Image 710 is centered across theentire surface 700. However, as depicted inFIG. 8A ,pixel lines 812 are spaced apart more at the edges of each period than in the center. With some examples, theMEMS mirror 114 oscillates slower at edges (e.g., when changing directions of oscillation, or the like) than in the center or the oscillation axes. - Turning more specifically, to
FIG. 8B , timing diagram 820 corresponding to projection of image 710Centered is depicted. Diagram 820 depictsoscillation 821 of MEMS mirror (e.g.,MEMS mirror 114, or the like) in conjunction with pixel lines 822. The timing of emission (e.g., bylight source 112, or the like) oflight 113 corresponding to pixel lines 822 is depicted. As noted, MEMS mirror oscillates aboutaxes 115 from left to right and right to left. As such, during eachperiod 823 light 113 corresponding to pixel lines 822 are emitted bylight source 112 at times coincident with the location of MEMS mirror in the scan cycle or period of oscillation. As noted, the depicted timing diagram corresponds to projection of image 710Centered. Image 710Centered is centered and projected onportion 742 ofsurface 700. Accordingly, as depicted inFIG. 8B , pixel lines 822 are delayed from the beginning and end of each pass across the axes (e.g., left to right and right to left) as well as spaced apart more at the edges of each period than in the center. - Turning more specifically, to
FIG. 8C , timing diagram 830 corresponding to projection of image 710Right is depicted. Diagram 830 depictsoscillation 831 of MEMS mirror (e.g.,MEMS mirror 114, or the like) in conjunction withpixel lines 832. The timing of emission (e.g., bylight source 112, or the like) oflight 113 corresponding topixel lines 832 is depicted. As noted, MEMS mirror oscillates aboutaxes 115 from left to right and right to left. As such, during eachperiod 833 light 113 corresponding topixel lines 832 are emitted bylight source 112 at times coincident with the location of MEMS mirror in the scan cycle or period of oscillation. As noted, the depicted timing diagram corresponds to projection of image 710Right. Image 710Right is shifted to the right and projected on portion 744 ofsurface 700. Accordingly, as depicted inFIG. 8C ,pixel lines 832 are shifted (e.g., delayed from starting in one direction and starting earlier in the other direction) relative to the centered pixel lines depicted inFIG. 8B . In some examples, the duration oflight pulses 113 corresponding topixel lines 832 can vary based on the location of the MEMS mirror withinperiod 833. For example,pixels lines 832 can be longer at the start of one cycle respective to the other cycle. More specifically, the duration of pixel lines can be different in the left to right oscillation in addition to the timing of the start of the pixel lines and the spacing between pixel lines. - Turning more specifically, to
FIG. 8D , timing diagram 840 corresponding to projection of image 710Left is depicted. Diagram 840 depictsoscillation 841 of MEMS mirror (e.g.,MEMS mirror 114, or the like) in conjunction withpixel lines 842. The timing of emission (e.g., bylight source 112, or the like) oflight 113 corresponding topixel lines 842 is depicted. As noted, MEMS mirror oscillates aboutaxes 115 from left to right and right to left. As such, during eachperiod 843 light 113 corresponding topixel lines 842 are emitted bylight source 112 at times coincident with the location of MEMS mirror in the scan cycle or period of oscillation. As noted, the depicted timing diagram corresponds to projection of image 710Left. Image 710Left is shifted to the left and projected onportion 746 ofsurface 700. Accordingly, as depicted inFIG. 8D ,pixel lines 842 are shifted (e.g., delayed from starting in one direction and starting earlier in the other direction) relative to the centered pixel lines depicted inFIG. 8B . In some examples, the duration oflight pulses 113 corresponding topixel lines 832 can vary based on the location of the MEMS mirror withinperiod 843. For example,pixels lines 842 can be longer at the start of one cycle respective to the other cycle. More specifically, the duration of pixel lines can be different in the left to right oscillation in addition to the timing of the start of the pixel lines and the spacing between pixel lines. - Accordingly, as depicted in
FIGS. 7A-7D andFIGS. 8A-8D , projected images can be shifted across a display surface by a fraction of a pixel. -
FIG. 9 depicts a block diagram of an example oftunable projector 110. In some examples, thetunable projector 110 can include the light source 112 (e.g., a laser, an LED, or the like) andmovable mirror 114 configured to oscillate about an axes (e.g., axes 115, or the like). Themirror 114 may be a MEMS based mirror configured to rotate about a number of axes to scan light emitted from thelight source 112 across a projection surface (e.g., thelens 122, the area of a lens corresponding to HOE 124, or the like). -
Tunable projector 110 can also include acontroller 116. In general, thecontroller 116 may comprise hardware and/or software and may be configured to send one or more control signals tolight source 112 and/or themirror 114 to cause thelight source 112 to emit light and themirror 114 to rotate about a number of axes to project the light over a particular area corresponding to the HOE of a lens removably fixed in a frame of a device to which thetunable projector 110 is disposed. - The
controller 116 can include anIPD detector 902. With some examples, theIPD detector 902 may receive an information element to include an indication of an IPD (e.g., theIPD displacement 511, thedisplacement 611, or the like). For example, theIPD detector 902 may receive an information element from a smart phone (or the like) to include an indication of the location of theHOE 124 in thelens 122 removably coupled to theframe 302. -
Controller 116 can also include apixel timer 904. In general,pixel timer 904 can send control signals tolight source 112 to modify the duration and/or start times of light 113 corresponding to pixels in an image to be projected. In some examples,pixel timer 904 can determine a desired image location based on an IPD.Pixel timer 904 can determine pixel durations and pixel start times based on the image location and themovable mirror 114 scanning speed. With some examples,controller 116 can include pixel timing table 906.Pixel timer 904 can determine duration and/or start times for pixel based on the pixel timing table 906. With some examples, pixel timing able 906 is pre-programmed withincontroller 116. In some example, pixel timing able 906 is generated during operation, such as, for example, by a processing unit coupled tocontroller 116, processing logic included incontroller 116, or the like. -
FIG. 10 depicts alogic flow 1000 for projecting a virtual image. Thelogic flow 1000 may begin atblock 1010. Atblock 1010 “oscillate a microelectromechanical system (MEMS) mirror about an oscillation axes”scanning mirror 114 can oscillate about an oscillation axes (e.g., axes 115, or the like). - Continuing to block 1020 “direct light from a light source at the MEMS mirror, the light corresponding to a plurality of pixels of an image” light from
light source 113 can be directed toscanning mirror 114. For example, light 113 can be directed atmirror 114 to be incident onmirror 114 during oscillation ofmirror 114 aboutaxes 115. - Continuing to block 1030 “send a control signal to the light source, the control signal to include an indication of a first start time of a first one of the plurality of pixels in relation to a period of oscillation of the MEMS mirror”
controller 116 can send a control signal tolight source 112 to causelight source 112 tooutput light 113 corresponding to pixels of an image (e.g.,image 700, or the like) at times corresponding to the period of oscillation of themirror 114. -
FIG. 11 illustrates an embodiment of astorage medium 2000. Thestorage medium 2000 may comprise an article of manufacture. In some examples, thestorage medium 2000 may include any non-transitory computer readable medium or machine readable medium, such as an optical, magnetic or semiconductor storage. Thestorage medium 2000 may store various types of computer executable instructions e.g., 2002). For example, thestorage medium 2000 may store various types of computer executable instructions to implementtechnique 1000. - Examples of a computer readable or machine readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of computer executable instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like. The examples are not limited in this context.
-
FIG. 12 is a diagram of an exemplary system embodiment and in particular, depicts aplatform 3000, which may include various elements. For instance, this figure depicts that platform (system) 3000 may include a processor/graphics core 3002, achipset 3004, an input/output (I/O)device 3006, a random access memory (RAM) (such as dynamic RAM (DRAM)) 3008, and a read only memory (ROM) 3010, HWD 3020 (e.g.,HWD 300, or the like) and various other platform components 3014 (e.g., a fan, a cross flow blower, a heat sink, DTM system, cooling system, housing, vents, and so forth).System 3000 may also includewireless communications chip 3016 andgraphics device 3018. The embodiments, however, are not limited to these elements. - As depicted, I/
O device 3006,RAM 3008, andROM 3010 are coupled toprocessor 3002 by way ofchipset 3004.Chipset 3004 may be coupled toprocessor 3002 by abus 3012. Accordingly,bus 3012 may include multiple lines. -
Processor 3002 may be a central processing unit comprising one or more processor cores and may include any number of processors having any number of processor cores. Theprocessor 3002 may include any type of processing unit, such as, for example, CPU, multi-processing unit, a reduced instruction set computer (RISC), a processor that has a pipeline, a complex instruction set computer (CISC), digital signal processor (DSP), and so forth. In some embodiments,processor 3002 may be multiple separate processors located on separate integrated circuit chips. In someembodiments processor 3002 may be a processor having integrated graphics, while inother embodiments processor 3002 may be a graphics core or cores. - Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Further, some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. Furthermore, aspects or elements from different embodiments may be combined.
- It is emphasized that the Abstract of the Disclosure is provided to allow a reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. In the appended claims, the terms “including” and “in which” are used as the Plain-English equivalents of the respective terms “comprising” and “wherein,” respectively. Moreover, the terms “first,” “second,” “third,” and so forth, are used merely as labels, and are not intended to impose numerical requirements on their objects.
- What has been described above includes examples of the disclosed architecture. It is, of course, not possible to describe every conceivable combination of components and/or methodologies, but one of ordinary skill in the art may recognize that many further combinations and permutations are possible. Accordingly, the novel architecture is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. The detailed disclosure now turns to providing examples that pertain to further embodiments. The examples provided below are not intended to be limiting.
Claims (25)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/390,346 US20180182272A1 (en) | 2016-12-23 | 2016-12-23 | Microelectromechanical system over-scanning for pupil distance compensation |
US16/245,753 US10672310B2 (en) | 2016-12-23 | 2019-01-11 | Microelectromechanical system over-scanning for pupil distance compensation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/390,346 US20180182272A1 (en) | 2016-12-23 | 2016-12-23 | Microelectromechanical system over-scanning for pupil distance compensation |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/245,753 Continuation US10672310B2 (en) | 2016-12-23 | 2019-01-11 | Microelectromechanical system over-scanning for pupil distance compensation |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180182272A1 true US20180182272A1 (en) | 2018-06-28 |
Family
ID=62625095
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/390,346 Abandoned US20180182272A1 (en) | 2016-12-23 | 2016-12-23 | Microelectromechanical system over-scanning for pupil distance compensation |
US16/245,753 Active US10672310B2 (en) | 2016-12-23 | 2019-01-11 | Microelectromechanical system over-scanning for pupil distance compensation |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/245,753 Active US10672310B2 (en) | 2016-12-23 | 2019-01-11 | Microelectromechanical system over-scanning for pupil distance compensation |
Country Status (1)
Country | Link |
---|---|
US (2) | US20180182272A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170102548A1 (en) * | 2015-10-12 | 2017-04-13 | Eric Tremblay | Adjustable pupil distance wearable display |
US20190019448A1 (en) * | 2017-07-12 | 2019-01-17 | Oculus Vr, Llc | Redundant microleds of multiple rows for compensation of defective microled |
US20190043392A1 (en) * | 2018-01-05 | 2019-02-07 | Intel Corporation | Augmented reality eyebox fitting optimization for individuals |
US10409076B2 (en) * | 2015-10-12 | 2019-09-10 | North Inc. | Adjustable pupil distance wearable display |
US11156896B2 (en) * | 2019-10-31 | 2021-10-26 | Samsung Electronics Co., Ltd. | Augmented reality device |
DE102020205910A1 (en) | 2020-05-12 | 2021-11-18 | Robert Bosch Gesellschaft mit beschränkter Haftung | Data glasses for virtual retinal display and method for operating the same |
WO2022146430A1 (en) * | 2020-12-30 | 2022-07-07 | Google Llc | Scanning projector pixel placement |
CN115039014A (en) * | 2020-01-30 | 2022-09-09 | 维德酷有限公司 | Compact optical assembly |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10613332B1 (en) * | 2018-02-15 | 2020-04-07 | Facebook Technologies, Llc | Near-eye display assembly with enhanced display resolution |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060192094A1 (en) * | 2005-02-25 | 2006-08-31 | Naosato Taniguchi | Scanning type image display apparatus |
US20100103077A1 (en) * | 2007-11-20 | 2010-04-29 | Keiji Sugiyama | Image display apparatus, display method thereof, program, integrated circuit, goggle-type head-mounted display, vehicle, binoculars, and desktop display |
US9560328B1 (en) * | 2015-10-06 | 2017-01-31 | Microvision, Inc. | Scanned beam projector pulsed laser control |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005221888A (en) | 2004-02-06 | 2005-08-18 | Olympus Corp | Head-mounted type display device and head-mounted type camera |
JP4411547B2 (en) | 2006-03-20 | 2010-02-10 | ソニー株式会社 | Image display device |
US20080007809A1 (en) | 2006-07-10 | 2008-01-10 | Moss Gaylord E | Auto-stereoscopic diffraction optics imaging system providing multiple viewing pupil pairs |
JP5534009B2 (en) | 2010-06-07 | 2014-06-25 | コニカミノルタ株式会社 | Video display device, head-mounted display, and head-up display |
JP6449236B2 (en) * | 2013-03-25 | 2019-01-09 | インテル コーポレイション | Method and apparatus for a multiple exit pupil head mounted display |
-
2016
- 2016-12-23 US US15/390,346 patent/US20180182272A1/en not_active Abandoned
-
2019
- 2019-01-11 US US16/245,753 patent/US10672310B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060192094A1 (en) * | 2005-02-25 | 2006-08-31 | Naosato Taniguchi | Scanning type image display apparatus |
US20100103077A1 (en) * | 2007-11-20 | 2010-04-29 | Keiji Sugiyama | Image display apparatus, display method thereof, program, integrated circuit, goggle-type head-mounted display, vehicle, binoculars, and desktop display |
US9560328B1 (en) * | 2015-10-06 | 2017-01-31 | Microvision, Inc. | Scanned beam projector pulsed laser control |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170102548A1 (en) * | 2015-10-12 | 2017-04-13 | Eric Tremblay | Adjustable pupil distance wearable display |
US10409076B2 (en) * | 2015-10-12 | 2019-09-10 | North Inc. | Adjustable pupil distance wearable display |
US10437071B2 (en) * | 2015-10-12 | 2019-10-08 | North Inc. | Adjustable pupil distance wearable display |
US20190019448A1 (en) * | 2017-07-12 | 2019-01-17 | Oculus Vr, Llc | Redundant microleds of multiple rows for compensation of defective microled |
US20190043392A1 (en) * | 2018-01-05 | 2019-02-07 | Intel Corporation | Augmented reality eyebox fitting optimization for individuals |
US10762810B2 (en) * | 2018-01-05 | 2020-09-01 | North Inc. | Augmented reality eyebox fitting optimization for individuals |
US11156896B2 (en) * | 2019-10-31 | 2021-10-26 | Samsung Electronics Co., Ltd. | Augmented reality device |
US20220019122A1 (en) * | 2019-10-31 | 2022-01-20 | Samsung Electronics Co., Ltd. | Augmented reality device |
US11586091B2 (en) * | 2019-10-31 | 2023-02-21 | Samsung Electronics Co., Ltd. | Augmented reality device |
CN115039014A (en) * | 2020-01-30 | 2022-09-09 | 维德酷有限公司 | Compact optical assembly |
DE102020205910A1 (en) | 2020-05-12 | 2021-11-18 | Robert Bosch Gesellschaft mit beschränkter Haftung | Data glasses for virtual retinal display and method for operating the same |
WO2022146430A1 (en) * | 2020-12-30 | 2022-07-07 | Google Llc | Scanning projector pixel placement |
Also Published As
Publication number | Publication date |
---|---|
US20190189037A1 (en) | 2019-06-20 |
US10672310B2 (en) | 2020-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10672310B2 (en) | Microelectromechanical system over-scanning for pupil distance compensation | |
US10437071B2 (en) | Adjustable pupil distance wearable display | |
US11782277B2 (en) | Waveguide display with increased uniformity and reduced cross-coupling between colors | |
US10419731B2 (en) | Virtual image generator | |
US10409082B2 (en) | Adjustable focal plane optical system | |
US10409076B2 (en) | Adjustable pupil distance wearable display | |
EP2948813B1 (en) | Projection optical system for coupling image light to a near-eye display | |
TWI641868B (en) | Head-mounted display device with vision correction function | |
EP3485632B1 (en) | Near-eye display with frame rendering based on reflected wavefront analysis for eye characterization | |
JP2018189963A (en) | Beam guiding device | |
JP2016517036A5 (en) | ||
US10133071B2 (en) | Replaceable optical element for near eye display | |
US20190212558A1 (en) | Variable Reflectivity Image Combiner For Wearable Displays | |
US10642042B2 (en) | Lens and embedded optical element for near eye display | |
KR20220093041A (en) | Systems and methods for displaying objects with depth | |
US10788670B2 (en) | Method to manufacture lens having embedded holographic optical element for near eye display | |
US20230078819A1 (en) | Optical design of a dual combiner in head-wearable display | |
CN108287405A (en) | Turnover type virtual reality display device and its method | |
KR102226639B1 (en) | An optical apparatus with plural pin-mirrors and a head mounted display apparatus having thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTEL CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MCGLEW, PATRICK GERARD;ABELE, NICOLAS;FOTINOS, ALEXANDRE;REEL/FRAME:041921/0842 Effective date: 20170206 |
|
AS | Assignment |
Owner name: NORTH INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTEL CORPORATION;REEL/FRAME:047945/0704 Effective date: 20181105 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTH INC.;REEL/FRAME:054113/0814 Effective date: 20200916 |