WO2014017348A1 - 画像表示装置および画像表示方法 - Google Patents
画像表示装置および画像表示方法 Download PDFInfo
- Publication number
- WO2014017348A1 WO2014017348A1 PCT/JP2013/069378 JP2013069378W WO2014017348A1 WO 2014017348 A1 WO2014017348 A1 WO 2014017348A1 JP 2013069378 W JP2013069378 W JP 2013069378W WO 2014017348 A1 WO2014017348 A1 WO 2014017348A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- observer
- optical system
- display
- positional relationship
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 16
- 210000001508 eye Anatomy 0.000 claims abstract description 443
- 230000003287 optical effect Effects 0.000 claims abstract description 242
- 238000012937 correction Methods 0.000 claims description 61
- 238000001514 detection method Methods 0.000 claims description 39
- 210000005252 bulbus oculi Anatomy 0.000 claims description 34
- 238000003702 image correction Methods 0.000 description 17
- 230000005540 biological transmission Effects 0.000 description 16
- 235000002673 Dioscorea communis Nutrition 0.000 description 15
- 241000544230 Dioscorea communis Species 0.000 description 15
- 208000035753 Periorbital contusion Diseases 0.000 description 15
- 210000004087 cornea Anatomy 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 11
- 230000007246 mechanism Effects 0.000 description 6
- 238000001028 reflection method Methods 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 230000011514 reflex Effects 0.000 description 3
- 210000003786 sclera Anatomy 0.000 description 3
- 241001469893 Oxyzygonectes dovii Species 0.000 description 2
- 230000008602 contraction Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 240000006829 Ficus sundaica Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/38—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/378—Image reproducers using viewer tracking for tracking rotational head movements around an axis perpendicular to the screen
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0025—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
- G02B27/0068—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration having means for controlling the degree of correction, e.g. using phase modulators, movable elements
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
Definitions
- the present technology relates to an image display apparatus and an image display method, and more particularly to an image display apparatus such as a head mounted display configured to guide an image displayed on a display element to an observer's eye by a magnifying optical system.
- an image display apparatus such as a head mounted display configured to guide an image displayed on a display element to an observer's eye by a magnifying optical system.
- a head-mounted display mounted on the head of a user.
- this head mounted display is configured to magnify an image displayed on a small display element with a magnifying optical system and to guide the image to the observer's eye. That is, the head mounted display is configured to optically magnify the image displayed on the display element and to allow the user to observe the image as a magnified virtual image.
- FIG. 14A when a shift occurs between the center of the optical system and the position of the observer's eye, an image is applied to comfortable image observation. In this case, for example, depending on the amount of deviation, color deviation may occur, contrast may be reduced, or image distortion may occur.
- FIG. 14B shows the case where there is no deviation between the center of the optical system and the position of the observer's eye.
- Patent Document 1 proposes a head mount display in which the optical system is configured to be movable, and the observer manually moves and adjusts the optical system to correct positional deviation between the observer's eye and the optical system. .
- An object of the present technology is to enable appropriate correction of misalignment between an optical system and an observer's eye.
- an image display apparatus includes a positional relationship correction unit that corrects the positional relationship between the optical system and the observer's eyes based on the positional relationship detected by the positional relationship detection unit.
- the positional relationship detection unit detects the positional relationship between the optical system that guides the image displayed on the display element to the observer's eye and the observer's eye.
- the image displayed on the display element is guided to the observer's eye by the optical system.
- the image displayed on the display element is optically magnified and observed by the observer as a magnified virtual image.
- an optical system a first optical system for guiding the left eye image displayed on the display element to the left eye of the observer, and a second optical system for guiding the right eye image displayed on the display element to the right eye of the observer
- An optical system may be present.
- it may be further provided with an optical system that guides the image displayed on the display element to the observer's eye.
- an eye position estimation unit for estimating the eye position of the observer's eye is further provided, and the positional relationship detection unit determines the positional relationship between the optical system and the observer's eye based on the eye position estimated by the eye position estimation unit. It may be made to detect.
- the estimation of the eyeball position may be performed by, for example, application of a scleral reflex method, an EOG (Electro-Oculogram) method, a face recognition technique, or the like.
- the positional relationship detection unit detects the positional relationship between the optical system and the observer's eye based on the estimated eye position.
- the positional relationship correction unit corrects the positional relationship between the optical system and the observer's eye based on the positional relationship detected by the positional relationship detection unit. For example, the positional relationship correction unit may shift the display position of the image displayed on the display element based on the positional relationship detected by the positional relationship detection unit. Also, for example, the positional relationship correction unit may move the optical system based on the positional relationship detected by the positional relationship detection unit.
- the positional relationship correction unit is configured to present the positional relationship detected by the positional relationship detection unit to the observer, and the positional relationship between the optical system and the observer's eyes according to the operation of the observer. It may be made to have a control part to control.
- control unit may perform shift control of the display position of the image displayed on the display element according to the operation of the observer. Also, for example, in this case, the control unit may perform movement control of the optical system according to the operation of the observer.
- the positional relationship presentation unit may be configured to display the positional relationship detected by the positional relationship detection unit on the display element.
- the gap between the optical system and the observer's eye may be displayed by a gauge, characters, or the like.
- an operation method of the remote control device may be displayed to reduce the deviation.
- the display of this positional relationship may be displayed on a display element separate from the display element described above.
- the presentation of this positional relationship may be performed by voice, vibration, and further by the intensity of light emission.
- the positional deviation between the optical system and the observer's eye is corrected based on the detection result of the positional relationship between the optical system and the observer's eye. Therefore, it becomes possible to appropriately correct the positional deviation between the optical system and the observer's eye.
- the present technology further includes, for example, a tilt estimation unit that estimates the tilt of the optical system, and the positional relationship detection unit is configured to estimate the tilt of the optical system and the observer eye based on the estimated tilt as well as the estimated eye position.
- the positional relationship may be detected. In this case, even when the optical system is inclined, the positional relationship between the optical system and the observer's eye can be accurately detected, and the positional deviation between the optical system and the observer's eye can be appropriately corrected. .
- FIG. 1 shows a schematic configuration example of an optical transmission head mounted display (HMD) 100 according to a first embodiment.
- This configuration example is a binocular HMD.
- the HMD 100 has a left eyeglass lens unit 101L and a right eyeglass lens unit 101R.
- the spectacle lens unit 101L and the spectacle lens unit 101R are integrally connected by the connection member 102.
- Each of the spectacle lens parts 101L and 101R is a combination of a spectacle lens and a HOE (holographic optical element) sheet.
- This HOE sheet has a function like a half mirror which combines external light and display light, and a function of concave surface and free-form surface which enlarges a display image.
- Infrared sensors 103L and 103R are attached to the spectacle lens portions 101L and 101R, respectively.
- the infrared sensor 103L is attached to the horizontal center position of the spectacle lens unit 101L (the center position in the horizontal direction of the left-eye optical system).
- the infrared sensor 103R is attached to the horizontal center position of the spectacle lens unit 101R (the center position in the horizontal direction of the right-eye optical system).
- a gyro sensor 104 for detecting an inclination of an optical system including the eyeglass lens portions 101L and 101R is attached to the eyeglass lens portion 101L.
- the sensor outputs of the infrared sensors 103L and 103R are used for eye position estimation by the scleral reflection method.
- the scleral reflex method is a method utilizing the fact that the reflectances of the cornea (black eye) and the sclera (white eye) are different.
- the infrared sensor scans in the horizontal direction the weak infrared rays emitted to the observer's eye side, and detects the reflected light. Since the intensity of the reflected light in the cornea (black eye) and the sclera (white eye) are largely different, it is possible to estimate the eyeball position of the observer from the sensor output.
- the HMD 100 includes an image correction unit 111, display drivers 112L and 112R, displays 113L and 113R, an eye position estimation unit 114, a tilt estimation unit 115, and a display control unit 116.
- the display 113L constitutes a display element and displays a left eye image.
- the display 113L is configured of, for example, an LCD (Liquid Crystal Display), and is driven by the display driver 112L to display a left eye image forming a stereoscopic image.
- the display 113R configures a display element and displays a right eye image.
- the display 113R is configured of, for example, an LCD (Liquid Crystal Display), and is driven by the display driver 112R to display a right eye image forming a stereoscopic image.
- the eye position estimation unit 114 estimates the eye positions of the left eye and the right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R. In this case, based on the level of the sensor output of the infrared sensor 103L, the eyeball position estimating unit 114 sets the angle ⁇ L at the time of the cornea (black eye) scan as the estimation result of the eyeball position of the left eye. Further, in this case, the eyeball position estimating unit 114 determines the angle ⁇ R at the time of the cornea (black eye) scan as the estimation result of the eyeball position of the right eye based on the level of the sensor output of the infrared sensor 103R.
- the tilt estimation unit 115 estimates the tilt ⁇ of the optical system including the spectacle lens units 101L and 101R with respect to the horizontal direction based on the sensor output of the gyro sensor 104.
- the display control unit 116 detects the positional relationship between the left eye optical system and the left eye based on the eye position estimation result of the left eye and the right eye and the inclination estimation result of the optical system, and the right eye optical system and the right eye Detect the positional relationship with
- the display control unit 116 determines the positional deviation dL of the left-eye optical system with respect to the left eye based on the angle ⁇ L, which is the estimation result of the eye position of the left eye, and estimates the eye position of the right eye. Based on the angle ⁇ R, the positional deviation dR of the right-eye optical system with respect to the right eye of the observer is determined. Then, the display control unit 116 calculates the inter-eye distance de based on the positional deviations dL and dR thus obtained and the inter-sensor distance ds of the infrared sensors 103L and 103R, which are known.
- FIG. 2 shows an example of the positional relationship between the optical system and the eye.
- FIG. 2 shows the case where the inclination ⁇ of the optical system is zero.
- the tilt ⁇ of the optical system may not be zero.
- the display control unit 116 calculates the inter-eye distance de based on the positional deviations dL and dR thus obtained, the inter-sensor distance ds of the known infrared sensors 103L and 103R, and the inclination ⁇ . Do.
- FIG. 3 shows an example of the positional relationship between the optical system and the eye in that case.
- the interocular distance de is expressed as shown in the following equation (2).
- the display control unit 116 shifts and controls the display positions of the left eye image and the right eye image in mutually opposite directions in the horizontal direction based on the inter-eye distance de. That is, the display control unit 116 automatically corrects the positional deviation between the optical system and the eyeball position of the observer by manipulating the image and electronically moving the optical system.
- the display control unit 116 shifts the display position of the left eye image and the right eye image according to the difference s (de ⁇ ds ⁇ cos ⁇ ) between the inter-eye distance de and the inter-sensor distance ds ⁇ cos ⁇ in consideration of the inclination ⁇ .
- FIG. 4 schematically shows the shift control of the display position of the left eye image and the right eye image.
- S 0, as shown in FIG. 4A, the shift amount of the left eye image and the right eye image in the horizontal direction is set to zero.
- the image correction unit 111 shifts the display position under control of the display control unit 116 with respect to left eye image data and right eye image data for displaying a stereoscopic (3D) image. Perform image correction processing for The image correction unit 111 supplies the corrected left eye image data to the display driver 112L, and supplies the corrected right eye image data to the display driver 112R.
- the left eye image data is supplied to the display driver 112 L via the image correction unit 111.
- the display driver 112L drives the display 113L, and the left eye image is displayed on the display 113L.
- right-eye image data is supplied to the display driver 112R via the image correction unit 111.
- the display 113R is driven by the display driver 112R, and a right eye image is displayed on the display 113R.
- the light from the left eye image displayed on the display 113L is reflected by the spectacle lens unit 101L and reaches the left eye of the observer.
- the left eye image displayed on the display 113L is optically enlarged by the spectacle lens unit 101L, and is observed by the observer's left eye as an enlarged virtual image.
- the light from the right eye image displayed on the display 113R is reflected by the spectacle lens unit 101R and reaches the right eye of the observer.
- the right-eye image displayed on the display 113R is optically enlarged by the spectacle lens unit 101R, and is observed by the observer's right eye as an enlarged virtual image.
- the viewer's eye side is irradiated with weak infrared rays from the infrared sensor 103L attached to the horizontal center position (center position in the horizontal direction of the left eye optical system) of the spectacle lens unit 101L, and the horizontal direction Will be scanned. Also, in this image display state, the viewer's eye side is irradiated with weak infrared rays from the infrared sensor 103R attached to the horizontal center position (center position in the horizontal direction of the right eye optical system) of the eyeglass lens unit 101R. Scan horizontally.
- the sensor outputs of the infrared sensors 103L and 103R are respectively supplied to the eyeball position estimation unit 114.
- the eyeball position estimation unit 114 estimates the eyeball positions of the left eye and the right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R.
- the eye position estimating unit 114 outputs the angles ⁇ L and ⁇ R at the time of the cornea (black eye) scan as estimation results of the eye positions of the left eye and the right eye (see FIGS. 2 and 3).
- the sensor output of the gyro sensor 104 attached to the spectacle lens unit 101 L is supplied to the inclination estimation unit 115.
- the tilt estimation unit 115 based on the sensor output of the gyro sensor 104, the tilt ⁇ of the optical system including the spectacle lens units 101L and 101R with respect to the horizontal direction is estimated (see FIG. 3).
- the angles ⁇ L and ⁇ R that are estimation results of the eyeball position estimation unit 114 are supplied to the display control unit 116. Further, the inclination ⁇ which is the estimation result of the inclination estimation unit 115 is supplied to the display control unit 116.
- the display control unit 116 based on the estimation results of the eye position of the left eye and the right eye and the inclination estimation result of the optical system, the positional deviation dL of the observer of the left eye optical system with respect to the left eye The positional deviation dR of the lens with respect to the right eye is obtained (see FIGS. 2 and 3).
- the display control unit 116 calculates the inter-eye distance de based on the positional deviations dL and dR, the known inter-sensor distance ds of the infrared sensors 103L and 103R, and the inclination ⁇ (see equation (2)). . In this case, by considering the inclination ⁇ , the inter-eye distance de can be obtained with high accuracy.
- the image correction unit 111 is controlled by the display control unit 116, and the display positions of the left eye image and the right eye image displayed on the displays 113L and 113R are shift-controlled.
- the difference s is controlled to be 0 according to the difference s (de ⁇ ds ⁇ cos ⁇ ) between the inter-eye distance de and the inter-sensor distance ds ⁇ cos ⁇ in which the inclination ⁇ is taken into consideration.
- the horizontal center of the left eye image is made to coincide with the left eye of the observer, and the horizontal center of the right eye image is made to coincide with the right eye of the observer. Misalignment of the eyes is corrected.
- the display positions of the left eye image and the right eye image are shift-controlled based on the detection result of the positional relationship between the optical system and the observer's eye, and the optical system and the observer's eye Misalignment is corrected electronically. Therefore, the positional deviation between the optical system and the observer's eye can be corrected appropriately.
- FIG. 5 shows a schematic configuration example of an optical transmission head mounted display (HMD) 100A as a second embodiment.
- This configuration example is a binocular HMD.
- parts corresponding to FIG. 1 are given the same reference numerals, and the detailed description thereof will be omitted as appropriate.
- the HMD 100A has a left eyeglass lens unit 101L and a right eyeglass lens unit 101R.
- the spectacle lens unit 101L and the spectacle lens unit 101R are connected by the optical system adjustment mechanism 122.
- the optical system adjustment mechanism 122 expands and contracts the distance between the left and right optical systems, that is, the distance between the eyeglass lens unit 101L forming the left eye optical system and the eyeglass lens unit 101R forming the right eye optical system.
- Infrared sensors 103L and 103R are attached to the spectacle lens portions 101L and 101R, respectively.
- a gyro sensor 104 for detecting an inclination of an optical system including the eyeglass lens portions 101L and 101R is attached to the eyeglass lens portion 101L.
- the sensor outputs of the infrared sensors 103L and 103R are used for eye position estimation by the scleral reflection method.
- the HMD 100A further includes display drivers 112L and 112R, displays 113L and 113R, an eye position estimation unit 114, a tilt estimation unit 115, a display control unit 116A, and an optical system position correction unit 117.
- the display 113 ⁇ / b> L is driven by the display driver 112 ⁇ / b> L to display a left eye image forming a stereoscopic image.
- the display 113R is driven by the display driver 112R to display a right-eye image forming a stereoscopic image.
- the eye position estimation unit 114 estimates the eye positions of the left eye and the right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R, and determines the angles ⁇ L and ⁇ R at the time of the cornea (black eye) scan as the eye positions. It outputs as an estimation result of (refer FIG. 2, FIG. 3).
- the inclination estimation unit 115 outputs, as an estimation result, the inclination ⁇ with respect to the horizontal direction of the optical system including the spectacle lens units 101L and 101R based on the sensor output of the gyro sensor 104 (see FIG. 3).
- the display control unit 116A determines the positional shift dL of the left-eye optical system with respect to the left eye of the observer based on the angle ⁇ L, and determines the positional shift dR with respect to the right eye of the observer of the right-eye optical system based on the angle ⁇ R.
- the display control unit 116A calculates the inter-eye distance de based on the positional deviations dL and dR, the inter-sensor distance ds of the known infrared sensors 103L and 103R, and the inclination ⁇ (see equation (2)). .
- the display control unit 116A selects the left-eye optical system and the right-eye optical system according to the difference s (de ⁇ ds ⁇ cos ⁇ ) between the inter-eye distance de and the inter-sensor distance ds ⁇ cos ⁇ in consideration of the inclination ⁇ .
- the position is controlled to shift in the opposite direction to the horizontal direction. That is, the display control unit 116A automatically corrects the positional deviation between the optical system and the eyeball position of the observer by mechanically moving the optical system.
- This shift control is similar to the shift control of the display position of the left eye image and the right eye image described above (see FIG. 4).
- the optical system position correction unit 117 corrects the optical system position under the control of the display control unit 116A. That is, the optical system position correction unit 117 adjusts the expansion and contraction of the optical system adjustment mechanism 122 to adjust the horizontal position of the spectacle lens unit 101L and the spectacle lens unit 101R. Further, along with this, the optical system position correction unit 117 also adjusts the positions of the displays 113L and 113R.
- the left eye image data is supplied to the display driver 112L.
- the display driver 112L drives the display 113L, and the left eye image is displayed on the display 113L.
- right eye image data is supplied to the display driver 112R.
- the display 113R is driven by the display driver 112R, and a right eye image is displayed on the display 113R.
- the light from the left eye image displayed on the display 113L is reflected by the spectacle lens unit 101L and reaches the left eye of the observer.
- the left eye image displayed on the display 113L is optically enlarged by the spectacle lens unit 101L, and is observed by the observer's left eye as an enlarged virtual image.
- the light from the right eye image displayed on the display 113R is reflected by the spectacle lens unit 101R and reaches the right eye of the observer.
- the right-eye image displayed on the display 113R is optically enlarged by the spectacle lens unit 101R, and is observed by the observer's right eye as an enlarged virtual image.
- the viewer's eye side is irradiated with weak infrared rays from the infrared sensor 103L attached to the horizontal center position (center position in the horizontal direction of the left eye optical system) of the spectacle lens unit 101L, and the horizontal direction Will be scanned. Also, in this image display state, the viewer's eye side is irradiated with weak infrared rays from the infrared sensor 103R attached to the horizontal center position (center position in the horizontal direction of the right eye optical system) of the eyeglass lens unit 101R. Scan horizontally.
- the sensor outputs of the infrared sensors 103L and 103R are respectively supplied to the eyeball position estimation unit 114.
- the eyeball position estimation unit 114 estimates the eyeball positions of the left eye and the right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R.
- the eye position estimating unit 114 outputs the angles ⁇ L and ⁇ R at the time of the cornea (black eye) scan as estimation results of the eye positions of the left eye and the right eye (see FIGS. 2 and 3).
- the sensor output of the gyro sensor 104 attached to the spectacle lens unit 101 L is supplied to the inclination estimation unit 115.
- the tilt estimation unit 115 based on the sensor output of the gyro sensor 104, the tilt ⁇ of the optical system including the spectacle lens units 101L and 101R with respect to the horizontal direction is estimated (see FIG. 3).
- the angles ⁇ L and ⁇ R that are estimation results of the eye position estimation unit 114 are supplied to the display control unit 116A. Further, the inclination ⁇ which is the estimation result of the inclination estimation unit 115 is supplied to the display control unit 116A.
- the display control unit 116A based on the estimation results of the eye position of the left eye and the right eye and the inclination estimation result of the optical system, the positional deviation dL of the observer of the left eye optical system with respect to the left eye and the observer of the right eye optical system The positional deviation dR of the lens with respect to the right eye is obtained (see FIGS. 2 and 3).
- the inter-eye distance de is calculated based on the positional deviations dL and dR, the inter-sensor distance ds of the known infrared sensors 103L and 103R, and the inclination ⁇ (see equation (2)) . In this case, by considering the inclination ⁇ , the inter-eye distance de can be obtained with high accuracy.
- the optical system position correction unit 117 is controlled by the display control unit 116A, and the positions of the left eye optical system and the right eye optical system, that is, the positions of the eyeglass lens units 101L and 101R and the positions of the displays 113L and 113R are shift controlled. Ru.
- the difference s is controlled to be 0 according to the difference s (de ⁇ ds ⁇ cos ⁇ ) between the inter-eye distance de and the inter-sensor distance ds ⁇ cos ⁇ in which the inclination ⁇ is taken into consideration.
- the horizontal center of the left-eye optical system coincides with the left eye of the observer
- the horizontal center of the right-eye optical system coincides with the right eye of the observer. Misalignment of the observer's eye is corrected.
- the positions of the left eye optical system and the right eye optical system are shift-controlled based on the detection result of the positional relationship between the optical system and the observer's eye, and the optical system and the observer Misalignment of the eye is mechanically corrected. Therefore, the positional deviation between the optical system and the observer's eye can be corrected appropriately.
- FIG. 6 shows a schematic configuration example of an optical transmission head mounted display (HMD) 100B according to the third embodiment.
- This configuration example is a binocular HMD.
- the HMD 100B has a left eyeglass lens unit 101L and a right eyeglass lens unit 101R.
- the spectacle lens unit 101L and the spectacle lens unit 101R are integrally connected by the connection member 102.
- Infrared sensors 103L and 103R are attached to the spectacle lens portions 101L and 101R, respectively.
- a gyro sensor 104 for detecting an inclination of an optical system including the eyeglass lens portions 101L and 101R is attached to the eyeglass lens portion 101L.
- the sensor outputs of the infrared sensors 103L and 103R are used for eye position estimation by the scleral reflection method.
- the HMD 100B further includes display drivers 112L and 112R, displays 113L and 113R, an eye position estimation unit 114, an inclination estimation unit 115, a display control unit 116B, an image correction unit 111, and an OSD superposition unit 118. ing.
- the display 113 ⁇ / b> L is driven by the display driver 112 ⁇ / b> L to display a left eye image forming a stereoscopic image.
- the display 113R is driven by the display driver 112R to display a right-eye image forming a stereoscopic image.
- the eye position estimation unit 114 estimates the eye positions of the left eye and the right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R, and determines the angles ⁇ L and ⁇ R at the time of the cornea (black eye) scan as the eye positions. It outputs as an estimation result of (refer FIG. 2, FIG. 3).
- the inclination estimation unit 115 outputs, as an estimation result, the inclination ⁇ with respect to the horizontal direction of the optical system including the spectacle lens units 101L and 101R based on the sensor output of the gyro sensor 104 (see FIG. 3).
- the display control unit 116B determines the positional shift dL of the left-eye optical system with respect to the left eye of the observer based on the angle ⁇ L, and determines the positional shift dR with respect to the right eye of the observer of the right-eye optical system based on the angle ⁇ R.
- the display control unit 116B calculates the inter-eye distance de based on the positional deviations dL and dR, the inter-sensor distance ds of the known infrared sensors 103L and 103R, and the inclination ⁇ (see Formula (2)). .
- the display control unit 116B performs OSD display of the difference s or this display according to the difference s (de ⁇ ds ⁇ cos ⁇ ) between the inter-eye distance de and the inter-sensor distance ds ⁇ cos ⁇ in consideration of the inclination ⁇ .
- An OSD is displayed to prompt the user to correct the difference s.
- FIG. 7 shows an example of the OSD display.
- FIG. 7A is an example of displaying a gauge indicating the difference s (the amount of positional deviation), in which the gauge is displayed on the left side of the screen.
- FIG. 7B is also an example of displaying a gauge indicating the difference s (the amount of positional deviation), in which the gauge is displayed on the upper end side of the screen.
- FIG. 7C is also an example of displaying a gauge indicating the difference s (the amount of positional deviation), in which the gauge is displayed at the center of the screen.
- FIGS. 7D and 7E show an example of displaying a character indicating the difference s (the amount of positional deviation). Further, FIG.
- FIG. 7F shows an example of the OSD display for prompting the user to perform the operation of correcting the difference s, and as indicated by the arrow P, the operation button of the remote control device to be operated by the user (operation portion )It is shown. It is also conceivable to display by combining the display of the gauge, the display of characters, the display of the operation button of the remote control device, and the like as appropriate.
- the display control unit 116B shifts the display positions of the left eye image and the right eye image based on the user operation acquired by the user operation acquisition unit 119.
- the user performs correction so that the difference s becomes 0 based on the OSD display of the difference s or the OSD display for prompting the user to perform the operation of correcting the difference s. I will do the operation.
- the image correction unit 111 performs an image correction process for shifting the display position of the left eye image data and the right eye image data for displaying a stereoscopic (3D) image under the control of the display control unit 116B. Do.
- the image correction unit 111 supplies the corrected left eye image data to the display driver 112L through the OSD superimposing unit 118, and supplies the corrected right eye image data to the display driver 112R.
- the OSD superimposing unit 118 superimposes the display signal for OSD display output from the display control unit 116B on the left eye image data and the right eye image data.
- the left eye image data is supplied to the display driver 112 L via the image correction unit 111.
- the display driver 112L drives the display 113L, and the left eye image is displayed on the display 113L.
- right-eye image data is supplied to the display driver 112R via the image correction unit 111.
- the display 113R is driven by the display driver 112R, and a right eye image is displayed on the display 113R.
- the light from the left eye image displayed on the display 113L is reflected by the spectacle lens unit 101L and reaches the left eye of the observer.
- the left eye image displayed on the display 113L is optically enlarged by the spectacle lens unit 101L, and is observed by the observer's left eye as an enlarged virtual image.
- the light from the right eye image displayed on the display 113R is reflected by the spectacle lens unit 101R and reaches the right eye of the observer.
- the right-eye image displayed on the display 113R is optically enlarged by the spectacle lens unit 101R, and is observed by the observer's right eye as an enlarged virtual image.
- the viewer's eye side is irradiated with weak infrared rays from the infrared sensor 103L attached to the horizontal center position (center position in the horizontal direction of the left eye optical system) of the spectacle lens unit 101L, and the horizontal direction Will be scanned. Also, in this image display state, the viewer's eye side is irradiated with weak infrared rays from the infrared sensor 103R attached to the horizontal center position (center position in the horizontal direction of the right eye optical system) of the eyeglass lens unit 101R. Scan horizontally.
- the sensor outputs of the infrared sensors 103L and 103R are respectively supplied to the eyeball position estimation unit 114.
- the eyeball position estimation unit 114 estimates the eyeball positions of the left eye and the right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R.
- the eye position estimating unit 114 outputs the angles ⁇ L and ⁇ R at the time of the cornea (black eye) scan as estimation results of the eye positions of the left eye and the right eye (see FIGS. 2 and 3).
- the sensor output of the gyro sensor 104 attached to the spectacle lens unit 101 L is supplied to the inclination estimation unit 115.
- the tilt estimation unit 115 based on the sensor output of the gyro sensor 104, the tilt ⁇ of the optical system including the spectacle lens units 101L and 101R with respect to the horizontal direction is estimated (see FIG. 3).
- the angles ⁇ L and ⁇ R which are estimation results of the eye position estimation unit 114, are supplied to the display control unit 116B.
- the inclination ⁇ which is the estimation result of the inclination estimation unit 115, is supplied to the display control unit 116B.
- the positional deviation dL of the observer of the left eye optical system with respect to the left eye and the observer of the right eye optical system is obtained (see FIGS. 2 and 3).
- the inter-eye distance de is calculated based on the positional deviations dL and dR, the known inter-sensor distance ds of the infrared sensors 103L and 103R, and the inclination ⁇ (see equation (2)) . In this case, by considering the inclination ⁇ , the inter-eye distance de can be obtained with high accuracy.
- a difference s (de ⁇ ds ⁇ cos ⁇ ) between the inter-eye distance de and the inter-sensor distance ds ⁇ cos ⁇ in consideration of the inclination ⁇ is obtained from the display control unit 116B, and the OSD display of the difference s or A display signal for OSD display for prompting the user to correct the difference s is output.
- This display signal is supplied to the OSD superimposing unit 118 and superimposed on the left eye image data and the right eye image data. As a result, the OSD is displayed on the displays 113L and 113R.
- a correction operation is performed by the observer based on the OSD display so that the difference s becomes zero.
- the user operation acquisition unit 119 acquires this user operation and sends it to the display control unit 116B.
- the display control unit 116B controls the image correction unit 111 to shift-control the display positions of the left eye image and the right eye image displayed on the displays 113L and 113R.
- the horizontal center of the left eye image is made to coincide with the left eye of the observer
- the horizontal center of the right eye image is made to coincide with the right eye of the observer. Misalignment of the eyes is corrected.
- the positional deviation between the optical system and the observer's eye is OSD-displayed based on the detection result of the positional relationship between the optical system and the observer's eye.
- the observer Based on this OSD display, the observer performs a correction operation so as to eliminate positional deviation. Then, in response to the correction operation, the display positions of the left eye image and the right eye image are shift-controlled, and the positional deviation between the optical system and the observer's eye is electronically corrected. Therefore, the positional deviation between the optical system and the observer's eye can be corrected appropriately.
- FIG. 8 shows a schematic configuration example of an optical transmission head mounted display (HMD) 100C according to a fourth embodiment.
- This configuration example is a binocular HMD.
- portions corresponding to FIGS. 2 and 6 are denoted by the same reference numerals, and the detailed description thereof is appropriately omitted.
- the HMD 100C has a left eyeglass lens unit 101L and a right eyeglass lens unit 101R.
- the spectacle lens unit 101L and the spectacle lens unit 101R are connected by the optical system adjustment mechanism 122.
- Infrared sensors 103L and 103R are attached to the spectacle lens portions 101L and 101R, respectively.
- a gyro sensor 104 for detecting an inclination of an optical system including the eyeglass lens portions 101L and 101R is attached to the eyeglass lens portion 101L.
- the sensor outputs of the infrared sensors 103L and 103R are used for eye position estimation by the scleral reflection method.
- the HMD 100C includes the display drivers 112L and 112R, the displays 113L and 113R, the eye position estimation unit 114, the tilt estimation unit 115, the display control unit 116C, the optical system position correction unit 117, and the OSD superposition unit 118. , And the user operation acquisition unit 119.
- the display 113 ⁇ / b> L is driven by the display driver 112 ⁇ / b> L to display a left eye image forming a stereoscopic image.
- the display 113R is driven by the display driver 112R to display a right-eye image forming a stereoscopic image.
- the eye position estimation unit 114 estimates the eye positions of the left eye and the right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R, and determines the angles ⁇ L and ⁇ R at the time of the cornea (black eye) scan as the eye positions. It outputs as an estimation result of (refer FIG. 2, FIG. 3).
- the inclination estimation unit 115 outputs, as an estimation result, the inclination ⁇ with respect to the horizontal direction of the optical system including the spectacle lens units 101L and 101R based on the sensor output of the gyro sensor 104 (see FIG. 3).
- the display control unit 116C determines the positional deviation dL of the left-eye optical system with respect to the left eye of the observer based on the angle ⁇ L, and determines the positional deviation dR with respect to the right eye of the observer of the right-eye optical system based on the angle ⁇ R.
- the display control unit 116C calculates the inter-eye distance de based on the positional deviations dL and dR, the inter-sensor distance ds of the known infrared sensors 103L and 103R, and the inclination ⁇ (see Formula (2)). .
- the display control unit 116C performs OSD display of the difference s or the difference s according to the difference s (de ⁇ ds ⁇ cos ⁇ ) between the inter-eye distance de and the inter-sensor distance ds ⁇ cos ⁇ in consideration of the inclination ⁇ .
- the OSD is displayed to prompt the user to perform an operation to correct the
- the display control unit 116C shifts the positions of the left eye optical system and the right eye optical system based on the user operation acquired by the user operation acquisition unit 119.
- the user performs correction so that the difference s becomes 0 based on the OSD display of the difference s or the OSD display for prompting the user to perform the operation of correcting the difference s. I will do the operation.
- the optical system position correction unit 117 corrects the optical system position under the control of the display control unit 116C. That is, the optical system position correction unit 117 adjusts the expansion and contraction of the optical system adjustment mechanism 122 to adjust the horizontal position of the spectacle lens unit 101L and the spectacle lens unit 101R. Further, along with this, the optical system position correction unit 117 also adjusts the positions of the displays 113L and 113R.
- the OSD superimposing unit 118 superimposes the display signal for OSD display output from the display control unit 116C on the left eye image data and the right eye image data.
- the left eye image data is supplied to the display driver 112L.
- the display driver 112L drives the display 113L, and the left eye image is displayed on the display 113L.
- right eye image data is supplied to the display driver 112R.
- the display 113R is driven by the display driver 112R, and a right eye image is displayed on the display 113R.
- the light from the left eye image displayed on the display 113L is reflected by the spectacle lens unit 101L and reaches the left eye of the observer.
- the left eye image displayed on the display 113L is optically enlarged by the spectacle lens unit 101L, and is observed by the observer's left eye as an enlarged virtual image.
- the light from the right eye image displayed on the display 113R is reflected by the spectacle lens unit 101R and reaches the right eye of the observer.
- the right-eye image displayed on the display 113R is optically enlarged by the spectacle lens unit 101R, and is observed by the observer's right eye as an enlarged virtual image.
- the viewer's eye side is irradiated with weak infrared rays from the infrared sensor 103L attached to the horizontal center position (center position in the horizontal direction of the left eye optical system) of the spectacle lens unit 101L, and the horizontal direction Will be scanned. Also, in this image display state, the viewer's eye side is irradiated with weak infrared rays from the infrared sensor 103R attached to the horizontal center position (center position in the horizontal direction of the right eye optical system) of the eyeglass lens unit 101R. Scan horizontally.
- the sensor outputs of the infrared sensors 103L and 103R are respectively supplied to the eyeball position estimation unit 114.
- the eyeball position estimation unit 114 estimates the eyeball positions of the left eye and the right eye of the observer based on the sensor outputs from the infrared sensors 103L and 103R.
- the eye position estimating unit 114 outputs the angles ⁇ L and ⁇ R at the time of the cornea (black eye) scan as estimation results of the eye positions of the left eye and the right eye (see FIGS. 2 and 3).
- the sensor output of the gyro sensor 104 attached to the spectacle lens unit 101 L is supplied to the inclination estimation unit 115.
- the tilt estimation unit 115 based on the sensor output of the gyro sensor 104, the tilt ⁇ of the optical system including the spectacle lens units 101L and 101R with respect to the horizontal direction is estimated (see FIG. 3).
- the angles ⁇ L and ⁇ R that are estimation results of the eyeball position estimation unit 114 are supplied to the display control unit 116C.
- the inclination ⁇ which is the estimation result of the inclination estimation unit 115, is supplied to the display control unit 116C.
- the positional deviation dL of the observer of the left eye optical system with respect to the left eye and the observer of the right eye optical system is obtained (see FIGS. 2 and 3).
- the inter-eye distance de is calculated based on the positional deviations dL and dR, the inter-sensor distance ds of the known infrared sensors 103L and 103R, and the inclination ⁇ (see Formula (2)) . In this case, by considering the inclination ⁇ , the inter-eye distance de can be obtained with high accuracy.
- a difference s (de ⁇ ds ⁇ cos ⁇ ) between the inter-eye distance de and the inter-sensor distance ds ⁇ cos ⁇ in consideration of the inclination ⁇ is obtained from the display control unit 116C, and the OSD display of the difference s or A display signal for OSD display for prompting the user to correct the difference s is output.
- This display signal is supplied to the OSD superimposing unit 118 and superimposed on the left eye image data and the right eye image data. As a result, the OSD is displayed on the displays 113L and 113R.
- a correction operation is performed by the observer based on the OSD display so that the difference s becomes zero.
- the user operation acquisition unit 119 acquires this user operation and sends it to the display control unit 116C.
- the optical system position correction unit 117 is controlled by the display control unit 116C, and the positions of the left eye optical system and the right eye optical system, that is, the positions of the eyeglass lens units 101L and 101R and those of the displays 101L and 101R. Position is shift controlled.
- the horizontal center of the left-eye optical system coincides with the left eye of the observer
- the horizontal center of the right-eye optical system coincides with the right eye of the observer. Misalignment of the observer's eye is corrected.
- the positional deviation between the optical system and the observer's eye is OSD-displayed based on the detection result of the positional relationship between the optical system and the observer's eye.
- the observer Based on this OSD display, the observer performs a correction operation so as to eliminate positional deviation. Then, in response to this correction operation, the left eye optical system and the right eye optical system are shift-controlled, and the positional deviation between the optical system and the observer's eye is mechanically corrected. Therefore, the positional deviation between the optical system and the observer's eye can be corrected appropriately.
- FIG. 9 shows a schematic configuration example of an optical transmission head mounted display (HMD) 100D according to the fifth embodiment.
- This configuration example is a monocular HMD.
- parts corresponding to FIG. 1 are given the same reference numerals, and the detailed description thereof will be omitted as appropriate.
- this HMD 100D is a monocular HMD
- the HMD 100 shown in FIG. 1 has one eyeglass lens portion 101 while the HMD 100 shown in FIG. 1 has two eyeglass lens portions 101L and 101R.
- An infrared sensor 103 is attached to the spectacle lens unit 101.
- the infrared sensor 103 functions in the same manner as the infrared sensors 103L and 103R in the HMD 100 shown in FIG. 1 and sends the sensor output to the eye position estimation unit 114.
- a gyro sensor 104 for detecting the inclination of the optical system including the eyeglass lens unit 101 is attached to the eyeglass lens unit 101.
- the gyro sensor 104 sends the sensor output to the tilt estimation unit 105.
- the HMD 100D since the HMD 100D is a monocular HMD, it has one display driver 112 and display 113.
- the display 113 is driven by the display driver 112 to display an image.
- the light from the image displayed on the display 113 is reflected by the spectacle lens unit 101 and reaches the observer's eye (left eye or right eye).
- the image displayed on the display 113 is optically magnified by the spectacle lens unit 101, and is observed by the observer's eye as a magnified virtual image.
- the eye position estimation unit 114 estimates the eye position of the observer's eye based on the sensor output from the infrared sensor 103 as in the eye position estimation unit 114 in the HMD 100 of FIG. (Black eye)
- the angle ⁇ at the time of scanning is output (see FIGS. 2 and 3).
- the inclination estimating unit 115 estimates the inclination ⁇ of the optical system including the eyeglass lens unit 101 with respect to the horizontal direction based on the sensor output of the gyro sensor 104 as in the inclination estimating unit 115 of the HMD 100 in FIG. 3).
- the display control unit 116D obtains the positional deviation d of the optical system with respect to the observer's eye based on the eye position estimation result and the inclination estimation result of the optical system.
- This positional deviation d corresponds to dL / cos ⁇ or dR / cos ⁇ , as shown in FIG.
- the display control unit 116D controls the image correction unit 111 to shift the display position of the image displayed on the display 113 in the horizontal direction so that the value becomes 0 based on the positional deviation d.
- the horizontal center of the image is made to coincide with the observer's eye, and the positional deviation between the optical system and the observer's eye is corrected.
- the other parts of the HMD 100D shown in FIG. 9 are configured in the same manner as the HMD 100 shown in FIG. As described above, in the HMD 100D shown in FIG. 9, the display position of the image is shift-controlled based on the detection result of the positional relationship between the optical system and the observer's eye, and the positional deviation between the optical system and the observer's eye is electronic. Corrected to Therefore, the positional deviation between the optical system and the observer's eye can be corrected appropriately.
- FIG. 10 shows a schematic configuration example of an optical transmission head mounted display (HMD) 100E according to a sixth embodiment.
- This configuration example is a monocular HMD.
- parts corresponding to FIGS. 5 and 9 are assigned the same reference numerals, and the detailed description thereof will be omitted as appropriate.
- the HMD 100E is a monocular HMD
- the HMD 100A shown in FIG. 5 has two spectacle lens parts 101L and 101R, but has one spectacle lens part 101.
- An infrared sensor 103 is attached to the spectacle lens unit 101.
- the infrared sensor 103 functions in the same manner as the infrared sensors 103L and 103R in the HMD 100A shown in FIG. 5, and sends the sensor output to the eye position estimation unit 114.
- a gyro sensor 104 for detecting the inclination of the optical system including the eyeglass lens unit 101 is attached to the eyeglass lens unit 101.
- the gyro sensor 104 sends the sensor output to the tilt estimation unit 105.
- the HMD 100E since the HMD 100E is a monocular HMD, it has one display driver 112 and display 113.
- the display 113 is driven by the display driver 112 to display an image.
- the light from the image displayed on the display 113 is reflected by the spectacle lens unit 101 and reaches the observer's eye (left eye or right eye).
- the image displayed on the display 113 is optically magnified by the spectacle lens unit 101, and is observed by the observer's eye as a magnified virtual image.
- the eye position estimation unit 114 estimates the eye position of the observer's eye based on the sensor output from the infrared sensor 103 as in the eye position estimation unit 114 in the HMD 100A of FIG. (Black eye)
- the angle ⁇ at the time of scanning is output (see FIGS. 2 and 3).
- the inclination estimating unit 115 estimates the inclination ⁇ of the optical system including the eyeglass lens unit 101 with respect to the horizontal direction based on the sensor output of the gyro sensor 104, similarly to the inclination estimating unit 115 in the HMD 100A of FIG. 3).
- the display control unit 116E obtains the positional deviation d of the optical system with respect to the observer's eye based on the eye position estimation result and the inclination estimation result of the optical system.
- This positional deviation d corresponds to dL / cos ⁇ or dR / cos ⁇ , as shown in FIG.
- the display control unit 116E controls the optical system position correction unit 117 so that the value becomes 0 based on the positional deviation d, and shifts the position of the spectacle lens unit 101, that is, the optical system in the horizontal direction.
- the display control unit 116 ⁇ / b> E also controls the position of the display 113 in accordance with the shift control of the spectacle lens unit 101. As a result, the horizontal center of the image is made to coincide with the observer's eye, and the positional deviation between the optical system and the observer's eye is corrected.
- the rest of the HMD 100E shown in FIG. 10 is configured in the same manner as the HMD 100A shown in FIG.
- the optical system is shift-controlled based on the detection result of the positional relationship between the optical system and the observer's eye, and the positional deviation between the optical system and the observer's eye is mechanically corrected. Be done. Therefore, the positional deviation between the optical system and the observer's eye can be corrected appropriately.
- FIG. 11 shows a schematic configuration example of an optical transmission head mounted display (HMD) 100F according to a seventh embodiment.
- This configuration example is a monocular HMD.
- parts corresponding to FIGS. 6 and 9 are assigned the same reference numerals, and the detailed description thereof will be omitted as appropriate.
- the HMD 100F is a monocular HMD
- the HMD 100B shown in FIG. 6 has two spectacle lens parts 101L and 101R, but has one spectacle lens part 101.
- An infrared sensor 103 is attached to the spectacle lens unit 101.
- the infrared sensor 103 functions in the same manner as the infrared sensors 103L and 103R in the HMD 100B shown in FIG. 6, and sends the sensor output to the eye position estimation unit 114.
- a gyro sensor 104 for detecting the inclination of the optical system including the eyeglass lens unit 101 is attached to the eyeglass lens unit 101.
- the gyro sensor 104 sends the sensor output to the tilt estimation unit 105.
- the HMD 100F since the HMD 100F is a monocular HMD, it has one display driver 112 and display 113.
- the display 113 is driven by the display driver 112 to display an image.
- the light from the image displayed on the display 113 is reflected by the spectacle lens unit 101 and reaches the observer's eye (left eye or right eye).
- the image displayed on the display 113 is optically magnified by the spectacle lens unit 101, and is observed by the observer's eye as a magnified virtual image.
- the eye position estimation unit 114 estimates the eye position of the observer's eye based on the sensor output from the infrared sensor 103 as in the eye position estimation unit 114 in the HMD 100B of FIG. (Black eye)
- the angle ⁇ at the time of scanning is output (see FIGS. 2 and 3).
- the inclination estimating unit 115 estimates the inclination ⁇ of the optical system including the eyeglass lens unit 101 with respect to the horizontal direction based on the sensor output of the gyro sensor 104 similarly to the inclination estimating unit 115 in the HMD 100B of FIG. 3).
- the display control unit 116F determines the positional deviation d of the optical system with respect to the observer's eye based on the eye position estimation result and the inclination estimation result of the optical system. This positional deviation d corresponds to dL / cos ⁇ or dR / cos ⁇ , as shown in FIG.
- the display control unit 116F outputs an OSD display of the positional deviation d or a display signal for an OSD display for prompting the user to correct the positional deviation d, and supplies the display signal to the OSD overlapping unit 118.
- the OSD display is performed on the display 113.
- a correction operation is performed by the observer based on the OSD display so that the positional deviation d becomes zero.
- the user operation acquisition unit 119 acquires this user operation and sends it to the display control unit 116F.
- the display control unit 116F performs shift control of the display position of the image displayed on the display 113 based on the correction operation. As a result, the horizontal center of the image is made to coincide with the observer's eye, and the positional deviation between the optical system and the observer's eye is corrected.
- the positional deviation between the optical system and the observer's eye is OSD-displayed based on the detection result of the positional relationship between the optical system and the observer's eye.
- the observer Based on this OSD display, the observer performs a correction operation so as to eliminate positional deviation. Then, according to this correction operation, the display position of the image is shift-controlled, and the positional deviation between the optical system and the observer's eye is electronically corrected. Therefore, the positional deviation between the optical system and the observer's eye can be corrected appropriately.
- FIG. 12 shows a schematic configuration example of an optical transmission head mounted display (HMD) 100G according to the eighth embodiment.
- This configuration example is a monocular HMD.
- the portions corresponding to FIGS. 8 and 9 are assigned the same reference numerals, and the detailed description thereof will be omitted as appropriate.
- the HMD 100G is a monocular HMD
- the HMD 100C shown in FIG. 8 has two spectacle lens parts 101L and 101R, but has one spectacle lens part 101.
- An infrared sensor 103 is attached to the spectacle lens unit 101.
- the infrared sensor 103 functions in the same manner as the infrared sensors 103 L and 103 R in the HMD 100 C shown in FIG. 8 and sends the sensor output to the eye position estimation unit 114.
- a gyro sensor 104 for detecting the inclination of the optical system including the eyeglass lens unit 101 is attached to the eyeglass lens unit 101.
- the gyro sensor 104 sends the sensor output to the tilt estimation unit 105.
- the HMD 100G since the HMD 100G is a monocular HMD, it has one display driver 112 and display 113.
- the display 113 is driven by the display driver 112 to display an image.
- the light from the image displayed on the display 113 is reflected by the spectacle lens unit 101 and reaches the observer's eye (left eye or right eye).
- the image displayed on the display 113 is optically magnified by the spectacle lens unit 101, and is observed by the observer's eye as a magnified virtual image.
- the eye position estimation unit 114 estimates the eye position of the observer's eye based on the sensor output from the infrared sensor 103 as in the eye position estimation unit 114 in the HMD 100C of FIG. (Black eye)
- the angle ⁇ at the time of scanning is output (see FIGS. 2 and 3).
- the inclination estimating unit 115 estimates the inclination ⁇ of the optical system including the eyeglass lens unit 101 with respect to the horizontal direction based on the sensor output of the gyro sensor 104 as in the inclination estimating unit 115 in the HMD 100C of FIG. 3).
- the display control unit 116G determines the positional deviation d of the optical system with respect to the observer's eye based on the eye position estimation result and the inclination estimation result of the optical system. This positional deviation d corresponds to dL / cos ⁇ or dR / cos ⁇ , as shown in FIG.
- the display control unit 116G outputs an OSD display of the displacement d or a display signal for an OSD display for prompting the user to correct the displacement d, and supplies the display signal to the OSD superimposing unit 118.
- the OSD display is performed on the display 113.
- a correction operation is performed by the observer based on the OSD display so that the positional deviation d becomes zero.
- the user operation acquisition unit 119 acquires this user operation and sends it to the display control unit 116G.
- the display control unit 116G controls the optical system position correction unit 117 based on this correction operation to shift control the position of the optical system, that is, the position of the spectacle lens unit 101 and the position of the display 103.
- the center of the optical system in the horizontal direction coincides with the observer's eye, and the positional deviation between the optical system and the observer's eye is corrected.
- the positional deviation between the optical system and the observer's eye is OSD-displayed based on the detection result of the positional relationship between the optical system and the observer's eye.
- the observer Based on this OSD display, the observer performs a correction operation so as to eliminate positional deviation. Then, in accordance with this correction operation, the optical system is shift-controlled, and the positional deviation between the optical system and the observer's eye is mechanically corrected. Therefore, the positional deviation between the optical system and the observer's eye can be corrected appropriately.
- the example which estimated inclination (theta) as the whole optical system based on the sensor output of the Jai sensor 104 attached to the spectacle lens part 101L was shown.
- a configuration is also conceivable in which the inclination of the optical system on the left eye side and the right eye side is detected, respectively.
- a gyro sensor is attached to the spectacle lens unit 101R.
- one infrared sensor was attached to each spectacles lens part, and the example in which the weak infrared rays irradiated to the observer side were scan-operated was shown.
- a plurality of infrared sensors IRS are attached to the spectacle lens part GL, and the position of the observer's eye is comprehensively judged based on the sensor output of each infrared sensor IRS. It is also conceivable to configure it. In this case, when the number of infrared sensors IRS is large, it is not necessary to scan the weak infrared rays emitted from the infrared sensors IRS to the observer side.
- the example which an infrared sensor is attached to each spectacles lens part was shown.
- FIG. 13B it is also possible to install the infrared sensor IRS in a place separated to some extent from the spectacle lens part GL by using the reflection sheet RS.
- FIG. 13C a plurality of infrared sensors IRS are arranged, and the position of the observer's eye is comprehensively determined based on the sensor output of each infrared sensor IRS. It is also conceivable to configure it to
- the image correction unit 111 shifts the display position of the image in the horizontal direction. Strictly, it is also conceivable to perform alignment by vertical shift, rotation, enlargement, reduction or the like.
- the eye position estimation is performed by applying the sclera reflex method.
- EOG Electro-Oculogram
- the positional relationship (difference s, positional deviation d, and the like) detected by the positional relationship detection unit is OSD-displayed on the display for displaying an image.
- this positional relationship may be displayed on a display element separate from the display for image display.
- a vibrator when using vibration, a vibrator is provided, and the vibration is configured to be larger as the amount of positional deviation is larger. Also, for example, in the case of using voice, the amount of deviation or the operation method is reproduced, or the volume is increased as the amount of deviation is larger.
- the example which applied this technique to HMD was shown.
- the present technology is not limited to this, and can be widely applied to other image display devices such as a camera finder and electronic binoculars.
- the present technology can also be configured as follows.
- An optical system for guiding an image displayed on a display element to the observer's eye based on the eyeball position of the observer's eye, and a positional relationship detection unit for detecting the positional relationship between the observer's eyes An image display apparatus comprising: a positional relationship correction unit configured to correct the positional relationship between the optical system and the observer's eye based on the positional relationship detected by the positional relationship detection unit.
- the positional relationship correction unit The image display apparatus according to (1), wherein the display position of the image displayed on the display element is shifted based on the positional relationship detected by the positional relationship detection unit.
- the positional relationship correction unit The image display apparatus according to (1), wherein the optical system is moved based on the positional relationship detected by the positional relationship detection unit.
- the positional relationship correction unit A positional relationship presentation unit for presenting the positional relationship detected by the positional relationship detection unit to an observer;
- the image display apparatus according to (1) further including: a control unit that controls the positional relationship between the optical system and the observer's eyes in accordance with the operation of the observer.
- the control unit The image display apparatus according to (4), wherein shift control of the display position of the image displayed on the display element is performed according to the operation of the observer.
- the control unit The image display apparatus according to (4), wherein movement control of the optical system is performed according to the operation of the observer.
- the positional relationship presentation unit The image display apparatus according to any one of (4) to (6), wherein the positional relationship detected by the positional relationship detection unit is displayed on the display element.
- the apparatus further comprises a tilt estimation unit that estimates the tilt of the optical system,
- the positional relationship detection unit The image display apparatus according to any one of (1) to (7), wherein a positional relationship between the optical system and the observer's eye is detected based on the estimated inclination as well as the estimated eye position.
- An eyeball position estimation unit for estimating the eyeball position of the observer's eye further comprising: The positional relationship detection unit detects the positional relationship between the optical system and the observer's eye based on the eyeball position estimated by the eyeball position estimation unit.
- the image display device according to (9) or (10) .
- An optical system for guiding an image displayed on a display element to the observer's eye, and a positional relation detection step of detecting the positional relation of the observer's eye An image display method, comprising: a positional relationship correction step of correcting the positional relationship between the optical system and the observer's eyes based on the detected positional relationship.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
Abstract
Description
観察者眼の眼球位置に基づいて、表示素子に表示された画像を上記観察者眼に導く光学系と上記観察者眼の位置関係を検出する位置関係検出部と、
上記位置関係検出部で検出された位置関係に基づいて、上記光学系と上記観察者眼の位置関係を補正する位置関係補正部とを備える
画像表示装置にある。
1.第1の実施の形態
2.第2の実施の形態
3.第3の実施の形態
4.第4の実施の形態
5.第5の実施の形態
6.第6の実施の形態
7.第7の実施の形態
8.第8の実施の形態
9.変形例
[ヘッドマウントディスプレイ(両眼)の構成例]
図1は、第1の実施の形態としての光学透過のヘッドマウントディスプレイ(HMD)100の概略的な構成例を示している。この構成例は両眼HMDである。このHMD100は、左側のメガネレンズ部101Lと、右側のメガネレンズ部101Rを有している。メガネレンズ部101Lとメガネレンズ部101Rは、接続部材102により一体的に接続されている。
de=ds-dL-dR ・・・(1)
de=ds/cosθ-dL/cosθ+dR/cosθ
=(ds-dL+dR)/cosθ ・・・(2)
図1に示すHMD100における光学系と観察者眼の位置ずれ補正時の動作を説明する。この位置ずれ補正時には、例えば、テスト用の左眼画像データおよび右眼画像データが使用される。
[ヘッドマウントディスプレイ(両眼)の構成例]
図5は、第2の実施の形態としての光学透過のヘッドマウントディスプレイ(HMD)100Aの概略的な構成例を示している。この構成例は両眼HMDである。この図5において、図1と対応する部分には同一符号を付し、適宜、その詳細説明は省略する。
図5に示すHMD100Aにおける光学系と観察者眼の位置ずれ補正時の動作を説明する。この位置ずれ補正時には、例えば、テスト用の左眼画像データおよび右眼画像データが使用される。
[ヘッドマウントディスプレイ(両眼)の構成例]
図6は、第3の実施の形態としての光学透過のヘッドマウントディスプレイ(HMD)100Bの概略的な構成例を示している。この構成例は両眼HMDである。この図6において、図1と対応する部分には同一符号を付し、適宜、その詳細説明は省略する。このHMD100Bは、左側のメガネレンズ部101Lと、右側のメガネレンズ部101Rを有している。メガネレンズ部101Lとメガネレンズ部101Rは、接続部材102により一体的に接続されている。
図6に示すHMD100Bにおける光学系と観察者眼の位置ずれ補正時の動作を説明する。この位置ずれ補正時には、例えば、テスト用の左眼画像データおよび右眼画像データが使用される。
[ヘッドマウントディスプレイ(両眼)の構成例]
図8は、第4の実施の形態としての光学透過のヘッドマウントディスプレイ(HMD)100Cの概略的な構成例を示している。この構成例は両眼HMDである。この図8において、図2、図6と対応する部分には同一符号を付し、適宜、その詳細説明は省略する。このHMD100Cは、左側のメガネレンズ部101Lと、右側のメガネレンズ部101Rを有している。メガネレンズ部101Lとメガネレンズ部101Rは、光学系調整機構122によりにより接続されている。
図8に示すHMD100Cにおける光学系と観察者眼の位置ずれ補正時の動作を説明する。この位置ずれ補正時には、例えば、テスト用の左眼画像データおよび右眼画像データが使用される。
[ヘッドマウントディスプレイ(単眼)の構成例]
図9は、第5の実施の形態としての光学透過のヘッドマウントディスプレイ(HMD)100Dの概略的な構成例を示している。この構成例は単眼HMDである。この図9において、図1と対応する部分には同一符号を付し、適宜、その詳細説明は省略する。
[ヘッドマウントディスプレイ(単眼)の構成例]
図10は、第6の実施の形態としての光学透過のヘッドマウントディスプレイ(HMD)100Eの概略的な構成例を示している。この構成例は単眼HMDである。この図10において、図5、図9と対応する部分には同一符号を付し、適宜、その詳細説明は省略する。
[ヘッドマウントディスプレイ(単眼)の構成例]
図11は、第7の実施の形態としての光学透過のヘッドマウントディスプレイ(HMD)100Fの概略的な構成例を示している。この構成例は単眼HMDである。この図11において、図6、図9と対応する部分には同一符号を付し、適宜、その詳細説明は省略する。
[ヘッドマウントディスプレイ(単眼)の構成例]
図12は、第8の実施の形態としての光学透過のヘッドマウントディスプレイ(HMD)100Gの概略的な構成例を示している。この構成例は単眼HMDである。この図12において、図8、図9と対応する部分には同一符号を付し、適宜、その詳細説明は省略する。
なお、上述実施の形態では、光学系の傾きθを考慮することで補正精度を高めるようにしている。しかし、この光学系の傾きθを考慮しない簡易的な構成も考えられる。その場合には、ジャイロセンサ104、傾き推定部115等の構成要素は不要となる。
(1)観察者眼の眼球位置に基づいて、表示素子に表示された画像を上記観察者眼に導く光学系と上記観察者眼の位置関係を検出する位置関係検出部と、
上記位置関係検出部で検出された位置関係に基づいて、上記光学系と上記観察者眼の位置関係を補正する位置関係補正部とを備える
画像表示装置。
(2)上記位置関係補正部は、
上記位置関係検出部で検出された位置関係に基づいて、上記表示素子に表示される画像の表示位置をシフトする
前記(1)に記載の画像表示装置。
(3)上記位置関係補正部は、
上記位置関係検出部で検出された位置関係に基づいて、上記光学系を移動させる
前記(1)に記載の画像表示装置。
(4)上記位置関係補正部は、
上記位置関係検出部で検出された位置関係を観察者に提示する位置関係提示部と、
上記観察者の操作に応じて、上記光学系と上記観察者眼の位置関係を制御する制御部とを有する
前記(1)に記載の画像表示装置。
(5)上記制御部は、
上記観察者の操作に応じて、上記表示素子に表示される画像の表示位置のシフト制御を行う
前記(4)に記載の画像表示装置。
(6)上記制御部は、
上記観察者の操作に応じて、上記光学系の移動制御を行う
前記(4)に記載の画像表示装置。
(7)上記位置関係提示部は、
上記表示素子に、上記位置関係検出部で検出された位置関係を表示する
前記(4)から(6)のいずれかに記載の画像表示装置。
(8)上記光学系の傾きを推定する傾き推定部をさらに備え、
上記位置関係検出部は、
上記推定された眼球位置と共に、上記推定された傾きに基づいて、上記光学系と上記観察者眼の位置関係を検出する
前記(1)から(7)のいずれかに記載の画像表示装置。
(9)上記表示素子に表示された画像を上記観察者眼に導く光学系を更に備える
前記(1)から(8)のいずれかに記載の画像表示装置。
(10)上記光学系として、表示素子に表示された左眼画像を観察者の左眼に導く第1の光学系と、表示素子に表示された右眼画像を観察者の右眼に導く第2の光学系が存在する
前記(9)に記載の画像表示装置。
(11)上記観察者眼の眼球位置を推定する眼球位置推定部を更に備え、
上記位置関係検出部は、上記眼球位置推定部により推定された眼球位置に基づいて、上記光学系と上記観察者眼の位置関係を検出する
前記(9)または(10)に記載の画像表示装置。
(12)表示素子に表示された画像を観察者眼に導く光学系と該観察者眼の位置関係を検出する位置関係検出ステップと、
上記検出された位置関係に基づいて、上記光学系と上記観察者眼の位置関係を補正する位置関係補正ステップとを備える
画像表示方法。
101,101L,101R・・・メガネレンズ部
102・・・接続部材
103,103L,103R・・・赤外線センサ
104・・・ジャイロセンサ
111・・・画像補正部
112,112L,112R・・・ディスプレイドライバ
113,113L,113R・・・ディスプレイ
114・・・眼球位置推定部
115・・・傾き推定部
116,116A~116G・・・表示制御部
117・・・光学系位置補正部
118・・・OSD重畳部
119・・・ユーザ操作取得部
122・・・光学系調整機構
Claims (12)
- 観察者眼の眼球位置<!--メインクレームにはエレメントとしていれない-->に基づいて、表示素子に表示された画像を上記観察者眼に導く光学系と上記観察者眼の位置関係を検出する位置関係検出部と、
上記位置関係検出部で検出された位置関係に基づいて、上記光学系と上記観察者眼の位置関係を補正する位置関係補正部とを備える
画像表示装置。 - 上記位置関係補正部は、
上記位置関係検出部で検出された位置関係に基づいて、上記表示素子に表示される画像の表示位置をシフトする
請求項1に記載の画像表示装置。 - 上記位置関係補正部は、
上記位置関係検出部で検出された位置関係に基づいて、上記光学系を移動させる
請求項1に記載の画像表示装置。 - 上記位置関係補正部は、
上記位置関係検出部で検出された位置関係を観察者に提示する位置関係提示部と、
上記観察者の操作に応じて、上記光学系と上記観察者眼の位置関係を制御する制御部と
を有する
請求項1に記載の画像表示装置。 - 上記制御部は、
上記観察者の操作に応じて、上記表示素子に表示される画像の表示位置のシフト制御を行う
請求項4に記載の画像表示装置。 - 上記制御部は、
上記観察者の操作に応じて、上記光学系の移動制御を行う
請求項4に記載の画像表示装置。 - 上記位置関係提示部は、
上記表示素子に、上記位置関係検出部で検出された位置関係を表示する
請求項4に記載の画像表示装置。 - 上記光学系の傾きを推定する傾き推定部をさらに備え、
上記位置関係検出部は、
上記推定された眼球位置と共に、上記推定された傾きに基づいて、上記光学系と上記観察者眼の位置関係を検出する
請求項1に記載の画像表示装置。 - 上記表示素子に表示された画像を上記観察者眼に導く光学系を更に備える
請求項1に記載の画像表示装置。 - 上記光学系として、表示素子に表示された左眼画像を観察者の左眼に導く第1の光学系と、表示素子に表示された右眼画像を観察者の右眼に導く第2の光学系が存在する
請求項9に記載の画像表示装置。 - 上記観察者眼の眼球位置を推定する眼球位置推定部を更に備え、
上記位置関係検出部は、上記眼球位置推定部により推定された眼球位置に基づいて、上記光学系と上記観察者眼の位置関係を検出する
請求項9に記載の画像表示装置。 - 表示素子に表示された画像を観察者眼に導く光学系と該観察者眼の位置関係を検出する位置関係検出ステップと、
上記検出された位置関係に基づいて、上記光学系と上記観察者眼の位置関係を補正する位置関係補正ステップとを備える
画像表示方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014526871A JP6248931B2 (ja) | 2012-07-24 | 2013-07-17 | 画像表示装置および画像表示方法 |
CN201380037860.6A CN104508538B (zh) | 2012-07-24 | 2013-07-17 | 图像显示装置及图像显示方法 |
US14/415,316 US9835864B2 (en) | 2012-07-24 | 2013-07-17 | Image display apparatus and method for displaying image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012163324 | 2012-07-24 | ||
JP2012-163324 | 2012-07-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014017348A1 true WO2014017348A1 (ja) | 2014-01-30 |
Family
ID=49997162
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/069378 WO2014017348A1 (ja) | 2012-07-24 | 2013-07-17 | 画像表示装置および画像表示方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9835864B2 (ja) |
JP (1) | JP6248931B2 (ja) |
CN (1) | CN104508538B (ja) |
WO (1) | WO2014017348A1 (ja) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014071230A (ja) * | 2012-09-28 | 2014-04-21 | Pioneer Electronic Corp | 画像表示装置、制御方法、プログラム及び記憶媒体 |
CN104581126A (zh) * | 2014-12-16 | 2015-04-29 | 青岛歌尔声学科技有限公司 | 一种头戴显示设备的画面显示处理方法和处理装置 |
JP2016085350A (ja) * | 2014-10-27 | 2016-05-19 | セイコーエプソン株式会社 | 表示装置、及び、表示装置の制御方法 |
JP2017026943A (ja) * | 2015-07-27 | 2017-02-02 | 株式会社東芝 | 画像表示装置及び画像処理装置 |
WO2018186168A1 (ja) * | 2017-04-05 | 2018-10-11 | シャープ株式会社 | 映像生成装置、映像生成方法および映像生成プログラム |
JP2020112713A (ja) * | 2019-01-15 | 2020-07-27 | ブルーオプテック株式会社 | ウエアラブル画像表示装置 |
JP2020533721A (ja) * | 2017-09-06 | 2020-11-19 | エクス・ワイ・ジィー リアリティ リミテッドXyz Reality Limited | 建物情報モデルの仮想画像の表示 |
CN113260427A (zh) * | 2019-01-11 | 2021-08-13 | 环球城市电影有限责任公司 | 掉落检测系统和方法 |
JP2021141612A (ja) * | 2015-11-04 | 2021-09-16 | マジック リープ, インコーポレイテッドMagic Leap, Inc. | 眼追跡に基づく動的ディスプレイ較正 |
WO2021229821A1 (ja) * | 2020-05-15 | 2021-11-18 | 日本電信電話株式会社 | 位置合わせ支援装置、位置合わせ支援方法およびプログラム |
JP2021532464A (ja) * | 2018-07-24 | 2021-11-25 | マジック リープ, インコーポレイテッドMagic Leap, Inc. | 左および右ディスプレイとユーザの眼との間の垂直整合を決定するためのディスプレイシステムおよび方法 |
WO2021261100A1 (ja) * | 2020-06-22 | 2021-12-30 | ソニーグループ株式会社 | 画像表示装置および画像表示方法 |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10345903B2 (en) * | 2013-07-30 | 2019-07-09 | Microsoft Technology Licensing, Llc | Feedback for optic positioning in display devices |
US9746675B2 (en) * | 2015-05-28 | 2017-08-29 | Microsoft Technology Licensing, Llc | Alignment based view matrix tuning |
US11252399B2 (en) | 2015-05-28 | 2022-02-15 | Microsoft Technology Licensing, Llc | Determining inter-pupillary distance |
US10515482B2 (en) | 2015-08-24 | 2019-12-24 | Pcms Holdings, Inc. | Systems and methods for enhancing augmented reality experience with dynamic output mapping |
WO2017062289A1 (en) | 2015-10-08 | 2017-04-13 | Pcms Holdings, Inc. | Methods and systems of automatic calibration for dynamic display configurations |
US20170115489A1 (en) * | 2015-10-26 | 2017-04-27 | Xinda Hu | Head mounted display device with multiple segment display and optics |
CN105391997B (zh) * | 2015-11-05 | 2017-12-29 | 广东未来科技有限公司 | 立体显示装置的三维视点校正方法 |
CN105898275A (zh) * | 2016-04-28 | 2016-08-24 | 乐视控股(北京)有限公司 | 虚拟现实图像校准方法及装置 |
CN108153417B (zh) * | 2017-12-25 | 2022-03-04 | 北京凌宇智控科技有限公司 | 画面补偿方法及采用该方法的头戴式显示装置 |
CN111103971A (zh) | 2018-10-29 | 2020-05-05 | 托比股份公司 | 头戴式装置在用户上的位置的确定 |
US11741673B2 (en) | 2018-11-30 | 2023-08-29 | Interdigital Madison Patent Holdings, Sas | Method for mirroring 3D objects to light field displays |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
CN112255800B (zh) * | 2020-11-09 | 2022-12-20 | Oppo广东移动通信有限公司 | 图像显示的控制方法及增强现实设备、装置、计算机可读存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07311361A (ja) * | 1994-05-16 | 1995-11-28 | Olympus Optical Co Ltd | 眼球投影型映像表示装置 |
JP2002328330A (ja) * | 2001-04-27 | 2002-11-15 | Sony Corp | 映像表示装置 |
JP2003279882A (ja) * | 2002-03-22 | 2003-10-02 | Victor Co Of Japan Ltd | ヘッドマウントディスプレイ装置 |
JP2005311754A (ja) * | 2004-04-22 | 2005-11-04 | Canon Inc | 撮像カメラと瞳位置検知機能を備えた頭部装着型映像表示装置 |
WO2009066475A1 (ja) * | 2007-11-21 | 2009-05-28 | Panasonic Corporation | 表示装置 |
JP2012042654A (ja) * | 2010-08-18 | 2012-03-01 | Sony Corp | 表示装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5714967A (en) | 1994-05-16 | 1998-02-03 | Olympus Optical Co., Ltd. | Head-mounted or face-mounted image display apparatus with an increased exit pupil |
JPH09304728A (ja) * | 1996-05-15 | 1997-11-28 | Sony Corp | 光学視覚装置 |
JPH09322199A (ja) * | 1996-05-29 | 1997-12-12 | Olympus Optical Co Ltd | 立体映像ディスプレイ装置 |
JP2009116196A (ja) | 2007-11-08 | 2009-05-28 | Canon Inc | 画像表示装置 |
US8941559B2 (en) * | 2010-09-21 | 2015-01-27 | Microsoft Corporation | Opacity filter for display device |
-
2013
- 2013-07-17 WO PCT/JP2013/069378 patent/WO2014017348A1/ja active Application Filing
- 2013-07-17 JP JP2014526871A patent/JP6248931B2/ja active Active
- 2013-07-17 CN CN201380037860.6A patent/CN104508538B/zh active Active
- 2013-07-17 US US14/415,316 patent/US9835864B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07311361A (ja) * | 1994-05-16 | 1995-11-28 | Olympus Optical Co Ltd | 眼球投影型映像表示装置 |
JP2002328330A (ja) * | 2001-04-27 | 2002-11-15 | Sony Corp | 映像表示装置 |
JP2003279882A (ja) * | 2002-03-22 | 2003-10-02 | Victor Co Of Japan Ltd | ヘッドマウントディスプレイ装置 |
JP2005311754A (ja) * | 2004-04-22 | 2005-11-04 | Canon Inc | 撮像カメラと瞳位置検知機能を備えた頭部装着型映像表示装置 |
WO2009066475A1 (ja) * | 2007-11-21 | 2009-05-28 | Panasonic Corporation | 表示装置 |
JP2012042654A (ja) * | 2010-08-18 | 2012-03-01 | Sony Corp | 表示装置 |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014071230A (ja) * | 2012-09-28 | 2014-04-21 | Pioneer Electronic Corp | 画像表示装置、制御方法、プログラム及び記憶媒体 |
JP2016085350A (ja) * | 2014-10-27 | 2016-05-19 | セイコーエプソン株式会社 | 表示装置、及び、表示装置の制御方法 |
CN104581126A (zh) * | 2014-12-16 | 2015-04-29 | 青岛歌尔声学科技有限公司 | 一种头戴显示设备的画面显示处理方法和处理装置 |
JP2017026943A (ja) * | 2015-07-27 | 2017-02-02 | 株式会社東芝 | 画像表示装置及び画像処理装置 |
JP2021141612A (ja) * | 2015-11-04 | 2021-09-16 | マジック リープ, インコーポレイテッドMagic Leap, Inc. | 眼追跡に基づく動的ディスプレイ較正 |
US11536559B2 (en) | 2015-11-04 | 2022-12-27 | Magic Leap, Inc. | Light field display metrology |
JP7218398B2 (ja) | 2015-11-04 | 2023-02-06 | マジック リープ, インコーポレイテッド | 眼追跡に基づく動的ディスプレイ較正 |
WO2018186168A1 (ja) * | 2017-04-05 | 2018-10-11 | シャープ株式会社 | 映像生成装置、映像生成方法および映像生成プログラム |
JP2020533721A (ja) * | 2017-09-06 | 2020-11-19 | エクス・ワイ・ジィー リアリティ リミテッドXyz Reality Limited | 建物情報モデルの仮想画像の表示 |
JP7456995B2 (ja) | 2018-07-24 | 2024-03-27 | マジック リープ, インコーポレイテッド | 左および右ディスプレイとユーザの眼との間の垂直整合を決定するためのディスプレイシステムおよび方法 |
JP2021532464A (ja) * | 2018-07-24 | 2021-11-25 | マジック リープ, インコーポレイテッドMagic Leap, Inc. | 左および右ディスプレイとユーザの眼との間の垂直整合を決定するためのディスプレイシステムおよび方法 |
US12105875B2 (en) | 2018-07-24 | 2024-10-01 | Magic Leap, Inc. | Display systems and methods for determining vertical alignment between left and right displays and a user's eyes |
JP7529672B2 (ja) | 2019-01-11 | 2024-08-06 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | ウェアラブル視覚化システム及び方法 |
CN113260427A (zh) * | 2019-01-11 | 2021-08-13 | 环球城市电影有限责任公司 | 掉落检测系统和方法 |
US11675196B2 (en) | 2019-01-15 | 2023-06-13 | Blue Optech Co., Ltd. | Wearable device with image display module |
JP2020112713A (ja) * | 2019-01-15 | 2020-07-27 | ブルーオプテック株式会社 | ウエアラブル画像表示装置 |
JP7409497B2 (ja) | 2020-05-15 | 2024-01-09 | 日本電信電話株式会社 | 位置合わせ支援装置、位置合わせ支援方法およびプログラム |
JPWO2021229821A1 (ja) * | 2020-05-15 | 2021-11-18 | ||
WO2021229821A1 (ja) * | 2020-05-15 | 2021-11-18 | 日本電信電話株式会社 | 位置合わせ支援装置、位置合わせ支援方法およびプログラム |
WO2021261100A1 (ja) * | 2020-06-22 | 2021-12-30 | ソニーグループ株式会社 | 画像表示装置および画像表示方法 |
US12072497B2 (en) | 2020-06-22 | 2024-08-27 | Sony Group Corporation | Image display apparatus and image display method |
Also Published As
Publication number | Publication date |
---|---|
US20150198808A1 (en) | 2015-07-16 |
JP6248931B2 (ja) | 2017-12-20 |
CN104508538B (zh) | 2018-06-22 |
US9835864B2 (en) | 2017-12-05 |
CN104508538A (zh) | 2015-04-08 |
JPWO2014017348A1 (ja) | 2016-07-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6248931B2 (ja) | 画像表示装置および画像表示方法 | |
JP4686586B2 (ja) | 車載用表示装置及び表示方法 | |
JP6520119B2 (ja) | 画像処理装置および画像処理方法 | |
JP3717653B2 (ja) | 頭部搭載型画像表示装置 | |
JP6516234B2 (ja) | 立体画像表示装置 | |
JP4857885B2 (ja) | 表示装置 | |
US20110080536A1 (en) | Stereoscopic image display apparatus | |
JP5173395B2 (ja) | 視機能検査装置 | |
JP2010070066A (ja) | ヘッドアップディスプレイ | |
US7229174B2 (en) | Method to detect misalignment and distortion in near-eye displays | |
JP2011133508A (ja) | 走査型表示装置光学系、立体表示装置及びヘッドアップディスプレイ装置 | |
WO2013145147A1 (ja) | ヘッドマウントディスプレイ及び表示方法 | |
JP2008176096A (ja) | 画像表示装置 | |
JP2006208407A (ja) | 立体画像観察用顕微鏡システム | |
JP2011085830A (ja) | 映像表示システム | |
JP7118650B2 (ja) | 表示装置 | |
JP2018203245A (ja) | 表示システム、電子ミラーシステム及び移動体 | |
US20100259820A1 (en) | Stereoscopic image display | |
US20140139916A1 (en) | Electronic image display apparatus | |
JPWO2018199244A1 (ja) | 表示システム | |
JP4353001B2 (ja) | 3次元画像撮像アダプター | |
JP2007267261A (ja) | 立体表示装置 | |
JP6932501B2 (ja) | 画像表示装置 | |
JPWO2019151314A1 (ja) | 表示装置 | |
JP2018129826A (ja) | 立体画像表示装置及び立体画像表示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13822355 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014526871 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14415316 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13822355 Country of ref document: EP Kind code of ref document: A1 |