WO2016103525A1 - ヘッドマウントディスプレイ - Google Patents
ヘッドマウントディスプレイ Download PDFInfo
- Publication number
- WO2016103525A1 WO2016103525A1 PCT/JP2014/084723 JP2014084723W WO2016103525A1 WO 2016103525 A1 WO2016103525 A1 WO 2016103525A1 JP 2014084723 W JP2014084723 W JP 2014084723W WO 2016103525 A1 WO2016103525 A1 WO 2016103525A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- user
- light source
- mounted display
- head mounted
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/08—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
- G02B26/0816—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
- G02B26/0833—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements the reflecting element being a micromechanical device, e.g. a MEMS mirror, DMD
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
- H04N5/7475—Constructional details of television projection apparatus
- H04N5/7491—Constructional details of television projection apparatus of head mounted projectors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0112—Head-up displays characterised by optical features comprising device for genereting colour display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
Definitions
- This invention relates to a head mounted display.
- a technique is known in which a user's eye direction is detected by irradiating a user's eyes with invisible light such as near infrared light and analyzing an image of the user's eyes including the reflected light.
- the detected information on the user's line-of-sight direction is reflected on a monitor such as a PC (Personal Computer) or a game machine and used as a pointing device.
- the head mounted display is a video display device for presenting a three-dimensional video to the user who wears it.
- a head-mounted display is generally used by being mounted so as to cover the user's field of view. For this reason, the user wearing the head-mounted display is shielded from the image of the outside world.
- the head-mounted display is used as a display device for images such as movies and games, it is difficult for the user to visually recognize an input device such as a controller.
- the present invention has been made in view of such problems, and an object of the present invention is to provide a technique for detecting the line-of-sight direction of a user wearing a head-mounted display.
- an aspect of the present invention is a head-mounted display that is used by being mounted on a user's head.
- This head-mounted display detects a user's gaze direction from a light source that can irradiate invisible light, a camera that irradiates the invisible light that is emitted from the light source and reflected by the user's eyes, and an image captured by the camera.
- An image output unit that outputs to the line-of-sight detection unit, and a housing that houses the light source, the camera, and the image output unit.
- the present invention it is possible to provide a technique for detecting the line-of-sight direction of a user wearing a head-mounted display.
- FIG. 1 is a diagram schematically showing an overview of a video system according to an embodiment. It is a figure which shows typically the optical structure of the image display system which the housing
- FIGS. 3A and 3B are diagrams for explaining a line-of-sight detection technique using a bright spot of invisible light incident on the user's eye as a reference position. It is a timing chart which shows typically the control pattern of the micromirror which the image control part concerning an embodiment performs.
- FIGS. 5A and 5B are diagrams showing examples of bright spots that appear in the eyes of the user. It is a figure which shows typically the optical structure of the image display system which the housing
- FIG. 1 is a diagram schematically showing an overview of a video system 1 according to the embodiment.
- the video system 1 according to the embodiment includes a head mounted display 100 and a video playback device 200. As shown in FIG. 1, the head mounted display 100 is used by being mounted on the head of a user 300.
- the video playback device 200 generates a video to be displayed by the head mounted display 100.
- the video playback device 200 is a device capable of playing back videos such as a stationary game machine, a portable game machine, a PC, a tablet, a smartphone, a fablet, a video player, and a television.
- the video playback device 200 is connected to the head mounted display 100 wirelessly or by wire.
- the video reproduction device 200 is connected to the head mounted display 100 wirelessly.
- the wireless connection between the video reproduction apparatus 200 and the head mounted display 100 can be realized by using a known wireless communication technology such as Wi-Fi (registered trademark) or Bluetooth (registered trademark).
- transmission of video between the head mounted display 100 and the video playback device 200 is performed according to standards such as Miracast (trademark), WiGig (trademark), and WHDI (trademark).
- FIG. 1 shows an example in which the head mounted display 100 and the video playback device 200 are different devices.
- the video playback device 200 may be built in the head mounted display 100.
- the head mounted display 100 includes a housing 150, a wearing tool 160, and headphones 170.
- the housing 150 accommodates an image display system for presenting video to the user 300, such as a light source and an image display element described later, and a wireless transmission module such as a Wi-Fi module and a Bluetooth module (not shown).
- the wearing tool 160 wears the head mounted display 100 on the user's 300 head.
- the wearing tool 160 can be realized by, for example, a belt or a stretchable band.
- the housing 150 is arranged at a position that covers the eyes of the user 300. For this reason, when the user 300 wears the head mounted display 100, the field of view of the user 300 is blocked by the housing 150.
- the headphones 170 output the audio of the video that the video playback device 200 plays.
- the headphones 170 may not be fixed to the head mounted display 100.
- the user 300 can freely attach and detach the headphones 170 even when the head mounted display 100 is worn using the wearing tool 160.
- FIG. 2 is a diagram schematically illustrating an optical configuration of the image display system 130 accommodated in the housing 150 according to the embodiment.
- the image display system 130 includes a white light source 102, a filter group 104, a filter switching unit 106, an image display element 108, an image control unit 110, a half mirror 112, a convex lens 114, a camera 116, and an image output unit 118.
- the white light source 102, the filter group 104, and the filter switching unit 106 constitute a light source of the image display system 130.
- the white light source 102 is a light source capable of emitting light including a visible light wavelength band and an invisible light wavelength band.
- Invisible light is light in a wavelength band that cannot be observed with the naked eye of the user 300, and is, for example, light in the near-infrared (about 800 nm to 2500 nm) wavelength band.
- the filter group 104 includes a red filter R, a green filter G, a blue filter B, and a near infrared filter IR.
- the red filter R transmits red light out of the light emitted from the white light source 102.
- the green filter G transmits green light out of light emitted from the white light source.
- the blue filter B transmits blue light out of light emitted from the white light source.
- the near-infrared filter IR transmits near-infrared light among the light emitted from the white light source.
- the filter switching unit 106 switches a filter that transmits light emitted from the white light source 102 among the filters included in the filter group 104.
- the filter group 104 is realized using a known color wheel
- the filter switching unit 106 is realized by a motor that rotationally drives the color wheel.
- the color wheel is divided into four regions, and the red filter R, green filter G, blue filter B, and near-infrared filter IR described above are arranged in each region.
- the white light source 102 is arranged so that the irradiation light passes through only one specific filter when the color wheel is stopped. For this reason, when the motor which is the filter switching unit 106 rotates the color wheel, the filter through which the light emitted from the white light source 102 passes is cyclically switched. For this reason, as for the light of the white light source 102 that passes through the color wheel, red light, green light, blue light, and near-infrared light are switched in a time division manner. As a result, the light source of the image display system 130 emits red light, green light, and blue light, which are visible light, and near infrared light, which is non-visible light, in a time division manner.
- the light transmitted through the filter group 104 is incident on the image display element 108.
- the image display element 108 generates image display light 120 using visible light emitted from a light source.
- the image display element 108 is realized using a known DMD (Digital Mirror Device).
- the DMD is an optical element including a plurality of micromirrors, and one micromirror corresponds to one pixel of an image.
- the image control unit 110 receives a video signal to be played back from the video playback device 200.
- the image control unit 110 independently controls each of the plurality of micromirrors included in the DMD in synchronization with the switching timing of the light emitted from the light source, based on the received video signal.
- the image control unit 110 can control the on state and the off state of each micromirror at high speed.
- the micromirror When the micromirror is on, the light reflected by the micromirror is incident on the eye 302 of the user 300. On the contrary, when the micromirror is in the OFF state, the light reflected by the micromirror does not go to the eye 302 of the user 300.
- the image control unit 110 forms the image display light 120 by the light reflected by the DMD by controlling the micromirror so that the light reflected by each micromirror forms an image. Note that the luminance value of each pixel can be controlled by controlling the ON state period per unit time of the corresponding micromirror.
- the half mirror 112 is disposed between the image display element 108 and the eye 302 of the user 300 when the user 300 wears the head mounted display 100.
- the half mirror 112 transmits only a part of the incident near-infrared light and reflects the remaining light.
- the half mirror 112 transmits incident visible light without reflecting it.
- the image display light 120 generated by the image display element 108 passes through the half mirror 112 and travels toward the eye 302 of the user 300.
- the convex lens 114 is disposed on the opposite side of the image display element 108 with respect to the half mirror 112. In other words, the convex lens 114 is disposed between the half mirror 112 and the eye 302 of the user 300 when the user 300 wears the head mounted display 100.
- the convex lens 114 condenses the image display light 120 that passes through the half mirror 112. For this reason, the convex lens 114 functions as an image enlargement unit that enlarges an image generated by the image display element 108 and presents it to the user 300.
- the convex lens 114 may be a lens group configured by combining various lenses.
- the image display system 130 of the head mounted display 100 includes two image display elements 108, and an image to be presented to the right eye of the user 300 and an image to be presented to the left eye. And can be generated independently. Therefore, the head mounted display 100 according to the embodiment can present a parallax image for the right eye and a parallax image for the left eye to the right eye and the left eye of the user 300, respectively. Thereby, the head mounted display 100 according to the embodiment can present a stereoscopic video with a sense of depth to the user 300.
- the light source of the image display system 130 emits red light, green light, and blue light that are visible light, and near-infrared light that is invisible light in a time-sharing manner. For this reason, when the near-infrared light irradiated from the light source is reflected by the DMD micromirror, a part of the light is incident on the eye 302 of the user 300 after passing through the half mirror 112. Part of the near-infrared light that has entered the eye 302 of the user 300 is reflected by the cornea by the eye 302 of the user 300 and reaches the half mirror 112 again.
- the camera 116 includes a filter that blocks visible light, and images near-infrared light reflected by the half mirror 112. That is, the camera 116 is a near-infrared camera that captures near-infrared light that is emitted from the light source of the image display system 130 and reflected by the eye of the user 300.
- the image output unit 118 outputs the image captured by the camera 116 to a gaze detection unit (not shown) that detects the gaze direction of the user 300. Specifically, the image output unit 118 transmits an image captured by the camera 116 to the video reproduction device 200.
- the line-of-sight detection unit is realized by a line-of-sight detection program executed by a CPU (Central Processing Unit) of the video reproduction device 200.
- the CPU of the head mounted display 100 may execute a program that realizes the line-of-sight detection unit.
- near-infrared light reflected by the eye 302 of the user 300 is a bright spot and an image of the eye 302 of the user 300 observed in the near-infrared wavelength band. And are imaged.
- the line-of-sight detection unit uses the bright spot of near infrared light in the image captured by the camera 116 as a reference position, and from the relative position of the pupil of the user 300 to the position, This can be realized by using a known algorithm for detecting the gaze direction of the user 300.
- irradiation of near-infrared light in the image display system 130 according to the embodiment will be described in more detail.
- FIGS. 3A to 3B are diagrams illustrating a line-of-sight detection technique using the bright spot of invisible light in the eye 302 of the user 300 as a reference position. More specifically, FIG. 3A is a diagram illustrating an image captured by the camera 116 in a state where the line of sight of the user 300 is facing the front, and FIG. It is a figure which shows the image which the camera 116 imaged in the state which has faced.
- the first bright spot 124a, the second bright spot 124b, the third bright spot 124c, and the fourth bright spot 124d appear in the eye 302 of the user 300.
- the first bright spot 124 a, the second bright spot 124 b, the third bright spot 124 c, and the fourth bright spot 124 d are each emitted from a light source and bright due to near-infrared light reflected by the eye 302 of the user 300. Is a point.
- the first bright spot 124a, the second bright spot 124b, the third bright spot 124c, and the fourth bright spot 124d are not particularly distinguished, they are simply referred to as “bright spots 124”.
- the first luminescent spot 124a, the second luminescent spot 124b, the third luminescent spot 124c, and the fourth luminescent spot 124d are in the state in which the user 300 is facing the front, respectively.
- the cornea is reflected at the boundary of the iris.
- the bright spot 124 is irradiated to the same position as long as the casing 150 is fixed to the head of the user 300 and the relative position between the eye 302 of the user 300 and the image display element 108 does not change. For this reason, the bright spot 124 can be regarded as a fixed point or landmark in the eye 302 of the user 300. As shown in FIG.
- the pupil center 304 of the user 300 is the first bright spot 124a, the second bright spot 124b, the third bright spot 124c, and the fourth bright spot. It coincides with the center of the bright spot 124d.
- the line-of-sight detection unit can acquire the position of the pupil center 304 with respect to the bright spot 124 by analyzing the image acquired from the camera 116. Thereby, the line-of-sight detection unit can detect the line-of-sight direction of the user 300.
- the detection of the bright spot 124 and the pupil center 304 in the image acquired from the camera 116 can be realized by using a known image processing technique such as edge extraction or Hough transform.
- the number of bright spots 124 is not limited to four, and at least one bright spot may be used. Note that the greater the number of bright spots 124, the more robust the gaze direction detection of the user 300 is. Even if near-infrared light that is incident on the user's 300 eye 302 from a certain position of the image display element 108 is blocked by some kind of shielding such as the user's 300 eyelids or eyelashes, cannot be incident from another position. This is because the incident near-infrared light may reach the eye 302 of the user 300.
- the image control unit 110 performs at least one pixel when the light source is radiating near-infrared light.
- the plurality of DMD micromirrors are controlled so that the near-infrared light enters the eye 302 of the user 300.
- FIG. 3A to 3B show an example in which the bright spot 124 has a circular shape.
- the first bright spot 124a, the third bright spot 124c, and the fourth bright spot 124d are out of the iris.
- the shape of the bright spot 124 generated at a location outside the iris may actually be distorted.
- the shape of the bright spot 124 is illustrated as a circular shape.
- FIG. 4 is a timing chart schematically showing a micromirror control pattern executed by the image control unit 110 according to the embodiment.
- the filter switching unit 106 cyclically switches the red filter R, the green filter G, the blue filter B, and the near infrared filter IR. For this reason, as shown in FIG. 4, red light, green light, blue light, and near-infrared light are also switched cyclically as light incident on the DMD that is the image display element 108.
- red light is incident on the image display element 108 from time T1 to time T2, and green light is incident from time T2 to time T3.
- blue light is incident on the image display element 108 from time T3 to time T4, and near-infrared light is incident from time T4 to time T5.
- a period D from time T1 to time T5 is set as one cycle, and red light, green light, blue light, and near-infrared light are sequentially switched in the image display element 108 while being temporally switched.
- the image control unit 110 synchronizes with the timing at which near-infrared light is incident on the image display element 108, and adds a micromirror corresponding to the bright spot 124 that serves as a reference point for line-of-sight detection.
- the micromirror corresponding to the bright spot 124 is turned on from time T4 to time T5 when near-infrared light is incident on the image display element 108. This can be realized, for example, when the image control unit 110 acquires a drive signal for the filter switching unit 106.
- the “micromirror corresponding to the bright spot 124” means, for example, a micromirror corresponding to the first bright spot 124a, the second bright spot 124b, the third bright spot 124c, and the fourth bright spot 124d shown in FIG. It is. Which of the plurality of micromirrors included in the DMD is set as the micromirror corresponding to the bright spot 124 is determined, for example, by the user 300 before the head mounted display 100 is used. What is necessary is just to determine by performing a calibration.
- the micromirror corresponding to the bright spot 124 is also used to form image display light. Therefore, during a period from time T1 to time T4 when the visible light is incident on the image display element 108, the image control unit 110 turns on / off the micromirror corresponding to the bright spot 124 based on the video signal. Control. Thereby, the image control unit 110 generates image display light based on the video signal to the user 300 when the visible light is incident on the image display element 108, and the user 300's when the near infrared light is incident.
- a bright spot 124 can be formed on the eye 302.
- the head mounted display 100 can irradiate near infrared light for forming the bright spot 124 from the front of the eye 302 of the user 300. Thereby, it can suppress that the near-infrared light which injects into the eyes 302 of the user 300, for example with the eyelashes, eyelids, etc. of the user 300 is shielded. As a result, the robustness of the gaze detection of the user 300 performed by the gaze detection unit can be improved.
- the image control unit 110 causes each micro of the DMD so that four bright spots 124 caused by corneal reflection of near-infrared light appear on the eye 302 of the user 300. Control the mirror. For this reason, if each of the plurality of bright spots 124 can be identified on the image captured by the camera 116, it is convenient for the gaze detection by the gaze detection unit. This is because it becomes easy to determine which bright spot 124 appears on the eye 302 of the user 300.
- the image control unit 110 may perform control so that the micromirrors respectively corresponding to the plurality of bright spots 124 are sequentially turned on.
- the image control unit 110 may also control micromirrors corresponding to the plurality of bright spots 124 so that the shapes of the plurality of bright spots 124 that appear on the eyes 302 of the user 300 are different.
- FIGS. 5A to 5B are diagrams showing an example of the bright spot 124 that appears in the eye 302 of the user 300.
- FIG. 5A is a diagram illustrating an example in which a plurality of bright spots 124 appear on the eyes 302 of the user 300 sequentially in a time division manner.
- FIG. 5B is a diagram showing an example in which the shapes of the plurality of bright spots 124 are different.
- the example shown in FIG. 5A corresponds to setting an identifier for uniquely identifying each of the plurality of bright spots 124 by using temporal variations.
- FIG. 5B corresponds to setting an identifier for uniquely identifying each of the plurality of bright spots 124 using so-called spatial variations.
- the image control unit 110 turns on the micromirror corresponding to the first bright spot 124a, for example, the timing at which near-infrared light is incident on the image display element 108. Even so, the micromirrors corresponding to the second bright spot 124b, the third bright spot 124c, and the fourth bright spot 124d are turned off. As shown in FIG. 3, near-infrared light is periodically incident on the image display element 108, and the incidence start time continues at T4, T8, T12, and T16 for each period D described above.
- the image control unit 110 controls the micromirror corresponding to the first bright spot 124a so that the first bright spot 124a appears on the eye 302 of the user 300 at time T4.
- the micromirror corresponding to the second bright spot 124b is controlled so that the second bright spot 124b appears on the eye 302 of the user 300.
- the image control unit 110 controls the micromirror so that the third bright spot 124c and the fourth bright spot 124d appear on the eye 302 of the user 300 at time T12 and time T16, respectively.
- the line-of-sight detection unit can uniquely identify the appearance timing of each of the plurality of bright spots 124.
- the first bright spot 124a, the second bright spot 124b, the third bright spot 124c, and the fourth bright spot 124d Appear simultaneously on the eyes 302 of the eyes.
- the shapes of the first bright spot 124a, the second bright spot 124b, the third bright spot 124c, and the fourth bright spot 124d are different from each other.
- the first bright spot 124a has a circular shape
- the second bright spot 124b has an X shape
- the third bright spot 124c has a triangular shape
- the fourth bright spot 124d has a quadrangular shape.
- the image control unit 110 controls the micromirror so that these shapes appear on the eye 302 of the user 300.
- the number of micromirrors corresponding to each luminescent spot 124 is not one but a plurality.
- Each shape of the bright spot 124 is an example, and any shape may be used as long as the shapes are different from each other.
- the line-of-sight detection unit can be uniquely identified using the shape of each of the plurality of bright spots 124. This is advantageous in that the specific time resolution of the bright spot 124 can be improved as compared with the case where the bright spot 124 is specified using the appearance timing.
- the light emitted from the white light source 102 is transmitted through each filter included in the filter group 104, so that the visible light and the invisible light are periodically transmitted.
- the time during which visible light is transmitted from the light source of the image display system 130 within one period of the light division period affects the brightness of the video presented to the user 300. That is, as the time during which visible light is transmitted from the light source of the image display system 130 within one period of the light division period, the video presented to the user 300 becomes brighter.
- the shorter the time during which visible light is transmitted from the light source of the image display system 130 within one division period that is, the longer the time during which invisible light is transmitted, the longer the bright spot 124 in the image captured by the camera 116 is.
- the time during which visible light is transmitted from the light source of the image display system 130 within one light division period is the area of the filter that transmits visible light in the color wheel that realizes the filter group 104.
- the red filter R, the green filter G, and the blue filter B are filters that transmit visible light.
- the near-infrared filter IR is a filter that transmits invisible light.
- the area of the near-infrared filter IR may be different from the area of other filters.
- a bright image can be presented to the user 300 by making the area of the near-infrared filter IR smaller than the area of other filters.
- the bright spot 124 can be made clear, and as a result, the robustness of the line-of-sight detection of the user 300 is improved. be able to.
- the operation of the video system 1 configured as described above is as follows. First, the user 300 mounts the head mounted display 100 using the mounting tool 160 so that the housing 150 of the head mounted display 100 is positioned in front of the eye 302 of the user 300. Subsequently, the user 300 executes calibration for determining the bright spot 124 so that the bright spot 124 appears near the iris of the eye 302 or the boundary of the iris.
- the image control unit 110 turns on the micromirror corresponding to the bright spot 124 in synchronization with the timing when the light source of the image display system 130 irradiates near infrared light.
- the camera 116 acquires a near-infrared image of the eye 302 of the user 300 including near-infrared light reflected by the cornea on the eye 302 of the user 300.
- the bright spot derived from the near infrared light reflected by the cornea on the eye 302 of the user 300 is the bright spot 124 described above.
- the image output unit 118 outputs the image acquired by the camera 116 to the video reproduction device 200.
- the line-of-sight detection unit in the video reproduction device 200 detects the direction of the line of sight of the user 300 by analyzing the image acquired by the camera 116.
- the line-of-sight direction of the user 300 detected by the line-of-sight detection unit is sent to, for example, an operating system that comprehensively controls various functions of the video playback device 200, and is used as an input interface for operating the head mounted display 100.
- the line-of-sight direction of the user 300 wearing the head mounted display 100 can be detected.
- the head mounted display 100 can make near infrared light incident from the front of the eye 302 of the user 300 wearing the head mounted display 100.
- near infrared light can be stably incident on the eye 302 of the user 300. Therefore, it is possible to improve the stability of the line-of-sight detection of the user 300 with the bright point 124 derived from the near-infrared light reflected by the eye 302 of the user 300 as the reference point.
- the head mounted display 100 incorporates a light source for irradiating non-visible light such as near infrared light inside the casing 150 of the head mounted display 100. For this reason, even if the housing 150 of the head mounted display 100 covers the eyes 302 of the user 300, the eyes 302 of the user 300 can be irradiated with invisible light.
- the light source of the image display system 130 is the white light source 102 has been described.
- the light source of the image display system 130 may be formed as an aggregate of a plurality of light sources having different wavelengths.
- FIG. 6 is a diagram schematically illustrating an optical configuration of the image display system 131 accommodated in the housing 150 according to the first modification.
- the description overlapping with the description of the image display system 130 according to the embodiment will be omitted or simplified as appropriate.
- the image display system 131 according to the first modification example is an image display element 108, an image control unit 110, a half mirror 112, a convex lens 114, a camera 116, and an image output unit. 118.
- the image display system 131 according to the first modification does not include the white light source 102, the filter group 104, and the filter switching unit 106. Instead, the image display system 131 according to the first modification includes a light source group 103.
- the light source group 103 is an aggregate of a plurality of light sources having different wavelengths. More specifically, the light source group 103 includes a red light source 103a, a green light source 103b, a blue light source 103c, and a near-infrared light source 103d.
- the red light source 103a, the green light source 103b, the blue light source 103c, and the near-infrared light source 103d can be realized using, for example, an LED light source.
- the red light source 103a is a red LED that emits red light
- the green light source 103b is a green LED that emits green light
- the blue light source 103c is a blue LED that emits blue light
- the near-infrared light source 103d is near-infrared light. It is realizable using infrared LED which irradiates.
- the light source group 103 can irradiate light of each wavelength independently, the filter group 104 and the filter switching unit 106 are not required.
- the light source group 103 may emit each LED according to a timing chart as shown in FIG.
- the light source portion 103 is configured so that the time for irradiating visible light and the time for irradiating invisible light are different within one period of the light division period. Also good.
- the head mounted display 100 including the image display system 131 according to the first modification has the same effect as the head mounted display 100 including the image display system 130 according to the above-described embodiment.
- the image display system 131 according to the first modification does not have the filter switching unit 105 realized by a motor or the like, compared with the image display system 130 according to the embodiment, the cause of failure is reduced.
- the head mounted display 100 can be reduced in noise, reduced in vibration, and reduced in weight.
- FIG. 7 is a diagram schematically illustrating an optical configuration of the image display system 132 accommodated in the housing 150 according to the second modification.
- the description overlapping with the description of the image display system 130 or the image display system 131 described above will be omitted or simplified as appropriate.
- the image display system 132 includes a light source group 103, an image display element 108, an image control unit 110, a convex lens 114, a camera 116, and an image output unit 118 in a casing 150.
- the timing at which the light source group 103 irradiates the red light source 103a, the green light source 103b, the blue light source 103c, and the near-infrared light source 103d is the same as the timing chart shown in FIG.
- the camera 116 captures and captures the bright spots 124 that appear in the eyes 302 of the user 300 by the half mirror 112.
- the camera 116 faces the eye 302 of the user 300, and directly captures near-infrared light reflected by the eye 302 of the user 300.
- the head mounted display 100 including the image display system 132 according to the second modification has the same effect as the head mounted display 100 including the image display system 130 according to the above-described embodiment.
- the camera 116 can capture the bright spot 124 caused by near-infrared light without using the half mirror 112. Since it is not necessary to house the half mirror 112 in the housing 150, the head mounted display 100 can be reduced in weight and size. It also contributes to cost reduction of the head mounted display 100. Furthermore, since the number of parts of the optical circuit constituting the image display system is reduced, it is possible to reduce problems such as optical axis misalignment.
- FIG. 8 is a diagram schematically illustrating an optical configuration of the image display system 133 accommodated in the housing 150 according to the third modification.
- the image display system 130, the image display system 131, or the image display system 133 described above will be appropriately omitted or simplified.
- the image display system 133 according to the third modification includes a light source group 103, a near infrared light source 103d, an image display element 108, an image control unit 110, a convex lens 114, a camera 116, and an image output unit 118.
- the light source group 103 included in the image display system 133 according to the third modification includes a red light source 103a, a green light source 103b, and a blue light source 103c, but does not include a near infrared light source 103d.
- the near-infrared light source 103 d is disposed in the vicinity of the side surface of the convex lens 114.
- the near-infrared light source 103d is arranged so as to irradiate near-infrared light in a direction toward the eye 302 of the user 300.
- FIG. 8 shows an example in which two near-infrared light sources 103d are arranged above and below the convex lens 114 in the drawing.
- the number of near-infrared light sources 103d is not limited to two and may be at least one.
- irradiation is performed toward different positions so that bright spots 124 are generated at different positions of the eyes 302 of the user 300, respectively.
- the irradiation timing of the red light source 103a, the green light source 103b, and the blue light source 103c included in the light source group 103 and the irradiation timing of the near-infrared light source 103d are the same as the timing chart shown in FIG.
- a plurality of near-infrared light sources 103d emit light simultaneously from time T4 to time T5.
- the micromirror is controlled to irradiate the camera 116.
- the optical path of the near-infrared light irradiated from the near-infrared light source 103 d and reflected by the eye 302 of the user 300 is displayed.
- a DMD micromirror is used.
- the image control unit 110 moves toward the optical path of the near-infrared light reflected by the eye 302 of the user 300 when the near-infrared light source 103d is radiating near-infrared light. Control the micromirror of DMD.
- the camera 116 can photograph the bright spot 124 caused by the near infrared light without using the half mirror 112.
- the head mounted display 100 can be reduced in weight and size. It also contributes to cost reduction of the head mounted display 100. Furthermore, since the number of parts of the optical circuit constituting the image display system is reduced, it is possible to reduce problems such as optical axis misalignment.
- the image display element 108 is realized by DMD.
- the image display element 108 is not limited to the DMD, and can be realized using other reflective devices.
- the image display element 108 may be realized by LCOS (Liquid crystal on silicon) (trademark) instead of DMD.
- each bright spot 124 may be set using a frequency variation.
- the near-infrared light source 103d transmits near-infrared light so that the first luminescent spot 124a, the second luminescent spot 124b, the third luminescent spot 124c, and the fourth luminescent spot 124d in the example shown in FIG. Control the frequency of lighting. The blinking frequency is changed by each bright spot 124.
- the line-of-sight detection unit can identify each bright spot 124 by analyzing the frequency of the bright spot reflected in the video captured by the camera 116.
- the line-of-sight detection unit may identify each bright spot by acquiring the on / off ratio of lighting in the PWM control of the near-infrared light source 103d.
- the video system 1 has been described above based on the embodiment and some modifications. New modifications that can be made by combining the embodiments or arbitrary configurations of the respective modifications are also included in the modifications of the present invention.
- the position of the near-infrared light source 103d in the image display system 133 according to the third modification may be set as the position of the near-infrared light source 103d in the image display system 132 according to the second modification.
- a reflective region 115 is provided on the convex lens, and a near-infrared light source 103d is arranged so that near-infrared light is incident toward the reflective region 115. Also good.
- 1 video system 100 head mounted display, 102 white light source, 103 light source group, 103a red light source, 103b green light source, 103c blue light source, 103d near infrared light source, 104 filter group, 105, 106 filter switching unit, 108 image display element 110 image control unit, 112 half mirror, 114 convex lens, 115 reflection area, 116 camera, 118 image output unit, 120 image display light, 124 bright spots, 130, 131, 132, 133 image display system, 150 housing, 160 Wearing tool, 170 headphones, 200 video playback device, 300 users, 302 eyes, 304 pupil center.
- This invention can be used for a head mounted display.
Abstract
Description
上記の説明では、画像表示系130の光源が白色光源102である場合について説明した。これに代えて、画像表示系130の光源は、波長が異なる複数の光源の集合体として形成してもよい。
上記の説明では、ユーザ300の眼302で反射した非可視光を撮影するためにハーフミラー112を用いる場合について説明した。しかしながら、ハーフミラー112は必須の構成ではなく、省略してもよい。
図8は、第3の変形例に係る筐体150が収容する画像表示系133の光学構成を模式的に示す図である。なお、以下本明細書において、上述した画像表示系130、画像表示系131、または画像表示系133の説明と重複する説明については、適宜省略または簡略化して記載する。
上述した実施の形態に係る画像表示系130、および第1の変形例に係る画像表示系131においては、画像表示素子108がDMDで実現される場合について説明した。画像表示素子108はDMDには限られず、他の反射型デバイスを用いても実現できる。例えば、画像表示素子108は、DMDに代えてLCOS(Liquid crystal on silicon)(商標)で実現されてもよい。
図5(a)-(b)を参照して上述した説明では、複数の輝点124を識別するために、各輝点124の時間的なバリエーション、または空間的なバリエーションを設定する場合について説明した。この他にも、時間的なバリエーションの他の例として、各輝点124の識別を周波数的なバリエーションを用いて設定してもよい.例えば、近赤外光源103dは、図5に示す例における第1輝点124a、第2輝点124b、第3輝点124c、および第4輝点124dが点滅するように、近赤外光の点灯の周波数を制御する。この点滅の周波数を、各輝点124によって変化させる。視線検出部は、カメラ116が動画撮影した映像に映る輝点の周波数を解析することで、各輝点124を識別することができる。あるいは、視線検出部は、近赤外光源103dのPWM制御における点灯のオン/オフ比を取得して、各輝点を識別してもよい。
Claims (7)
- ユーザの頭部に装着して使用されるヘッドマウントディスプレイであって、
非可視光を照射可能な光源と、
前記光源から照射され、前記ユーザの眼で反射された非可視光を撮像するカメラと、
前記カメラが撮像した画像を、前記ユーザの視線方向を検出する視線検出部に出力する画像出力部と、
前記光源、前記カメラ、および前記画像出力部を収容する筐体とを備えることを特徴とするヘッドマウントディスプレイ。 - 前記光源は可視光も照射可能であり、
前記ヘッドマウントディスプレイはさらに、
前記光源から照射された可視光を用いて画像表示光を生成する画像表示素子を備え、
前記カメラは、前記光源から照射され、前記画像表示素子を経由して前記ユーザの眼で反射された非可視光を撮像することを特徴とする請求項1に記載のヘッドマウントディスプレイ。 - 前記ユーザが前記ヘッドマウントディスプレイを装着したときに、前記画像表示素子と前記ユーザの眼との間に配置され、非可視光を反射させるハーフミラーをさらに備え、
前記カメラは、前記ユーザの眼と前記ハーフミラーとに反射された非可視光を撮像することを特徴とする請求項2に記載のヘッドマウントディスプレイ。 - 前記光源は、可視光と非可視光とを、時分割で照射することを特徴とする請求項2または3に記載のヘッドマウントディスプレイ。
- 前記画像表示素子は、1つのマイクロミラーが1画素に対応する複数のマイクロミラーを備えるDMD(Digital Mirror Device)であり、
前記ヘッドマウントディスプレイは前記複数のマイクロミラーそれぞれを独立に制御する画像制御部をさらに備え、
前記画像制御部は、前記光源が非可視光を照射しているときに前記ユーザの眼内に少なくとも1画素分の非可視光が入射するように、前記複数のマイクロミラーを制御することを特徴とする請求項4に記載のヘッドマウントディスプレイ。 - 前記光源は、
近赤外光の波長領域の光を含む光を照射する白色光源と、
前記白色光源が照射する光のうち、赤色光を透過させる赤フィルタ、青色光を透過させる青フィルタ、緑色光を透過させる緑フィルタ、および近赤外光を透過させる近赤外フィルタを備えるフィルタ群と、
前記フィルタ群を切り替えるフィルタ切替部とを備えることを特徴とする請求項2から5のいずれかに記載のヘッドマウントディスプレイ。 - 前記フィルタ群は、近赤外フィルタの面積が、それ以外のフィルタの面積と異なることを特徴とする請求項6に記載のヘッドマウントディスプレイ。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/084723 WO2016103525A1 (ja) | 2014-12-27 | 2014-12-27 | ヘッドマウントディスプレイ |
US14/897,883 US20160370591A1 (en) | 2014-12-27 | 2014-12-27 | Head mounted display |
JP2015530205A JP5824697B1 (ja) | 2014-12-27 | 2014-12-27 | ヘッドマウントディスプレイ |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/084723 WO2016103525A1 (ja) | 2014-12-27 | 2014-12-27 | ヘッドマウントディスプレイ |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016103525A1 true WO2016103525A1 (ja) | 2016-06-30 |
Family
ID=54696334
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/084723 WO2016103525A1 (ja) | 2014-12-27 | 2014-12-27 | ヘッドマウントディスプレイ |
Country Status (3)
Country | Link |
---|---|
US (1) | US20160370591A1 (ja) |
JP (1) | JP5824697B1 (ja) |
WO (1) | WO2016103525A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2017026371A1 (ja) * | 2015-08-11 | 2018-03-15 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイ |
US10209773B2 (en) | 2016-04-08 | 2019-02-19 | Vizzario, Inc. | Methods and systems for obtaining, aggregating, and analyzing vision data to assess a person's vision performance |
US10299673B2 (en) | 2008-01-14 | 2019-05-28 | Vizzario, Inc. | Method and system of enhancing ganglion cell function to improve physical performance |
WO2019124864A1 (ko) * | 2017-12-19 | 2019-06-27 | 삼성전자 주식회사 | 외부 전자 장치가 경사지게 결합될 수 있는 마운트 장치 |
EP3520681A4 (en) * | 2016-11-04 | 2019-10-09 | Samsung Electronics Co., Ltd. | METHOD AND DEVICE FOR RECORDING INFORMATION BY EYE DETECTION |
WO2020166256A1 (ja) * | 2019-02-14 | 2020-08-20 | 株式会社資生堂 | 情報処理端末、プログラム、情報処理システム及び色補正方法 |
JP2020536309A (ja) * | 2017-09-28 | 2020-12-10 | アップル インコーポレイテッドApple Inc. | イベントカメラデータを使用するアイトラッキングの方法及び装置 |
JP2021507301A (ja) * | 2017-12-21 | 2021-02-22 | ビ−エイイ− システムズ パブリック リミテッド カンパニ−BAE SYSTEMS plc | 頭部装着型ディスプレイのためのアイトラッキング |
JP2022022421A (ja) * | 2017-02-27 | 2022-02-03 | アドバンスド ニュー テクノロジーズ カンパニー リミテッド | バーチャル・リアリティー・ヘッドマウント装置 |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016157486A1 (ja) * | 2015-04-01 | 2016-10-06 | フォーブ インコーポレーテッド | ヘッドマウントディスプレイ |
FR3053235A1 (fr) * | 2016-06-29 | 2018-01-05 | Institut Mines Telecom | Dispositif de mesure oculaire equipe d'un systeme optiquement aligne sur l'axe de vision de l'utilisateur |
CN106932904A (zh) | 2017-02-27 | 2017-07-07 | 阿里巴巴集团控股有限公司 | 虚拟现实头戴设备 |
CN106908951A (zh) * | 2017-02-27 | 2017-06-30 | 阿里巴巴集团控股有限公司 | 虚拟现实头戴设备 |
KR102413218B1 (ko) | 2017-03-22 | 2022-06-24 | 삼성디스플레이 주식회사 | 헤드 마운티드 디스플레이 장치 |
JP6953247B2 (ja) | 2017-09-08 | 2021-10-27 | ラピスセミコンダクタ株式会社 | ゴーグル型表示装置、視線検出方法及び視線検出システム |
GB2569600B (en) * | 2017-12-21 | 2023-02-08 | Bae Systems Plc | Eye tracking for head-worn display |
US10948729B2 (en) | 2019-04-16 | 2021-03-16 | Facebook Technologies, Llc | Keep-out zone for in-field light sources of a head mounted display |
US10877268B2 (en) * | 2019-04-16 | 2020-12-29 | Facebook Technologies, Llc | Active control of in-field light sources of a head mounted display |
CN110381244B (zh) * | 2019-08-26 | 2021-02-02 | 浙江大华技术股份有限公司 | 一种摄像头及低照度下提升图像质量的方法 |
FR3115118B1 (fr) * | 2020-10-09 | 2022-10-28 | Commissariat Energie Atomique | Système de vision en réalité virtuelle ou augmentée avec capteur d’image de l’œil |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001264685A (ja) * | 2000-03-23 | 2001-09-26 | Canon Inc | 画像表示装置および画像表示システム |
JP2002328330A (ja) * | 2001-04-27 | 2002-11-15 | Sony Corp | 映像表示装置 |
JP2009122550A (ja) * | 2007-11-16 | 2009-06-04 | Panasonic Electric Works Co Ltd | 網膜投影ディスプレイ装置 |
US20110109880A1 (en) * | 2006-01-26 | 2011-05-12 | Ville Nummela | Eye Tracker Device |
JP2014132305A (ja) * | 2013-01-07 | 2014-07-17 | Seiko Epson Corp | 表示装置、および、表示装置の制御方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6544193B2 (en) * | 1996-09-04 | 2003-04-08 | Marcio Marc Abreu | Noninvasive measurement of chemical substances |
US20090134332A1 (en) * | 2007-11-27 | 2009-05-28 | Thompson Jason R | Infrared Encoded Objects and Controls for Display Systems |
KR101931406B1 (ko) * | 2012-01-24 | 2018-12-20 | 더 아리조나 보드 오브 리전츠 온 비핼프 오브 더 유니버시티 오브 아리조나 | 컴팩트한 시선추적 기능의 헤드 탑재형 디스플레이 |
-
2014
- 2014-12-27 JP JP2015530205A patent/JP5824697B1/ja not_active Expired - Fee Related
- 2014-12-27 WO PCT/JP2014/084723 patent/WO2016103525A1/ja active Application Filing
- 2014-12-27 US US14/897,883 patent/US20160370591A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001264685A (ja) * | 2000-03-23 | 2001-09-26 | Canon Inc | 画像表示装置および画像表示システム |
JP2002328330A (ja) * | 2001-04-27 | 2002-11-15 | Sony Corp | 映像表示装置 |
US20110109880A1 (en) * | 2006-01-26 | 2011-05-12 | Ville Nummela | Eye Tracker Device |
JP2009122550A (ja) * | 2007-11-16 | 2009-06-04 | Panasonic Electric Works Co Ltd | 網膜投影ディスプレイ装置 |
JP2014132305A (ja) * | 2013-01-07 | 2014-07-17 | Seiko Epson Corp | 表示装置、および、表示装置の制御方法 |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11096570B2 (en) | 2008-01-14 | 2021-08-24 | Vizzario, Inc. | Method and system of enhancing ganglion cell function to improve physical performance |
US10299673B2 (en) | 2008-01-14 | 2019-05-28 | Vizzario, Inc. | Method and system of enhancing ganglion cell function to improve physical performance |
EP3337164A4 (en) * | 2015-08-11 | 2019-04-10 | Sony Interactive Entertainment Inc. | HEAD-MOUNTED DISPLAY |
JPWO2017026371A1 (ja) * | 2015-08-11 | 2018-03-15 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイ |
US10635901B2 (en) | 2015-08-11 | 2020-04-28 | Sony Interactive Entertainment Inc. | Head-mounted display |
US11126840B2 (en) | 2015-08-11 | 2021-09-21 | Sony Interactive Entertainment Inc. | Head-mounted display |
US10209773B2 (en) | 2016-04-08 | 2019-02-19 | Vizzario, Inc. | Methods and systems for obtaining, aggregating, and analyzing vision data to assess a person's vision performance |
US11561614B2 (en) | 2016-04-08 | 2023-01-24 | Sphairos, Inc. | Methods and systems for obtaining, aggregating, and analyzing vision data to assess a person's vision performance |
EP3520681A4 (en) * | 2016-11-04 | 2019-10-09 | Samsung Electronics Co., Ltd. | METHOD AND DEVICE FOR RECORDING INFORMATION BY EYE DETECTION |
US11307646B2 (en) | 2016-11-04 | 2022-04-19 | Samsung Electronics Co., Ltd. | Method and apparatus for acquiring information by capturing eye |
JP2022022421A (ja) * | 2017-02-27 | 2022-02-03 | アドバンスド ニュー テクノロジーズ カンパニー リミテッド | バーチャル・リアリティー・ヘッドマウント装置 |
JP7369173B2 (ja) | 2017-02-27 | 2023-10-25 | アドバンスド ニュー テクノロジーズ カンパニー リミテッド | バーチャル・リアリティー・ヘッドマウント装置 |
US11150469B2 (en) | 2017-09-28 | 2021-10-19 | Apple Inc. | Method and device for eye tracking using event camera data |
JP7008131B2 (ja) | 2017-09-28 | 2022-01-25 | アップル インコーポレイテッド | イベントカメラデータを使用するアイトラッキングの方法及び装置 |
US11474348B2 (en) | 2017-09-28 | 2022-10-18 | Apple Inc. | Method and device for eye tracking using event camera data |
JP2020536309A (ja) * | 2017-09-28 | 2020-12-10 | アップル インコーポレイテッドApple Inc. | イベントカメラデータを使用するアイトラッキングの方法及び装置 |
WO2019124864A1 (ko) * | 2017-12-19 | 2019-06-27 | 삼성전자 주식회사 | 외부 전자 장치가 경사지게 결합될 수 있는 마운트 장치 |
US11561404B2 (en) | 2017-12-19 | 2023-01-24 | Samsung Electronics Co., Ltd. | Mount device to which an external electronic device can be coupled so as to slope |
JP2021507301A (ja) * | 2017-12-21 | 2021-02-22 | ビ−エイイ− システムズ パブリック リミテッド カンパニ−BAE SYSTEMS plc | 頭部装着型ディスプレイのためのアイトラッキング |
JP7187561B2 (ja) | 2017-12-21 | 2022-12-12 | ビ-エイイ- システムズ パブリック リミテッド カンパニ- | 頭部装着型ディスプレイのためのアイトラッキング |
WO2020166256A1 (ja) * | 2019-02-14 | 2020-08-20 | 株式会社資生堂 | 情報処理端末、プログラム、情報処理システム及び色補正方法 |
JP7335283B2 (ja) | 2019-02-14 | 2023-08-29 | 株式会社 資生堂 | 情報処理端末、プログラム、情報処理システム及び色補正方法 |
Also Published As
Publication number | Publication date |
---|---|
JP5824697B1 (ja) | 2015-11-25 |
JPWO2016103525A1 (ja) | 2017-04-27 |
US20160370591A1 (en) | 2016-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5824697B1 (ja) | ヘッドマウントディスプレイ | |
US9967487B2 (en) | Preparation of image capture device in response to pre-image-capture signal | |
US9411160B2 (en) | Head mounted display, control method for head mounted display, and image display system | |
JP6112583B2 (ja) | ヘッドマウントディスプレイ | |
US9906781B2 (en) | Head mounted display device and control method for head mounted display device | |
US9360672B2 (en) | Head mounted display device and control method for head mounted display device | |
US9696801B2 (en) | Eye-controlled user interface to control an electronic device | |
US20150261293A1 (en) | Remote device control via gaze detection | |
US20180007258A1 (en) | External imaging system, external imaging method, external imaging program | |
JP6241086B2 (ja) | 表示装置、投影装置、表示補助装置及びシステム | |
US20160165220A1 (en) | Display apparatus and method of controlling display apparatus | |
US20210109363A1 (en) | Apparatus and methods to render 3d digital content having multiple views | |
US20160216778A1 (en) | Interactive projector and operation method thereof for determining depth information of object | |
JP7218376B2 (ja) | 視線追跡方法および装置 | |
JP6075083B2 (ja) | 頭部装着型表示装置および頭部装着型表示装置の制御方法 | |
JP2016127587A (ja) | ヘッドマウントディスプレイ | |
JP6883256B2 (ja) | 投影装置 | |
US20160077340A1 (en) | Head mounted display device, control method for head mounted display device, and computer program | |
JP2014011654A (ja) | 画像解析装置、画像解析方法、および画像解析システム | |
WO2022079584A1 (ja) | 視線検出装置、視線検出プログラム及びヘッドマウントディスプレイ | |
JP6304415B2 (ja) | 頭部装着型表示装置、および、頭部装着型表示装置の制御方法 | |
WO2018158921A1 (ja) | 画像表示装置、及び、眼の挙動検出方法 | |
JP2013121068A (ja) | 投写型映像表示装置 | |
KR101142176B1 (ko) | 입체 영상 제공 장치 및 그 방법 | |
JP2021068296A (ja) | 情報処理装置、ヘッドマウントディスプレイ、およびユーザ操作処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015530205 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14897883 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14909126 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14909126 Country of ref document: EP Kind code of ref document: A1 |