US20180067306A1 - Head mounted display - Google Patents
Head mounted display Download PDFInfo
- Publication number
- US20180067306A1 US20180067306A1 US15/563,444 US201615563444A US2018067306A1 US 20180067306 A1 US20180067306 A1 US 20180067306A1 US 201615563444 A US201615563444 A US 201615563444A US 2018067306 A1 US2018067306 A1 US 2018067306A1
- Authority
- US
- United States
- Prior art keywords
- user
- image
- pixel
- sub
- head
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 210000004087 cornea Anatomy 0.000 claims abstract description 88
- 210000003128 head Anatomy 0.000 claims abstract description 15
- 238000001514 detection method Methods 0.000 claims description 71
- 238000003384 imaging method Methods 0.000 claims description 12
- 230000005540 biological transmission Effects 0.000 claims description 11
- 230000001678 irradiating effect Effects 0.000 claims 1
- 230000004438 eyesight Effects 0.000 description 50
- 238000010586 diagram Methods 0.000 description 25
- 238000000034 method Methods 0.000 description 22
- 230000014509 gene expression Effects 0.000 description 17
- 239000011159 matrix material Substances 0.000 description 14
- 230000008569 process Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000004397 blinking Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 239000013307 optical fiber Substances 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000003754 machining Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002207 retinal effect Effects 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B17/00—Systems with reflecting surfaces, with or without refracting elements
- G02B17/08—Catadioptric systems
- G02B17/0856—Catadioptric systems comprising a refractive element with a reflective surface, the reflection taking place inside the element, e.g. Mangin mirrors
- G02B17/086—Catadioptric systems comprising a refractive element with a reflective surface, the reflection taking place inside the element, e.g. Mangin mirrors wherein the system is made of a single block of optical material, e.g. solid catadioptric systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B19/00—Condensers, e.g. light collectors or similar non-imaging optics
- G02B19/0004—Condensers, e.g. light collectors or similar non-imaging optics characterised by the optical means employed
- G02B19/0028—Condensers, e.g. light collectors or similar non-imaging optics characterised by the optical means employed refractive and reflective surfaces, e.g. non-imaging catadioptric systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/09—Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
- G02B27/0938—Using specific optical elements
- G02B27/095—Refractive optical elements
- G02B27/0955—Lenses
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/32—Fiducial marks and measuring scales within the optical system
- G02B27/34—Fiducial marks and measuring scales within the optical system illuminated
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/215—Motion-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- H04N13/044—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B19/00—Condensers, e.g. light collectors or similar non-imaging optics
- G02B19/0033—Condensers, e.g. light collectors or similar non-imaging optics characterised by the use
- G02B19/009—Condensers, e.g. light collectors or similar non-imaging optics characterised by the use for use with infrared radiation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Position Input By Displaying (AREA)
Abstract
A head mounted display is used in a state of being mounted on a user's head and includes a convex lens disposed at a position facing the user's cornea when the head mounted display is mounted. An infrared light source emits infrared light toward the convex lens. A camera captures an image including the user's cornea in a subject. A housing houses the convex lens, the infrared light source, and the camera. The convex lens is provided with a plurality of reflection regions that reflects infrared light in an inside of the convex lens. The infrared light source causes a pattern of infrared light to appear on the user's cornea by emitting infrared light to each of the plurality of reflection regions provided in the convex lens.
Description
- This disclosure relates to a head mounted display.
- A technique is known in which the eyesight direction of a user is detected by emitting non-visible light such as near-infrared light to the user's eyes, and analyzing an image of the user's eyes including reflected light. Information of the detected eyesight direction of the user is reflected on the monitor of, for example, a PC (Personal Computer), a game console or the like, and thus use as a pointing device has been realized.
- Japanese Unexamined Patent Application, First Publication No. H2-264632
- A head mounted display is an image display device that presents a three-dimensional image to a user wearing the device. Generally, the head mounted display is used in a state of being mounted to cover the visual range of a user. For this reason, a user wearing the head mounted display has an external image shielded. When the head mounted display is used as a display device of an image of a moving picture, a game or the like, it is difficult for a user to visually recognize an input device such as a controller.
- Therefore, the usability of a head mounted display as a substitute for a pointing device by detecting the eyesight direction of a user wearing the display is of convenience. Particularly, the acquisition of geometric information (information of spatial coordinates or a shape) of a user's cornea in a state where the user wears a head mounted display is useful in estimating the eyesight direction of the user.
- It could therefore be helpful to provide a technique of detecting geometric information of the cornea of a user wearing a head mounted display.
- Provided is a line-of-sight detection system including a head-mounted display and a line-of-sight detection device, the head-mounted display includes an image display element that includes a plurality of pixels, each pixel including sub-pixels that emit red, green, blue, invisible light, and displays an image to be viewed by a user; an imaging unit that images an eye of the user wearing the head-mounted display on the basis of the invisible light emitted from the sub-pixel that emits the invisible light; and a transmission unit that transmits a captured image captured by the imaging unit, and the line-of-sight detection device includes a reception unit that receives the captured image; and a line-of-sight detection unit that detects a line of sight of the eye of the user based on the captured image.
- Further, in order to resolve the problem, a head-mounted display according to an aspect of the present invention is a head-mounted display mounted on a head of a user and used, and includes a convex lens disposed in a position facing a cornea of the user when the head-mounted display is mounted; an image display element that includes a plurality of pixels, each pixel including sub-pixels that emit red, green, blue, invisible light, and displays an image to be viewed by a user; a camera that images a video including the cornea of the user as a subject; and a housing that houses the convex lens, the image display element, and the camera.
- Further, in the head-mounted display, the head-mounted display may further include a control unit that selects the pixel that emits invisible light among the plurality of pixels constituting the image display element and causes the selected pixel to emit light.
- Further, the control unit may change the pixel that emits the invisible light when a predetermined time elapses.
- Further, the control unit may switch and control a light emission timing of the sub-pixel that emit the invisible light and the sub-pixel other than the sub-pixels that emit the invisible light.
- Meanwhile, any combination of the aforementioned components, and implementation of our displays in the form of methods, devices, systems, computer programs, data structures, recording mediums, and the like may be considered part of this disclosure.
- It is thus possible to provide a technique of detecting geometric information of the cornea of a user wearing a head mounted display.
-
FIG. 1 is a diagram schematically illustrating a general view of an example of our image system. -
FIG. 2 is a diagram schematically illustrating an optical configuration of an image display system housed by a housing. -
FIGS. 3(a) and 3(b) are schematic diagrams illustrating a reflection region. -
FIGS. 4(a) and 4(b) are diagrams schematically illustrating an example of dot patterns generated by a reflection region of a convex lens. -
FIG. 5 is a schematic diagram illustrating a relationship between a captured dot pattern and a structure of a subject. -
FIG. 6 is a diagram schematically illustrating a functional configuration of an image reproducing device. -
FIG. 7 is a schematic diagram illustrating an eyesight direction of a user. -
FIG. 8 is a schematic diagram illustrating calibration in an eyesight direction which is executed by a head mounted display. -
FIG. 9 is a diagram illustrating an example of a pixel configuration of an image display element. -
FIG. 10 is a diagram illustrating an example in which an eye of a user is directly irradiated with image light, which includes infrared light. -
FIG. 11A andFIG. 11B illustrate an example in which a line-of-sight detection is performed on the basis of an image reflected in the eye of the user. -
FIG. 1 is a diagram schematically illustrating a general view of animage system 1 according to an example. Theimage system 1 includes a head mounteddisplay 100 and animage reproducing device 200. As shown inFIG. 1 , the head mounteddisplay 100 is used in a state of being mounted on the head of auser 300. - The
image reproducing device 200 generates an image displayed by the head mounteddisplay 100. Although not limited, as an example, theimage reproducing device 200 is a device capable of reproducing an image of a stationary game console, a portable game console, a PC, a tablet, a smartphone, a phablet, a video player, a television or the like. Theimage reproducing device 200 connects to the head mounteddisplay 100 in a wireless or wired manner. In an example shown inFIG. 1 , theimage reproducing device 200 is wirelessly connected to the head mounteddisplay 100. The wireless connection of theimage reproducing device 200 to the head mounteddisplay 100 can be realized using, for example, a wireless communication technique such as known WI-FI (Registered Trademark) or BLUETOOTH (Registered Trademark). Although not limited, as an example, transmission of an image between the head mounteddisplay 100 and theimage reproducing device 200 is executed according to the standard of MIRACAST (Trademark), WIGIG (Trademark), WHDI (Trademark), or the like. - Meanwhile,
FIG. 1 illustrates an example when the head mounteddisplay 100 and theimage reproducing device 200 are different devices. However, theimage reproducing device 200 may be built into the head mounteddisplay 100. - The head mounted
display 100 includes ahousing 150, amounting fixture 160, and aheadphone 170. Thehousing 150 houses an image display system such as an image display element that presents an image to theuser 300, or a wireless transmission module such as a WI-FI module or a BLUETOOTH (Registered Trademark) module which is not shown. Themounting fixture 160 mounts the head mounteddisplay 100 on the head of theuser 300. The mountingfixture 160 can be realized by, for example, a belt, an elastic band or the like. When theuser 300 mounts the head mounteddisplay 100 using themounting fixture 160, thehousing 150 is disposed at a position where the eyes of theuser 300 are covered. For this reason, when theuser 300 mounts the head mounteddisplay 100, the visual range of theuser 300 is shielded by thehousing 150. - The
headphone 170 outputs a voice of an image reproduced by theimage reproducing device 200. Theheadphone 170 may be fixed to the head mounteddisplay 100. Even in a state where theuser 300 mounts the head mounteddisplay 100 using themounting fixture 160, the user can freely attach and detach theheadphone 170. -
FIG. 2 is a diagram schematically illustrating an optical configuration of animage display system 130 housed by thehousing 150. Theimage display system 130 includes a near-infrared light source 103, animage display element 108, ahot mirror 112, aconvex lens 114, acamera 116, and animage output unit 118. - The near-
infrared light source 103 is a light source capable of emitting light of a near-infrared (approximately 700 nm to 2,500 nm) wavelength band. The near-infrared light is light of a wavelength band of non-visible light which is not able to be generally observed with a naked eye of theuser 300. - The
image display element 108 displays an image for presentation to theuser 300. The image displayed by theimage display element 108 is generated by a GPU (Graphic Processing Unit), not shown, within theimage reproducing device 200. Theimage display element 108 can be realized using, for example, a known LCD (Liquid Crystal Display), an organic EL display (Organic Electro-Luminescent Display) or the like. - When the
user 300 mounts the head mounteddisplay 100, thehot mirror 112 is disposed between theimage display element 108 and thecornea 302 of theuser 300. Thehot mirror 112 has a property of transmitting visible light generated by theimage display element 108, but reflecting near-infrared light. - The
convex lens 114 is disposed on the opposite side to theimage display element 108 with respect to thehot mirror 112. In other words, when theuser 300 mounts the head mounteddisplay 100, theconvex lens 114 is disposed between thehot mirror 112 and thecornea 302 of theuser 300. That is, when the head mounteddisplay 100 is mounted to theuser 300, theconvex lens 114 is disposed at a position facing thecornea 302 of theuser 300. - The
convex lens 114 condenses image display light that passes through thehot mirror 112. For this reason, theconvex lens 114 functions as an image enlargement unit that enlarges an image generated by theimage display element 108 and presents the enlarged image to theuser 300. Meanwhile, for convenience of description, only oneconvex lens 114 is shown inFIG. 2 , but theconvex lens 114 may be a lens group configured by combining various lenses, and may be configured such that one lens has a curvature and the other lens is a planar one-sided convex lens. - The near-infrared
light source 103 is disposed at the lateral side of theconvex lens 114. The near-infraredlight source 103 emits infrared light toward the inside of theconvex lens 114. Theconvex lens 114 is provided with a plurality of reflection regions that reflect the infrared light inside the lens. These reflection regions can be realized by providing fine regions having different refractive indexes in the inside of theconvex lens 114. Meanwhile, providing the regions having different refractive indexes in theconvex lens 114 can be realized using a known laser machining technique. The reflection region is provided at a plurality of places in the inside of theconvex lens 114. - Near-infrared light emitted toward the inside of the
convex lens 114 by the near-infraredlight source 103 is reflected from the reflection region inside theconvex lens 114 and directed to thecornea 302 of theuser 300. Meanwhile, since the near-infrared light is non-visible light, theuser 300 is almost not able to perceive the near-infrared light reflected from the reflection region. In addition, the reflection region is a region which is as large as a pixel constituting theimage display element 108 or is finer. For this reason, theuser 300 is almost not able to perceive the reflection region, and is able to observe image light emitted by theimage display element 108. Meanwhile, the details of the reflection region will be described later. - Although not shown, the
image display system 130 of the head mounteddisplay 100 includes twoimage display elements 108, and can generate an image for presentation to the right eye of theuser 300 and an image for presentation to the left eye independently of each other. For this reason, the head mounteddisplay 100 can present a parallax image for the right eye and a parallax image for the left eye, respectively, to the right eye and the left eye of theuser 300. Thereby, the head mounteddisplay 100 can present a stereoscopic image having a sense of depth to theuser 300. - As described above, the
hot mirror 112 transmits visible light, and reflects near-infrared light. Therefore, image light emitted by theimage display element 108 passes through thehot mirror 112 and reaches thecornea 302 of theuser 300. In addition, infrared light emitted from the near-infraredlight source 103 and reflected from the reflection region inside theconvex lens 114 reaches thecornea 302 of theuser 300. - The infrared light reaching the
cornea 302 of theuser 300 is reflected from thecornea 302 of theuser 300, and directed to the direction of theconvex lens 114 again. This infrared light passes through theconvex lens 114, and is reflected from thehot mirror 112. Thecamera 116 includes a filter that shields visible light, and captures near-infrared light reflected from thehot mirror 112. That is, thecamera 116 is a near-infrared camera that captures near-infrared light emitted from the near-infraredlight source 103 and reflected from the cornea of theuser 300. - The
image output unit 118 outputs an image captured by thecamera 116 to an eyesight detection unit that detects the eyesight direction of theuser 300. Theimage output unit 118 also outputs the image captured by thecamera 116 to a cornea coordinate acquisition unit that acquires spatial coordinates of the user's cornea. Specifically, theimage output unit 118 transmits the image captured by thecamera 116 to theimage reproducing device 200. The eyesight detection unit and the cornea coordinate acquisition unit will be described later, but can be realized by an eyesight detecting program and a cornea coordinate acquiring program executed by a CPU (Central Processing Unit) of theimage reproducing device 200. Meanwhile, when the head mounteddisplay 100 has a computing resource of a CPU, a memory or the like, the CPU of the head mounteddisplay 100 may execute a program to operate the eyesight detection unit. - Although a detailed description will be given later, in the image captured by the
camera 116, a bright point of the near-infrared light reflected by thecornea 302 of theuser 300 and an image of the eye including thecornea 302 of theuser 300 observed at a near-infrared wavelength band are captured. - In the
convex lens 114, a plurality of reflection regions are formed so that a pattern of infrared light appearing on thecornea 302 of theuser 300 forms structured light. The “structured light” refers to light used in one method of three-dimensional measurement of an object called a structured light method. More specifically, the structured light is light emitted to cause a light pattern having a special structure to appear on the surface of an object to be measured. Various patterns caused to appear through the structured light are present, but include as an example, a plurality of dot patterns arrayed in a lattice shape, stripe-shaped patterns disposed at equal intervals, a lattice pattern, and the like. In addition, the structured light may include not only single-color light, but also multi-color (such as, for example, red, green and blue) light. - The structured light method is a known technique, and thus a detailed description thereof will not be given, but the structured light formed by the reflection region provided inside the
convex lens 114 causes a pattern formed by a plurality of infrared light dots to appear in a region including thecornea 302 of theuser 300. -
FIGS. 3(a) and 3(b) are schematic diagrams illustratingreflection regions 120.FIG. 3(a) is a diagram illustrating when theconvex lens 114 is seen from the lateral side (outer canthus of the user 300), andFIG. 3(b) is a diagram illustrating when theconvex lens 114 is seen from the upper side (top of the head of the user 300). - As shown in
FIG. 3(a) , the near-infraredlight source 103 includes a plurality ofLEDs 104. To avoid becoming complicated, inFIG. 3(a) , only onereference numeral 104 is shown, but the rectangles of broken lines indicate theLEDs 104. TheLED 104 emits infrared light toward the inside of theconvex lens 114. - As shown in
FIG. 3(b) , a plurality ofreflection regions 120 are provided inside theconvex lens 114. To avoid becoming complicated, inFIG. 3(b) , only onereference numeral 120 is shown, but regions shown by diagonal segments in the drawing indicate thereflection regions 120. - As described above, the
reflection region 120 is a region having a different refractive index as compared to other regions in theconvex lens 114. For this reason, the infrared light incident from theLED 104 is totally reflected from thereflection region 120 and directed to thecornea 302 of theuser 300. Since thereflection region 120 is provided in a plurality of places in theconvex lens 114, as much infrared light as thereflection region 120 is directed to thecornea 302 of theuser 300. Thereby, dot patterns according to an installation shape of thereflection region 120 can be formed on thecornea 302 of theuser 300. Meanwhile, providing a region having a refractive index in theconvex lens 114 can be realized using a known laser machining technique. - As described above, the infrared light reaching the
cornea 302 of theuser 300 is reflected from thecornea 302 of theuser 300, and directed to the direction of theconvex lens 114 again. In this case, when the infrared light reaches thereflection region 120, the infrared light is reflected by thereflection region 120 and is not able to pass through theconvex lens 114. However, each of thereflection regions 120 is a narrow region, and a relative position between thereflection region 120 and thecornea 302 of theuser 300 continually changes with a change in the eyesight of theuser 300. For this reason, the probability of the infrared light reflected from thecornea 302 of theuser 300 and directed to theconvex lens 114 being reflected by thereflection region 120 is small, which does not lead to a problem. - Even when it is assumed that the infrared light reflected from the
cornea 302 of theuser 300 and directed to theconvex lens 114 is reflected in thereflection region 120 at a certain timing, the relative position between thereflection region 120 and thecornea 302 of theuser 300 changes at another timing, and thus the infrared light is not reflected. Therefore, even when reflected light from thecornea 302 of theuser 300 is reflected by thereflection region 120 at a certain moment by capturing the infrared light in thecamera 116 over time, thecamera 116 can capture an image at another moment, which does not lead to a problem. -
FIG. 3(b) illustrates thereflection regions 120 present in a certain horizontal cross section of theconvex lens 114. Thereflection regions 120 are also present on other horizontal cross sections of theconvex lens 114. Therefore, the infrared light emitted from the near-infraredlight source 103 and reflected by thereflection region 120 is distributed two-dimensionally in thecornea 302 of theuser 300 and forms dot patterns. -
FIGS. 4(a) and 4(b) are diagrams schematically illustrating an example of dot patterns generated by thereflection region 120 of theconvex lens 114. More specifically,FIG. 4(a) is a schematic diagram illustrating dot patterns emitted from theconvex lens 114. Meanwhile, in the drawing shown inFIG. 4(a) , A to H are symbols shown for convenience to describe the lineup of dots, and are not dot patterns caused to appear in reality. - In
FIG. 4(a) , black circles and white circles are lined up in a longitudinal direction at equal intervals. Meanwhile, the longitudinal direction inFIG. 4 is a direction linking the top of the head of the user to the chin, and is a vertical direction when the user stands upright. Hereinafter, a dot row longitudinally lined up from the symbol A may be described as a row A. The same is true of the symbols B to H. - In
FIG. 4(a) , the black circle indicates that a dot (bright point) caused by infrared light appears, and the white circle indicates that a dot does not appear in reality. In theconvex lens 114, a reflection region is provided so that a different dot pattern appears at a different position of thecornea 302 of theuser 300. In an example shown inFIG. 4(a) , all the dot rows are formed as different dot patterns from the row A to the row H. Therefore, the cornea coordinate acquisition unit can uniquely specify each dot row by analyzing dot patterns in an image captured by thecamera 116. -
FIG. 4(b) is a schematic diagram illustrating dot patterns reflected from a region including thecornea 302 of theuser 300, and is a diagram schematically illustrating an image captured by thecamera 116. As shown inFIG. 4(b) , the dot patterns captured by thecamera 116 are dot patterns mainly reaching the ocular surface of theuser 300, and dot patterns reaching the skin of theuser 300 have a tendency to be captured. This is because the dot patterns reaching the skin of theuser 300 are diffusely reflected from the skin, and the amount of light reaching thecamera 116 is reduced. On the other hand, dot patterns reaching the ocular surface of theuser 300 are subject to reflection close to specular reflection due to the influence of tears or the like. For this reason, the amount of light of the dot patterns reaching thecamera 116 also increases. -
FIG. 5 is a schematic diagram illustrating a relationship between a captured dot pattern and a structure of a subject. An example shown inFIG. 5 shows a state where the row D inFIG. 4(a) is captured. - As shown in
FIG. 5 , thecamera 116 captures thecornea 302 of theuser 300 from a downward direction (that is, direction of the user's mouth). Generally, the human cornea has a shape protruding in an eyesight direction. For this reason, even when equally-spaced dot patterns appear in thecornea 302 of theuser 300, an interval between each of the dots in the dot pattern captured by thecamera 116 changes depending on the shape of thecornea 302 of theuser 300. In other words, an interval between the dot patterns appearing in thecornea 302 of theuser 300 reflects information in a depth direction (that is, direction in which the infrared light is emitted to thecornea 302 of the user 300). This interval between the dot patterns is analyzed, and thus the cornea coordinate acquisition unit can acquire the shape of thecornea 302 of theuser 300. Meanwhile, the above is not limited to when thecamera 116 captures thecornea 302 of theuser 300 from the downward direction. Light paths of infrared light incident on thecornea 302 of theuser 300 and infrared light reflected from thecornea 302 of theuser 300 may shift from each other, and thecamera 116 may capture, for example, thecornea 302 of theuser 300 from a transverse direction or an upward direction. -
FIG. 6 is a diagram schematically illustrating a functional configuration of theimage reproducing device 200. Theimage reproducing device 200 includes a reception andtransmission unit 210, animage generation unit 220, aneyesight detection unit 230, and a cornea coordinateacquisition unit 240. -
FIG. 6 illustrates a functional configuration to operate an image generation process, an eyesight detection process, and a cornea coordinate detection process performed by theimage reproducing device 200, and other configurations are omitted. InFIG. 6 , each component described as functional blocks that perform various processes can be constituted by a CPU (Central Processing Unit), a main memory, and other LSIs (Large Scale Integrations) in a hardware manner. In addition, each component is realized by programs or the like loaded into the main memory in a software manner. Meanwhile, the programs may be stored in a computer readable recording medium, and may be downloaded from a network through a communication line. It is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and are not limited to any particular one. - The reception and
transmission unit 210 executes the transmission of information between theimage reproducing device 200 and the head mounteddisplay 100. The reception andtransmission unit 210 can be realized by a wireless communication module according to the standard of MIRACAST (Trademark), WIGIG (Trademark), WHDI (Trademark), or the like described above. - The
image generation unit 220 generates an image displayed on theimage display element 108 of the head mounteddisplay 100. Theimage generation unit 220 can be realized using, for example, the GPU or the CPU described above. - The cornea coordinate
acquisition unit 240 analyzes the interval between the dot patterns appearing in thecornea 302 of theuser 300, and thus acquires a three-dimensional shape of thecornea 302 of theuser 300. Thereby, the cornea coordinateacquisition unit 240 can also estimate position coordinates of thecornea 302 of theuser 300 in a three-dimensional coordinate system using thecamera 116 as an origin. - Meanwhile, the
camera 116 may be a monocular camera, and may be a stereo camera including two or more imaging units. In this case, the cornea coordinateacquisition unit 240 analyzes the parallax image of thecornea 302 of theuser 300 which is captured by thecamera 116, and thus can more accurately estimate the position coordinates of thecornea 302 of theuser 300 in the three-dimensional coordinate system using thecamera 116 as an origin. -
FIG. 7 is a schematic diagram illustrating an eyesight direction of theuser 300. As described above, the dot patterns appearing in thecornea 302 are analyzed, and thus the cornea coordinateacquisition unit 240 can acquire the shape of thecornea 302 of theuser 300. Thereby, as shown inFIG. 6 , theeyesight detection unit 230 can detect a peak P of thecornea 302 of theuser 300 having an approximately hemisphere shape. Subsequently, theeyesight detection unit 230 sets aplane 304 that comes into contact with thecornea 302 at the peak P. In this case, the direction of anormal line 306 of theplane 304 at the peak P is set to the eyesight direction of theuser 300. - Meanwhile, the
cornea 302 of theuser 300 is generally aspherical rather than spherical. For this reason, in the above method in which thecornea 302 of theuser 300 is assumed to be spherical, an estimation error may occur in the eyesight direction of theuser 300. Consequently, theeyesight detection unit 230 may provide calibration for an eyesight direction estimation in advance of theuser 300 starting to use the head mounteddisplay 100. -
FIG. 8 is a schematic diagram illustrating calibration in an eyesight direction executed by theeyesight detection unit 230. Theeyesight detection unit 230 generates nine points from points Q.sub.1 to Q.sub.9 as shown inFIG. 8 in theimage generation unit 220, and displays these points on theimage display element 108. Theeyesight detection unit 230 causes theuser 300 to keep observation on these points in order from the point Q.sub.1 to the point Q.sub.9, and detects the aforementionednormal line 306. In addition, when theuser 300 keeps observation on, for example, the point Q1, the central coordinates (that is, coordinates of the peak P described above with reference toFIG. 7 ) of thecornea 302 of theuser 300 are set to P.sub.1. In this case, the eyesight direction of theuser 300 is set to a direction P.sub.1-Q.sub.1 linking the point P1 to the point Q.sub.1 inFIG. 8 . Theeyesight detection unit 230 compares the acquired direction of thenormal line 306 with the direction P.sub.1-Q.sub.1, and stores the error thereof. - Hereinafter, similarly, the
user 300 stores errors with respect to nine directions P.sub.1-Q.sub.1, P.sub.2-Q.sub.2, . . . , P.sub.9-Q.sub.9 of the point Q.sub.1 to the point Q.sub.9, and thus theeyesight detection unit 230 can acquire a correction table to correct the direction of thenormal line 306 obtained by calculation. Theeyesight detection unit 230 acquires the correction table in advance through calibration, and corrects the direction of thenormal line 306 obtained in the aforementioned method, thereby allowing higher-accuracy eyesight direction detection to be realized. - It is also considered that, after the
user 300 mounts the head mounteddisplay 100 on the head and theeyesight detection unit 230 performs calibration, a relative positional relationship between the head of theuser 300 and the head mounteddisplay 100 changes. However, when the eyesight direction of theuser 300 is detected from the shape of thecornea 302 of theuser 300 described above, the relative positional relationship between the head of theuser 300 and the head mounteddisplay 100 does not influence the accuracy of detection of the eyesight direction. Therefore, it is possible to realize robust eyesight direction detection with respect to a change in the relative positional relationship between the head of theuser 300 and the head mounteddisplay 100. - Regarding the above, a method in which the
eyesight detection unit 230 detects the eyesight direction of theuser 300 using a geometric method has been described. Theeyesight detection unit 230 may execute eyesight direction detection based on an algebraic method using coordinate transformation described below, instead of the geometric method. - In
FIG. 8 , the coordinates of the point Q.sub.1 to the point Q.sub.9 in the two-dimensional coordinate system which is set in a moving image displayed by theimage display element 108 are set to Q.sub.1(x.sub.1, y.sub.1).sup.T, Q.sub.2(x.sub.2, y.sub.2).sup.T . . . , Q.sub.9(x.sub.9, x.sub.9).sup.T, respectively. In addition, the coordinates of the position coordinates P.sub.1 to P.sub.9 of thecornea 302 of theuser 300 when theuser 300 keeps observation on the point Q.sub.1 to the point Q.sub.9 are set to P.sub.1(X.sub.1, Y.sub.1, Z.sub.1).sup.T, P.sub.2(X.sub.2, Y.sub.2, Z.sub.2).sup.T, . . . , P.sub.9(Z.sub.9, Y.sub.9, Z.sub.9).sup.T, respectively. Meanwhile, T indicates the transposition of a vector or a matrix. - A matrix M having a size of 3.times.2 is defined as Expression (1).
-
- In this case, when the matrix M satisfies Expression (2), the matrix M becomes a matrix to project the eyesight direction of the
user 300 onto a moving image surface displayed by theimage display element 108. -
P.sub.N=MQ.sub.N (N=1, . . . ,9) (2) - When Expression (2) is specifically written, Expression (3) is established.
-
- When Expression (3) is deformed, Expression (4) is obtained.
-
- When the following expression is set, Expression (5) is obtained.
-
Y=Ax (5) - In Expression (5), the elements of a vector y are the coordinates of the points Q.sub.1 to Q.sub.9 displayed on the
image display element 108 by theeyesight detection unit 230, and thus the elements are known. In addition, the elements of a matrix A are coordinates of the peak P of thecornea 302 of theuser 300 acquired by the cornea coordinateacquisition unit 240. Therefore, theeyesight detection unit 230 can acquire the vector y and the matrix A. Meanwhile, a vector x which is a vector obtained by arranging the elements of the transformation matrix M is unknown. Therefore, when the vector y and the matrix A are known, a problem of estimating the matrix M becomes a problem of obtaining the unknown vector x. - In Expression (5), when the number of expressions (that is, the number of points Q presented to the
user 300 when theeyesight detection unit 230 performs calibration) is larger than the number of unknowns (that is, the number of elements of the vector x is 6), a priority determination problem occurs. In the example shown in Expression (5), the number of expressions is nine, which leads to a priority determination problem. - An error vector between the vector y and a vector Ax is set to a vector e. That is, the relation of e=y−Ax is established. In this case, in the meaning of minimizing a square sum of the elements of the vector e, an optimum vector x.sub.opt is obtained by Expression (6).
- x.sub.opt=(A.sup.TA).sup.−1A.sup.Ty (6) wherein “−1” indicates an inverse matrix.
- The
eyesight detection unit 230 constitutes the matrix M of Expression (1) by using the elements of the obtained vector x.sub.opt. Thereby, theeyesight detection unit 230 uses the matrix M and the coordinates of the peak P of thecornea 302 of theuser 300 acquired by the cornea coordinateacquisition unit 240, and thus can estimate where on the moving image surface displayed by theimage display element 108 theuser 300 keeps observation on according to Expression (2). - It is also considered that, after the
user 300 mounts the head mounteddisplay 100 on the head and theeyesight detection unit 230 performs calibration, a relative positional relationship between the head of theuser 300 and the head mounteddisplay 100 changes. However, the position coordinates of the peak P of thecornea 302 constituting the matrix A described above are values estimated by the cornea coordinateacquisition unit 240 as position coordinates in the three-dimensional coordinate system using thecamera 116 as an origin. Even when it is assumed that the relative positional relationship between the head of theuser 300 and the head mounteddisplay 100 changes, a coordinate system based on the position coordinates of the peak P of thecornea 302 does not change. Therefore, even when the relative positional relationship between the head of theuser 300 and the head mounteddisplay 100 changes slightly, coordinate transformation according to Expression (2) is considered to be effective. Consequently, eyesight detection executed by theeyesight detection unit 230 can improve robustness with respect to a shift of the head mounteddisplay 100 during mounting. - As described above, according to the head mounted
display 100, it is possible to detect geometric information of thecornea 302 of theuser 300 wearing the head mounteddisplay 100. - Particularly, the head mounted
display 100 can acquire the three-dimensional shape and the position coordinates of thecornea 302 of theuser 300, it is possible to estimate the eyesight direction of theuser 300 with good accuracy. - As stated above, our displays have been described on the basis of our examples. The examples have been described for exemplary purposes only, and it can be readily understood by those skilled in the art that various modifications may be made by a combination of each of these components or processes, which are also encompassed by the scope of this disclosure.
- In the above, a description has been given of an example when the
convex lens 114 is provided with thereflection regions 120 so that different dot patterns appear at different positions of thecornea 302 of theuser 300. Dots having different blinking patterns may be caused to appear at different positions of thecornea 302 of theuser 300, instead thereof or in addition thereto. This can be realized by forming, for example, the near-infraredlight source 103 by a plurality of different light sources, and changing a blinking pattern in each light source. - Further, although the near-infrared light is radiated from the near-infrared
light source 103 in the above-described embodiment, a light source that radiates near-infrared light may be included in each pixel constituting theimage display element 108. That is, generally, one pixel is constituted by RGB, and a light emitting element that emits near-infrared light is provided in addition to the light emitting elements that emit red light, green light, and blue light. When a sub-pixel that emits near-infrared light is included as a sub-pixel in the image display element, the near-infraredlight source 103 may not be provided in the head-mounteddisplay 100. -
FIG. 9 is a diagram illustrating a configuration example of thescreen display element 108. Theimage display element 108 includes pixels constituting an image and sub-pixels constituting the pixels. One pixel will be described by way of example. Apixel 900 that is one pixel constituting theimage display element 108 includes sub-pixels 900 r, 9009, 900 b, and 900 i. The sub-pixel 900 r is a pixel that emits red light (including light emission of a backlight, light emission of the pixel itself, or light emission of both), the sub-pixel 900 g is a pixel that emits green light, and the sub-pixel 900 b is a pixel that emits blue light. In the case of a normal display device, sub-pixels including three colors are set as one pixel, or sub-pixels with white light added thereto are set as one pixel. However, in the case of thescreen display element 108 according to the present invention, the sub-pixel 900 i is included as a sub-pixel. - The sub-pixel 900 i is a pixel that emits near-infrared light. It is determined whether or not the sub-pixel 900 i of each
pixel 900 emits the near-infrared light according to an instruction from the video output unit 224, and information indicating whether or not the sub-pixel 900 i of eachpixel 900 emits the near-infrared light is included in display image data that the video output unit 224 outputs to the head-mounteddisplay 100. Thus, an emission pattern of the near-infrared light desired by an operator can be formed. Therefore, in a video to be displayed at that time, for example, formation of a pattern can also be realized such that the near-infrared light is not emitted according to content of the image in a pixel that strongly emits red light. The light emission of the sub-pixel 900 i may be executed by a display unit included in the head-mounted display shown in the above embodiment, or may be executed by an irradiation unit that controls the sub-pixel 900 i that emits near-infrared light. - A configuration for emitting near-infrared light in the image is effective regardless of a type of the display device, and can be applied to various display devices such as an LCD, a plasma display, an organic EL display. Further, even when a sub-pixel for the near-infrared light is included in the pixel, the user do not feel uncomfortable when viewing the image by setting a wavelength of the near-infrared light to be radiated to be outside a range of wavelengths that can be perceived by a person with respect to an actually displayed image.
- Further, control of the sub-pixel 900 i of which of the respective pixels of the
image display element 108 is caused to emit light may be executed by the display unit or the irradiation unit of the head-mounteddisplay 100, or may be executed by the display unit or the irradiation unit of the head-mounted display according to designation of thevideo generation unit 220. Thus, structural light shown in the above embodiment can be realized. Further, at this time, turn-on of the sub-pixel 900 i may be appropriately changed. Particularly, for example, when a moving image is displayed on theimage display element 108, the sub-pixel 900 i that emits the near-infrared light may be changed each time a predetermined time elapses in time series. Here, the predetermined time may be defined by the number of seconds. In the case of a moving image, the predetermined time may be defined by the number of frames, or the predetermined time may be defined for each blanking period. In this case, the frame number of the moving image and coordinate position information of theimage display element 108 of the sub-pixel 900 i that emits near-infrared light at that time are stored in the head-mounted display or a line-of-sight detection device in association with each other, such that a line-of-sight detection can be appropriately executed each time. Further, a blinking pattern of the sub-pixel 900 i that emits the near-infrared light may be changed in a predetermined period. - Further, a timing at which the sub-pixel 900 i is turned on and a timing at which the sub-pixel 900 r, the sub-pixel 900 g, and the sub-pixel 900 b are turned on may be different timings. The
camera 116 may be configured to execute imaging only at a timing at which the sub-pixel 900 i is turned on. Further, as a scheme for realizing this configuration, for example, the configuration may be realized so that a blanking period of the sub-pixel 900 r, the sub-pixel 900 g, and the sub-pixel 900 b and a blanking period of the sub-pixel 900 i are set to be different time zones. More specifically, it is preferable for the blanking period of the sub-pixel 900 r, the sub-pixel 900 g, and the sub-pixel 900 b to be set as a turn-on period of the sub-pixel 900 i and for the blanking period of the sub-pixel 900 i to be set as the turn-on period of the sub-pixel 900 r, the sub-pixel 900 g, and the sub-pixel 900 b. - Further, in the above embodiment, the image is displayed on the
image display element 108 provided on the head-mounteddisplay 100 and the video is provided to the user, but the present invention is not limited thereto.FIG. 10 illustrates an example in which the video is provided to the user without being displayed on theimage display element 108 that may be adopted by the head-mounteddisplay 100. - A
display system 1000 illustrated inFIG. 10 is a display system in a case in which the image displayed on theimage display element 108 is not caused to be visually recognized by the user, but the video is directly projected onto the eyes of the user. In recent years, the development of a virtual retinal display that directly projects the video on the eyes of the user has been remarkable. Image light can be directly projected without causing any adverse effect on the eyes of the user, such that the image can be caused to be recognized by the user. This technology can also be applied to the head-mounteddisplay 100 according to the present invention. As illustrated inFIG. 10 , thedisplay system 1000 directly projects image data transmitted from the video output unit 224 onto theeyes 303 of the user via aconvex lens 114 using anoptical fiber 1001. In this case, an invisible light source is also included in a pattern as shown in the above embodiment in the image 1102 and is also radiated on theeyes 303 of the user. Therefore, line-of-sight detection can be realized by imaging a cornea that reflects the invisible light radiated on the eyes of the user using thecamera 116. Although an example in which the image is directly projected from theoptical fiber 1001 to the eyes of the user is shown inFIG. 10 , the image light from theoptical fiber 1001 may be reflected by a hot mirror or the like and projected onto theeyes 303 of the user via theconvex lens 114. - In the above embodiment, the example in which the line-of-sight detection is assumed, a marker image is displayed, the marker image is caused to be gazed by the user, mapping information indicating a relationship between the cornea and a monitor obtained by calibration is stored, and the line-of-sight detection for the user when the user views an actual video is performed is shown. However, it goes without saying that a line-of-sight detection scheme is not limited to the above algorithm. Line-of-sight detection using the following scheme is also included in the idea of the present invention.
-
FIG. 11(a) andFIG. 11(b) illustrate an example of a line-of-sight detection method. A line-of-sight detection unit 230 stores a position of a corneal center of the user when the user views a center of the image.FIG. 11(a) is a diagram illustrating animage 1100 obtained by imaging a visible light image including a left eye of the user. Here, although the left eye is used, the same applies to a right eye. It is generally known that a landscape that the user views is reflected in the eyes of the user. As illustrated inFIG. 11(a) , animage 1110 displayed on theimage display element 108 illustrated inFIG. 11(b) is reflected in the eyes of the user. - When line-of-sight detection using an image reflected in the eyes is realized, a visible light camera is used as the
camera 116. Accordingly, an image based on normal visible light can be imaged, and an image as illustrated inFIG. 11(a) can be acquired. The line-of-sight detection unit 230 specifies a feature point by performing image analysis such as edge analysis and corner analysis on the obtained image. InFIG. 11(a) , for example,feature points sight detection unit 230 compares the detected feature point with a position of the corneal center of the user, specifies the amount of movement from the position of the corneal center of the user when the user views the center of the image stored in advance, and detects a place (line-of-sight direction) at which the user is gazing. Such a configuration may be used. When the line-of-sight detection using the image reflected in the eyes of the user is performed, the head-mounteddisplay 100 needs to include a half mirror that partly reflects visible light and partially transmits the visible light in place of thehot mirror 112 or needs to include a visible light camera that directly images the eyes of the user separately from thecamera 116. - Further, although the position of the corneal center of the user when the user is viewing the center of the image is stored in the above description, the line-of-sight detection can be performed without storing the position information. That is, the feature point is detected from a first frame of the moving image output by the video output unit 224 and a second frame following the first frame (the second frame may not be a frame immediately after the first frame, but at least a part of the same object as an object to be displayed in the first frame is required to be displayed). Further, a position of a corneal center of the user gazing at the first frame at that time and a position of a corneal center of the user gazing at the second frame are detected. The line-of-
sight detection unit 230 may be configured to detect a point (line-of-sight direction) of the second frame at which the user is gazing on the basis of a movement distance and a movement direction on thescreen display element 108 from the feature point in the first frame to the corresponding feature point in the second frame, and a movement distance and a movement direction on thescreen display element 108 from the position of the corneal center of the user in the first frame to the position of the corneal center of the user in the second frame. - According to these schemes, it is not necessary to execute the calibration by displaying the marker image shown in the above embodiment. Therefore, prior preparation for performing the line-of-sight detection using the head-mounted
display 100 may not be performed, and convenience of the user can be improved. - The present invention is applicable to a head mounted display.
-
-
- 1 Image system
- 100 Head mounted display
- 103 near-infrared light source
- 104 LED
- 108 Image display element
- 112 Hot mirror
- 114 Convex lens
- 116 Camera
- 118 Image output unit
- 120 Reflection region
- 130 Image display system
- 150 Housing
- 160 Mounting fixture
- 170 Headphone
- 200 Image reproducing device
- 210 Reception and transmission unit
- 220 Image generation unit
- 230 Eyesight detection unit
Claims (10)
1. A line-of-sight detection system comprising a head-mounted display and a line-of-sight detection device,
the head-mounted display includes
an image display element that includes a plurality of pixels, each pixel including sub-pixels that emit red, green, blue, invisible light, and displays an image to be viewed by a user;
an imaging unit that images an eye of the user wearing the head-mounted display on the basis of the invisible light emitted from the sub-pixel that emits the invisible light; and
a transmission unit that transmits a captured image captured by the imaging unit, and
the line-of-sight detection device includes
a reception unit that receives the captured image; and
a line-of-sight detection unit that detects a line of sight of the eye of the user based on the captured image.
2. The line-of-sight detection system according to claim 1 , wherein the head-mounted display further include a control unit that selects the pixel that emits invisible light among the plurality of pixels constituting the image display element and causes the selected pixel to emit light.
3. The line-of-sight detection system according to claim 2 , wherein the control unit changes the pixel that emits the invisible light when a predetermined time elapses.
4. The line-of-sight detection system according to claim 2 , wherein the control unit switches and controls a light emission timing of a sub-pixel that emits the invisible light and the sub-pixel other than the sub-pixel that emits the invisible light.
5. A head-mounted display mounted on a head of a user and used, the head-mounted display comprising:
a convex lens disposed in a position facing a cornea of the user when the head-mounted display is mounted;
an image display element that includes a plurality of pixels, each pixel including sub-pixels that emit red, green, blue, invisible light, and displays an image to be viewed by a user;
a camera that images a video including the cornea of the user as a subject; and
a housing that houses the convex lens, the image display element, and the camera.
6. The head-mounted display according to claim 5 , further comprising a control unit that selects the pixel that emits invisible light among the plurality of pixels constituting the image display element and causes the selected pixel to emit light.
7. The head-mounted display according to claim 6 , wherein the control unit changes the pixel that emits the invisible light when a predetermined time elapses.
8. The head-mounted display according to claim 6 , wherein the control unit switches and controls a light emission timing of a sub-pixel that emits the invisible light and the sub-pixel other than the sub-pixel that emits the invisible light.
9. A line-of-sight detection method using a line-of-sight detection system comprising a head-mounted display including an image display element that includes a plurality of pixels, each pixel including sub-pixels that emit red, green, blue, invisible light, and displays an image to be viewed by a user, and a line-of-sight detection device, the line-of-sight detection method comprising:
an irradiation step of emitting, by the head-mounted display, invisible light from sub-pixels of the plurality of pixels and irradiating a cornea of the user with the invisible light;
an imaging step of imaging, by the head-mounted display, a subject radiated with invisible light emitted from the sub-pixel that emits the invisible light and including a cornea of the user viewing the image displayed on the image display element to generate a captured image;
a transmission step of transmitting, by the head-mounted display, the captured image to the line-of-sight detection device;
a reception step of receiving the captured image by the line-of-sight detection device; and
a detection step of detecting, by the line-of-sight detection device, a direction of a line of sight of the user on the basis of the captured image.
10. A line-of-sight detection system comprising a head-mounted display and a line-of-sight detection device,
wherein the head-mounted display includes
an image display element that displays an image to be viewed by a user;
an imaging unit that images an eye of the user in which the image displayed on the image display element is reflected;
a transmission unit that transmits a captured image captured by the imaging unit, and
the line-of-sight detection device includes
a reception unit that receives the captured image; and
a line-of-sight detection unit that detects a direction of a line of sight of the eye of the user on the basis of a feature point of the image displayed on the image display element included in the captured image and a feature point of the image displayed on the image display element.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JPPCT/JP2015/060398 | 2015-04-01 | ||
PCT/JP2015/060398 WO2016157486A1 (en) | 2015-04-01 | 2015-04-01 | Head mounted display |
PCT/JP2016/056078 WO2016158151A1 (en) | 2015-04-01 | 2016-02-29 | Sight line detection system, head mounted display, and sight line detection method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180067306A1 true US20180067306A1 (en) | 2018-03-08 |
Family
ID=57004015
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/907,999 Active US9625989B2 (en) | 2015-04-01 | 2015-04-01 | Head mounted display |
US15/563,444 Abandoned US20180067306A1 (en) | 2015-04-01 | 2016-02-29 | Head mounted display |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/907,999 Active US9625989B2 (en) | 2015-04-01 | 2015-04-01 | Head mounted display |
Country Status (4)
Country | Link |
---|---|
US (2) | US9625989B2 (en) |
KR (1) | KR20170133390A (en) |
CN (1) | CN107850937A (en) |
WO (2) | WO2016157486A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180131926A1 (en) * | 2016-11-10 | 2018-05-10 | Mark Shanks | Near eye wavefront emulating display |
US20200125169A1 (en) * | 2018-10-18 | 2020-04-23 | Eyetech Digital Systems, Inc. | Systems and Methods for Correcting Lens Distortion in Head Mounted Displays |
US10990816B1 (en) * | 2019-05-01 | 2021-04-27 | Facebook Technologies, Llc | Apparatuses, systems, and methods for mapping corneal curvature |
TWI726445B (en) * | 2019-04-08 | 2021-05-01 | 宏達國際電子股份有限公司 | Head mounted display apparatus |
US11030975B2 (en) * | 2016-07-04 | 2021-06-08 | Sony Corporation | Information processing apparatus and information processing method |
US11256097B2 (en) * | 2018-05-30 | 2022-02-22 | Sony Interactive Entertainment Inc. | Image generation apparatus, image display system, image generation method, and computer program |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL235594A0 (en) | 2014-11-09 | 2015-02-26 | Ron Schneider | Surgical lens system |
WO2016157485A1 (en) * | 2015-04-01 | 2016-10-06 | フォーブ インコーポレーテッド | Head mounted display |
US9870049B2 (en) | 2015-07-31 | 2018-01-16 | Google Llc | Reflective lenses to auto-calibrate a wearable system |
US10552676B2 (en) * | 2015-08-03 | 2020-02-04 | Facebook Technologies, Llc | Methods and devices for eye tracking based on depth sensing |
US10338451B2 (en) | 2015-08-03 | 2019-07-02 | Facebook Technologies, Llc | Devices and methods for removing zeroth order leakage in beam steering devices |
US10459305B2 (en) | 2015-08-03 | 2019-10-29 | Facebook Technologies, Llc | Time-domain adjustment of phase retardation in a liquid crystal grating for a color display |
US10297180B2 (en) | 2015-08-03 | 2019-05-21 | Facebook Technologies, Llc | Compensation of chromatic dispersion in a tunable beam steering device for improved display |
US10274730B2 (en) | 2015-08-03 | 2019-04-30 | Facebook Technologies, Llc | Display with an embedded eye tracker |
US10247858B2 (en) | 2015-10-25 | 2019-04-02 | Facebook Technologies, Llc | Liquid crystal half-wave plate lens |
US10416454B2 (en) | 2015-10-25 | 2019-09-17 | Facebook Technologies, Llc | Combination prism array for focusing light |
US10203566B2 (en) | 2015-12-21 | 2019-02-12 | Facebook Technologies, Llc | Enhanced spatial resolution using a segmented electrode array |
US9946074B2 (en) | 2016-04-07 | 2018-04-17 | Google Llc | See-through curved eyepiece with patterned optical combiner |
JP2018054782A (en) * | 2016-09-28 | 2018-04-05 | セイコーエプソン株式会社 | Optical element and display device |
US10877556B2 (en) | 2016-10-21 | 2020-12-29 | Apple Inc. | Eye tracking system |
US10816814B2 (en) * | 2016-12-08 | 2020-10-27 | Sony Corporation | Imaging device |
CN108399633A (en) * | 2017-02-06 | 2018-08-14 | 罗伯团队家居有限公司 | Method and apparatus for stereoscopic vision |
WO2018147455A1 (en) * | 2017-02-13 | 2018-08-16 | 株式会社モノプロダイム | Wake-up method and device using same |
CN106932904A (en) | 2017-02-27 | 2017-07-07 | 阿里巴巴集团控股有限公司 | Virtual reality helmet |
CN106932905A (en) * | 2017-02-27 | 2017-07-07 | 阿里巴巴集团控股有限公司 | Virtual reality helmet |
US10546518B2 (en) | 2017-05-15 | 2020-01-28 | Google Llc | Near-eye display with extended effective eyebox via eye tracking |
US10629105B2 (en) * | 2017-06-15 | 2020-04-21 | Google Llc | Near-eye display with frame rendering based on reflected wavefront analysis for eye characterization |
US20200175144A1 (en) * | 2017-08-10 | 2020-06-04 | Nec Corporation | Information acquisition system, information acquisition method, and storage medium |
US10989921B2 (en) * | 2017-12-29 | 2021-04-27 | Letinar Co., Ltd. | Augmented reality optics system with pinpoint mirror |
US10989922B2 (en) * | 2017-12-29 | 2021-04-27 | Letinar Co., Ltd. | Augmented reality optics system with pin mirror |
CN108279496B (en) * | 2018-02-09 | 2021-02-19 | 京东方科技集团股份有限公司 | Eyeball tracking module and method of video glasses and video glasses |
US10775616B1 (en) * | 2018-03-21 | 2020-09-15 | Facebook Technologies, Llc | Lenses integrated with micro-light emitting diodes |
US10942349B2 (en) | 2018-08-21 | 2021-03-09 | Facebook Technologies, Llc | Illumination assembly with in-field micro devices |
US10607353B2 (en) * | 2018-08-30 | 2020-03-31 | Facebook Technologies, Llc | Structured light depth sensing |
US10902627B2 (en) * | 2018-11-30 | 2021-01-26 | Hins Sas | Head mounted device for virtual or augmented reality combining reliable gesture recognition with motion tracking algorithm |
CN109766820A (en) * | 2019-01-04 | 2019-05-17 | 北京七鑫易维信息技术有限公司 | A kind of eyeball tracking device, headset equipment and eyes image acquisition methods |
US11561336B2 (en) * | 2019-10-05 | 2023-01-24 | Meta Platforms Technologies, Llc | Transparent illumination layer with transparent waveguide structure |
KR20220030774A (en) * | 2020-09-03 | 2022-03-11 | 삼성전자주식회사 | Electronic device and method for obtaining gaze information of user thereof |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3936605A (en) | 1972-02-14 | 1976-02-03 | Textron, Inc. | Eyeglass mounted visual display |
US4181405A (en) * | 1978-08-07 | 1980-01-01 | The Singer Company | Head-up viewing display |
JP2939988B2 (en) | 1989-04-05 | 1999-08-25 | キヤノン株式会社 | Eye gaze detection device |
US6204974B1 (en) * | 1996-10-08 | 2001-03-20 | The Microoptical Corporation | Compact image display system for eyeglasses or other head-borne frames |
JPH1173274A (en) * | 1997-08-27 | 1999-03-16 | Canon Inc | Visual line input intention transmitting device and method therefor and storage medium |
US6426740B1 (en) | 1997-08-27 | 2002-07-30 | Canon Kabushiki Kaisha | Visual-axis entry transmission apparatus and method therefor |
JP2003502711A (en) * | 1999-06-21 | 2003-01-21 | ザ マイクロオプティカル コーポレイション | Eyepiece display lens system using off-axis optical design |
JP2009282085A (en) * | 2008-05-20 | 2009-12-03 | Panasonic Corp | Optical device and image display equipped with the same |
JP5329882B2 (en) * | 2008-09-17 | 2013-10-30 | パイオニア株式会社 | Display device |
JP5290092B2 (en) * | 2009-08-31 | 2013-09-18 | オリンパス株式会社 | Eyeglass-type image display device |
WO2012169064A1 (en) * | 2011-06-10 | 2012-12-13 | パイオニア株式会社 | Image display device, image display method, and image display program |
KR102264765B1 (en) * | 2012-01-24 | 2021-06-11 | 더 아리조나 보드 오브 리전츠 온 비핼프 오브 더 유니버시티 오브 아리조나 | Compact eye-tracked head-mounted display |
US20140160157A1 (en) | 2012-12-11 | 2014-06-12 | Adam G. Poulos | People-triggered holographic reminders |
US9625723B2 (en) * | 2013-06-25 | 2017-04-18 | Microsoft Technology Licensing, Llc | Eye-tracking system using a freeform prism |
US20150015478A1 (en) * | 2013-07-11 | 2015-01-15 | Samsung Display Co., Ltd. | Ir emissive display facilitating remote eye tracking |
CN103760973B (en) * | 2013-12-18 | 2017-01-11 | 微软技术许可有限责任公司 | Reality-enhancing information detail |
JP6539654B2 (en) * | 2014-06-27 | 2019-07-03 | フォーブ インコーポレーテッド | Gaze detection device |
WO2016103525A1 (en) * | 2014-12-27 | 2016-06-30 | 株式会社Fove | Head mount display |
-
2015
- 2015-04-01 WO PCT/JP2015/060398 patent/WO2016157486A1/en active Application Filing
- 2015-04-01 US US14/907,999 patent/US9625989B2/en active Active
-
2016
- 2016-02-29 WO PCT/JP2016/056078 patent/WO2016158151A1/en active Application Filing
- 2016-02-29 KR KR1020177030135A patent/KR20170133390A/en unknown
- 2016-02-29 US US15/563,444 patent/US20180067306A1/en not_active Abandoned
- 2016-02-29 CN CN201680020039.7A patent/CN107850937A/en not_active Withdrawn
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11030975B2 (en) * | 2016-07-04 | 2021-06-08 | Sony Corporation | Information processing apparatus and information processing method |
US20180131926A1 (en) * | 2016-11-10 | 2018-05-10 | Mark Shanks | Near eye wavefront emulating display |
US10757400B2 (en) * | 2016-11-10 | 2020-08-25 | Manor Financial, Inc. | Near eye wavefront emulating display |
US11303880B2 (en) * | 2016-11-10 | 2022-04-12 | Manor Financial, Inc. | Near eye wavefront emulating display |
US11256097B2 (en) * | 2018-05-30 | 2022-02-22 | Sony Interactive Entertainment Inc. | Image generation apparatus, image display system, image generation method, and computer program |
US20200125169A1 (en) * | 2018-10-18 | 2020-04-23 | Eyetech Digital Systems, Inc. | Systems and Methods for Correcting Lens Distortion in Head Mounted Displays |
TWI726445B (en) * | 2019-04-08 | 2021-05-01 | 宏達國際電子股份有限公司 | Head mounted display apparatus |
US10990816B1 (en) * | 2019-05-01 | 2021-04-27 | Facebook Technologies, Llc | Apparatuses, systems, and methods for mapping corneal curvature |
US11715331B1 (en) * | 2019-05-01 | 2023-08-01 | Meta Platforms Technologies, Llc | Apparatuses, systems, and methods for mapping corneal curvature |
Also Published As
Publication number | Publication date |
---|---|
US9625989B2 (en) | 2017-04-18 |
KR20170133390A (en) | 2017-12-05 |
CN107850937A (en) | 2018-03-27 |
WO2016157486A1 (en) | 2016-10-06 |
US20170038834A1 (en) | 2017-02-09 |
WO2016158151A1 (en) | 2016-10-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180067306A1 (en) | Head mounted display | |
US10460165B2 (en) | Head mounted display | |
US10725300B2 (en) | Display device, control method for display device, and program | |
AU2015318074B2 (en) | Display with eye-discomfort reduction | |
US11200646B2 (en) | Compensation for deformation in head mounted display systems | |
US10467770B2 (en) | Computer program for calibration of a head-mounted display device and head-mounted display device using the computer program for calibration of a head-mounted display device | |
CN109791605A (en) | Auto-adaptive parameter in image-region based on eyctracker information | |
US20180032133A1 (en) | Eye-gaze detection system, displacement detection method, and displacement detection program | |
US9870048B2 (en) | Head-mounted display device, method of controlling the same, and computer program | |
US20140118685A1 (en) | Visual function testing device | |
US20170344112A1 (en) | Gaze detection device | |
US20160170482A1 (en) | Display apparatus, and control method for display apparatus | |
JP5694257B2 (en) | Display device, display method, and program | |
US9846305B2 (en) | Head mounted display, method for controlling head mounted display, and computer program | |
JP6903998B2 (en) | Head mounted display | |
US20190035364A1 (en) | Display apparatus, method of driving display apparatus, and electronic apparatus | |
WO2017047271A1 (en) | Display device and optical filter | |
KR20230079138A (en) | Eyewear with strain gauge estimation function | |
US20200213467A1 (en) | Image display system, image display method, and image display program | |
US20170371408A1 (en) | Video display device system, heartbeat specifying method, heartbeat specifying program | |
JP6500570B2 (en) | Image display apparatus and image display method | |
US20180182124A1 (en) | Estimation system, estimation method, and estimation program | |
US20200319709A1 (en) | Information processing system, operation method, and operation program | |
JP2017045068A (en) | Head-mounted display |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |