WO2019069812A1 - Sight-line detection system for head-mounted display, head-mounted display, and sight-line detection method for head-mounted display - Google Patents

Sight-line detection system for head-mounted display, head-mounted display, and sight-line detection method for head-mounted display Download PDF

Info

Publication number
WO2019069812A1
WO2019069812A1 PCT/JP2018/036290 JP2018036290W WO2019069812A1 WO 2019069812 A1 WO2019069812 A1 WO 2019069812A1 JP 2018036290 W JP2018036290 W JP 2018036290W WO 2019069812 A1 WO2019069812 A1 WO 2019069812A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
face
sight
mounted display
optical fiber
Prior art date
Application number
PCT/JP2018/036290
Other languages
French (fr)
Japanese (ja)
Inventor
敬志 畑田
クマール ラヴィ
Original Assignee
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・インタラクティブエンタテインメント filed Critical 株式会社ソニー・インタラクティブエンタテインメント
Publication of WO2019069812A1 publication Critical patent/WO2019069812A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus

Definitions

  • the present invention relates to a gaze detection system for a head mounted display, a head mounted display, and a gaze detection method for the head mounted display.
  • HMD head mounted display
  • the camera can not be disposed at a position suitable for detecting the line of sight of the user due to the restriction of the shape and size of the casing, and the line of sight of the user can not be detected with sufficient accuracy.
  • the present invention has been made in view of the above situation, and one of the objects is a line of sight detection system for a head mounted display, a head mounted display, and a head mount capable of accurately detecting the line of sight of a user wearing the head mounted display.
  • An object of the present invention is to provide a method of detecting the line of sight of a display.
  • the gaze detection system for a head mounted display is arranged such that one end face faces the user's eyes at different positions, and a plurality of lights supported by the housing of the head mounted display A fiber, a reflected light detection unit for detecting reflected light of the user's eye, which is incident on the one end face and emitted from the other end of the plurality of optical fibers, and a result of the detection And a gaze detection unit that detects the gaze of the user.
  • the reflected light detection unit is a camera that captures the other end face of the plurality of optical fibers, and the sight line detection unit is based on an image of the other end face captured by the camera. To detect the line of sight of the user.
  • a housing worn by a user a plurality of optical fibers supported by the housing and arranged to face the eyes of the user at different positions on one end face, and A reflected light detection unit for detecting reflected light from the user's eyes, which is incident on the one end face and emitted from the other end of a plurality of optical fibers, and based on the result of the detection, And a gaze detection unit that detects a gaze.
  • one of the plurality of optical fibers supported by the housing of the head-mounted display is disposed such that one end face faces the user's eyes at different positions. Detecting reflected light from the end face of the user, which is incident on the end face of the light source and emitted from the other end face, and detecting a line of sight of the user based on a result of the detection.
  • FIG. 3 is a cross-sectional view taken along line III-III of the gaze detection system shown in FIGS. 1 and 2; It is a figure which shows an example of an end surface image.
  • FIGS. 1 and 2 It is a functional block diagram which shows an example of the function mounted by the gaze detection apparatus which concerns on one Embodiment of this invention.
  • FIG. 1 is a plan view showing an example of a gaze detection system 1 according to an embodiment of the present invention.
  • FIG. 2 is a left side view of the sight line detection system 1 shown in FIG.
  • FIG. 3 is a cross-sectional view taken along line III-III of the gaze detection system 1 shown in FIGS.
  • the gaze detection system 1 includes a head mounted display (HMD) 10, a display unit 12 for the left eye, a display unit 14 for the right eye, and a gaze detection apparatus 20.
  • HMD head mounted display
  • the sight line detection device 20 includes a main body 20a, an infrared light emitting diode (LED) 20b, a camera 20c for the left eye, and a camera 20d for the right eye.
  • the display unit 12 and the display unit 14 are displays such as a liquid crystal display and an organic EL display, for example.
  • the camera 20c and the camera 20d are, for example, infrared cameras.
  • the HMD 10 includes a case 24 that the user 22 wears on the head, an optical fiber 26 for the left eye that guides an image for the left eye, and an optical fiber 28 for the right eye that guides the image for the right eye. included.
  • the optical fiber 26 and the optical fiber 28 according to the present embodiment have, for example, flexibility.
  • a support 30 for supporting the optical fiber 26 and a support 32 for instructing the optical fiber 28 are provided on the front surface of the housing 24.
  • the support 30 is a frame supporting the optical fiber 26 in front of the left eye of the user 22, and the support 32 supports the optical fiber 28 in front of the right eye of the user 22. It is a frame.
  • the housing 24 includes the connection portion 34 disposed on the back of the user 22.
  • the optical fiber 26 is connected to the support 30, the connection 34, and the display surface of the display 12.
  • the optical fiber 28 is connected to the support 32, the connection 34, and the display surface of the display 14.
  • Each of the optical fibers 26 according to the present embodiment is associated with at least one pixel constituting the display surface of the display unit 12. Then, each of the optical fibers 26 is connected to at least one pixel constituting the display surface of the display unit 12 that is associated with the optical fiber 26. Each of the optical fibers 26 guides the light of at least one pixel that is associated with the optical fiber 26. Further, each of the optical fibers 28 according to the present embodiment is associated with at least one pixel constituting the display surface of the display unit 14. Each of the optical fibers 28 is connected to at least one pixel constituting the display surface of the display unit 14 that is associated with the optical fiber 28. Each of the optical fibers 28 then guides the light of at least one pixel that is associated with the optical fiber 28.
  • the pixels forming the display surface of the display unit 12 or the display unit 14 may be formed of a plurality of sub-pixels such as, for example, R pixels, G pixels, and B pixels.
  • the optical fiber 26 connected to the display surface of the display unit 12 is guided by the support unit 30 along the left head of the user 22 via the connection unit 34.
  • the support unit 30 supports the bundle of optical fibers 26 from the outside in front of the left eye of the user 22 so that one end face of the optical fiber 26 that emits light of an image faces the left eye of the user 22. Then, the light of the image from the display unit 12 is incident on the other end face of the optical fiber 26.
  • the other end face of the optical fiber 26 may be connected to the display unit 12.
  • the optical fiber 28 connected to the display surface of the display unit 14 is guided by the support unit 32 along the right head of the user 22 via the connection unit 34.
  • the support portion 32 supports the bundle of optical fibers 28 from the outside in front of the right eye of the user 22 so that one end face of the optical fiber 28 emitting light of an image faces the right eye of the user 22. Then, the light of the image from the display unit 14 is incident on the other end surface of the optical fiber 28.
  • the other end face of the optical fiber 28 may be connected to the display unit 14.
  • the light of the image for the left eye output from the display unit 12 and guided by the optical fiber 26 is incident on the user 22 for the left eye. Further, the light of the image for the right eye which is output by the display unit 14 and guided by the optical fiber 28 is incident on the user 22 to the right eye.
  • the user 22 wearing the HMD 10 can appreciate, for example, the three-dimensional image output by the display unit 12 and the display unit 14.
  • the HMD 10 also includes an optical fiber 40 for emitting infrared light to the left eye of the user 22 and an optical fiber 42 for emitting infrared light to the right eye of the user 22.
  • the optical fiber 40 is supported by the support 30 in front of the left eye of the user 22 with the bundle of optical fibers 26.
  • the optical fiber 42 is also supported by the support 32 in front of the right eye of the user 22 together with the bundle of optical fibers 28.
  • the infrared LED 20b of the sight line detection device 20 makes infrared rays incident on one end surface of the optical fiber 40 and one end surface of the optical fiber 42. Then, the infrared ray is emitted from the other end face 44 of the optical fiber 40 shown in FIG. 3 toward the left eye of the user 22. Further, infrared rays are emitted from the other end face 46 of the optical fiber 42 shown in FIG. 3 to the right eye of the user 22.
  • the other end face 44 of the optical fiber 40 is located at the center of the bundle of optical fibers 26 in the example of FIG. 3, the end face 44 does not have to be located at the center of the bundle of optical fibers 26.
  • the other end face 46 of the optical fiber 42 is located at the center of the bundle of optical fibers 28, but the end face 46 does not have to be located at the center of the bundle of optical fibers 28.
  • the optical fibers 48 (48a to 48e) for guiding the light reflected by the left eye of the user 22 and the optical fibers 50 (50a to 50e) for guiding the light reflected by the right eye of the user 22 )It is included.
  • the support portion 30 of the housing 24 supports the plurality of optical fibers 48 such that one end face 52 is disposed at a different position. Further, the plurality of optical fibers 48 are arranged such that one end face 52 faces the left eye of the user 22.
  • one end face 52 a of the optical fiber 48 a is supported by the support 30 so as to be located below the support 30.
  • one end face 52 b of the optical fiber 48 b is supported by the support 30 so as to be located on the left of the support 30.
  • one end face 52 c of the optical fiber 48 c is supported by the support portion 30 so as to be located at the upper left of the support portion 30.
  • one end face 52 d of the optical fiber 48 d is supported by the support 30 so as to be located on the right of the support 30.
  • one end face 52 e of the optical fiber 48 e is supported by the support 30 so as to be located at the upper right of the support 30.
  • the optical fiber 48 penetrates the hole formed in the frame of the support portion 30. However, the optical fiber 48 does not have to penetrate the hole formed in the frame of the support portion 30.
  • the support portion 32 of the housing 24 supports the plurality of optical fibers 50 such that one end face 54 is disposed at a different position.
  • the plurality of optical fibers 50 are arranged such that one end face 54 faces the right eye of the user 22.
  • one end face 54 a of the optical fiber 50 a is supported by the support 32 so as to be located below the support 32.
  • one end face 54 b of the optical fiber 50 b is supported by the support portion 32 so as to be located on the right of the support portion 32.
  • one end face 54 c of the optical fiber 50 c is supported by the support portion 32 so as to be located at the upper right of the support portion 32.
  • one end face 54 d of the optical fiber 50 d is supported by the support portion 32 so as to be located on the left of the support portion 32.
  • one end face 54 e of the optical fiber 50 e is supported by the support portion 32 so as to be located at the upper left of the support portion 32.
  • the optical fiber 50 passes through the hole formed in the frame of the support portion 32. However, the optical fiber 50 does not have to pass through the hole formed in the frame of the support portion 32.
  • an infrared ray emitted from the other end surface 44 of the optical fiber 40 and reflected by the left eye of the user 22 is incident on one end surface 52 of the optical fiber 48. Further, the infrared ray emitted from the other end face 46 of the optical fiber 42 and reflected by the right eye of the user 22 is incident on one end face 54 of the optical fiber 50.
  • the camera 20 c captures the other end face 56 of the plurality of optical fibers 48.
  • the camera 20c includes the other end 56a of the optical fiber 48a, the other end 56b of the optical fiber 48b, the other end 56c of the optical fiber 48c, the other end 56d of the optical fiber 48d, and the other end of the optical fiber 48e.
  • Shoot 56e the camera 20c captures the other end face 56 of the plurality of optical fibers 48.
  • the camera 20c includes the other end 56a of the optical fiber 48a, the other end 56b of the optical fiber 48b, the other end 56c of the optical fiber 48c, the other end 56d of the optical fiber 48d, and the other end of the optical fiber 48e.
  • the camera 20 d captures the other end surface 58 of the plurality of optical fibers 50.
  • the camera 20d includes the other end face 58a of the optical fiber 50a, the other end face 58b of the optical fiber 50b, the other end face 58c of the optical fiber 50c, the other end face 58d of the optical fiber 50d, and the other end face of the optical fiber 50e.
  • Shoot 58e the other end face 58a of the optical fiber 50a, the other end face 58b of the optical fiber 50b, the other end face 58c of the optical fiber 50c, the other end face 58d of the optical fiber 50d, and the other end face of the optical fiber 50e.
  • FIG. 4 is a view showing an example of the left end surface image 60 which is an image obtained by photographing the other end surface 56 of the plurality of optical fibers 48 by the camera 20c. As shown in FIG. 4, in the left end surface image 60, images of the end faces 56a to 56e are shown.
  • the visual axis detection device 20 enters the end face 52 of the optical fiber 48 and emits from the end face 56 based on the detection result of the reflected light of the user 22's left eye. Detect the gaze of
  • the gaze detection apparatus 20 may detect the gaze of the left eye of the user 22 based on the left end surface image 60.
  • the line of sight detection device 20 To detect.
  • the main unit 20a of the sight line detection device 20 is a computer including a processor which is a control device such as a microprocessor, a storage element such as a ROM or RAM, a storage unit such as a hard disk drive, Good.
  • the processor included in the main unit 20a may operate in accordance with a program installed in the main unit 20a.
  • FIG. 5 is a functional block diagram showing an example of functions implemented by the visual axis detection device 20 according to the present embodiment.
  • the visual axis detection device 20 according to the present embodiment not all the functions illustrated in FIG. 5 need to be implemented, and functions other than the functions illustrated in FIG. 5 may be implemented.
  • the visual axis detection device 20 functionally includes, for example, an infrared light emitting unit 70, a reflected light detection unit 72, and a visual axis detection unit 74.
  • the infrared emitting unit 70 is mainly mounted with, for example, the infrared LED 20b.
  • the reflected light detection unit 72 is mainly mounted with, for example, the processor of the main body unit 20a, the camera 20c, and the camera 20d.
  • the gaze detection unit 74 is mainly implemented with, for example, the processor and the storage unit of the main body unit 20 a.
  • the above functions are executed by executing a program installed in the main body 20a of the sight line detection device 20, which is a computer, including a command corresponding to the above functions by a processor included in the main body 20a of the sight line detection device 20. It may be implemented.
  • This program is supplied to the main body 20a of the visual axis detection device 20 via a computer readable information storage medium such as an optical disk, a magnetic disk, a magnetic tape, a magneto-optical disk, a flash memory or the Internet, for example. It may be done.
  • the infrared emitting unit 70 causes infrared rays to be incident on one end surface of the optical fiber 40 and one end surface of the optical fiber 42, for example.
  • the reflected light detection unit 72 detects, for example, the reflected light of the left eye of the user 22 that is incident on one end surface 52 of the optical fiber 48 and emitted from the other end surface 56. Further, in the present embodiment, for example, the reflected light detection unit 72 detects the reflected light of the right eye of the user 22 that is incident on one end surface 54 of the optical fiber 50 and emitted from the other end surface 58.
  • the reflected light detection unit 72 may be a camera. Then, the reflected light detection unit 72 may generate, for example, a left end surface image obtained by entering the one end surface 52 of the optical fiber 48 and capturing the other end surface 56. For example, the reflected light detection unit 72 may generate the left end surface image 60 shown in FIG. Further, for example, the reflected light detection unit 72 may generate a right end face image obtained by entering one end face 52 of the optical fiber 48 and capturing the other end face 56.
  • the line-of-sight detection unit 74 detects the line of sight of the user 22, such as the movement of the pupil or the direction of the line of sight, based on the detection result by the reflected light detection unit 72.
  • the line-of-sight detection unit 74 may detect the line of sight of the left eye of the user 22 based on, for example, a left end face image which is an image of the other end face 56 of the optical fiber 48 captured by the camera 20c.
  • the line-of-sight detection unit 74 may detect the line of sight of the right eye of the user 22, for example, based on a right end face image which is an image of the other end face 58 of the optical fiber 50 captured by the camera 20d.
  • the amount of infrared rays incident on each of the end faces 52a to 52e corresponds to the direction of the line of sight of the left eye of the user 22.
  • the brightness of the end faces 56a to 56e shown in the left end face image 60 corresponds to the amount of infrared rays incident on the end faces 52a to 52e, respectively.
  • the line-of-sight detection unit 74 detects the line of sight of the left eye of the user 22 based on the left end face image in which the images of the end faces 56a to 56e are shown.
  • the gaze detection unit 74 detects the gaze of the right eye of the user 22 based on the right end face image.
  • the method of detecting the sight line based on the left end face image and the right end face image is not particularly limited.
  • the line of sight of the user 22 may be detected by a machine learning method such as deep learning.
  • learning data including an image in which the images of the end faces 56a to 56e are shown as learning input data and learning data including data indicating the direction of the line of sight of the left eye of the user 22 when the image is photographed
  • the gaze detection unit 74 may store the learned model.
  • the learned model is a machine learning model in which supervised learning is performed.
  • the line-of-sight detection unit 74 detects (estimates) the line of sight of the left eye of the user 22 when the left end plane image is captured based on the output when the left end plane image is input to the learned model. Good.
  • learning data is included that includes an image in which the images of the end faces 58a to 58e are shown as learning input data and data indicating the direction of the line of sight of the right eye of the user 22 when the image is taken
  • the gaze detection unit 74 may store the learned model that has been learned.
  • the learned model is a machine learning model in which supervised learning is performed.
  • the sight line detection unit 74 detects (estimates) the sight line of the right eye of the user 22 when the right end face image is captured based on the output when the right end face image is input to the learned model. Good.
  • the line-of-sight detection unit 74 detects the line of sight of the left eye of the user 22 based on the distribution of values indicating the brightness associated with each of the positions where the end faces 52a to 52e indicated by the data. May be Similarly, even if the line-of-sight detection unit 74 generates data in which the positions at which the end faces 54a to 54e are arranged correspond to the values indicating the brightness of the end faces 58a to 58e shown in the right end face image. Good. Then, the line-of-sight detection unit 74 detects the line of sight of the right eye of the user 22 based on the distribution of values indicating the brightness associated with each of the positions where the end faces 54a to 54e indicated by the data. May be
  • the infrared emitting unit 70 waits for the infrared emitting timing (S101).
  • infrared rays are emitted at an interval of one second.
  • the infrared emitting unit 70 emits the infrared rays toward one end face of the optical fiber 40 and one end face of the optical fiber 42 (S102).
  • the reflected light detection unit 72 generates the left end surface image and the right end surface image (S103).
  • the line-of-sight detection unit 74 detects the line of sight of the left eye of the user 22 based on the left end surface image generated in the process shown in S103, and the user 22 based on the right end face image generated in the process shown in S103.
  • the line of sight of the right eye of the camera is detected (S104). Then, the process returns to the process shown in S101.
  • the infrared ray incident on one end face of the optical fiber 40 in the process shown in S102 is emitted from the other end face 44 of the optical fiber 40 toward the left eye of the user 22, and the left eye of the user 22 It is reflected.
  • the infrared ray incident on one end face of the optical fiber 42 in the process shown in S102 is emitted from the other end face 46 of the optical fiber 42 toward the right eye of the user 22 and is reflected by the right eye of the user 22.
  • the infrared light reflected by the left eye of the user 22 is incident on one end face 52a to 52e of the optical fibers 48a to 48e.
  • the infrared light reflected by the right eye of the user 22 is incident on one end face 54a to 54e of the optical fibers 50a to 50e.
  • the infrared rays incident on one end face 52a to 52e of the optical fibers 48a to 48e are emitted from the other end faces 56a to 56e of the optical fibers 48a to 48e.
  • the infrared rays incident on one of the end faces 54a to 54e of the optical fibers 50a to 50e are emitted from the other end faces 58a to 58e of the optical fibers 50a to 50e.
  • the display of the image by the display unit 12 and the display unit 14 may be suppressed at the timing when the infrared light is emitted.
  • the display of the image may be suppressed every 60 frames. For example, in this way, it is possible to suppress the influence of the image output from the display unit 12 or the display unit 14 on the detection of the user's line of sight.
  • the display unit 12 and the display unit 14 may output visible light of a predetermined color at predetermined time intervals (for example, at intervals of 1 second) instead of an image to be viewed by the user.
  • the reflected light detection unit 72 emits the left end surface image obtained by photographing the end faces 56a to 56e emitting the visible light reflected by the left eye of the user 22, and the visible light reflected by the right eye of the user 22.
  • a right end face image obtained by photographing the end faces 58a to 58e may be generated. This makes it possible to detect the line of sight of the user 22 without providing the infrared LED 20b.
  • the line of sight of the user 22 is detected based on the image of the eye of the user 22 captured by a camera arranged inside the housing 24.
  • the camera can not be disposed at a position suitable for detecting the line of sight of the user 22 due to the restriction of the shape and size of the housing 24, and the line of sight of the user 22 may not be detected with sufficient accuracy.
  • the gaze detection system 1 it is not necessary to dispose a camera for detecting the gaze of the user 22 inside the housing 24 of the HMD 10. Then, in the gaze detection system 1 according to the present embodiment, the gaze of the user 22 is detected based on the detection result of the light guided by the optical fiber 48 or the optical fiber 50 having a high degree of freedom of arrangement. Thus, according to the gaze detection system 1 according to the present embodiment, the gaze of the user 22 wearing the HMD 10 can be accurately detected. Then, a process according to the line of sight of the user 22 detected in this manner may be executed, for example, in a game program executed on the game console or a program executed on the personal computer.
  • the present invention is not limited to the above-described embodiment.
  • the reflected light detection unit 72 does not have to be a camera.
  • the reflected light detection unit 72 may be a light receiving element such as a light sensor capable of detecting the amount of infrared light.
  • the sight line detector 74 may detect the sight line of the user 22 based on the detection result of the amount of infrared rays detected by the light receiving element.
  • the sight line detection unit 74 may detect the line of sight of the user 22 based on the detection result of the light guided by a larger number of optical fibers or a smaller number of optical fibers.
  • the HMD 10 may have the function of the visual axis detection device 20.
  • the line-of-sight detection device 20 may be incorporated in the connection unit 34 of the HMD 10.
  • the display unit 12 or the display unit 14 may be incorporated in the connection unit 34 of the HMD 10. This allows the user 22 to move his head freely.
  • the present invention is not limited to the gaze detection system 1 including the HMD 10 described above.
  • a display unit is disposed in front of a housing worn by the user on the head so that the user can directly view an image displayed on the display unit.
  • the display unit is disposed on the side of the housing, and a mirror that reflects the video light emitted forward from the display unit is disposed on the front of the housing, so that the video output from the display unit and reflected by the mirror
  • HMDs that allow the user to view
  • the present invention is equally applicable to gaze detection systems including such HMDs.

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)

Abstract

Provided are a sight-line detection system for a head-mounted display, whereby the sight line of a user wearing a head-mounted display can be accurately detected, a head-mounted display, and a sight-line detection method for a head-mounted display. A plurality of optical fibers (48a)-(48e) are supported by a housing (24) of a head-mounted display (10) so that one-end surfaces (52a)-(52e) thereof are disposed in different positions facing an eye of a user (22). A sight-line detection device (20) detects, for the plurality of optical fibers (48a)-(48e), light reflected by the eye of the user (22), the reflected light being incident on the one-end surfaces (52a)-(52e) and emitted from other-end surfaces (56a)-(56e). The sight-line detection device (20) detects the sight line of the user (22) on the basis of the result of the above detection.

Description

ヘッドマウントディスプレイの視線検出システム、ヘッドマウントディスプレイ及びヘッドマウントディスプレイの視線検出方法Eye-gaze detection system for head-mounted display, head-mounted display, and eye-gaze detection method for head-mounted display
 本発明は、ヘッドマウントディスプレイの視線検出システム、ヘッドマウントディスプレイ及びヘッドマウントディスプレイの視線検出方法に関する。 The present invention relates to a gaze detection system for a head mounted display, a head mounted display, and a gaze detection method for the head mounted display.
 ユーザが頭部に装着して映像を観賞するためのヘッドマウントディスプレイ(HMD)において、筐体の内部に配置されたカメラが撮影したユーザの眼の画像に基づいてユーザの視線を検出する技術が知られている。 In a head mounted display (HMD) for a user to wear on the head and watch an image, there is a technology for detecting the line of sight of the user based on an image of the user's eye taken by a camera arranged inside the housing Are known.
 従来のHMDにおける視線検出では、筐体の形状や大きさの制約からカメラをユーザの視線の検出に適した位置に配置できず、充分な精度でユーザの視線を検出できないことがあった。 In the line of sight detection in the conventional HMD, the camera can not be disposed at a position suitable for detecting the line of sight of the user due to the restriction of the shape and size of the casing, and the line of sight of the user can not be detected with sufficient accuracy.
 本発明は上記実情に鑑みてなされたものであって、その目的の一つは、ヘッドマウントディスプレイを装着するユーザの視線を的確に検出できるヘッドマウントディスプレイの視線検出システム、ヘッドマウントディスプレイ及びヘッドマウントディスプレイの視線検出方法を提供することにある。 The present invention has been made in view of the above situation, and one of the objects is a line of sight detection system for a head mounted display, a head mounted display, and a head mount capable of accurately detecting the line of sight of a user wearing the head mounted display. An object of the present invention is to provide a method of detecting the line of sight of a display.
 上記課題を解決するために、本発明に係るヘッドマウントディスプレイの視線検出システムは、一方の端面が異なる位置にユーザの眼を向くよう配置され、ヘッドマウントディスプレイの筐体により支持される複数の光ファイバと、前記複数の光ファイバについての、前記一方の端面に入射され他方の端面から出射される、前記ユーザの眼での反射光を検出する反射光検出部と、前記検出の結果に基づいて、前記ユーザの視線を検出する視線検出部と、を含む。 In order to solve the above problems, the gaze detection system for a head mounted display according to the present invention is arranged such that one end face faces the user's eyes at different positions, and a plurality of lights supported by the housing of the head mounted display A fiber, a reflected light detection unit for detecting reflected light of the user's eye, which is incident on the one end face and emitted from the other end of the plurality of optical fibers, and a result of the detection And a gaze detection unit that detects the gaze of the user.
 本発明の一態様では、前記反射光検出部は、前記複数の光ファイバの前記他方の端面を撮影するカメラであり、前記視線検出部は、前記カメラが前記他方の端面を撮影した画像に基づいて、前記ユーザの視線を検出する。 In one aspect of the present invention, the reflected light detection unit is a camera that captures the other end face of the plurality of optical fibers, and the sight line detection unit is based on an image of the other end face captured by the camera. To detect the line of sight of the user.
 また、本発明に係るヘッドマウントディスプレイは、ユーザが装着する筐体と、一方の端面が異なる位置に前記ユーザの眼を向くよう配置され、前記筐体により支持される複数の光ファイバと、前記複数の光ファイバについての、前記一方の端面に入射され他方の端面から出射される、前記ユーザの眼での反射光を検出する反射光検出部と、前記検出の結果に基づいて、前記ユーザの視線を検出する視線検出部と、を含む。 In the head mounted display according to the present invention, a housing worn by a user, a plurality of optical fibers supported by the housing and arranged to face the eyes of the user at different positions on one end face, and A reflected light detection unit for detecting reflected light from the user's eyes, which is incident on the one end face and emitted from the other end of a plurality of optical fibers, and based on the result of the detection, And a gaze detection unit that detects a gaze.
 また、本発明に係るヘッドマウントディスプレイの視線検出方法は、一方の端面が異なる位置にユーザの眼を向くよう配置され、ヘッドマウントディスプレイの筐体により支持される複数の光ファイバについての、前記一方の端面に入射され他方の端面から出射される、前記ユーザの眼での反射光を検出するステップと、前記検出の結果に基づいて、前記ユーザの視線を検出するステップと、を含む。 In the head-mounted display eye gaze detection method according to the present invention, one of the plurality of optical fibers supported by the housing of the head-mounted display is disposed such that one end face faces the user's eyes at different positions. Detecting reflected light from the end face of the user, which is incident on the end face of the light source and emitted from the other end face, and detecting a line of sight of the user based on a result of the detection.
本発明の一実施形態に係る視線検出システムの一例を示す平面図である。It is a top view showing an example of a look detection system concerning one embodiment of the present invention. 図1に示す視線検出システムの左側面図である。It is a left view of the gaze detection system shown in FIG. 図1及び図2に示す視線検出システムのIII-III線断面図である。FIG. 3 is a cross-sectional view taken along line III-III of the gaze detection system shown in FIGS. 1 and 2; 端面画像の一例を示す図である。It is a figure which shows an example of an end surface image. 本発明の一実施形態に係る視線検出装置で実装される機能の一例を示す機能ブロック図である。It is a functional block diagram which shows an example of the function mounted by the gaze detection apparatus which concerns on one Embodiment of this invention. 本発明の一実施形態に係る視線検出装置で行われる処理の流れの一例を示すフロー図である。It is a flowchart which shows an example of the flow of the process performed with the gaze detection apparatus which concerns on one Embodiment of this invention.
 以下、本発明の一実施形態について、図面を参照しながら説明する。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
 図1は、本発明の一実施形態に係る視線検出システム1の一例を示す平面図である。図2は、図1に示す視線検出システム1の左側面図である。図3は、図1及び図2に示す視線検出システム1のIII-III線断面図である。 FIG. 1 is a plan view showing an example of a gaze detection system 1 according to an embodiment of the present invention. FIG. 2 is a left side view of the sight line detection system 1 shown in FIG. FIG. 3 is a cross-sectional view taken along line III-III of the gaze detection system 1 shown in FIGS.
 図1及び図2に示すように、本実施形態に係る視線検出システム1には、ヘッドマウントディスプレイ(HMD)10、左眼用の表示部12、右眼用の表示部14、視線検出装置20、が含まれている。また視線検出装置20には、本体部20a、赤外線LED(Light Emitting Diode)20b、左眼用のカメラ20c、及び、右眼用のカメラ20dが含まれる。表示部12及び表示部14は、例えば液晶ディスプレイや有機ELディスプレイなどのディスプレイである。またカメラ20c、及び、カメラ20dは、例えば赤外線カメラである。 As shown in FIGS. 1 and 2, the gaze detection system 1 according to the present embodiment includes a head mounted display (HMD) 10, a display unit 12 for the left eye, a display unit 14 for the right eye, and a gaze detection apparatus 20. ,It is included. The sight line detection device 20 includes a main body 20a, an infrared light emitting diode (LED) 20b, a camera 20c for the left eye, and a camera 20d for the right eye. The display unit 12 and the display unit 14 are displays such as a liquid crystal display and an organic EL display, for example. The camera 20c and the camera 20d are, for example, infrared cameras.
 HMD10には、ユーザ22が頭部に装着する筐体24と、左眼用の映像を導く左眼用の光ファイバ26と、右眼用の映像を導く右眼用の光ファイバ28と、が含まれる。本実施形態に係る光ファイバ26及び光ファイバ28は、例えば可撓性を有する。 The HMD 10 includes a case 24 that the user 22 wears on the head, an optical fiber 26 for the left eye that guides an image for the left eye, and an optical fiber 28 for the right eye that guides the image for the right eye. included. The optical fiber 26 and the optical fiber 28 according to the present embodiment have, for example, flexibility.
 筐体24の前面には光ファイバ26を支持する支持部30、及び、光ファイバ28を指示する支持部32が設けられている。図3に示すように、支持部30は、ユーザ22の左眼の眼前で光ファイバ26を支持する枠体であり、支持部32は、ユーザ22の右眼の眼前で光ファイバ28を支持する枠体である。 A support 30 for supporting the optical fiber 26 and a support 32 for instructing the optical fiber 28 are provided on the front surface of the housing 24. As shown in FIG. 3, the support 30 is a frame supporting the optical fiber 26 in front of the left eye of the user 22, and the support 32 supports the optical fiber 28 in front of the right eye of the user 22. It is a frame.
 また本実施形態に係る筐体24には、ユーザ22の後頭部に配置される接続部34が含まれている。そして光ファイバ26は、支持部30、接続部34、及び、表示部12の表示面に接続される。また光ファイバ28は、支持部32、接続部34、及び、表示部14の表示面に接続される。 Further, the housing 24 according to the present embodiment includes the connection portion 34 disposed on the back of the user 22. The optical fiber 26 is connected to the support 30, the connection 34, and the display surface of the display 12. The optical fiber 28 is connected to the support 32, the connection 34, and the display surface of the display 14.
 本実施形態に係る光ファイバ26のそれぞれは、表示部12の表示面を構成する少なくとも1の画素に対応付けられている。そして光ファイバ26のそれぞれは、当該光ファイバ26に対応付けられる、表示部12の表示面を構成する少なくとも1の画素に接続されている。そして光ファイバ26のそれぞれは、当該光ファイバ26に対応付けられる、少なくとも1つの画素の光を導く。また本実施形態に係る光ファイバ28のそれぞれは、表示部14の表示面を構成する少なくとも1の画素に対応付けられている。そして光ファイバ28のそれぞれは、当該光ファイバ28に対応付けられる、表示部14の表示面を構成する少なくとも1の画素に接続されている。そして光ファイバ28のそれぞれは、当該光ファイバ28に対応付けられる、少なくとも1つの画素の光を導く。 Each of the optical fibers 26 according to the present embodiment is associated with at least one pixel constituting the display surface of the display unit 12. Then, each of the optical fibers 26 is connected to at least one pixel constituting the display surface of the display unit 12 that is associated with the optical fiber 26. Each of the optical fibers 26 guides the light of at least one pixel that is associated with the optical fiber 26. Further, each of the optical fibers 28 according to the present embodiment is associated with at least one pixel constituting the display surface of the display unit 14. Each of the optical fibers 28 is connected to at least one pixel constituting the display surface of the display unit 14 that is associated with the optical fiber 28. Each of the optical fibers 28 then guides the light of at least one pixel that is associated with the optical fiber 28.
 また表示部12や表示部14の表示面を構成する画素が、例えばR画素、G画素、B画素などといった複数のサブ画素から構成されていてもよい。 Further, the pixels forming the display surface of the display unit 12 or the display unit 14 may be formed of a plurality of sub-pixels such as, for example, R pixels, G pixels, and B pixels.
 本実施形態では例えば、表示部12の表示面に接続される光ファイバ26が、接続部34を経由して、ユーザ22の左頭部に沿って支持部30に案内される。そして支持部30は、映像の光を出射する光ファイバ26の一方の端面がユーザ22の左眼を向くよう、ユーザ22の左眼の眼前で光ファイバ26の束を外側から支持する。そして光ファイバ26の他方の端面に、表示部12からの映像の光が入射される。ここで例えば、光ファイバ26の他方の端面が表示部12に接続されていてもよい。 In the present embodiment, for example, the optical fiber 26 connected to the display surface of the display unit 12 is guided by the support unit 30 along the left head of the user 22 via the connection unit 34. The support unit 30 supports the bundle of optical fibers 26 from the outside in front of the left eye of the user 22 so that one end face of the optical fiber 26 that emits light of an image faces the left eye of the user 22. Then, the light of the image from the display unit 12 is incident on the other end face of the optical fiber 26. Here, for example, the other end face of the optical fiber 26 may be connected to the display unit 12.
 また例えば、表示部14の表示面に接続される光ファイバ28が、接続部34を経由して、ユーザ22の右頭部に沿って支持部32に案内される。そして支持部32は、映像の光を出射する光ファイバ28の一方の端面がユーザ22の右眼を向くよう、ユーザ22の右眼の眼前で光ファイバ28の束を外側から支持する。そして光ファイバ28の他方の端面に、表示部14からの映像の光が入射される。ここで例えば、光ファイバ28の他方の端面が表示部14に接続されていてもよい。 Also, for example, the optical fiber 28 connected to the display surface of the display unit 14 is guided by the support unit 32 along the right head of the user 22 via the connection unit 34. The support portion 32 supports the bundle of optical fibers 28 from the outside in front of the right eye of the user 22 so that one end face of the optical fiber 28 emitting light of an image faces the right eye of the user 22. Then, the light of the image from the display unit 14 is incident on the other end surface of the optical fiber 28. Here, for example, the other end face of the optical fiber 28 may be connected to the display unit 14.
 表示部12が出力し、光ファイバ26が導く、左眼用の映像の光が、ユーザ22に左眼に入射される。また、表示部14が出力し、光ファイバ28が導く、右眼用の映像の光が、ユーザ22に右眼に入射される。HMD10を装着するユーザ22は例えば表示部12及び表示部14により出力される三次元映像を鑑賞することができる。 The light of the image for the left eye output from the display unit 12 and guided by the optical fiber 26 is incident on the user 22 for the left eye. Further, the light of the image for the right eye which is output by the display unit 14 and guided by the optical fiber 28 is incident on the user 22 to the right eye. The user 22 wearing the HMD 10 can appreciate, for example, the three-dimensional image output by the display unit 12 and the display unit 14.
 また本実施形態に係るHMD10には、ユーザ22の左眼への赤外線を出射する光ファイバ40、及び、ユーザ22の右眼への赤外線を出射する光ファイバ42が含まれている。図3に示すように、光ファイバ40は、支持部30によって、光ファイバ26の束とともにユーザ22の左眼の眼前で支持される。また光ファイバ42は、支持部32によって、光ファイバ28の束とともにユーザ22の右眼の眼前で支持される。 The HMD 10 according to the present embodiment also includes an optical fiber 40 for emitting infrared light to the left eye of the user 22 and an optical fiber 42 for emitting infrared light to the right eye of the user 22. As shown in FIG. 3, the optical fiber 40 is supported by the support 30 in front of the left eye of the user 22 with the bundle of optical fibers 26. The optical fiber 42 is also supported by the support 32 in front of the right eye of the user 22 together with the bundle of optical fibers 28.
 そして本実施形態では例えば、視線検出装置20の赤外線LED20bが光ファイバ40の一方の端面及び光ファイバ42の一方の端面に赤外線を入射する。そして図3に示す光ファイバ40の他方の端面44からユーザ22の左眼に向かって当該赤外線が出射される。また図3に示す光ファイバ42の他方の端面46からユーザ22の右眼に赤外線が出射される。なお図3の例では光ファイバ40の他方の端面44が光ファイバ26の束の中央に位置しているが、端面44が光ファイバ26の束の中央に位置する必要はない。また図3の例では光ファイバ42の他方の端面46が光ファイバ28の束の中央に位置しているが、端面46が光ファイバ28の束の中央に位置する必要はない。 Then, in the present embodiment, for example, the infrared LED 20b of the sight line detection device 20 makes infrared rays incident on one end surface of the optical fiber 40 and one end surface of the optical fiber 42. Then, the infrared ray is emitted from the other end face 44 of the optical fiber 40 shown in FIG. 3 toward the left eye of the user 22. Further, infrared rays are emitted from the other end face 46 of the optical fiber 42 shown in FIG. 3 to the right eye of the user 22. Although the other end face 44 of the optical fiber 40 is located at the center of the bundle of optical fibers 26 in the example of FIG. 3, the end face 44 does not have to be located at the center of the bundle of optical fibers 26. In the example of FIG. 3, the other end face 46 of the optical fiber 42 is located at the center of the bundle of optical fibers 28, but the end face 46 does not have to be located at the center of the bundle of optical fibers 28.
 また本実施形態に係るHMD10には、ユーザ22の左眼で反射する光を導く光ファイバ48(48a~48e)、及び、ユーザ22の右眼で反射する光を導く光ファイバ50(50a~50e)が含まれている。 In the HMD 10 according to this embodiment, the optical fibers 48 (48a to 48e) for guiding the light reflected by the left eye of the user 22 and the optical fibers 50 (50a to 50e) for guiding the light reflected by the right eye of the user 22 )It is included.
 図1~図3に示すように、筐体24の支持部30は、一方の端面52が異なる位置に配置されるよう複数の光ファイバ48を支持する。また複数の光ファイバ48は、一方の端面52がユーザ22の左眼を向くよう配置される。例えば光ファイバ48aの一方の端面52aは、支持部30の下に位置するよう支持部30によって支持されている。また光ファイバ48bの一方の端面52bは、支持部30の左に位置するよう支持部30によって支持されている。また光ファイバ48cの一方の端面52cは、支持部30の左上に位置するよう支持部30によって支持されている。また光ファイバ48dの一方の端面52dは、支持部30の右に位置するよう支持部30によって支持されている。また光ファイバ48eの一方の端面52eは、支持部30の右上に位置するよう支持部30によって支持されている。本実施形態では光ファイバ48が支持部30の枠に形成された孔を貫通しているが、光ファイバ48が支持部30の枠に形成された孔を貫通している必要はない。 As shown in FIGS. 1 to 3, the support portion 30 of the housing 24 supports the plurality of optical fibers 48 such that one end face 52 is disposed at a different position. Further, the plurality of optical fibers 48 are arranged such that one end face 52 faces the left eye of the user 22. For example, one end face 52 a of the optical fiber 48 a is supported by the support 30 so as to be located below the support 30. Further, one end face 52 b of the optical fiber 48 b is supported by the support 30 so as to be located on the left of the support 30. Further, one end face 52 c of the optical fiber 48 c is supported by the support portion 30 so as to be located at the upper left of the support portion 30. Further, one end face 52 d of the optical fiber 48 d is supported by the support 30 so as to be located on the right of the support 30. Further, one end face 52 e of the optical fiber 48 e is supported by the support 30 so as to be located at the upper right of the support 30. In the present embodiment, the optical fiber 48 penetrates the hole formed in the frame of the support portion 30. However, the optical fiber 48 does not have to penetrate the hole formed in the frame of the support portion 30.
 また図1~図3に示すように、筐体24の支持部32は、一方の端面54が異なる位置に配置されるよう複数の光ファイバ50を支持する。また複数の光ファイバ50は、一方の端面54がユーザ22の右眼を向くよう配置される。例えば光ファイバ50aの一方の端面54aは、支持部32の下に位置するよう支持部32によって支持されている。また光ファイバ50bの一方の端面54bは、支持部32の右に位置するよう支持部32によって支持されている。また光ファイバ50cの一方の端面54cは、支持部32の右上に位置するよう支持部32によって支持されている。また光ファイバ50dの一方の端面54dは、支持部32の左に位置するよう支持部32によって支持されている。また光ファイバ50eの一方の端面54eは、支持部32の左上に位置するよう支持部32によって支持されている。本実施形態では光ファイバ50が支持部32の枠に形成された孔を貫通しているが、光ファイバ50が支持部32の枠に形成された孔を貫通している必要はない。 Further, as shown in FIGS. 1 to 3, the support portion 32 of the housing 24 supports the plurality of optical fibers 50 such that one end face 54 is disposed at a different position. The plurality of optical fibers 50 are arranged such that one end face 54 faces the right eye of the user 22. For example, one end face 54 a of the optical fiber 50 a is supported by the support 32 so as to be located below the support 32. Further, one end face 54 b of the optical fiber 50 b is supported by the support portion 32 so as to be located on the right of the support portion 32. Further, one end face 54 c of the optical fiber 50 c is supported by the support portion 32 so as to be located at the upper right of the support portion 32. Further, one end face 54 d of the optical fiber 50 d is supported by the support portion 32 so as to be located on the left of the support portion 32. Further, one end face 54 e of the optical fiber 50 e is supported by the support portion 32 so as to be located at the upper left of the support portion 32. In the present embodiment, the optical fiber 50 passes through the hole formed in the frame of the support portion 32. However, the optical fiber 50 does not have to pass through the hole formed in the frame of the support portion 32.
 本実施形態では例えば、光ファイバ40の他方の端面44から出射されてユーザ22の左眼で反射される赤外線が、光ファイバ48の一方の端面52に入射する。また光ファイバ42の他方の端面46から出射されてユーザ22の右眼で反射される赤外線が、光ファイバ50の一方の端面54に入射する。 In the present embodiment, for example, an infrared ray emitted from the other end surface 44 of the optical fiber 40 and reflected by the left eye of the user 22 is incident on one end surface 52 of the optical fiber 48. Further, the infrared ray emitted from the other end face 46 of the optical fiber 42 and reflected by the right eye of the user 22 is incident on one end face 54 of the optical fiber 50.
 そして本実施形態では例えば、カメラ20cが、複数の光ファイバ48の他方の端面56を撮影する。例えばカメラ20cは、光ファイバ48aの他方の端面56a、光ファイバ48bの他方の端面56b、光ファイバ48cの他方の端面56c、光ファイバ48dの他方の端面56d、及び、光ファイバ48eの他方の端面56eを撮影する。 Then, in the present embodiment, for example, the camera 20 c captures the other end face 56 of the plurality of optical fibers 48. For example, the camera 20c includes the other end 56a of the optical fiber 48a, the other end 56b of the optical fiber 48b, the other end 56c of the optical fiber 48c, the other end 56d of the optical fiber 48d, and the other end of the optical fiber 48e. Shoot 56e.
 また本実施形態では例えば、カメラ20dが、複数の光ファイバ50の他方の端面58を撮影する。例えばカメラ20dは、光ファイバ50aの他方の端面58a、光ファイバ50bの他方の端面58b、光ファイバ50cの他方の端面58c、光ファイバ50dの他方の端面58d、及び、光ファイバ50eの他方の端面58eを撮影する。 Further, in the present embodiment, for example, the camera 20 d captures the other end surface 58 of the plurality of optical fibers 50. For example, the camera 20d includes the other end face 58a of the optical fiber 50a, the other end face 58b of the optical fiber 50b, the other end face 58c of the optical fiber 50c, the other end face 58d of the optical fiber 50d, and the other end face of the optical fiber 50e. Shoot 58e.
 以下、光ファイバ48の他方の端面56を撮影した画像を左端面画像と呼び、光ファイバ50の他方の端面58を撮影した画像を右端面画像と呼ぶこととする。図4は、カメラ20cが複数の光ファイバ48の他方の端面56を撮影した画像である左端面画像60の一例を示す図である。図4に示すように、左端面画像60には、端面56a~56eの像が示されている。 Hereinafter, an image obtained by capturing the other end face 56 of the optical fiber 48 is referred to as a left end face image, and an image obtained by capturing the other end face 58 of the optical fiber 50 is referred to as a right end face image. FIG. 4 is a view showing an example of the left end surface image 60 which is an image obtained by photographing the other end surface 56 of the plurality of optical fibers 48 by the camera 20c. As shown in FIG. 4, in the left end surface image 60, images of the end faces 56a to 56e are shown.
 そして本実施形態では例えば、視線検出装置20が、光ファイバ48の端面52に入射され端面56から出射される、ユーザ22の左眼での反射光の検出結果に基づいて、ユーザ22の左眼の視線を検出する。ここで例えば、視線検出装置20が、左端面画像60に基づいてユーザ22の左眼の視線を検出してもよい。 Then, in the present embodiment, for example, the visual axis detection device 20 enters the end face 52 of the optical fiber 48 and emits from the end face 56 based on the detection result of the reflected light of the user 22's left eye. Detect the gaze of Here, for example, the gaze detection apparatus 20 may detect the gaze of the left eye of the user 22 based on the left end surface image 60.
 また同様に、視線検出装置20が、光ファイバ50の端面54に入射され端面58から出射される、ユーザ22の右眼での反射光の検出結果に基づいて、ユーザ22の右眼の視線を検出する。 Similarly, based on the detection result of the reflected light by the right eye of the user 22 that is incident on the end face 54 of the optical fiber 50 and emitted from the end face 58, the line of sight detection device 20 To detect.
 以下、視線検出装置20の機能及び視線検出装置20で実行される処理についてさらに説明する。 Hereinafter, the function of the visual axis detection device 20 and the process executed by the visual axis detection device 20 will be further described.
 本実施形態に係る視線検出装置20の本体部20aは、例えば、マイクロプロセッサ等の制御デバイスであるプロセッサ、ROMやRAM等の記憶素子やハードディスクドライブなどといった記憶部、などを含むコンピュータであってもよい。本体部20aに含まれるプロセッサは、本体部20aにインストールされるプログラムに従って動作してもよい。 For example, the main unit 20a of the sight line detection device 20 according to the present embodiment is a computer including a processor which is a control device such as a microprocessor, a storage element such as a ROM or RAM, a storage unit such as a hard disk drive, Good. The processor included in the main unit 20a may operate in accordance with a program installed in the main unit 20a.
 図5は、本実施形態に係る視線検出装置20で実装される機能の一例を示す機能ブロック図である。なお、本実施形態に係る視線検出装置20で、図5に示す機能のすべてが実装される必要はなく、また、図5に示す機能以外の機能が実装されていても構わない。 FIG. 5 is a functional block diagram showing an example of functions implemented by the visual axis detection device 20 according to the present embodiment. In the visual axis detection device 20 according to the present embodiment, not all the functions illustrated in FIG. 5 need to be implemented, and functions other than the functions illustrated in FIG. 5 may be implemented.
 図5に示すように、視線検出装置20は、機能的には例えば、赤外線出射部70、反射光検出部72、視線検出部74、を含んでいる。赤外線出射部70は例えば、赤外線LED20bを主として実装される。反射光検出部72は例えば、本体部20aのプロセッサ、カメラ20c、カメラ20dを主として実装される。視線検出部74は、例えば、本体部20aのプロセッサ及び記憶部を主として実装される。 As shown in FIG. 5, the visual axis detection device 20 functionally includes, for example, an infrared light emitting unit 70, a reflected light detection unit 72, and a visual axis detection unit 74. The infrared emitting unit 70 is mainly mounted with, for example, the infrared LED 20b. The reflected light detection unit 72 is mainly mounted with, for example, the processor of the main body unit 20a, the camera 20c, and the camera 20d. The gaze detection unit 74 is mainly implemented with, for example, the processor and the storage unit of the main body unit 20 a.
 また以上の機能は、コンピュータである視線検出装置20の本体部20aにインストールされた、以上の機能に対応する指令を含むプログラムを視線検出装置20の本体部20aに含まれるプロセッサで実行することにより実装されてもよい。このプログラムは、例えば、光ディスク、磁気ディスク、磁気テープ、光磁気ディスク、フラッシュメモリ等のコンピュータ読み取り可能な情報記憶媒体を介して、あるいは、インターネットなどを介して視線検出装置20の本体部20aに供給されてもよい。 Further, the above functions are executed by executing a program installed in the main body 20a of the sight line detection device 20, which is a computer, including a command corresponding to the above functions by a processor included in the main body 20a of the sight line detection device 20. It may be implemented. This program is supplied to the main body 20a of the visual axis detection device 20 via a computer readable information storage medium such as an optical disk, a magnetic disk, a magnetic tape, a magneto-optical disk, a flash memory or the Internet, for example. It may be done.
 赤外線出射部70は、本実施形態では例えば、光ファイバ40の一方の端面及び光ファイバ42の一方の端面に赤外線を入射させる。 In the present embodiment, the infrared emitting unit 70 causes infrared rays to be incident on one end surface of the optical fiber 40 and one end surface of the optical fiber 42, for example.
 反射光検出部72は、本実施形態では例えば、光ファイバ48の一方の端面52に入射され他方の端面56から出射される、ユーザ22の左眼での反射光を検出する。また反射光検出部72は、本実施形態では例えば、光ファイバ50の一方の端面54に入射され他方の端面58から出射される、ユーザ22の右眼での反射光を検出する。 In the present embodiment, the reflected light detection unit 72 detects, for example, the reflected light of the left eye of the user 22 that is incident on one end surface 52 of the optical fiber 48 and emitted from the other end surface 56. Further, in the present embodiment, for example, the reflected light detection unit 72 detects the reflected light of the right eye of the user 22 that is incident on one end surface 54 of the optical fiber 50 and emitted from the other end surface 58.
 また上述のように反射光検出部72は、カメラであってもよい。そして反射光検出部72が例えば、光ファイバ48の一方の端面52に入射され他方の端面56を撮影した左端面画像を生成してもよい。例えば反射光検出部72が、図4に示す左端面画像60を生成してもよい。また反射光検出部72が例えば、光ファイバ48の一方の端面52に入射され他方の端面56を撮影した右端面画像を生成してもよい。 Further, as described above, the reflected light detection unit 72 may be a camera. Then, the reflected light detection unit 72 may generate, for example, a left end surface image obtained by entering the one end surface 52 of the optical fiber 48 and capturing the other end surface 56. For example, the reflected light detection unit 72 may generate the left end surface image 60 shown in FIG. Further, for example, the reflected light detection unit 72 may generate a right end face image obtained by entering one end face 52 of the optical fiber 48 and capturing the other end face 56.
 視線検出部74は、本実施形態では例えば、反射光検出部72による検出の結果に基づいて、瞳の動きや視線の方向などといったユーザ22の視線を検出する。視線検出部74は、例えば、カメラ20cが光ファイバ48の他方の端面56を撮影した画像である左端面画像に基づいて、ユーザ22の左眼の視線を検出してもよい。また視線検出部74は、例えば、カメラ20dが光ファイバ50の他方の端面58を撮影した画像である右端面画像に基づいて、ユーザ22の右眼の視線を検出してもよい。 In the present embodiment, for example, the line-of-sight detection unit 74 detects the line of sight of the user 22, such as the movement of the pupil or the direction of the line of sight, based on the detection result by the reflected light detection unit 72. The line-of-sight detection unit 74 may detect the line of sight of the left eye of the user 22 based on, for example, a left end face image which is an image of the other end face 56 of the optical fiber 48 captured by the camera 20c. The line-of-sight detection unit 74 may detect the line of sight of the right eye of the user 22, for example, based on a right end face image which is an image of the other end face 58 of the optical fiber 50 captured by the camera 20d.
 例えば端面52a~52eのそれぞれに入射する赤外線の量は、ユーザ22の左眼の視線の向きに応じたものとなる。また左端面画像60に示されている端面56a~56eの明るさは、それぞれ、端面52a~52eに入射する赤外線の量に応じたものとなる。このことを踏まえて、本実施形態では視線検出部74が、端面56a~56eの像が示されている左端面画像に基づいて、ユーザ22の左眼の視線を検出する。同様に、本実施形態では視線検出部74が、右端面画像に基づいて、ユーザ22の右眼の視線を検出する。 For example, the amount of infrared rays incident on each of the end faces 52a to 52e corresponds to the direction of the line of sight of the left eye of the user 22. The brightness of the end faces 56a to 56e shown in the left end face image 60 corresponds to the amount of infrared rays incident on the end faces 52a to 52e, respectively. Based on this, in the present embodiment, the line-of-sight detection unit 74 detects the line of sight of the left eye of the user 22 based on the left end face image in which the images of the end faces 56a to 56e are shown. Similarly, in the present embodiment, the gaze detection unit 74 detects the gaze of the right eye of the user 22 based on the right end face image.
 ここで左端面画像や右端面画像に基づく視線の検出の方法は特に問わない。例えばディープラーニング等の機械学習の手法によりユーザ22の視線を検出してもよい。 Here, the method of detecting the sight line based on the left end face image and the right end face image is not particularly limited. For example, the line of sight of the user 22 may be detected by a machine learning method such as deep learning.
 例えば、端面56a~56eの像が示されている画像を学習入力データとして含み当該画像が撮影される際のユーザ22の左眼の視線の方向を示すデータを教師データとして含む学習データを学習した学習済モデルを視線検出部74が記憶してもよい。この場合は、当該学習済モデルは教師あり学習が実行された機械学習モデルであることとなる。そして視線検出部74は、この学習済モデルに左端面画像を入力した際の出力に基づいて、当該左端面画像が撮影された際のユーザ22の左眼の視線を検出(推定)してもよい。 For example, learning data including an image in which the images of the end faces 56a to 56e are shown as learning input data and learning data including data indicating the direction of the line of sight of the left eye of the user 22 when the image is photographed The gaze detection unit 74 may store the learned model. In this case, the learned model is a machine learning model in which supervised learning is performed. The line-of-sight detection unit 74 detects (estimates) the line of sight of the left eye of the user 22 when the left end plane image is captured based on the output when the left end plane image is input to the learned model. Good.
 また例えば、端面58a~58eの像が示されている画像を学習入力データとして含み当該画像が撮影される際のユーザ22の右眼の視線の方向を示すデータを教師データとして含む学習データを学習した学習済モデルを視線検出部74が記憶してもよい。この場合は、当該学習済モデルは教師あり学習が実行された機械学習モデルであることとなる。そして視線検出部74は、この学習済モデルに右端面画像を入力した際の出力に基づいて、当該右端面画像が撮影された際のユーザ22の右眼の視線を検出(推定)してもよい。 Also, for example, learning data is included that includes an image in which the images of the end faces 58a to 58e are shown as learning input data and data indicating the direction of the line of sight of the right eye of the user 22 when the image is taken The gaze detection unit 74 may store the learned model that has been learned. In this case, the learned model is a machine learning model in which supervised learning is performed. Then, the sight line detection unit 74 detects (estimates) the sight line of the right eye of the user 22 when the right end face image is captured based on the output when the right end face image is input to the learned model. Good.
 また例えば、視線検出部74は、端面52a~52eが配置されている位置と、左端面画像に示されている端面56a~56eの明るさを示す値とを対応付けたデータを生成してもよい。そして、視線検出部74は、当該データが示す、端面52a~52eが配置されている位置のそれぞれに対応付けられる明るさを示す値の分布に基づいて、ユーザ22の左眼の視線を検出してもよい。同様に、視線検出部74は、端面54a~54eが配置されている位置と、右端面画像に示されている端面58a~58eの明るさを示す値とを対応付けたデータを生成してもよい。そして、視線検出部74は、当該データが示す、端面54a~54eが配置されている位置のそれぞれに対応付けられる明るさを示す値の分布に基づいて、ユーザ22の右眼の視線を検出してもよい。 For example, even if the visual axis detection unit 74 generates data in which the positions at which the end faces 52a to 52e are arranged are associated with the values indicating the brightness of the end faces 56a to 56e shown in the left end face image. Good. Then, the line-of-sight detection unit 74 detects the line of sight of the left eye of the user 22 based on the distribution of values indicating the brightness associated with each of the positions where the end faces 52a to 52e indicated by the data. May be Similarly, even if the line-of-sight detection unit 74 generates data in which the positions at which the end faces 54a to 54e are arranged correspond to the values indicating the brightness of the end faces 58a to 58e shown in the right end face image. Good. Then, the line-of-sight detection unit 74 detects the line of sight of the right eye of the user 22 based on the distribution of values indicating the brightness associated with each of the positions where the end faces 54a to 54e indicated by the data. May be
 ここで本実施形態に係る視線検出装置20で行われる視線検出処理の流れの一例を、図6に例示するフロー図を参照しながら説明する。 Here, an example of the flow of the gaze detection process performed by the gaze detection apparatus 20 according to the present embodiment will be described with reference to the flowchart illustrated in FIG.
 まず、赤外線出射部70が、赤外線の出射タイミングを待機する(S101)。本実施形態では、例えば、1秒間隔で赤外線が出射されることとする。ここで赤外線の出射タイミングが到来すると、赤外線出射部70が、光ファイバ40の一方の端面及び光ファイバ42の一方の端面に向けて赤外線を出射する(S102)。 First, the infrared emitting unit 70 waits for the infrared emitting timing (S101). In the present embodiment, for example, infrared rays are emitted at an interval of one second. Here, when the emission timing of the infrared rays comes, the infrared emitting unit 70 emits the infrared rays toward one end face of the optical fiber 40 and one end face of the optical fiber 42 (S102).
 そして反射光検出部72が、左端面画像及び右端面画像を生成する(S103)。 Then, the reflected light detection unit 72 generates the left end surface image and the right end surface image (S103).
 そして視線検出部74が、S103に示す処理で生成された左端面画像に基づいて、ユーザ22の左眼の視線を検出し、S103に示す処理で生成された右端面画像に基づいて、ユーザ22の右眼の視線を検出する(S104)。そしてS101に示す処理に戻る。 Then, the line-of-sight detection unit 74 detects the line of sight of the left eye of the user 22 based on the left end surface image generated in the process shown in S103, and the user 22 based on the right end face image generated in the process shown in S103. The line of sight of the right eye of the camera is detected (S104). Then, the process returns to the process shown in S101.
 上述のように、S102に示す処理で光ファイバ40の一方の端面に入射された赤外線は、光ファイバ40の他方の端面44からユーザ22の左眼に向かって出射され、ユーザ22の左眼で反射される。またS102に示す処理で光ファイバ42の一方の端面に入射された赤外線は、光ファイバ42の他方の端面46からユーザ22の右眼に向かって出射され、ユーザ22の右眼で反射される。 As described above, the infrared ray incident on one end face of the optical fiber 40 in the process shown in S102 is emitted from the other end face 44 of the optical fiber 40 toward the left eye of the user 22, and the left eye of the user 22 It is reflected. The infrared ray incident on one end face of the optical fiber 42 in the process shown in S102 is emitted from the other end face 46 of the optical fiber 42 toward the right eye of the user 22 and is reflected by the right eye of the user 22.
 ユーザ22の左眼で反射された赤外線は、光ファイバ48a~48eの一方の端面52a~52eに入射する。またユーザ22の右眼で反射された赤外線は、光ファイバ50a~50eの一方の端面54a~54eに入射する。 The infrared light reflected by the left eye of the user 22 is incident on one end face 52a to 52e of the optical fibers 48a to 48e. The infrared light reflected by the right eye of the user 22 is incident on one end face 54a to 54e of the optical fibers 50a to 50e.
 そして光ファイバ48a~48eの一方の端面52a~52eに入射した赤外線は、光ファイバ48a~48eの他方の端面56a~56eから出射する。また光ファイバ50a~50eの一方の端面54a~54eに入射した赤外線は、光ファイバ50a~50eの他方の端面58a~58eから出射する。 The infrared rays incident on one end face 52a to 52e of the optical fibers 48a to 48e are emitted from the other end faces 56a to 56e of the optical fibers 48a to 48e. The infrared rays incident on one of the end faces 54a to 54e of the optical fibers 50a to 50e are emitted from the other end faces 58a to 58e of the optical fibers 50a to 50e.
 S103に示す処理では、このようにして赤外線が出射される端面56a~56eを撮影した左端面画像、及び、このようにして赤外線が出射される端面58a~58eを撮影した右端面画像が生成される。 In the process shown in S103, the left end face image obtained by photographing the end faces 56a to 56e from which infrared rays are emitted in this manner and the right end face image obtained from the end faces 58a to 58e from which infrared rays are emitted in this manner are generated. Ru.
 なお本処理例において、赤外線が出射されるタイミングに表示部12及び表示部14による映像の表示が抑制されるようにしてもよい。例えば表示部12及び表示部14のフレームレートが60fpsである場合に、60フレーム毎に映像の表示が抑制されるようにしてもよい。例えばこのようにすれば表示部12や表示部14が出力する映像がユーザ22の視線の検出に与える影響を抑制できる。 In the present processing example, the display of the image by the display unit 12 and the display unit 14 may be suppressed at the timing when the infrared light is emitted. For example, when the frame rate of the display unit 12 and the display unit 14 is 60 fps, the display of the image may be suppressed every 60 frames. For example, in this way, it is possible to suppress the influence of the image output from the display unit 12 or the display unit 14 on the detection of the user's line of sight.
 また例えば、所定の時間間隔(例えば1秒間隔)で表示部12及び表示部14がユーザに鑑賞させる映像の代わりに所定色の可視光を出力してもよい。そして反射光検出部72は、ユーザ22の左眼で反射される当該可視光を出射する端面56a~56eを撮影した左端面画像や、ユーザ22の右眼で反射される当該可視光を出射する端面58a~58eを撮影した右端面画像を生成してもよい。こうすれば赤外線LED20bを設けることなくユーザ22の視線の検出が可能となる。 Alternatively, for example, the display unit 12 and the display unit 14 may output visible light of a predetermined color at predetermined time intervals (for example, at intervals of 1 second) instead of an image to be viewed by the user. Then, the reflected light detection unit 72 emits the left end surface image obtained by photographing the end faces 56a to 56e emitting the visible light reflected by the left eye of the user 22, and the visible light reflected by the right eye of the user 22. A right end face image obtained by photographing the end faces 58a to 58e may be generated. This makes it possible to detect the line of sight of the user 22 without providing the infrared LED 20b.
 従来のHMD10における視線検出では、筐体24の内部に配置されたカメラが撮影したユーザ22の眼の画像に基づいてユーザ22の視線を検出していた。このようなHMD10では、筐体24の形状や大きさの制約からカメラをユーザ22の視線の検出に適した位置に配置できず、充分な精度でユーザ22の視線を検出できないことがあった。 In the line of sight detection in the conventional HMD 10, the line of sight of the user 22 is detected based on the image of the eye of the user 22 captured by a camera arranged inside the housing 24. In such an HMD 10, the camera can not be disposed at a position suitable for detecting the line of sight of the user 22 due to the restriction of the shape and size of the housing 24, and the line of sight of the user 22 may not be detected with sufficient accuracy.
 本実施形態に係る視線検出システム1では、ユーザ22の視線を検出するためのカメラをHMD10の筐体24の内部に配置する必要がない。そして本実施形態に係る視線検出システム1では、配置の自由度が高い光ファイバ48や光ファイバ50が導く光の検出結果に基づいてユーザ22の視線が検出される。このようにして本実施形態に係る視線検出システム1によれば、HMD10を装着するユーザ22の視線を的確に検出できることとなる。そしてこのようにして検出されるユーザ22の視線に応じた処理が、例えば、ゲームコンソールで実行されるゲームプログラムや、パーソナルコンピュータで実行されるプログラムにおいて実行されるようにしてもよい。 In the gaze detection system 1 according to the present embodiment, it is not necessary to dispose a camera for detecting the gaze of the user 22 inside the housing 24 of the HMD 10. Then, in the gaze detection system 1 according to the present embodiment, the gaze of the user 22 is detected based on the detection result of the light guided by the optical fiber 48 or the optical fiber 50 having a high degree of freedom of arrangement. Thus, according to the gaze detection system 1 according to the present embodiment, the gaze of the user 22 wearing the HMD 10 can be accurately detected. Then, a process according to the line of sight of the user 22 detected in this manner may be executed, for example, in a game program executed on the game console or a program executed on the personal computer.
 なお、本発明は上述の実施形態に限定されるものではない。 The present invention is not limited to the above-described embodiment.
 例えば反射光検出部72はカメラである必要はない。例えば、反射光検出部72が、赤外線の量を検出可能な光センサ等の受光素子であっても構わない。この場合、視線検出部74は、受光素子が検出する赤外線の量の検出結果に基づいて、ユーザ22の視線を検出してもよい。 For example, the reflected light detection unit 72 does not have to be a camera. For example, the reflected light detection unit 72 may be a light receiving element such as a light sensor capable of detecting the amount of infrared light. In this case, the sight line detector 74 may detect the sight line of the user 22 based on the detection result of the amount of infrared rays detected by the light receiving element.
 また視線検出部74が、5本の光ファイバが導く光の検出結果に基づいて、ユーザ22の視線を検出する必要はない。視線検出部74は、より多くの数の光ファイバ、あるいは、より少ない数の光ファイバが導く光の検出結果に基づいて、ユーザ22の視線を検出してもよい。 Further, it is not necessary for the sight line detection unit 74 to detect the sight line of the user 22 based on the detection result of the light guided by the five optical fibers. The line-of-sight detection unit 74 may detect the line of sight of the user 22 based on the detection result of the light guided by a larger number of optical fibers or a smaller number of optical fibers.
 また視線検出装置20の機能をHMD10が備えていてもよい。例えば、HMD10の接続部34に視線検出装置20が内蔵されていてもよい。また表示部12や表示部14がHMD10の接続部34に内蔵されていてもよい。こうすればユーザ22は自由に頭部を動かすことができる。 In addition, the HMD 10 may have the function of the visual axis detection device 20. For example, the line-of-sight detection device 20 may be incorporated in the connection unit 34 of the HMD 10. The display unit 12 or the display unit 14 may be incorporated in the connection unit 34 of the HMD 10. This allows the user 22 to move his head freely.
 また本発明は上述のHMD10を含む視線検出システム1に限定されない。HMDのなかには、ユーザが頭部に装着する筐体の前方に表示部を配置して、表示部に表示される映像をユーザが直接視覚できるようにしたものがある。また筐体の側面に表示部を配置し、筐体の前面に当該表示部から前方に出射される映像光を反射するミラーを配置することで、当該表示部から出力されミラーで反射される映像をユーザが視覚できるようにしたHMDもある。本発明はこのようなHMDを含む視線検出システムにも同様に適用可能である。 Further, the present invention is not limited to the gaze detection system 1 including the HMD 10 described above. Among the HMDs, there is one in which a display unit is disposed in front of a housing worn by the user on the head so that the user can directly view an image displayed on the display unit. Further, the display unit is disposed on the side of the housing, and a mirror that reflects the video light emitted forward from the display unit is disposed on the front of the housing, so that the video output from the display unit and reflected by the mirror There are also HMDs that allow the user to view The present invention is equally applicable to gaze detection systems including such HMDs.

Claims (4)

  1.  一方の端面が異なる位置にユーザの眼を向くよう配置され、ヘッドマウントディスプレイの筐体により支持される複数の光ファイバと、
     前記複数の光ファイバについての、前記一方の端面に入射され他方の端面から出射される、前記ユーザの眼での反射光を検出する反射光検出部と、
     前記検出の結果に基づいて、前記ユーザの視線を検出する視線検出部と、
     を含むことを特徴とするヘッドマウントディスプレイの視線検出システム。
    A plurality of optical fibers, one end face of which is arranged to face the user's eyes at different positions and supported by the housing of the head mounted display;
    A reflected light detection unit for detecting reflected light of the user's eye, which is incident on the one end face and emitted from the other end of the plurality of optical fibers;
    A gaze detection unit that detects the gaze of the user based on the result of the detection;
    An eye gaze detection system for a head mounted display, comprising:
  2.  前記反射光検出部は、前記複数の光ファイバの前記他方の端面を撮影するカメラであり、
     前記視線検出部は、前記カメラが前記他方の端面を撮影した画像に基づいて、前記ユーザの視線を検出する、
     ことを特徴とする請求項1に記載のヘッドマウントディスプレイの視線検出システム。
    The reflected light detection unit is a camera for photographing the other end surface of the plurality of optical fibers,
    The gaze detection unit detects the gaze of the user based on an image obtained by photographing the other end face of the camera.
    The eye gaze detection system of a head mounted display according to claim 1.
  3.  ユーザが装着する筐体と、
     一方の端面が異なる位置に前記ユーザの眼を向くよう配置され、前記筐体により支持される複数の光ファイバと、
     前記複数の光ファイバについての、前記一方の端面に入射され他方の端面から出射される、前記ユーザの眼での反射光を検出する反射光検出部と、
     前記検出の結果に基づいて、前記ユーザの視線を検出する視線検出部と、
     を含むことを特徴とするヘッドマウントディスプレイ。
    A housing that the user wears,
    A plurality of optical fibers, one end face of which is arranged to face the user's eyes at different positions, and supported by the housing;
    A reflected light detection unit for detecting reflected light of the user's eye, which is incident on the one end face and emitted from the other end of the plurality of optical fibers;
    A gaze detection unit that detects the gaze of the user based on the result of the detection;
    A head mounted display characterized by including.
  4.  一方の端面が異なる位置にユーザの眼を向くよう配置され、ヘッドマウントディスプレイの筐体により支持される複数の光ファイバについての、前記一方の端面に入射され他方の端面から出射される、前記ユーザの眼での反射光を検出するステップと、
     前記検出の結果に基づいて、前記ユーザの視線を検出するステップと、
     を含むことを特徴とするヘッドマウントディスプレイの視線検出方法。
    One of the end faces is arranged to face the user's eyes at a different position, and the plurality of optical fibers supported by the housing of the head mounted display are incident on the one end face and emitted from the other end face Detecting reflected light from the human eye;
    Detecting a line of sight of the user based on a result of the detection;
    And a visual line detection method for a head mounted display.
PCT/JP2018/036290 2017-10-05 2018-09-28 Sight-line detection system for head-mounted display, head-mounted display, and sight-line detection method for head-mounted display WO2019069812A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-194956 2017-10-05
JP2017194956 2017-10-05

Publications (1)

Publication Number Publication Date
WO2019069812A1 true WO2019069812A1 (en) 2019-04-11

Family

ID=65994606

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/036290 WO2019069812A1 (en) 2017-10-05 2018-09-28 Sight-line detection system for head-mounted display, head-mounted display, and sight-line detection method for head-mounted display

Country Status (1)

Country Link
WO (1) WO2019069812A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7538647B2 (en) 2020-07-27 2024-08-22 キヤノン株式会社 Gaze position processing device, imaging device, learning device, gaze position processing method, learning method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10232364A (en) * 1996-07-26 1998-09-02 Universal Ventures Focus adjustment by spatial relation of eyes
JP2000258724A (en) * 1999-03-09 2000-09-22 Agency Of Ind Science & Technol Picture display device
JP2016520891A (en) * 2013-03-15 2016-07-14 マジック リープ, インコーポレイテッド Display system and method
US20160363763A1 (en) * 2015-06-15 2016-12-15 Electronics And Telecommunications Research Institute Human factor-based wearable display apparatus
JP2017507400A (en) * 2013-12-31 2017-03-16 アイフルエンス, インコーポレイテッドEyefluence, Inc. System and method for media selection and editing by gaze

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10232364A (en) * 1996-07-26 1998-09-02 Universal Ventures Focus adjustment by spatial relation of eyes
JP2000258724A (en) * 1999-03-09 2000-09-22 Agency Of Ind Science & Technol Picture display device
JP2016520891A (en) * 2013-03-15 2016-07-14 マジック リープ, インコーポレイテッド Display system and method
JP2017507400A (en) * 2013-12-31 2017-03-16 アイフルエンス, インコーポレイテッドEyefluence, Inc. System and method for media selection and editing by gaze
US20160363763A1 (en) * 2015-06-15 2016-12-15 Electronics And Telecommunications Research Institute Human factor-based wearable display apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7538647B2 (en) 2020-07-27 2024-08-22 キヤノン株式会社 Gaze position processing device, imaging device, learning device, gaze position processing method, learning method, and program

Similar Documents

Publication Publication Date Title
US11402639B2 (en) Optical configurations for head-worn see-through displays
US12066635B2 (en) Stray light suppression for head worn computing
US11940629B2 (en) Optical configurations for head-worn see-through displays
US20220164809A1 (en) Sight information collection in head worn computing
US9746686B2 (en) Content position calibration in head worn computing
US10191279B2 (en) Eye imaging in head worn computing
US9288468B2 (en) Viewing windows for video streams
JP7207809B2 (en) Position tracking system for head-mounted displays containing sensor integrated circuits
EP3526967B1 (en) Non-planar computational displays
US20150294156A1 (en) Sight information collection in head worn computing
US20150241963A1 (en) Eye imaging in head worn computing
US20150205348A1 (en) Eye imaging in head worn computing
WO2015012280A1 (en) Sight line detection device
US10789782B1 (en) Image plane adjustment in a near-eye display
US12108989B2 (en) Eye imaging in head worn computing
WO2019069812A1 (en) Sight-line detection system for head-mounted display, head-mounted display, and sight-line detection method for head-mounted display
CN118591753A (en) Display system with a raster oriented to reduce the occurrence of ghost images
US10534156B2 (en) Devices and methods for lens alignment based on encoded color patterns
US10620432B1 (en) Devices and methods for lens position adjustment based on diffraction in a fresnel lens
US10520729B1 (en) Light scattering element for providing optical cues for lens position adjustment
JP2019066787A (en) Head-mounted display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18864291

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18864291

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP