WO2022044395A1 - Sight line detection device, sight line detection method, and sight line detection program - Google Patents

Sight line detection device, sight line detection method, and sight line detection program Download PDF

Info

Publication number
WO2022044395A1
WO2022044395A1 PCT/JP2021/010240 JP2021010240W WO2022044395A1 WO 2022044395 A1 WO2022044395 A1 WO 2022044395A1 JP 2021010240 W JP2021010240 W JP 2021010240W WO 2022044395 A1 WO2022044395 A1 WO 2022044395A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
light
center
detection
subject
Prior art date
Application number
PCT/JP2021/010240
Other languages
French (fr)
Japanese (ja)
Inventor
美玖 小村田
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2022044395A1 publication Critical patent/WO2022044395A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Definitions

  • the present disclosure relates to a line-of-sight detection device, a line-of-sight detection method, and a line-of-sight detection program.
  • the light source emits the detected light to irradiate the subject's eyeball, and an image of the eyeball irradiated with the detected light is acquired. Based on the image of the pupil and the reflected image of the detected light in the acquired image, the center of the pupil and the cornea A line-of-sight detection device that calculates the center of curvature and detects a vector from the center of corneal curvature to the center of the pupil as the line-of-sight direction of the subject is known (see, for example, Patent Document 1).
  • the reflected image of the detection light emitted to the subject's eyeball exists at the boundary between the cornea and the sclera. There is. In this case, the shape of the reflected image of the detected light is distorted, so that the detection accuracy of the line of sight may decrease.
  • the present disclosure has been made in view of the above, and an object of the present disclosure is to provide a line-of-sight detection device, a line-of-sight detection method, and a line-of-sight detection program capable of suppressing a decrease in detection accuracy.
  • the line-of-sight detection device captures an image of a display unit that displays an image, a plurality of light sources that emit detection light to irradiate at least one eyeball of a subject, and an eyeball irradiated with the detection light.
  • An image pickup unit a position detection unit that detects the position of the center of the pupil indicating the center of the pupil of the eyeball irradiated with the detection light and the position of the center of the corneal reflection indicating the center of the corneal reflection from the captured image, and a position detection unit.
  • the gaze point detector that calculates the position of the gaze point of the subject based on the position of the center of the pupil and the position of the center of curvature of the corneal, and the gaze point detection unit that controls the light emission and non-emission of the plurality of light sources to control the light emission and non-emission of the light source of the subject.
  • a light source control unit for switching the light source that emits the detected light among the plurality of the light sources according to the position of the viewpoint is provided.
  • the line-of-sight detection method includes displaying an image on a display unit, emitting detection light from a plurality of light sources to irradiate at least one eyeball of a subject, and the eyeball irradiated with the detection light.
  • the position of the gaze point of the subject is calculated based on the position of the center of the pupil and the position of the center of curvature of the corneal, and the light emission and non-emission of the plurality of light sources are controlled to control the gaze point of the subject. It includes switching the light source that emits the detected light among the plurality of the light sources according to the position.
  • the line-of-sight detection program includes a process of displaying an image on a display unit, a process of emitting detection light from a plurality of light sources to irradiate at least one eyeball of a subject, and a process of irradiating the pupil with the detection light.
  • the process of capturing an image and the process of detecting from the captured image the position of the center of the pupil indicating the center of the pupil of the eyeball irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex.
  • the computer is made to execute a process of switching the light source that emits the detected light among the plurality of the light sources according to the position.
  • FIG. 1 is a perspective view schematically showing an example of a line-of-sight detection device according to the present embodiment.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the line-of-sight detection device according to the present embodiment.
  • FIG. 3 is a functional block diagram showing an example of the line-of-sight detection device according to the present embodiment.
  • FIG. 4 is a diagram showing an example in which the eyeball is illuminated by the first light source and the second light source.
  • FIG. 5 is a schematic diagram for explaining the principle of the calibration process according to the present embodiment.
  • FIG. 6 is a schematic diagram for explaining the principle of the line-of-sight detection process according to the present embodiment.
  • FIG. 1 is a perspective view schematically showing an example of a line-of-sight detection device according to the present embodiment.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the line-of-sight detection device according to the present embodiment.
  • FIG. 3 is a functional block diagram showing an
  • FIG. 7 is a diagram showing an example of an eyeball on which a reflected image of the detected light is formed.
  • FIG. 8 is a diagram showing another example of the eyeball in which the reflected image of the detected light is formed.
  • FIG. 9 is a diagram showing an example of the range of the gazing point formed so that the reflected image of the detected light fits on the cornea of the subject for each light source.
  • FIG. 10 is a diagram showing an example of the operation of the display unit and the lighting device in the calibration process.
  • FIG. 11 is a diagram showing an example of the operation of the display unit and the lighting device in the calibration process.
  • FIG. 12 is a diagram showing an example of the operation of the display unit and the lighting device in the line-of-sight detection process.
  • FIG. 13 is a diagram showing an example of the operation of the display unit and the lighting device in the line-of-sight detection process.
  • FIG. 14 is a diagram showing another example of the operation of the display unit and the lighting device in the line-of-sight detection process.
  • FIG. 15 is a diagram showing another example of the operation of the display unit and the lighting device in the line-of-sight detection process.
  • FIG. 16 is a flowchart showing an example of the calibration process in the line-of-sight detection method according to the present embodiment.
  • FIG. 17 is a flowchart showing an example of the line-of-sight detection process in the line-of-sight detection method according to the present embodiment.
  • the direction parallel to the first axis of the predetermined surface is the X-axis direction
  • the direction parallel to the second axis of the predetermined surface orthogonal to the first axis is the Y-axis direction
  • the directions are orthogonal to each of the first axis and the second axis.
  • the direction parallel to the third axis is the Z-axis direction.
  • the predetermined plane includes an XY plane.
  • FIG. 1 is a perspective view schematically showing an example of the line-of-sight detection device 100 according to the present embodiment.
  • the line-of-sight detection device 100 includes a display unit 101, a stereo camera device 102, and a lighting device 103.
  • the display unit 101 includes a flat panel display such as a liquid crystal display (LCD) or an organic electroluminescence display (OLED).
  • the display unit 101 displays an image.
  • the display unit 101 displays, for example, an index for evaluating the visual function of the subject.
  • the display unit 101 is substantially parallel to the XY plane.
  • the X-axis direction is the left-right direction of the display unit 101
  • the Y-axis direction is the vertical direction of the display unit 101
  • the Z-axis direction is the depth direction orthogonal to the display unit 101.
  • the stereo camera device 102 has a first camera 102A and a second camera 102B.
  • the stereo camera device 102 is arranged below the display unit 101.
  • the first camera 102A and the second camera 102B are arranged in the X-axis direction.
  • the first camera 102A is arranged in the ⁇ X direction with respect to the second camera 102B.
  • Each of the first camera 102A and the second camera 102B includes an infrared camera, and has, for example, an optical system capable of transmitting near-infrared light having a wavelength of 850 [nm] and an image pickup element capable of receiving the near-infrared light. ..
  • the lighting device (light source) 103 has a first light source (lower first light source) 103A, a second light source (lower second light source) 103B, and a third light source (upper light source) 103C.
  • the first light source 103A and the second light source 103B are arranged below the display unit 101.
  • the third light source 103C is arranged above the display unit 101.
  • the first light source 103A and the second light source 103B are arranged in the X-axis direction.
  • the first light source 103A is arranged in the ⁇ X direction with respect to the first camera 102A.
  • the second light source 103B is arranged in the + X direction with respect to the second camera 102B.
  • the third light source 103C is arranged at a position between the first camera 102A and the second camera 102B in the X direction.
  • the third light source 103C is arranged at a position corresponding to the center of the display unit 101 in the X direction.
  • the first light source 103A, the second light source 103B, and the third light source 103C each include an LED (light emission diode) light source, and can emit near-infrared light having a wavelength of, for example, 850 [nm].
  • the first light source 103A and the second light source 103B may be arranged between the first camera 102A and the second camera 102B. Further, the stereo camera device 102 may be arranged above the display unit 101.
  • the lighting device 103 emits near-infrared light, which is the detection light, to illuminate the subject's eyeball 111.
  • the stereo camera device 102 when the detection light emitted by the light emission of the first light source 103A is applied to the eyeball 111, a part of the eyeball 111 (hereinafter, including this is referred to as an “eyeball”” by the second camera 102B.
  • the detection light emitted by the second light source 103B emitting light is applied to the eyeball 111
  • the first camera 102A photographs the eyeball 111.
  • the stereo camera device 102 photographs the eyeball 111 with the first camera 102A and the second camera 102B when the detection light emitted by the light emission of the third light source 103C is applied to the eyeball 111.
  • a frame synchronization signal is output from at least one of the first camera 102A and the second camera 102B.
  • the first light source 103A, the second light source 103B, and the third light source 103C emit the detection light based on the frame synchronization signal.
  • the first camera 102A captures the image data of the eyeball 111 when the detection light emitted from the second light source 103B irradiates the eyeball 111.
  • the second camera 102B captures the image data of the eyeball 111 when the detection light emitted from the first light source 103A irradiates the eyeball 111.
  • the first camera 102A and the second camera 102B alternately capture the image data of the eyeball 111 when the detection light emitted from the third light source 103C is applied to the eyeball 111.
  • the eyeball 111 When the eyeball 111 is irradiated with the detected light, a part of the detected light is reflected by the pupil 112, and the light from the pupil 112 is incident on the stereo camera device 102. Further, when the eyeball 111 is irradiated with the detection light, a corneal reflex image 113, which is a virtual image of the cornea, is formed on the eyeball 111, and the light from the corneal reflex image 113 is incident on the stereo camera device 102.
  • the light incident on the stereo camera device 102 from the pupil 112 The intensity becomes low, and the intensity of the light incident on the stereo camera device 102 from the corneal reflection image 113 becomes high. That is, the image of the pupil 112 taken by the stereo camera device 102 has low brightness, and the image of the corneal reflex image 113 has high brightness.
  • the stereo camera device 102 can detect the position of the pupil 112 and the position of the corneal reflex image 113 based on the brightness of the image to be captured.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the line-of-sight detection device 100 according to the present embodiment.
  • the line-of-sight detection device 100 includes a display unit 101, a stereo camera device 102, a lighting device 103, a computer system (control unit) 20, an input / output interface device 30, a drive circuit 40, and the like. It includes an output device 50 and an input device 60.
  • the computer system 20 includes an arithmetic processing unit 20A and a storage device 20B.
  • the arithmetic processing unit 20A includes a microprocessor such as a CPU (central processing unit).
  • the storage device 20B includes a memory or storage such as a ROM (read only memory) and a RAM (random access memory).
  • the arithmetic processing unit 20A performs arithmetic processing according to the computer program 20C stored in the storage apparatus 20B.
  • the drive circuit 40 generates a drive signal and outputs it to the display unit 101, the stereo camera device 102, and the lighting device 103. Further, the drive circuit 40 supplies the image data of the eyeball 111 taken by the stereo camera device 102 to the computer system 20 via the input / output interface device 30.
  • the output device 50 includes a display unit such as a flat panel display.
  • the output device 50 may include a printing device.
  • the input device 60 generates input data by being operated.
  • the input device 60 includes a keyboard or mouse for a computer system.
  • the input device 60 may include a touch sensor provided on the display screen of the output device 50 which is a display unit.
  • the display unit 101 and the computer system 20 are separate devices.
  • the display unit 101 and the computer system 20 may be integrated.
  • the tablet-type personal computer may be equipped with a computer system 20, an input / output interface device 30, a drive circuit 40, and a display unit 101.
  • FIG. 3 is a functional block diagram showing an example of the line-of-sight detection device 100 according to the present embodiment.
  • the input / output interface device 30 has an input / output unit 302.
  • the drive circuit 40 generates a display device drive unit 402 that generates a drive signal for driving the display unit 101 and outputs the drive signal to the display unit 101, and a first camera that generates a drive signal for driving the first camera 102A.
  • the first camera input / output unit 404A that outputs to 102A
  • the second camera input / output unit 404B that generates a drive signal for driving the second camera 102B and outputs it to the second camera 102B, the first light source 103A, and the first It has a light source driving unit 406 that generates a drive signal for driving the two light source 103B and the third light source 103C and outputs the drive signal to the first light source 103A, the second light source 103B, and the third light source 103C.
  • the first camera input / output unit 404A supplies the image data of the eyeball 111 taken by the first camera 102A to the computer system 20 via the input / output unit 302.
  • the second camera input / output unit 404B supplies the image data of the eyeball 111 taken by the second camera 102B to the computer system 20 via the input / output unit 302.
  • the computer system 20 controls the line-of-sight detection device 100.
  • the computer system 20 includes a display control unit 21, a light source control unit 22, an image data acquisition unit 23, a position detection unit 24, a curvature center calculation unit 25, a gaze point detection unit 26, and a storage unit 27. It has an output control unit 28.
  • the functions of the computer system 20 are exhibited by the arithmetic processing unit 20A and the storage device 20B.
  • the display control unit 21 causes the display unit 101 to display an image to be shown to the subject.
  • the display control unit 21 can display, for example, a target image in the calibration process at a plurality of positions (target positions) of the display unit 101.
  • the display control unit 21 may sequentially switch the target image to a plurality of target positions one by one and display the target image, or may display the target image so as to sequentially move to the plurality of target positions in the display unit 101. ..
  • the number of target positions for displaying the target image can be set by, for example, inputting by an operator using an input device 60 or the like.
  • the light source control unit 22 controls the light source drive unit 406 to control light emission and non-light emission of the first light source 103A, the second light source 103B, and the third light source 103C.
  • the light source control unit 22 switches the light source that emits the detected light from the first light source 103A, the second light source 103B, and the third light source 103C according to the position of the gazing point of the subject.
  • the light source control unit 22 emits the detected light from the first light source 103A and the second light source 103B when the position of the gazing point of the subject is in the lower region of the display unit 101.
  • the light source control unit 22 controls the first light source 103A and the second light source 103B so that the first light source 103A and the second light source 103B emit the detected light at different timings.
  • the light source control unit 22 causes the first light source 103A and the second light source 103B to emit light alternately.
  • the light source control unit 22 emits the detected light from the third light source 103C when the position of the gazing point of the subject is in the upper region of the display unit 101.
  • the image data acquisition unit 23 acquires image data of the subject's eyeball 111 taken by the stereo camera device 102 including the first camera 102A and the second camera 102B from the stereo camera device 102 via the input / output unit 302.
  • the position detection unit 24 detects the position data at the center of the pupil based on the image data of the eyeball 111 acquired by the image data acquisition unit 23. Further, the position detection unit 24 detects the position data of the corneal reflex center based on the image data of the eyeball 111 acquired by the image data acquisition unit 23.
  • the center of the pupil is the center of the pupil 112.
  • the corneal reflex center is the center of the corneal reflex image 113.
  • the position detection unit 24 detects the position data of the center of the pupil and the position data of the center of the corneal reflex for each of the left and right eyeballs 111 of the subject.
  • the curvature center calculation unit 25 calculates the position data of the corneal curvature center of the eyeball 111 based on the image data of the eyeball 111 acquired by the image data acquisition unit 23.
  • the gaze point detection unit 26 detects the position data of the gaze point of the subject based on the image data of the eyeball 111 acquired by the image data acquisition unit 23.
  • the position data of the gazing point means the position data of the intersection of the line-of-sight vector of the subject defined by the three-dimensional global coordinate system and the display unit 101.
  • the gazing point detection unit 26 detects the line-of-sight vectors of the left and right eyeballs 111 of the subject based on the position data of the center of the pupil and the position data of the center of curvature of the cornea acquired from the image data of the eyeballs 111. After the line-of-sight vector is detected, the gaze point detection unit 26 detects the position data of the gaze point indicating the intersection of the line-of-sight vector and the display unit 101.
  • the storage unit 27 stores various data and programs related to the above-mentioned line-of-sight detection.
  • the storage unit 27 can store, for example, data about an image to be displayed on the display unit 101 for each color and brightness of the background image. Further, the storage unit 27 stores the position data of the gazing point calculated in each calibration process.
  • the storage unit 27 takes a process of displaying an image on the display unit, a process of emitting detection light from a plurality of light sources to irradiate at least one eyeball of the subject, and an image of the eyeball irradiated with the detection light.
  • the input / output control unit 28 outputs data to at least one of the display unit 101 and the output device 50.
  • the eyeball 111 is illuminated by the first light source 103A and the second light source 103B and the eyeball 111 is photographed by two cameras, the first camera 102A and the second camera 102B, will be described.
  • the case is not limited to the case where the light source and the camera are two, and the same explanation can be made when the light source and the camera are one.
  • the principle of the line-of-sight detection method according to the present embodiment will be described.
  • FIG. 4 is a diagram showing an example in which the eyeball 111 is illuminated by the first light source 103A and the second light source 103B.
  • the first camera 102A and the second light source 103B and the second camera 102B and the first light source 103A have an intermediate position between the first camera 102A and the second camera 102B. It is placed symmetrically with respect to the straight line passing through. It can be considered that the virtual light source (reference position of the light source) 103V exists at the intermediate position between the first camera 102A and the second camera 102B.
  • the corneal reflex center 121 indicates the corneal reflex center in the image obtained by taking the eyeball 111 with the second camera 102B.
  • the corneal reflex center 122 indicates the corneal reflex center in the image obtained by taking the eyeball 111 with the first camera 102A.
  • the corneal reflex center 124 indicates a corneal reflex center corresponding to the virtual light source 103V.
  • the position data of the corneal reflex center 124 is calculated based on the position data of the corneal reflex center 121 and the position data of the corneal reflex center 122 taken by the stereo camera device 102.
  • the stereo camera device 102 detects the position data of the corneal reflex center 121 and the position data of the corneal reflex center 122 in the three-dimensional local coordinate system defined by the stereo camera device 102.
  • the stereo camera device 102 is subjected to camera calibration by the stereo calibration method in advance, and conversion parameters for converting the three-dimensional local coordinate system of the stereo camera device 102 into the three-dimensional global coordinate system are calculated.
  • the conversion parameter is stored in the storage unit 27.
  • the curvature center calculation unit 25 converts the position data of the corneal reflex center 121 and the position data of the corneal reflex center 122 taken by the stereo camera device 102 into position data in the three-dimensional global coordinate system using conversion parameters.
  • the curvature center calculation unit 25 calculates the position data of the corneal reflex center 124 in the three-dimensional global coordinate system based on the position data of the corneal reflex center 121 and the position data of the corneal reflex center 122 defined in the three-dimensional global coordinate system. do.
  • the corneal curvature center 110 exists on a straight line 123 connecting the virtual light source 103V and the corneal reflection center 124.
  • the curvature center calculation unit 25 calculates a position on the straight line 123 where the distance from the corneal reflex center 124 is a predetermined value as the position of the corneal curvature center 110.
  • a corneal radius of curvature 109 is used as the predetermined value.
  • the corneal radius of curvature 109 is the distance between the corneal surface and the corneal curvature center 110.
  • a value predetermined from a general value of the radius of curvature of the cornea can be used.
  • the same method as described above is performed by arranging the third light source 103C at an intermediate position in the X direction between the first camera 102A and the second camera 102B in the upper part of the display unit 101. Calculates the center of curvature of the cornea 110.
  • FIG. 5 is a schematic diagram for explaining the principle of the calibration process according to the present embodiment.
  • the target position 130 is set so that the subject can gaze.
  • the target position 130 is defined in the three-dimensional global coordinate system.
  • the display control unit 21 displays the target image at the set target position 130.
  • the first light source 103A and the second light source 103B illuminate the eyeball 111.
  • the first camera 102A and the second camera 102B photograph the eyeball 111.
  • the detection light is emitted from the first light source 103A
  • the eyeball 111 is photographed by the second camera 102B.
  • the detection light is emitted from the second light source 103B
  • the eyeball 111 is photographed by the first camera 102A.
  • the third light source 103C is used instead of the first light source 103A and the second light source 103B
  • the detection light is emitted from the third light source 102C, for example, at different timings in the first camera 102A and the second camera 102B.
  • the eyeball 111 is photographed.
  • the position detection unit 24 detects the position data of the pupil center 112C and the position data of the corneal reflex center 113C based on the image data of the eyeball 111 acquired by the image data acquisition unit 23.
  • the position detection unit 24 converts each detected position data into a global coordinate system.
  • the curvature center calculation unit 25 obtains the position data of the virtual light source 103V, the position data of the target position 130, and the position data of the pupil center 112C.
  • the position data of the corneal curvature center 110 is calculated based on the position data of the corneal reflection center 113C.
  • the curvature center calculation unit 25 obtains a first straight line 141 connecting the virtual light source 103V or the third light source 102C and the corneal reflection center 113C.
  • the curvature center calculation unit 25 uses the position data of the third light source 103C instead of the position data of the virtual light source 103V to obtain the position data of the corneal curvature center 110. Is calculated. Specifically, the curvature center calculation unit 25 obtains a first straight line connecting the third light source 103C and the corneal reflection center 113C.
  • the curvature center calculation unit 25 obtains a second straight line 142 connecting the target position 130 and the pupil center 112C.
  • the curvature center calculation unit 25 obtains the intersection of the first straight line 141 and the second straight line 142 as the position data of the corneal curvature center 110.
  • the curvature center calculation unit 25 calculates the distance 127 between the corneal curvature center 110 and the pupil center 112C, and stores it in the storage unit 27 as calibration data.
  • FIG. 6 is a schematic diagram for explaining the principle of the line-of-sight detection process according to the present embodiment.
  • the eyeball 111 is illuminated by using the first light source 103A and the second light source 103B, or by using the third light source 103C, as in the calibration process.
  • the first camera 102A and the second camera 102B photograph the eyeball 111.
  • the position detection unit 24 detects the position data of the pupil center 112C and the position data of the corneal reflex center 113C based on the image data of the eyeball 111 acquired by the image data acquisition unit 23.
  • the curvature center calculation unit 25 When the detection light is irradiated using the first light source 103A and the second light source 103B, the curvature center calculation unit 25 includes the position data of the virtual light source 103V, the position data of the pupil center 112C, and the position of the corneal reflection center 113C.
  • the position data of the corneal curvature center 110 is calculated based on the data and the distance 127 between the corneal curvature center 110 and the pupil center 112C calculated in the calibration process. Specifically, the curvature center calculation unit 25 obtains a straight line 173 connecting the virtual light source 103V and the corneal reflection center 113C.
  • the curvature center calculation unit 25 uses the position data of the third light source 103C instead of the position data of the virtual light source 103V to obtain the position data of the third light source 103C. Calculate the position data. That is, the curvature center calculation unit 25 obtains a straight line connecting the third light source 103C and the corneal reflection center 113C.
  • the curvature center calculation unit 25 obtains a position separated from the inside of the pupil 111 by a distance corresponding to a distance 127 as the position data of the corneal curvature center 110.
  • the gazing point detection unit 26 obtains a straight line 178 connecting the pupil center 112C and the corneal curvature center 110, and calculates the position data of the intersection 166 between the straight line 178 and the display unit 101 as the gazing point position data.
  • FIG. 7 is a diagram showing an example of an eyeball on which a reflected image of the detected light is formed.
  • the reflected image 113 of the detection light emitted to the subject's eyeball 111 is present on the cornea 111a, such as when the subject looks at a position not so far from the virtual light source 103V or the third light source 103C.
  • the shape of the reflection image 113 is, for example, a substantially circular shape with a small flattening ratio, the position data of the corneal reflection center can be detected with high accuracy.
  • FIG. 8 is a diagram showing another example of the eyeball in which the reflected image of the detected light is formed.
  • the reflected image 113 of the detection light emitted to the subject's eyeball 111 is the cornea 111a and the sclera. It may exist at the boundary with 111b.
  • the shape of the reflected image 113 of the detected light is distorted due to the difference in the radius of curvature and the reflectance, for example, an elliptical shape having a large flatness. Therefore, the detection accuracy of the position data of the corneal reflex center may decrease, and the detection accuracy of the line of sight may decrease.
  • a light source that emits the detected light is provided so that the reflected image of the detected light is formed at a position that does not protrude from the cornea 111a of the subject according to the position of the gazing point of the subject. Controls switching.
  • the position where the reflected image of the detected light is formed is determined by, for example, the position of the subject's eyeball and the positional relationship between the camera and the light source.
  • FIG. 9 is a note formed so that the reflected image 113 of the detected light fits on the cornea 111a of the subject when the position of the eyeball of the subject and the distance between the eyeball of the subject and each light source are constant.
  • the gazing point range of the first light source 103A (hereinafter referred to as the first gazing point range) Sa and the gazing point range of the second light source 103B (hereinafter referred to as the second gazing point range) Sb.
  • the gaze point range (hereinafter referred to as the third gaze point range) Sc of the third light source 103C are elliptical regions centered on each light source.
  • the overlapping range Sd in which the first gazing point range Sa and the second gazing point range Sb overlap is at least the central portion and the lower half region of the display unit 101 (hereinafter, referred to as a lower region). ) 101D is included. Therefore, when the gazing point of the subject is present in the central portion and the lower region 101D of the display unit 101, the reflected image 113 of the detection light emitted by both the first light source 103A and the second light source 103B is the cornea of the subject, respectively.
  • the alternate long and short dash line indicating the overlapping region Sd is shown shifted inward with respect to the solid line indicating the outer circumference of the display unit 101.
  • the overlapping region Sd can be a region including the outer periphery of the display unit 101.
  • the third gaze point range Sc includes at least the central portion of the display unit 101 and the upper half region (hereinafter referred to as the upper region) 101U. Therefore, when the gazing point of the subject is present in the central portion of the display unit 101 and the upper region 101U, the reflected image 113 of the detection light emitted by the third light source 103C is formed on the cornea 111a of the subject.
  • FIGS. 10 and 11 are diagrams showing an example of the operation of the display unit 101 and the lighting device 103 in the calibration process.
  • calibration processing is performed for each of the case where the detection light is emitted from the first light source 103A and the second light source 103B and the case where the detection light is emitted from the third light source 103C.
  • the display control unit 21 is, for example, among the display units 101, the overlapping range Sd and the second light source 103 shown in FIG. 3 Control is performed so that the target image M is displayed in the central portion included in both the gazing point range Sc.
  • the display control unit 21 may be controlled to display the target image M in, for example, the lower region of the display unit 101.
  • the display control unit 21 when the detection light is emitted from the third light source 103C in the calibration process, the display control unit 21 emits the detection light from the first light source 103A and the second light source 103B in the same manner as in the case of emitting the detection light from the first light source 103A and the second light source 103B.
  • the target image M is controlled to be displayed in the central portion of the display unit 101.
  • the display control unit 21 may control the target image M to be displayed in the upper region of the display unit 101, for example.
  • the image data acquisition unit 23 acquires image data of the left and right eyeballs.
  • the position detection unit 24 detects the position data at the center of the pupil and the position data at the center of the corneal reflex, and converts each position data into a global coordinate (world coordinate) system.
  • the curvature center calculation unit 25 connects the virtual light source 103V and the corneal reflection center 113C. One straight line is obtained, a second straight line connecting the target position 130 and the pupil center 112C is obtained, and the intersection of the first straight line and the second straight line is obtained as the position data of the corneal curvature center 110. Then, the curvature center calculation unit 25 calculates the distance Ra between the corneal curvature center 110 and the pupil center 112C. The curvature center calculation unit 25 stores the calculated value of the distance Ra in the storage unit 27 as calibration data when the detected light is emitted from the lower light source.
  • the curvature center calculation unit 25 is the first straight line connecting the third light source 103C and the corneal reflection center 113C. Is obtained, a second straight line connecting the target position 130 and the pupil center 112C is obtained, and the intersection of the first straight line and the second straight line is obtained as the position data of the corneal curvature center 110. Then, the curvature center calculation unit 25 calculates the distance Rb between the corneal curvature center 110 and the pupil center 112C. The curvature center calculation unit 25 stores the calculated value of the distance Rb in the storage unit 27 as calibration data when the detected light is emitted from the upper light source.
  • the distance Ra when the light source that emits the detection light is the lower light source and the distance Rb when the light source that emits the detection light is the upper light source.
  • the distance (distance between the center of curvature of the corneum 110 and the center of the pupil 112C) is calculated.
  • FIGS. 12 and 13 are diagrams showing an example of the operation of the display unit 101 and the lighting device 103 in the line-of-sight detection process.
  • the light source control unit 22 controls so that the detected light is emitted from, for example, the first light source 103A and the second light source 103B.
  • the light source control unit 22 In the detection of the gaze point from the next time onward, as shown in FIG. 12, for example, when the position of the gaze point of the previous subject exists in the lower region 101D of the display unit 101, the light source control unit 22 is the lower light source. The detection light is controlled to be emitted from the first light source 103A and the second light source 103B. Further, as shown in FIG. 13, for example, when the position of the gaze point of the previous subject exists in the upper region 101U of the display unit 101, the light source control unit 22 emits the detected light to the third light source 103C which is the upper light source. Control to do.
  • the detection light is controlled to be emitted from the first light source 103A and the second light source 103B, but when the gazing point cannot be detected normally, the light source that emits the detection light is the third light source.
  • the light source 103C may be changed to perform detection again.
  • the detection light is controlled to be emitted from the third light source 103C, but when the gazing point cannot be detected normally, the light sources that emit the detection light are the first light source 103A and the second light source 103A.
  • the light source 103B may be changed to perform detection again.
  • the light source control unit 22 causes one of the first light source 103A and the second light source 103B to emit light to irradiate the eyeball 111 with the detected light, and the light source of the first camera 102A and the second camera 102B is farther from the light source.
  • the eyeball 111 of the subject is photographed by the camera of the subject.
  • the other of the first light source 103A and the second light source 103B is made to emit light to irradiate the eyeball 111 with the detected light, and the eyeball of the subject is taken by the camera of the first camera 102A and the second camera 102B which is far from the emitted light source. Take a picture of 111.
  • the eyeball 111 is photographed by the second camera 102B. Further, when the detection light is emitted from the second light source 103B, the eyeball 111 is photographed by the first camera 102A.
  • the image data acquisition unit 23 acquires image data.
  • the position detection unit 24 detects the position data of the center of the pupil and the center of the corneal reflex based on the acquired image data.
  • the position detection unit 24 determines whether or not the position data of the corneal reflex center is normally detected. If the reflected image of the detection light is contained on the subject's cornea, it is more likely that a normal value will be detected. On the other hand, when the reflected image of the detected light does not fit on the cornea of the subject and is distorted by protruding onto the sclera, for example, the possibility that a normal value is detected is low.
  • the position data of the gazing point is acquired by each process in the curvature center calculation unit 25 and the gazing point detecting unit 26. If a normal value is not detected, the gaze point detection unit 26 can make an error determination, for example, in the line-of-sight detection process.
  • the curvature center calculation unit 25 calculates the corneal curvature center based on the detected value.
  • the curvature center calculation unit 25 is calculated in the calibration process when the light source that emits the detected light in the line-of-sight detection process is the lower light source (first light source 103A and second light source 103B).
  • the center of corneal curvature is calculated using the value of the distance Ra calculated when the light source that emits the detected light is the lower light source among the types of distances (distance between the center of corneal curvature and the center of the pupil) Ra and Rb. ..
  • the curvature center calculation unit 25 uses the light source that emits the detected light in the calibration process as the upper light source.
  • the center of curvature of the cornea is calculated using the value of the distance Rb calculated in the case.
  • FIG. 14 is a diagram showing another example of the operation of the display unit 101 and the lighting device 103 in the line-of-sight detection process.
  • the light source control unit 22 predicts the position of the next subject's gaze point based on the transition of the gaze point position of the subject before the previous time, and emits the detected light based on the prediction result.
  • the light source can be selected from a first light source 103A and a second light source 103B which are lower light sources and a third light source 103C which is an upper light source. For example, if the gaze point before the previous time is moving upward in the lower region 101D and it is predicted that the next gaze point will be placed in the upper region 101U, the light source control unit 22 will perform the next gaze.
  • the detection light At the time of detecting the viewpoint, it is possible to control the detection light to be emitted from the third light source 103C which is the upper light source. Further, when the gaze point before the previous time is moving downward in the upper region 101U and it is predicted that the next gaze point will be arranged in the lower region 101D, the light source control unit 22 will perform the next time. At the time of detecting the gazing point, it is possible to control the detection light to be emitted from the first light source 103A and the second light source 103B, which are the lower light sources.
  • FIG. 15 is a diagram showing another example of the operation of the display unit 101 and the lighting device 103 in the line-of-sight detection process.
  • the lower region 101D can be set in the range corresponding to the above-mentioned overlapping range Sd (see FIG. 9) in the display unit 101.
  • the region outside the lower region 101D becomes the upper region 101U.
  • the light source control unit 22 detects the detection light from the first light source 103A and the second light source 103B. It can be controlled to emit light.
  • the light source control unit 22 causes the detection light to be emitted from the third light source 103C.
  • the shape of the lower region 101D is not limited to the shape along the overlapping range Sd, and may be another shape as long as it includes at least a part of the overlapping range Sd.
  • the shape of the lower region 101D is set so that the upper region 101U has a shape included in the third gazing point range Sc (see FIG. 9). As a result, the upper region 101U is included in the third gazing point range Sc, and the lower region 101D is included in the overlapping range Sd.
  • the alternate long and short dash line indicating the overlapping region Sd is shown shifted inward with respect to the solid line indicating the outer circumference of the display unit 101.
  • the overlapping region Sd can be a region including the outer periphery of the display unit 101.
  • the broken line indicating the boundary between the upper region 101U and the lower region 101D is shown shifted inward with respect to the alternate long and short dash line indicating the outer circumference of the overlapping region Sd. ..
  • the boundary between the upper region 101U and the lower region 101D can be arranged on the outer periphery of the overlapping region Sd.
  • FIG. 16 is a flowchart showing an example of the calibration process in the line-of-sight detection method according to the present embodiment.
  • FIG. 17 is a flowchart showing an example of the line-of-sight detection process in the line-of-sight detection method according to the present embodiment.
  • the display control unit 21 displays the target image M in the central portion of the display unit 101 (step S101).
  • one of the first light source 103A and the second light source 103B is made to emit light to irradiate the eyeball 111 with the detected light (step S102), and the light emitted from the first camera 102A and the second camera 102B emits light.
  • the subject's eyeball 111 is photographed by a distant camera (step S103).
  • the other of the first light source 103A and the second light source 103B was made to emit light to irradiate the eyeball 111 with the detected light (step S104), and the light was emitted from the first camera 102A and the second camera 102B.
  • the subject's eyeball 111 is photographed by a camera far from the light source (step S105). For example, when the detection light is emitted from the first light source 103A, the eyeball 111 is photographed by the second camera 102B. Further, when the detection light is emitted from the second light source 103B, the eyeball 111 is photographed by the first camera 102A.
  • the image data acquisition unit 23 acquires image data of the left and right eyeballs.
  • the position detection unit 24 detects the position data at the center of the pupil and the position data at the center of the corneal reflex (step S106), and converts each position data into a global coordinate (world coordinate) system (step S107).
  • the curvature center calculation unit 25 obtains a first straight line connecting the virtual light source 103V and the corneal reflection center 113C (step S108). Further, the curvature center calculation unit 25 obtains a second straight line connecting the target position 130 and the pupil center 112C (step S109). The curvature center calculation unit 25 obtains the intersection of the first straight line and the second straight line as the position data of the corneal curvature center 110 (step S110). Then, the curvature center calculation unit 25 calculates the distance Ra between the corneal curvature center 110 and the pupil center 112C (step S111).
  • the third light source 103C is made to emit light under the control of the light source control unit 22 to irradiate the eyeball 111 with the detected light (step S112), and the subject's eyeball 111 is photographed by one of the cameras 102A and the second camera 102B. An image is taken (step S113), and then the eyeball 111 of the subject is photographed by the other camera (step S114).
  • the image data acquisition unit 23 acquires image data of the left and right eyeballs.
  • the position detection unit 24 detects the position data at the center of the pupil and the position data at the center of the corneal reflex (step S115), and converts each position data into a global coordinate (world coordinate) system (step S116).
  • the curvature center calculation unit 25 obtains a first straight line connecting the third light source 103C and the corneal reflection center 113C (step S117). Further, the curvature center calculation unit 25 obtains a second straight line connecting the target position 130 and the pupil center 112C (step S118). The curvature center calculation unit 25 obtains the intersection of the first straight line and the second straight line as the position data of the corneal curvature center 110 (step S119). Then, the curvature center calculation unit 25 calculates the distance Rb between the corneal curvature center 110 and the pupil center 112C (step S120).
  • the line-of-sight detection process is performed. As shown in FIG. 17, in the line-of-sight detection process, the light source control unit 22 selects whether or not the light source that emits the detected light is the third light source 103C (step S201). When there is no reason to use the third light source 103C as the light source for emitting the detected light (No in step S201), the light source control unit 22 uses the first light source 103A and the second light source 103B as the light sources for emitting the detected light.
  • the control of the light source control unit 22 causes one of the first light source 103A and the second light source 103B to emit light to irradiate the eyeball 111 with the detected light (step S202), and among the first camera 102A and the second camera 102B.
  • the subject's eyeball 111 is photographed by a camera far from the light source that emits light (step S203).
  • the other of the first light source 103A and the second light source 103B was made to emit light to irradiate the eyeball 111 with the detected light (step S204), and the light was emitted from the first camera 102A and the second camera 102B.
  • the subject's eyeball 111 is photographed by a camera far from the light source (step S205). For example, when the detection light is emitted from the first light source 103A, the eyeball 111 is photographed by the second camera 102B. Further, when the detection light is emitted from the second light source 103B, the eyeball 111 is photographed by the first camera 102A.
  • the light source control unit 22 causes the third light source 103C to emit light and detects it on the eyeball 111.
  • Light is irradiated (step S206), the subject's eyeball 111 is photographed by one of the first camera 102A and the second camera 102B (step S207), and then the subject's eyeball 111 is photographed by the other camera (step S207). Step S208).
  • the image data acquisition unit 23 acquires image data.
  • the position detection unit 24 detects the position data of the center of the pupil and the center of the corneal reflex based on the acquired image data.
  • the position detection unit 24 determines whether or not the position data of the corneal reflex center is normally detected (step S209).
  • the position data of the gazing point is acquired by each process in the curvature center calculation unit 25 and the gazing point detecting unit 26 (step S210). ..
  • step S210 when the light source that emits the detected light in the line-of-sight detection process is the lower light source (first light source 103A and second light source 103B), the curvature center calculation unit 25 emits the detected light in the calibration process.
  • the center of curvature of the cornea is calculated using the value of the distance Ra calculated when the light source is the lower light source.
  • the curvature center calculation unit 25 uses the upper light source as the light source that emits the detected light in the calibration process.
  • the center of curvature of the cornea is calculated using the value of the distance Rb calculated in. If it is determined in step S209 that a normal value has not been detected (No in step S209), an error determination is made (step S211).
  • step S212 the gaze point detection unit 26 determines whether or not to end the gaze point detection. As a result of the determination in step S212, when the detection of the gaze point is terminated (Yes in step S212), the process is terminated. If the gaze point detection is not completed (No in step S212), the processes after step S201 are repeated.
  • the line-of-sight detection device 100 has a display unit 101 for displaying an image and a plurality of light sources (first light source 103A, 1st light source 103A, which emits detection light to irradiate at least one eyeball 111 of the subject.
  • the position of the gazing point of the subject is calculated based on the position detection unit 24 that detects the position of the center of the pupil and the position of the center of the corneal reflection indicating the center of the corneal reflex, and the position of the center of the pupil and the position of the center of the curvature of the corneum. It includes a gazing point detection unit 26 and a light source control unit 22 that controls light emission and non-light emission of a plurality of light sources and switches a light source that emits the detected light among the plurality of light sources according to the position of the gazing point of the subject.
  • an image is displayed on the display unit 101, detection light is emitted from a plurality of light sources to irradiate at least one pupil 111 of the subject, and the detection light irradiates.
  • the image of the pupil 111 is captured, and the position of the center of the pupil indicating the center of the pupil of the eyeball 111 irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex are determined from the captured image.
  • the position of the gaze point of the subject is calculated based on the detection, the position of the center of the pupil and the position of the center of curvature of the corneal, and the position of the gaze point of the subject is controlled by controlling the emission and non-emission of multiple light sources. It includes switching the light source that emits the detected light among the plurality of light sources according to the above.
  • the process of capturing the image of the eyeball 111 and the position of the center of the pupil indicating the center of the pupil of the eyeball 111 irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex are determined from the captured image.
  • the process of detecting the process of calculating the position of the gaze point of the subject based on the position of the center of the pupil and the position of the center of curvature of the corneal, and the process of controlling the emission and non-emission of multiple light sources, and the position of the gaze point of the subject.
  • the computer is made to execute the process of switching the light source that emits the detected light among the plurality of light sources according to the above.
  • the light source that emits the detection light can be switched from the plurality of light sources according to the position of the gazing point of the subject, the reflected image of the detection light emitted to the eyeball of the subject. Can be prevented from protruding to the outside of the cornea. As a result, it is possible to suppress the occurrence of distortion of the reflected image of the detected light, and thus it is possible to suppress a decrease in the detection accuracy of the line of sight.
  • the plurality of light sources include the first light source 103A and the second light source 103B installed below the display unit 101, and the third light source 103C installed above the display unit 101.
  • the light source control unit 22 emits detection light from the first light source 103A and the second light source 103B when the position of the gaze point of the subject is in the lower region 101D of the display unit 101, and the position of the gaze point of the subject. Is located in the upper region 101U of the display unit 101, the detection light is emitted from the third light source 103C. As a result, the reflected image of the detected light can be reliably placed in the cornea of the subject.
  • the lower light source includes the first light source 103A and the second light source 103B, and the light source control unit 22 has a case where the position of the gazing point of the subject is in the lower region of the display unit 101.
  • the first light source 103A and the second light source 103B are alternately caused to emit light. This makes it possible to reliably form one reflected image of the detected light in the cornea of the subject. Therefore, the position data of the center of the reflected image (corneal reflex center) can be obtained with high accuracy.
  • the embodiments of the present disclosure have been described above, the embodiments are not limited by the contents of these embodiments. Further, the above-mentioned components include those that can be easily assumed by those skilled in the art, those that are substantially the same, that is, those in a so-called equal range. Furthermore, the components described above can be combined as appropriate. Further, various omissions, replacements or changes of the components can be made without departing from the gist of the above-described embodiment.
  • a lower light source (first light source 103A, second light source 103B) arranged at the lower part of the display unit 101 and an upper light source (third light source 103B) arranged at the upper part of the display unit 101.
  • the configuration provided with the light source 103C) has been described as an example, but the present invention is not limited to this.
  • the light sources may be arranged on the left and right side portions of the display unit 101.
  • the upper light source may be configured to be provided with a plurality of light sources.
  • the configuration in which the first camera 102A and the second camera 102B are arranged at the lower part of the display unit 101 has been described as an example, but the present invention is not limited to this.
  • the first camera 102A and the second camera 102B may be arranged on the upper portion or the side portion of the display unit 101.
  • the line-of-sight detection device, line-of-sight detection method, and line-of-sight detection program according to the present disclosure can be used, for example, in a processing device such as a computer.
  • line-of-sight detection device 101 ... display unit, 101D ... Lower area, 101U ... upper area, 102 ... stereo camera device, 102A ... first camera, 102B ... second camera, 102C, 103C ... third light source, 103 ... lighting device, 103A ... first light source, 103B ... second Light source, 103V ... Virtual light source, 109,127, Ra, Rb ... Distance (distance between the center of corneal curvature and the center of the pupil), 110 ... Center of curvature of the corneal, 111 ... Eyeball, 111a ... Corn, 111b ... Strong membrane, 112 ... , 112C ... pupil center, 113 ...

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)
  • Position Input By Displaying (AREA)

Abstract

This sight line detection device is provided with: a display unit that displays an image; a plurality of light sources that emit detection light to irradiate at least one of the eyes of a subject; an imaging unit that captures an image of the eye being irradiated with the detection light; a position detection unit that detects a pupil center position indicating the center of the pupil of the eye being irradiated with the detection light and a corneal reflection center position indicating the center of corneal reflection, from the captured image; a gazing point detection unit that calculates the position of a gazing point of the subject on the basis of the pupil center position and the position of the center of the corneal curvature; and a light source control unit that controls the plurality of light sources to emit or not to emit light, and switches a light source that emits the detection light among the plurality of light sources on the basis of the position of the gazing point of the subject.

Description

視線検出装置、視線検出方法及び視線検出プログラムLine-of-sight detection device, line-of-sight detection method and line-of-sight detection program
 本開示は、視線検出装置、視線検出方法及び視線検出プログラムに関する。 The present disclosure relates to a line-of-sight detection device, a line-of-sight detection method, and a line-of-sight detection program.
 光源で検出光を発光して被験者の眼球に照射し、検出光が照射された眼球の画像を取得し、取得した画像における瞳孔の像と検出光の反射像とに基づいて、瞳孔中心及び角膜曲率中心を算出し、角膜曲率中心から瞳孔中心へ向かうベクトルを被験者の視線方向として検出する視線検出装置が知られている(例えば、特許文献1参照)。 The light source emits the detected light to irradiate the subject's eyeball, and an image of the eyeball irradiated with the detected light is acquired. Based on the image of the pupil and the reflected image of the detected light in the acquired image, the center of the pupil and the cornea A line-of-sight detection device that calculates the center of curvature and detects a vector from the center of corneal curvature to the center of the pupil as the line-of-sight direction of the subject is known (see, for example, Patent Document 1).
特開2016-85588号公報Japanese Unexamined Patent Publication No. 2016-85588
 上記のような視線検出装置では、被験者が例えば光源から大きく外れた位置を見た場合等においては、被験者の眼球に照射される検出光の反射像が角膜と強膜との境界に存在することがある。この場合、検出光の反射像の形が歪むため、視線の検出精度が低下する可能性がある。 In the above-mentioned line-of-sight detection device, for example, when the subject looks at a position far away from the light source, the reflected image of the detection light emitted to the subject's eyeball exists at the boundary between the cornea and the sclera. There is. In this case, the shape of the reflected image of the detected light is distorted, so that the detection accuracy of the line of sight may decrease.
 本開示は、上記に鑑みてなされたものであり、検出精度の低下を抑制することが可能な視線検出装置、視線検出方法及び視線検出プログラムを提供することを目的とする。 The present disclosure has been made in view of the above, and an object of the present disclosure is to provide a line-of-sight detection device, a line-of-sight detection method, and a line-of-sight detection program capable of suppressing a decrease in detection accuracy.
 本開示に係る視線検出装置は、画像を表示する表示部と、検出光を発光して被験者の少なくとも一方の眼球に照射する複数の光源と、前記検出光が照射された眼球の画像を撮像する撮像部と、撮像された前記画像から、前記検出光が照射された眼球の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出する位置検出部と、前記瞳孔中心の位置と前記角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出する注視点検出部と、複数の前記光源の発光と非発光とを制御し、前記被験者の前記注視点の位置に応じて複数の前記光源のうち前記検出光を発光する前記光源を切り替える光源制御部とを備える。 The line-of-sight detection device according to the present disclosure captures an image of a display unit that displays an image, a plurality of light sources that emit detection light to irradiate at least one eyeball of a subject, and an eyeball irradiated with the detection light. An image pickup unit, a position detection unit that detects the position of the center of the pupil indicating the center of the pupil of the eyeball irradiated with the detection light and the position of the center of the corneal reflection indicating the center of the corneal reflection from the captured image, and a position detection unit. The gaze point detector that calculates the position of the gaze point of the subject based on the position of the center of the pupil and the position of the center of curvature of the corneal, and the gaze point detection unit that controls the light emission and non-emission of the plurality of light sources to control the light emission and non-emission of the light source of the subject. A light source control unit for switching the light source that emits the detected light among the plurality of the light sources according to the position of the viewpoint is provided.
 本開示に係る視線検出方法は、表示部に画像を表示することと、複数の光源から検出光を発光して被験者の少なくとも一方の眼球に照射することと、前記検出光が照射された眼球の画像を撮像することと、撮像された前記画像から、前記検出光が照射された眼球の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出することと、前記瞳孔中心の位置と前記角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出することと、複数の前記光源の発光と非発光とを制御し、前記被験者の前記注視点の位置に応じて複数の前記光源のうち前記検出光を発光する前記光源を切り替えることとを含む。 The line-of-sight detection method according to the present disclosure includes displaying an image on a display unit, emitting detection light from a plurality of light sources to irradiate at least one eyeball of a subject, and the eyeball irradiated with the detection light. To capture an image and to detect from the captured image the position of the center of the pupil indicating the center of the pupil of the eyeball irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex. , The position of the gaze point of the subject is calculated based on the position of the center of the pupil and the position of the center of curvature of the corneal, and the light emission and non-emission of the plurality of light sources are controlled to control the gaze point of the subject. It includes switching the light source that emits the detected light among the plurality of the light sources according to the position.
 本開示に係る視線検出プログラムは、表示部に画像を表示する処理と、複数の光源から検出光を発光して被験者の少なくとも一方の眼球に照射する処理と、前記検出光が照射された眼球の画像を撮像する処理と、撮像された前記画像から、前記検出光が照射された眼球の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出する処理と、前記瞳孔中心の位置と前記角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出する処理と、複数の前記光源の発光と非発光とを制御し、前記被験者の前記注視点の位置に応じて複数の前記光源のうち前記検出光を発光する前記光源を切り替える処理とをコンピュータに実行させる。 The line-of-sight detection program according to the present disclosure includes a process of displaying an image on a display unit, a process of emitting detection light from a plurality of light sources to irradiate at least one eyeball of a subject, and a process of irradiating the pupil with the detection light. The process of capturing an image and the process of detecting from the captured image the position of the center of the pupil indicating the center of the pupil of the eyeball irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex. , The process of calculating the position of the gaze point of the subject based on the position of the center of the pupil and the position of the center of curvature of the corneal, and controlling the light emission and non-emission of the plurality of light sources, the gaze point of the subject. The computer is made to execute a process of switching the light source that emits the detected light among the plurality of the light sources according to the position.
 本開示によれば、検出精度の低下を抑制することができる。 According to the present disclosure, it is possible to suppress a decrease in detection accuracy.
図1は、本実施形態に係る視線検出装置の一例を模式的に示す斜視図である。FIG. 1 is a perspective view schematically showing an example of a line-of-sight detection device according to the present embodiment. 図2は、本実施形態に係る視線検出装置のハードウェア構成の一例を示す図である。FIG. 2 is a diagram showing an example of the hardware configuration of the line-of-sight detection device according to the present embodiment. 図3は、本実施形態に係る視線検出装置の一例を示す機能ブロック図である。FIG. 3 is a functional block diagram showing an example of the line-of-sight detection device according to the present embodiment. 図4は、第1光源及び第2光源で眼球が照明される例を示す図である。FIG. 4 is a diagram showing an example in which the eyeball is illuminated by the first light source and the second light source. 図5は、本実施形態に係るキャリブレーション処理の原理を説明するための模式図である。FIG. 5 is a schematic diagram for explaining the principle of the calibration process according to the present embodiment. 図6は、本実施形態に係る視線検出処理の原理を説明するための模式図である。FIG. 6 is a schematic diagram for explaining the principle of the line-of-sight detection process according to the present embodiment. 図7は、検出光の反射像が形成された眼球の一例を示す図である。FIG. 7 is a diagram showing an example of an eyeball on which a reflected image of the detected light is formed. 図8は、検出光の反射像が形成された眼球の他の例を示す図である。FIG. 8 is a diagram showing another example of the eyeball in which the reflected image of the detected light is formed. 図9は、検出光の反射像が被験者の角膜上に収まるように形成される注視点の範囲の一例を光源ごとに示す図である。FIG. 9 is a diagram showing an example of the range of the gazing point formed so that the reflected image of the detected light fits on the cornea of the subject for each light source. 図10は、キャリブレーション処理における表示部及び照明装置の動作の一例を示す図である。FIG. 10 is a diagram showing an example of the operation of the display unit and the lighting device in the calibration process. 図11は、キャリブレーション処理における表示部及び照明装置の動作の一例を示す図である。FIG. 11 is a diagram showing an example of the operation of the display unit and the lighting device in the calibration process. 図12は、視線検出処理における表示部及び照明装置の動作の一例を示す図である。FIG. 12 is a diagram showing an example of the operation of the display unit and the lighting device in the line-of-sight detection process. 図13は、視線検出処理における表示部及び照明装置の動作の一例を示す図である。FIG. 13 is a diagram showing an example of the operation of the display unit and the lighting device in the line-of-sight detection process. 図14は、視線検出処理における表示部及び照明装置の動作の他の例を示す図である。FIG. 14 is a diagram showing another example of the operation of the display unit and the lighting device in the line-of-sight detection process. 図15は、視線検出処理における表示部及び照明装置の動作の他の例を示す図である。FIG. 15 is a diagram showing another example of the operation of the display unit and the lighting device in the line-of-sight detection process. 図16は、本実施形態に係る視線検出方法におけるキャリブレーション処理の一例を示すフローチャートである。FIG. 16 is a flowchart showing an example of the calibration process in the line-of-sight detection method according to the present embodiment. 図17は、本実施形態に係る視線検出方法における視線検出処理の一例を示すフローチャートである。FIG. 17 is a flowchart showing an example of the line-of-sight detection process in the line-of-sight detection method according to the present embodiment.
 以下、本開示の実施形態を図面に基づいて説明する。なお、この実施形態によりこの発明が限定されるものではない。また、下記実施形態における構成要素には、当業者が置換可能かつ容易なもの、あるいは実質的に同一のものが含まれる。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. The present invention is not limited to this embodiment. In addition, the components in the following embodiments include those that can be easily replaced by those skilled in the art, or those that are substantially the same.
 以下の説明においては、三次元グローバル座標系を設定して各部の位置関係について説明する。所定面の第1軸と平行な方向をX軸方向とし、第1軸と直交する所定面の第2軸と平行な方向をY軸方向とし、第1軸及び第2軸のそれぞれと直交する第3軸と平行な方向をZ軸方向とする。所定面はXY平面を含む。 In the following explanation, the positional relationship of each part will be explained by setting the three-dimensional global coordinate system. The direction parallel to the first axis of the predetermined surface is the X-axis direction, the direction parallel to the second axis of the predetermined surface orthogonal to the first axis is the Y-axis direction, and the directions are orthogonal to each of the first axis and the second axis. The direction parallel to the third axis is the Z-axis direction. The predetermined plane includes an XY plane.
 (視線検出装置)
 図1は、本実施形態に係る視線検出装置100の一例を模式的に示す斜視図である。図1に示すように、視線検出装置100は、表示部101と、ステレオカメラ装置102と、照明装置103とを備える。
(Gaze detection device)
FIG. 1 is a perspective view schematically showing an example of the line-of-sight detection device 100 according to the present embodiment. As shown in FIG. 1, the line-of-sight detection device 100 includes a display unit 101, a stereo camera device 102, and a lighting device 103.
 表示部101は、液晶ディスプレイ(liquid crystal display:LCD)又は有機ELディスプレイ(organic electroluminescence display:OLED)のようなフラットパネルディスプレイを含む。本実施形態において、表示部101は、画像を表示する。本実施形態において、表示部101は、例えば被験者の視機能を評価するための指標を表示する。表示部101は、XY平面と実質的に平行である。X軸方向は表示部101の左右方向であり、Y軸方向は表示部101の上下方向であり、Z軸方向は表示部101と直交する奥行方向である。 The display unit 101 includes a flat panel display such as a liquid crystal display (LCD) or an organic electroluminescence display (OLED). In the present embodiment, the display unit 101 displays an image. In the present embodiment, the display unit 101 displays, for example, an index for evaluating the visual function of the subject. The display unit 101 is substantially parallel to the XY plane. The X-axis direction is the left-right direction of the display unit 101, the Y-axis direction is the vertical direction of the display unit 101, and the Z-axis direction is the depth direction orthogonal to the display unit 101.
 ステレオカメラ装置102は、第1カメラ102A及び第2カメラ102Bを有する。ステレオカメラ装置102は、表示部101よりも下方に配置される。第1カメラ102Aと第2カメラ102BとはX軸方向に配置される。第1カメラ102Aは、第2カメラ102Bよりも-X方向に配置される。第1カメラ102A及び第2カメラ102Bはそれぞれ、赤外線カメラを含み、例えば波長850[nm]の近赤外光を透過可能な光学系と、その近赤外光を受光可能な撮像素子とを有する。 The stereo camera device 102 has a first camera 102A and a second camera 102B. The stereo camera device 102 is arranged below the display unit 101. The first camera 102A and the second camera 102B are arranged in the X-axis direction. The first camera 102A is arranged in the −X direction with respect to the second camera 102B. Each of the first camera 102A and the second camera 102B includes an infrared camera, and has, for example, an optical system capable of transmitting near-infrared light having a wavelength of 850 [nm] and an image pickup element capable of receiving the near-infrared light. ..
 照明装置(光源)103は、第1光源(下部第1光源)103Aと、第2光源(下部第2光源)103Bと、第3光源(上部光源)103Cとを有する。第1光源103A及び第2光源103Bは、表示部101よりも下方に配置される。第3光源103Cは、表示部101よりも上方に配置される。 The lighting device (light source) 103 has a first light source (lower first light source) 103A, a second light source (lower second light source) 103B, and a third light source (upper light source) 103C. The first light source 103A and the second light source 103B are arranged below the display unit 101. The third light source 103C is arranged above the display unit 101.
 第1光源103Aと第2光源103BとはX軸方向に配置される。第1光源103Aは、第1カメラ102Aよりも-X方向に配置される。第2光源103Bは、第2カメラ102Bよりも+X方向に配置される。第3光源103Cは、X方向において、第1カメラ102Aと第2カメラ102Bとの間の位置に配置される。本実施形態において、第3光源103Cは、X方向について表示部101の中央に対応する位置に配置される。 The first light source 103A and the second light source 103B are arranged in the X-axis direction. The first light source 103A is arranged in the −X direction with respect to the first camera 102A. The second light source 103B is arranged in the + X direction with respect to the second camera 102B. The third light source 103C is arranged at a position between the first camera 102A and the second camera 102B in the X direction. In the present embodiment, the third light source 103C is arranged at a position corresponding to the center of the display unit 101 in the X direction.
 第1光源103A、第2光源103B及び第3光源103Cはそれぞれ、LED(light emitting diode)光源を含み、例えば波長850[nm]の近赤外光を射出可能である。なお、第1光源103A及び第2光源103Bは、第1カメラ102Aと第2カメラ102Bとの間に配置されてもよい。また、ステレオカメラ装置102は、表示部101よりも上方に配置されてもよい。 The first light source 103A, the second light source 103B, and the third light source 103C each include an LED (light emission diode) light source, and can emit near-infrared light having a wavelength of, for example, 850 [nm]. The first light source 103A and the second light source 103B may be arranged between the first camera 102A and the second camera 102B. Further, the stereo camera device 102 may be arranged above the display unit 101.
 照明装置103は、検出光である近赤外光を射出して、被験者の眼球111を照明する。ステレオカメラ装置102は、第1光源103Aが発光することで射出された検出光が眼球111に照射されたときに第2カメラ102Bで眼球111の一部(以下、これを含めて「眼球」とする)を撮影し、第2光源103Bが発光することで射出された検出光が眼球111に照射されたときに第1カメラ102Aで眼球111を撮影する。また、ステレオカメラ装置102は、第3光源103Cが発光することで射出された検出光が眼球111に照射されたときに第1カメラ102A及び第2カメラ102Bで眼球111を撮影する。 The lighting device 103 emits near-infrared light, which is the detection light, to illuminate the subject's eyeball 111. In the stereo camera device 102, when the detection light emitted by the light emission of the first light source 103A is applied to the eyeball 111, a part of the eyeball 111 (hereinafter, including this is referred to as an “eyeball”” by the second camera 102B. When the detection light emitted by the second light source 103B emitting light is applied to the eyeball 111, the first camera 102A photographs the eyeball 111. Further, the stereo camera device 102 photographs the eyeball 111 with the first camera 102A and the second camera 102B when the detection light emitted by the light emission of the third light source 103C is applied to the eyeball 111.
 第1カメラ102A及び第2カメラ102Bの少なくとも一方からフレーム同期信号が出力される。第1光源103A、第2光源103B及び第3光源103Cは、フレーム同期信号に基づいて検出光を射出する。第1カメラ102Aは、第2光源103Bから射出された検出光が眼球111に照射されたときに、眼球111の画像データを撮影する。第2カメラ102Bは、第1光源103Aから射出された検出光が眼球111に照射されたときに、眼球111の画像データを撮影する。また、第1カメラ102A及び第2カメラ102Bは、第3光源103Cから射出された検出光が眼球111に照射されたときに、交互に眼球111の画像データを撮影する。 A frame synchronization signal is output from at least one of the first camera 102A and the second camera 102B. The first light source 103A, the second light source 103B, and the third light source 103C emit the detection light based on the frame synchronization signal. The first camera 102A captures the image data of the eyeball 111 when the detection light emitted from the second light source 103B irradiates the eyeball 111. The second camera 102B captures the image data of the eyeball 111 when the detection light emitted from the first light source 103A irradiates the eyeball 111. Further, the first camera 102A and the second camera 102B alternately capture the image data of the eyeball 111 when the detection light emitted from the third light source 103C is applied to the eyeball 111.
 眼球111に検出光が照射されると、その検出光の一部は瞳孔112で反射し、その瞳孔112からの光がステレオカメラ装置102に入射する。また、眼球111に検出光が照射されると、角膜の虚像である角膜反射像113が眼球111に形成され、その角膜反射像113からの光がステレオカメラ装置102に入射する。 When the eyeball 111 is irradiated with the detected light, a part of the detected light is reflected by the pupil 112, and the light from the pupil 112 is incident on the stereo camera device 102. Further, when the eyeball 111 is irradiated with the detection light, a corneal reflex image 113, which is a virtual image of the cornea, is formed on the eyeball 111, and the light from the corneal reflex image 113 is incident on the stereo camera device 102.
 第1カメラ102A及び第2カメラ102Bと、第1光源103A、第2光源103B及び第3光源103Cとの相対位置が適切に設定されることにより、瞳孔112からステレオカメラ装置102に入射する光の強度は低くなり、角膜反射像113からステレオカメラ装置102に入射する光の強度は高くなる。すなわち、ステレオカメラ装置102で撮影される瞳孔112の画像は低輝度となり、角膜反射像113の画像は高輝度となる。ステレオカメラ装置102は、撮影される画像の輝度に基づいて、瞳孔112の位置及び角膜反射像113の位置を検出することができる。 By appropriately setting the relative positions of the first camera 102A and the second camera 102B and the first light source 103A, the second light source 103B, and the third light source 103C, the light incident on the stereo camera device 102 from the pupil 112 The intensity becomes low, and the intensity of the light incident on the stereo camera device 102 from the corneal reflection image 113 becomes high. That is, the image of the pupil 112 taken by the stereo camera device 102 has low brightness, and the image of the corneal reflex image 113 has high brightness. The stereo camera device 102 can detect the position of the pupil 112 and the position of the corneal reflex image 113 based on the brightness of the image to be captured.
 図2は、本実施形態に係る視線検出装置100のハードウェア構成の一例を示す図である。図2に示すように、視線検出装置100は、表示部101と、ステレオカメラ装置102と、照明装置103と、コンピュータシステム(制御部)20と、入出力インターフェース装置30と、駆動回路40と、出力装置50と、入力装置60とを備える。 FIG. 2 is a diagram showing an example of the hardware configuration of the line-of-sight detection device 100 according to the present embodiment. As shown in FIG. 2, the line-of-sight detection device 100 includes a display unit 101, a stereo camera device 102, a lighting device 103, a computer system (control unit) 20, an input / output interface device 30, a drive circuit 40, and the like. It includes an output device 50 and an input device 60.
 コンピュータシステム20と、駆動回路40と、出力装置50と、入力装置60とは、入出力インターフェース装置30を介してデータ通信する。コンピュータシステム20は、演算処理装置20A及び記憶装置20Bを含む。演算処理装置20Aは、CPU(central processing unit)のようなマイクロプロセッサを含む。記憶装置20Bは、ROM(read only memory)及びRAM(random access memory)のようなメモリ又はストレージを含む。演算処理装置20Aは、記憶装置20Bに記憶されているコンピュータプログラム20Cに従って演算処理を実施する。 The computer system 20, the drive circuit 40, the output device 50, and the input device 60 communicate data via the input / output interface device 30. The computer system 20 includes an arithmetic processing unit 20A and a storage device 20B. The arithmetic processing unit 20A includes a microprocessor such as a CPU (central processing unit). The storage device 20B includes a memory or storage such as a ROM (read only memory) and a RAM (random access memory). The arithmetic processing unit 20A performs arithmetic processing according to the computer program 20C stored in the storage apparatus 20B.
 駆動回路40は、駆動信号を生成して、表示部101、ステレオカメラ装置102、及び照明装置103に出力する。また、駆動回路40は、ステレオカメラ装置102で撮影された眼球111の画像データを、入出力インターフェース装置30を介してコンピュータシステム20に供給する。 The drive circuit 40 generates a drive signal and outputs it to the display unit 101, the stereo camera device 102, and the lighting device 103. Further, the drive circuit 40 supplies the image data of the eyeball 111 taken by the stereo camera device 102 to the computer system 20 via the input / output interface device 30.
 出力装置50は、フラットパネルディスプレイのような表示部を含む。なお、出力装置50は、印刷装置を含んでもよい。入力装置60は、操作されることにより入力データを生成する。入力装置60は、コンピュータシステム用のキーボード又はマウスを含む。なお、入力装置60が表示部である出力装置50の表示画面に設けられたタッチセンサを含んでもよい。 The output device 50 includes a display unit such as a flat panel display. The output device 50 may include a printing device. The input device 60 generates input data by being operated. The input device 60 includes a keyboard or mouse for a computer system. The input device 60 may include a touch sensor provided on the display screen of the output device 50 which is a display unit.
 本実施形態においては、表示部101とコンピュータシステム20とは別々の装置である。なお、表示部101とコンピュータシステム20とが一体でもよい。例えば視線検出装置100がタブレット型パーソナルコンピュータを含む場合、そのタブレット型パーソナルコンピュータに、コンピュータシステム20、入出力インターフェース装置30、駆動回路40、及び表示部101が搭載されてもよい。 In the present embodiment, the display unit 101 and the computer system 20 are separate devices. The display unit 101 and the computer system 20 may be integrated. For example, when the line-of-sight detection device 100 includes a tablet-type personal computer, the tablet-type personal computer may be equipped with a computer system 20, an input / output interface device 30, a drive circuit 40, and a display unit 101.
 図3は、本実施形態に係る視線検出装置100の一例を示す機能ブロック図である。図3に示すように、入出力インターフェース装置30は、入出力部302を有する。駆動回路40は、表示部101を駆動するための駆動信号を生成して表示部101に出力する表示装置駆動部402と、第1カメラ102Aを駆動するための駆動信号を生成して第1カメラ102Aに出力する第1カメラ入出力部404Aと、第2カメラ102Bを駆動するための駆動信号を生成して第2カメラ102Bに出力する第2カメラ入出力部404Bと、第1光源103A、第2光源103B及び第3光源103Cを駆動するための駆動信号を生成して第1光源103A、第2光源103B及び第3光源103Cに出力する光源駆動部406とを有する。また、第1カメラ入出力部404Aは、第1カメラ102Aで撮影された眼球111の画像データを、入出力部302を介してコンピュータシステム20に供給する。第2カメラ入出力部404Bは、第2カメラ102Bで撮影された眼球111の画像データを、入出力部302を介してコンピュータシステム20に供給する。 FIG. 3 is a functional block diagram showing an example of the line-of-sight detection device 100 according to the present embodiment. As shown in FIG. 3, the input / output interface device 30 has an input / output unit 302. The drive circuit 40 generates a display device drive unit 402 that generates a drive signal for driving the display unit 101 and outputs the drive signal to the display unit 101, and a first camera that generates a drive signal for driving the first camera 102A. The first camera input / output unit 404A that outputs to 102A, the second camera input / output unit 404B that generates a drive signal for driving the second camera 102B and outputs it to the second camera 102B, the first light source 103A, and the first It has a light source driving unit 406 that generates a drive signal for driving the two light source 103B and the third light source 103C and outputs the drive signal to the first light source 103A, the second light source 103B, and the third light source 103C. Further, the first camera input / output unit 404A supplies the image data of the eyeball 111 taken by the first camera 102A to the computer system 20 via the input / output unit 302. The second camera input / output unit 404B supplies the image data of the eyeball 111 taken by the second camera 102B to the computer system 20 via the input / output unit 302.
 コンピュータシステム20は、視線検出装置100を制御する。コンピュータシステム20は、表示制御部21と、光源制御部22と、画像データ取得部23と、位置検出部24と、曲率中心算出部25と、注視点検出部26と、記憶部27と、入出力制御部28とを有する。コンピュータシステム20の機能は、演算処理装置20A及び記憶装置20Bによって発揮される。 The computer system 20 controls the line-of-sight detection device 100. The computer system 20 includes a display control unit 21, a light source control unit 22, an image data acquisition unit 23, a position detection unit 24, a curvature center calculation unit 25, a gaze point detection unit 26, and a storage unit 27. It has an output control unit 28. The functions of the computer system 20 are exhibited by the arithmetic processing unit 20A and the storage device 20B.
 表示制御部21は、被験者に見せるための画像を表示部101に表示させる。表示制御部21は、例えばキャリブレーション処理における目標画像を表示部101の複数の位置(目標位置)に表示可能である。表示制御部21は、目標画像を1箇所ずつ複数の目標位置に順次切り替えて表示してもよいし、目標画像が表示部101内で複数の目標位置に順次移動するように表示させてもよい。なお、目標画像を表示させる目標位置の数については、例えば操作者が入力装置60等により入力することによって設定可能である。 The display control unit 21 causes the display unit 101 to display an image to be shown to the subject. The display control unit 21 can display, for example, a target image in the calibration process at a plurality of positions (target positions) of the display unit 101. The display control unit 21 may sequentially switch the target image to a plurality of target positions one by one and display the target image, or may display the target image so as to sequentially move to the plurality of target positions in the display unit 101. .. The number of target positions for displaying the target image can be set by, for example, inputting by an operator using an input device 60 or the like.
 光源制御部22は、光源駆動部406を制御して、第1光源103A、第2光源103B及び第3光源103Cの発光と非発光とを制御する。光源制御部22は、被験者の注視点の位置に応じて、第1光源103A、第2光源103B及び第3光源103Cのうち検出光を発光する光源を切り替える。光源制御部22は、被験者の注視点の位置が表示部101の下側領域にある場合に第1光源103A及び第2光源103Bから検出光を発光させる。この場合、光源制御部22は、第1光源103Aと第2光源103Bとが異なるタイミングで検出光を射出するように第1光源103A及び第2光源103Bを制御する。例えば、光源制御部22は、第1光源103Aと第2光源103Bとを交互に発光させるようにする。光源制御部22は、被験者の注視点の位置が表示部101の上側領域にある場合に第3光源103Cから検出光を発光させる。 The light source control unit 22 controls the light source drive unit 406 to control light emission and non-light emission of the first light source 103A, the second light source 103B, and the third light source 103C. The light source control unit 22 switches the light source that emits the detected light from the first light source 103A, the second light source 103B, and the third light source 103C according to the position of the gazing point of the subject. The light source control unit 22 emits the detected light from the first light source 103A and the second light source 103B when the position of the gazing point of the subject is in the lower region of the display unit 101. In this case, the light source control unit 22 controls the first light source 103A and the second light source 103B so that the first light source 103A and the second light source 103B emit the detected light at different timings. For example, the light source control unit 22 causes the first light source 103A and the second light source 103B to emit light alternately. The light source control unit 22 emits the detected light from the third light source 103C when the position of the gazing point of the subject is in the upper region of the display unit 101.
 画像データ取得部23は、第1カメラ102A及び第2カメラ102Bを含むステレオカメラ装置102によって撮影された被験者の眼球111の画像データを、入出力部302を介してステレオカメラ装置102から取得する。 The image data acquisition unit 23 acquires image data of the subject's eyeball 111 taken by the stereo camera device 102 including the first camera 102A and the second camera 102B from the stereo camera device 102 via the input / output unit 302.
 位置検出部24は、画像データ取得部23で取得された眼球111の画像データに基づいて、瞳孔中心の位置データを検出する。また、位置検出部24は、画像データ取得部23で取得された眼球111の画像データに基づいて、角膜反射中心の位置データを検出する。瞳孔中心は、瞳孔112の中心である。角膜反射中心は、角膜反射像113の中心である。位置検出部24は、被験者の左右それぞれの眼球111について、瞳孔中心の位置データ及び角膜反射中心の位置データを検出する。 The position detection unit 24 detects the position data at the center of the pupil based on the image data of the eyeball 111 acquired by the image data acquisition unit 23. Further, the position detection unit 24 detects the position data of the corneal reflex center based on the image data of the eyeball 111 acquired by the image data acquisition unit 23. The center of the pupil is the center of the pupil 112. The corneal reflex center is the center of the corneal reflex image 113. The position detection unit 24 detects the position data of the center of the pupil and the position data of the center of the corneal reflex for each of the left and right eyeballs 111 of the subject.
 曲率中心算出部25は、画像データ取得部23で取得された眼球111の画像データに基づいて、眼球111の角膜曲率中心の位置データを算出する。 The curvature center calculation unit 25 calculates the position data of the corneal curvature center of the eyeball 111 based on the image data of the eyeball 111 acquired by the image data acquisition unit 23.
 注視点検出部26は、画像データ取得部23で取得された眼球111の画像データに基づいて、被験者の注視点の位置データを検出する。本実施形態において、注視点の位置データとは、三次元グローバル座標系で規定される被験者の視線ベクトルと表示部101との交点の位置データをいう。注視点検出部26は、眼球111の画像データから取得された瞳孔中心の位置データ及び角膜曲率中心の位置データに基づいて、被験者の左右それぞれの眼球111の視線ベクトルを検出する。視線ベクトルが検出された後、注視点検出部26は、視線ベクトルと表示部101との交点を示す注視点の位置データを検出する。 The gaze point detection unit 26 detects the position data of the gaze point of the subject based on the image data of the eyeball 111 acquired by the image data acquisition unit 23. In the present embodiment, the position data of the gazing point means the position data of the intersection of the line-of-sight vector of the subject defined by the three-dimensional global coordinate system and the display unit 101. The gazing point detection unit 26 detects the line-of-sight vectors of the left and right eyeballs 111 of the subject based on the position data of the center of the pupil and the position data of the center of curvature of the cornea acquired from the image data of the eyeballs 111. After the line-of-sight vector is detected, the gaze point detection unit 26 detects the position data of the gaze point indicating the intersection of the line-of-sight vector and the display unit 101.
 記憶部27は、上記の視線検出に関する各種データやプログラムを記憶する。記憶部27は、例えば表示部101に表示させる画像についてのデータを、背景画像の色及び輝度ごとに記憶することができる。また、記憶部27は、各キャリブレーション処理において算出される注視点の位置データを記憶する。 The storage unit 27 stores various data and programs related to the above-mentioned line-of-sight detection. The storage unit 27 can store, for example, data about an image to be displayed on the display unit 101 for each color and brightness of the background image. Further, the storage unit 27 stores the position data of the gazing point calculated in each calibration process.
 また、記憶部27は、表示部に画像を表示する処理と、複数の光源から検出光を発光して被験者の少なくとも一方の眼球に照射する処理と、検出光が照射された眼球の画像を撮像する処理と、撮像された画像から、検出光が照射された眼球の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出する処理と、瞳孔中心の位置と角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出する処理と、複数の光源の発光と非発光とを制御し、被験者の注視点の位置に応じて複数の光源のうち検出光を発光する光源を切り替える処理とをコンピュータに実行させる視線検出プログラムを記憶する。 Further, the storage unit 27 takes a process of displaying an image on the display unit, a process of emitting detection light from a plurality of light sources to irradiate at least one eyeball of the subject, and an image of the eyeball irradiated with the detection light. The process of detecting the position of the center of the pupil indicating the center of the pupil of the eyeball irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex from the captured image, and the position of the center of the pupil. The process of calculating the position of the gaze point of the subject based on the position of the center of gaze of the corneal curvature, and controlling the light emission and non-emission of multiple light sources, and detecting among multiple light sources according to the position of the gaze point of the subject. It stores a line-of-sight detection program that causes a computer to execute a process of switching a light source that emits light.
 入出力制御部28は、表示部101及び出力装置50の少なくとも一方にデータを出力する。 The input / output control unit 28 outputs data to at least one of the display unit 101 and the output device 50.
 次に、本実施形態に係る曲率中心算出部25の処理の概要について説明する。本実施形態では、第1光源103A及び第2光源103Bで眼球111が照明され、第1カメラ102A及び第2カメラ102Bの2つのカメラで眼球111を撮影する場合について説明する。なお、光源及びカメラが2つの場合に限定されず、光源及びカメラが1つである場合にも同様の説明が可能である。以下、本実施形態に係る視線検出方法の原理を説明する。 Next, an outline of the processing of the curvature center calculation unit 25 according to the present embodiment will be described. In the present embodiment, a case where the eyeball 111 is illuminated by the first light source 103A and the second light source 103B and the eyeball 111 is photographed by two cameras, the first camera 102A and the second camera 102B, will be described. The case is not limited to the case where the light source and the camera are two, and the same explanation can be made when the light source and the camera are one. Hereinafter, the principle of the line-of-sight detection method according to the present embodiment will be described.
 図4は、第1光源103A及び第2光源103Bで眼球111が照明される例を示す図である。図4に示すように、本実施形態においては、第1カメラ102A及び第2光源103Bと、第2カメラ102B及び第1光源103Aとは、第1カメラ102Aと第2カメラ102Bとの中間位置を通る直線に対して左右対称の位置に配置される。第1カメラ102Aと第2カメラ102Bとの中間位置に仮想光源(光源の基準位置)103Vが存在するとみなすことができる。 FIG. 4 is a diagram showing an example in which the eyeball 111 is illuminated by the first light source 103A and the second light source 103B. As shown in FIG. 4, in the present embodiment, the first camera 102A and the second light source 103B and the second camera 102B and the first light source 103A have an intermediate position between the first camera 102A and the second camera 102B. It is placed symmetrically with respect to the straight line passing through. It can be considered that the virtual light source (reference position of the light source) 103V exists at the intermediate position between the first camera 102A and the second camera 102B.
 角膜反射中心121は、第2カメラ102Bで眼球111を撮影した画像における角膜反射中心を示す。角膜反射中心122は、第1カメラ102Aで眼球111を撮影した画像における角膜反射中心を示す。角膜反射中心124は、仮想光源103Vに対応する角膜反射中心を示す。 The corneal reflex center 121 indicates the corneal reflex center in the image obtained by taking the eyeball 111 with the second camera 102B. The corneal reflex center 122 indicates the corneal reflex center in the image obtained by taking the eyeball 111 with the first camera 102A. The corneal reflex center 124 indicates a corneal reflex center corresponding to the virtual light source 103V.
 角膜反射中心124の位置データは、ステレオカメラ装置102で撮影された角膜反射中心121の位置データ及び角膜反射中心122の位置データに基づいて算出される。ステレオカメラ装置102は、ステレオカメラ装置102に規定される三次元ローカル座標系において角膜反射中心121の位置データ及び角膜反射中心122の位置データを検出する。ステレオカメラ装置102について、事前にステレオ較正法によるカメラ較正が実施され、ステレオカメラ装置102の三次元ローカル座標系を三次元グローバル座標系に変換する変換パラメータが算出される。その変換パラメータは、記憶部27に記憶されている。曲率中心算出部25は、ステレオカメラ装置102で撮影された角膜反射中心121の位置データ及び角膜反射中心122の位置データを、変換パラメータを使って、三次元グローバル座標系における位置データに変換する。曲率中心算出部25は、三次元グローバル座標系で規定される角膜反射中心121の位置データ及び角膜反射中心122の位置データに基づいて、三次元グローバル座標系における角膜反射中心124の位置データを算出する。 The position data of the corneal reflex center 124 is calculated based on the position data of the corneal reflex center 121 and the position data of the corneal reflex center 122 taken by the stereo camera device 102. The stereo camera device 102 detects the position data of the corneal reflex center 121 and the position data of the corneal reflex center 122 in the three-dimensional local coordinate system defined by the stereo camera device 102. The stereo camera device 102 is subjected to camera calibration by the stereo calibration method in advance, and conversion parameters for converting the three-dimensional local coordinate system of the stereo camera device 102 into the three-dimensional global coordinate system are calculated. The conversion parameter is stored in the storage unit 27. The curvature center calculation unit 25 converts the position data of the corneal reflex center 121 and the position data of the corneal reflex center 122 taken by the stereo camera device 102 into position data in the three-dimensional global coordinate system using conversion parameters. The curvature center calculation unit 25 calculates the position data of the corneal reflex center 124 in the three-dimensional global coordinate system based on the position data of the corneal reflex center 121 and the position data of the corneal reflex center 122 defined in the three-dimensional global coordinate system. do.
 角膜曲率中心110は、仮想光源103Vと角膜反射中心124とを結ぶ直線123上に存在する。曲率中心算出部25は、直線123上において角膜反射中心124からの距離が所定値となる位置を、角膜曲率中心110の位置として算出する。当該所定値としては、角膜曲率半径109が用いられる。角膜曲率半径109は、角膜表面と角膜曲率中心110との距離である。角膜曲率半径109の値としては、例えば一般的な角膜曲率半径値などから予め定められた値を用いることができる。 The corneal curvature center 110 exists on a straight line 123 connecting the virtual light source 103V and the corneal reflection center 124. The curvature center calculation unit 25 calculates a position on the straight line 123 where the distance from the corneal reflex center 124 is a predetermined value as the position of the corneal curvature center 110. As the predetermined value, a corneal radius of curvature 109 is used. The corneal radius of curvature 109 is the distance between the corneal surface and the corneal curvature center 110. As the value of the radius of curvature of the cornea 109, for example, a value predetermined from a general value of the radius of curvature of the cornea can be used.
 なお、第3光源103Cを用いる場合については、表示部101の上部において、第1カメラ102Aと第2カメラ102BとのX方向における中間位置に第3光源103Cを配置することにより、上記同様の方法で角膜曲率中心110が算出される。 In the case of using the third light source 103C, the same method as described above is performed by arranging the third light source 103C at an intermediate position in the X direction between the first camera 102A and the second camera 102B in the upper part of the display unit 101. Calculates the center of curvature of the cornea 110.
 [視線検出方法]
 本実施形態に係る視線検出方法では、キャリブレーション処理を行った後、注視点検出処理が行われる。まず、キャリブレーション処理の原理について説明する。図5は、本実施形態に係るキャリブレーション処理の原理を説明するための模式図である。キャリブレーション処理では、被験者に注視させるため、目標位置130が設定される。目標位置130は、三次元グローバル座標系において規定される。表示制御部21は、設定された目標位置130に目標画像を表示する。
[Gaze detection method]
In the line-of-sight detection method according to the present embodiment, the gaze point detection process is performed after the calibration process. First, the principle of the calibration process will be described. FIG. 5 is a schematic diagram for explaining the principle of the calibration process according to the present embodiment. In the calibration process, the target position 130 is set so that the subject can gaze. The target position 130 is defined in the three-dimensional global coordinate system. The display control unit 21 displays the target image at the set target position 130.
 第1光源103A及び第2光源103Bは、眼球111を照明する。第1カメラ102A及び第2カメラ102Bは、眼球111を撮影する。例えば、第1光源103Aから検出光を射出した場合には、第2カメラ102Bで眼球111を撮影する。また、第2光源103Bから検出光を射出した場合には、第1カメラ102Aで眼球111を撮影する。なお、第1光源103A及び第2光源103Bに代えて第3光源103Cを用いる場合については、第3光源102Cから検出光を射出して、例えば第1カメラ102A及び第2カメラ102Bで異なるタイミングで眼球111を撮影する。位置検出部24は、画像データ取得部23で取得された眼球111の画像データに基づいて、瞳孔中心112Cの位置データ及び角膜反射中心113Cの位置データを検出する。位置検出部24は、検出した各位置データをグローバル座標系に変換する。 The first light source 103A and the second light source 103B illuminate the eyeball 111. The first camera 102A and the second camera 102B photograph the eyeball 111. For example, when the detection light is emitted from the first light source 103A, the eyeball 111 is photographed by the second camera 102B. Further, when the detection light is emitted from the second light source 103B, the eyeball 111 is photographed by the first camera 102A. When the third light source 103C is used instead of the first light source 103A and the second light source 103B, the detection light is emitted from the third light source 102C, for example, at different timings in the first camera 102A and the second camera 102B. The eyeball 111 is photographed. The position detection unit 24 detects the position data of the pupil center 112C and the position data of the corneal reflex center 113C based on the image data of the eyeball 111 acquired by the image data acquisition unit 23. The position detection unit 24 converts each detected position data into a global coordinate system.
 曲率中心算出部25は、第1光源103A及び第2光源103Bから検出光を出射した場合には、仮想光源103Vの位置データと、目標位置130の位置データと、瞳孔中心112Cの位置データと、角膜反射中心113Cの位置データとに基づいて、角膜曲率中心110の位置データを算出する。具体的には、曲率中心算出部25は、仮想光源103V又は第3光源102Cと、角膜反射中心113Cとを結ぶ第1直線141を求める。また、曲率中心算出部25は、第3光源103Cから検出光を出射した場合には、仮想光源103Vの位置データに代えて第3光源103Cの位置データを用いることで角膜曲率中心110の位置データを算出する。具体的には、曲率中心算出部25は、第3光源103Cと、角膜反射中心113Cとを結ぶ第1直線を求める。 When the detection light is emitted from the first light source 103A and the second light source 103B, the curvature center calculation unit 25 obtains the position data of the virtual light source 103V, the position data of the target position 130, and the position data of the pupil center 112C. The position data of the corneal curvature center 110 is calculated based on the position data of the corneal reflection center 113C. Specifically, the curvature center calculation unit 25 obtains a first straight line 141 connecting the virtual light source 103V or the third light source 102C and the corneal reflection center 113C. Further, when the detection light is emitted from the third light source 103C, the curvature center calculation unit 25 uses the position data of the third light source 103C instead of the position data of the virtual light source 103V to obtain the position data of the corneal curvature center 110. Is calculated. Specifically, the curvature center calculation unit 25 obtains a first straight line connecting the third light source 103C and the corneal reflection center 113C.
 また、曲率中心算出部25は、目標位置130と瞳孔中心112Cとを結ぶ第2直線142を求める。曲率中心算出部25は、第1直線141と第2直線142との交点を角膜曲率中心110の位置データとして求める。そして、曲率中心算出部25は、角膜曲率中心110と瞳孔中心112Cとの距離127を算出し、キャリブレーションデータとして記憶部27に記憶する。 Further, the curvature center calculation unit 25 obtains a second straight line 142 connecting the target position 130 and the pupil center 112C. The curvature center calculation unit 25 obtains the intersection of the first straight line 141 and the second straight line 142 as the position data of the corneal curvature center 110. Then, the curvature center calculation unit 25 calculates the distance 127 between the corneal curvature center 110 and the pupil center 112C, and stores it in the storage unit 27 as calibration data.
 次に、視線検出処理の原理について説明する。図6は、本実施形態に係る視線検出処理の原理を説明するための模式図である。視線検出処理では、キャリブレーション処理と同様に、第1光源103A及び第2光源103Bを用いて、又は第3光源103Cを用いて、眼球111を照明する。第1カメラ102A及び第2カメラ102Bは、眼球111を撮影する。位置検出部24は、画像データ取得部23で取得された眼球111の画像データに基づいて、瞳孔中心112Cの位置データ及び角膜反射中心113Cの位置データを検出する。 Next, the principle of the line-of-sight detection process will be explained. FIG. 6 is a schematic diagram for explaining the principle of the line-of-sight detection process according to the present embodiment. In the line-of-sight detection process, the eyeball 111 is illuminated by using the first light source 103A and the second light source 103B, or by using the third light source 103C, as in the calibration process. The first camera 102A and the second camera 102B photograph the eyeball 111. The position detection unit 24 detects the position data of the pupil center 112C and the position data of the corneal reflex center 113C based on the image data of the eyeball 111 acquired by the image data acquisition unit 23.
 曲率中心算出部25は、第1光源103A及び第2光源103Bを用いて検出光を照射した場合には、仮想光源103Vの位置データと、瞳孔中心112Cの位置データと、角膜反射中心113Cの位置データと、キャリブレーション処理において算出した角膜曲率中心110と瞳孔中心112Cとの距離127とに基づいて、角膜曲率中心110の位置データを算出する。具体的には、曲率中心算出部25は、仮想光源103Vと、角膜反射中心113Cとを結ぶ直線173を求める。また、曲率中心算出部25は、第3光源103Cを用いて検出光を照射した場合には、仮想光源103Vの位置データに代えて第3光源103Cの位置データを用いることで角膜曲率中心110の位置データを算出する。つまり、曲率中心算出部25は、第3光源103Cと、角膜反射中心113Cとを結ぶ直線を求める。 When the detection light is irradiated using the first light source 103A and the second light source 103B, the curvature center calculation unit 25 includes the position data of the virtual light source 103V, the position data of the pupil center 112C, and the position of the corneal reflection center 113C. The position data of the corneal curvature center 110 is calculated based on the data and the distance 127 between the corneal curvature center 110 and the pupil center 112C calculated in the calibration process. Specifically, the curvature center calculation unit 25 obtains a straight line 173 connecting the virtual light source 103V and the corneal reflection center 113C. Further, when the detection light is irradiated by the third light source 103C, the curvature center calculation unit 25 uses the position data of the third light source 103C instead of the position data of the virtual light source 103V to obtain the position data of the third light source 103C. Calculate the position data. That is, the curvature center calculation unit 25 obtains a straight line connecting the third light source 103C and the corneal reflection center 113C.
 また、曲率中心算出部25は、瞳孔中心112Cから眼球111の内側に対して距離127に相当する距離だけ離れた位置を角膜曲率中心110の位置データとして求める。注視点検出部26は、瞳孔中心112Cと角膜曲率中心110とを結ぶ直線178を求め、当該直線178と表示部101との交点166の位置データを注視点の位置データとして算出する。 Further, the curvature center calculation unit 25 obtains a position separated from the inside of the pupil 111 by a distance corresponding to a distance 127 as the position data of the corneal curvature center 110. The gazing point detection unit 26 obtains a straight line 178 connecting the pupil center 112C and the corneal curvature center 110, and calculates the position data of the intersection 166 between the straight line 178 and the display unit 101 as the gazing point position data.
 図7は、検出光の反射像が形成された眼球の一例を示す図である。図7に示すように、被験者が仮想光源103V又は第3光源103Cからそれほど離れない位置を見た場合等、被験者の眼球111に照射される検出光の反射像113が角膜111a上に存在する場合には、反射像113の形状が例えば扁平率の小さい略円形状となるため、角膜反射中心の位置データを高精度に検出できる。 FIG. 7 is a diagram showing an example of an eyeball on which a reflected image of the detected light is formed. As shown in FIG. 7, when the reflected image 113 of the detection light emitted to the subject's eyeball 111 is present on the cornea 111a, such as when the subject looks at a position not so far from the virtual light source 103V or the third light source 103C. Since the shape of the reflection image 113 is, for example, a substantially circular shape with a small flattening ratio, the position data of the corneal reflection center can be detected with high accuracy.
 図8は、検出光の反射像が形成された眼球の他の例を示す図である。図8に示すように、被験者が仮想光源103V又は第3光源103Cから大きく外れた位置を見た場合等においては、被験者の眼球111に照射される検出光の反射像113が角膜111aと強膜111bとの境界に存在することがある。この場合、曲率半径及び反射率の違い等により、検出光の反射像113の形が歪み、例えば扁平率の大きい楕円形等の形状となる。このため、角膜反射中心の位置データの検出精度が低下し、視線の検出精度が低下する可能性がある。 FIG. 8 is a diagram showing another example of the eyeball in which the reflected image of the detected light is formed. As shown in FIG. 8, when the subject sees a position significantly deviated from the virtual light source 103V or the third light source 103C, the reflected image 113 of the detection light emitted to the subject's eyeball 111 is the cornea 111a and the sclera. It may exist at the boundary with 111b. In this case, the shape of the reflected image 113 of the detected light is distorted due to the difference in the radius of curvature and the reflectance, for example, an elliptical shape having a large flatness. Therefore, the detection accuracy of the position data of the corneal reflex center may decrease, and the detection accuracy of the line of sight may decrease.
 これに対して、本実施形態では、被験者の注視点の位置に応じて、検出光の反射像が被験者の角膜111aからはみ出さない位置に形成されるように、当該検出光を発光する光源を切り替える制御を行う。検出光の反射像が形成される位置は、例えば被験者の眼球の位置と、カメラと光源との位置関係と、によって決定される。図9は、被験者の眼球の位置と、被験者の眼球と各光源との距離と、をそれぞれ一定とした場合に、検出光の反射像113が被験者の角膜111a上に収まるように形成される注視点の範囲(以下、注視点範囲と表記する)の一例を光源ごとに示す図である。図9に示す例において、第1光源103Aの注視点範囲(以下、第1注視点範囲と表記する)Sa、第2光源103Bの注視点範囲(以下、第2注視点範囲と表記する)Sb、及び第3光源103Cの注視点範囲(以下、第3注視点範囲と表記する)Scは、それぞれ各光源を中心とする楕円形の領域となっている。 On the other hand, in the present embodiment, a light source that emits the detected light is provided so that the reflected image of the detected light is formed at a position that does not protrude from the cornea 111a of the subject according to the position of the gazing point of the subject. Controls switching. The position where the reflected image of the detected light is formed is determined by, for example, the position of the subject's eyeball and the positional relationship between the camera and the light source. FIG. 9 is a note formed so that the reflected image 113 of the detected light fits on the cornea 111a of the subject when the position of the eyeball of the subject and the distance between the eyeball of the subject and each light source are constant. It is a figure which shows an example of the viewpoint range (hereinafter referred to as a gaze point range) for each light source. In the example shown in FIG. 9, the gazing point range of the first light source 103A (hereinafter referred to as the first gazing point range) Sa and the gazing point range of the second light source 103B (hereinafter referred to as the second gazing point range) Sb. , And the gaze point range (hereinafter referred to as the third gaze point range) Sc of the third light source 103C are elliptical regions centered on each light source.
 第1光源103A及び第2光源103Bから検出光を発光する場合、両方の光源による反射像113がそれぞれ角膜111aからはみ出さないように存在している必要がある。図9に示す例では、第1注視点範囲Saと第2注視点範囲Sbとが重なる重複範囲Sdは、少なくとも表示部101の中央部及び下側半分の領域(以下、下側領域と表記する)101Dを含んでいる。したがって、被験者の注視点が表示部101の中央部及び下側領域101Dに存在する場合、第1光源103A及び第2光源103Bの両方で発光される検出光の反射像113は、それぞれ被験者の角膜111aに形成されることになる。なお、図9では、図を判別しやすくするため、重複領域Sdを示す一点鎖線を、表示部101の外周を示す実線に対して内側にずらして示している。本実施形態において、重複領域Sdは、表示部101の外周を含む領域とすることができる。 When the detection light is emitted from the first light source 103A and the second light source 103B, it is necessary that the reflected image 113 by both light sources exists so as not to protrude from the cornea 111a. In the example shown in FIG. 9, the overlapping range Sd in which the first gazing point range Sa and the second gazing point range Sb overlap is at least the central portion and the lower half region of the display unit 101 (hereinafter, referred to as a lower region). ) 101D is included. Therefore, when the gazing point of the subject is present in the central portion and the lower region 101D of the display unit 101, the reflected image 113 of the detection light emitted by both the first light source 103A and the second light source 103B is the cornea of the subject, respectively. It will be formed at 111a. In FIG. 9, in order to make it easier to distinguish the figure, the alternate long and short dash line indicating the overlapping region Sd is shown shifted inward with respect to the solid line indicating the outer circumference of the display unit 101. In the present embodiment, the overlapping region Sd can be a region including the outer periphery of the display unit 101.
 また、図9に示す例では、第3注視点範囲Scは、少なくとも表示部101の中央部及び上側半分の領域(以下、上側領域と表記する)101Uを含んでいる。したがって、被験者の注視点が表示部101の中央部及び上側領域101Uに存在する場合、第3光源103Cで発光される検出光の反射像113は、被験者の角膜111aに形成されることになる。 Further, in the example shown in FIG. 9, the third gaze point range Sc includes at least the central portion of the display unit 101 and the upper half region (hereinafter referred to as the upper region) 101U. Therefore, when the gazing point of the subject is present in the central portion of the display unit 101 and the upper region 101U, the reflected image 113 of the detection light emitted by the third light source 103C is formed on the cornea 111a of the subject.
 図10及び図11は、キャリブレーション処理における表示部101及び照明装置103の動作の一例を示す図である。本実施形態では、第1光源103A及び第2光源103Bから検出光を発光する場合と、第3光源103Cから検出光を発光する場合とのそれぞれについて、キャリブレーション処理を行う。 10 and 11 are diagrams showing an example of the operation of the display unit 101 and the lighting device 103 in the calibration process. In the present embodiment, calibration processing is performed for each of the case where the detection light is emitted from the first light source 103A and the second light source 103B and the case where the detection light is emitted from the third light source 103C.
 図10に示すように、キャリブレーション処理において第1光源103A及び第2光源103Bから検出光を発光する場合、表示制御部21は、例えば表示部101のうち、図9に示す重複範囲Sd及び第3注視点範囲Scの両方に含まれる中央部に目標画像Mを表示するように制御する。なお、表示制御部21は、例えば表示部101の下側領域に目標画像Mを表示するように制御してもよい。 As shown in FIG. 10, when the detection light is emitted from the first light source 103A and the second light source 103B in the calibration process, the display control unit 21 is, for example, among the display units 101, the overlapping range Sd and the second light source 103 shown in FIG. 3 Control is performed so that the target image M is displayed in the central portion included in both the gazing point range Sc. The display control unit 21 may be controlled to display the target image M in, for example, the lower region of the display unit 101.
 また、図11に示すように、キャリブレーション処理において第3光源103Cから検出光を発光する場合、表示制御部21は、第1光源103A及び第2光源103Bから検出光を発光する場合と同様、表示部101の中央部に目標画像Mを表示するように制御する。なお、表示制御部21は、例えば表示部101の上側領域に目標画像Mを表示するように制御してもよい。 Further, as shown in FIG. 11, when the detection light is emitted from the third light source 103C in the calibration process, the display control unit 21 emits the detection light from the first light source 103A and the second light source 103B in the same manner as in the case of emitting the detection light from the first light source 103A and the second light source 103B. The target image M is controlled to be displayed in the central portion of the display unit 101. The display control unit 21 may control the target image M to be displayed in the upper region of the display unit 101, for example.
 画像データ取得部23は、左右の眼球の画像データを取得する。位置検出部24は、瞳孔中心の位置データ及び角膜反射中心の位置データを検出し、各位置データをグローバル座標(世界座標)系に変換する。 The image data acquisition unit 23 acquires image data of the left and right eyeballs. The position detection unit 24 detects the position data at the center of the pupil and the position data at the center of the corneal reflex, and converts each position data into a global coordinate (world coordinate) system.
 当該キャリブレーション処理において、検出光を発光する光源を下部光源(第1光源103A及び第2光源103B)とした場合、曲率中心算出部25は、仮想光源103Vと、角膜反射中心113Cとを結ぶ第1直線を求め、目標位置130と瞳孔中心112Cとを結ぶ第2直線を求めて、第1直線と第2直線との交点を角膜曲率中心110の位置データとして求める。そして、曲率中心算出部25は、角膜曲率中心110と瞳孔中心112Cとの距離Raを算出する。曲率中心算出部25は、算出した距離Raの値を、下部光源から検出光を発光した場合のキャリブレーションデータとして記憶部27に記憶する。 In the calibration process, when the light sources that emit the detected light are the lower light sources (first light source 103A and second light source 103B), the curvature center calculation unit 25 connects the virtual light source 103V and the corneal reflection center 113C. One straight line is obtained, a second straight line connecting the target position 130 and the pupil center 112C is obtained, and the intersection of the first straight line and the second straight line is obtained as the position data of the corneal curvature center 110. Then, the curvature center calculation unit 25 calculates the distance Ra between the corneal curvature center 110 and the pupil center 112C. The curvature center calculation unit 25 stores the calculated value of the distance Ra in the storage unit 27 as calibration data when the detected light is emitted from the lower light source.
 一方、当該キャリブレーション処理において、検出光を発光する光源を上部光源(第3光源103C)とした場合、曲率中心算出部25は、第3光源103Cと、角膜反射中心113Cとを結ぶ第1直線を求め、目標位置130と瞳孔中心112Cとを結ぶ第2直線を求めて、第1直線と第2直線との交点を角膜曲率中心110の位置データとして求める。そして、曲率中心算出部25は、角膜曲率中心110と瞳孔中心112Cとの距離Rbを算出する。曲率中心算出部25は、算出した距離Rbの値を、上部光源から検出光を発光した場合のキャリブレーションデータとして記憶部27に記憶する。 On the other hand, in the calibration process, when the light source that emits the detected light is the upper light source (third light source 103C), the curvature center calculation unit 25 is the first straight line connecting the third light source 103C and the corneal reflection center 113C. Is obtained, a second straight line connecting the target position 130 and the pupil center 112C is obtained, and the intersection of the first straight line and the second straight line is obtained as the position data of the corneal curvature center 110. Then, the curvature center calculation unit 25 calculates the distance Rb between the corneal curvature center 110 and the pupil center 112C. The curvature center calculation unit 25 stores the calculated value of the distance Rb in the storage unit 27 as calibration data when the detected light is emitted from the upper light source.
 このように、本実施形態では、キャリブレーション処理において、検出光を発光する光源を下部光源とした場合の距離Raと、検出光を発光する光源を上部光源とした場合の距離Rbとの2種類の距離(角膜曲率中心110と瞳孔中心112Cとの距離)が算出される。 As described above, in the present embodiment, in the calibration process, there are two types of distance Ra when the light source that emits the detection light is the lower light source and the distance Rb when the light source that emits the detection light is the upper light source. The distance (distance between the center of curvature of the corneum 110 and the center of the pupil 112C) is calculated.
 図12及び図13は、視線検出処理における表示部101及び照明装置103の動作の一例を示す図である。視線検出処理において、初回の注視点の検出を行う場合、光源制御部22は、例えば第1光源103A及び第2光源103Bから検出光が発光するように制御する。 12 and 13 are diagrams showing an example of the operation of the display unit 101 and the lighting device 103 in the line-of-sight detection process. When detecting the gaze point for the first time in the line-of-sight detection process, the light source control unit 22 controls so that the detected light is emitted from, for example, the first light source 103A and the second light source 103B.
 次回以降の注視点の検出において、図12に示すように、例えば前回の被験者の注視点の位置が表示部101の下側領域101Dに存在する場合、光源制御部22は、下部光源である第1光源103A及び第2光源103Bから検出光が発光するように制御する。また、図13に示すように、例えば前回の被験者の注視点の位置が表示部101の上側領域101Uに存在する場合、光源制御部22は、上部光源である第3光源103Cに検出光を発光するように制御する。 In the detection of the gaze point from the next time onward, as shown in FIG. 12, for example, when the position of the gaze point of the previous subject exists in the lower region 101D of the display unit 101, the light source control unit 22 is the lower light source. The detection light is controlled to be emitted from the first light source 103A and the second light source 103B. Further, as shown in FIG. 13, for example, when the position of the gaze point of the previous subject exists in the upper region 101U of the display unit 101, the light source control unit 22 emits the detected light to the third light source 103C which is the upper light source. Control to do.
 また、例えば前回の検出において第1光源103A及び第2光源103Bから検出光が発光するように制御したが、注視点を正常に検出できなかった場合には、検出光を発光する光源を第3光源103Cに変更して再度検出を行うようにしてもよい。また、例えば前回の検出において第3光源103Cから検出光が発光するように制御したが、注視点を正常に検出できなかった場合には、検出光を発光する光源を第1光源103A及び第2光源103Bに変更して再度検出を行うようにしてもよい。 Further, for example, in the previous detection, the detection light is controlled to be emitted from the first light source 103A and the second light source 103B, but when the gazing point cannot be detected normally, the light source that emits the detection light is the third light source. The light source 103C may be changed to perform detection again. Further, for example, in the previous detection, the detection light is controlled to be emitted from the third light source 103C, but when the gazing point cannot be detected normally, the light sources that emit the detection light are the first light source 103A and the second light source 103A. The light source 103B may be changed to perform detection again.
 この場合、光源制御部22は、第1光源103A及び第2光源103Bの一方を発光させて眼球111に検出光を照射し、第1カメラ102A及び第2カメラ102Bのうち発光した光源から遠い方のカメラにより、被験者の眼球111を撮影する。その後、第1光源103A及び第2光源103Bの他方を発光させて眼球111に検出光を照射し、第1カメラ102A及び第2カメラ102Bのうち発光した光源から遠い方のカメラにより、被験者の眼球111を撮影する。例えば、第1光源103Aから検出光を射出した場合には、第2カメラ102Bで眼球111を撮影する。また、第2光源103Bから検出光を射出した場合には、第1カメラ102Aで眼球111を撮影する。 In this case, the light source control unit 22 causes one of the first light source 103A and the second light source 103B to emit light to irradiate the eyeball 111 with the detected light, and the light source of the first camera 102A and the second camera 102B is farther from the light source. The eyeball 111 of the subject is photographed by the camera of the subject. After that, the other of the first light source 103A and the second light source 103B is made to emit light to irradiate the eyeball 111 with the detected light, and the eyeball of the subject is taken by the camera of the first camera 102A and the second camera 102B which is far from the emitted light source. Take a picture of 111. For example, when the detection light is emitted from the first light source 103A, the eyeball 111 is photographed by the second camera 102B. Further, when the detection light is emitted from the second light source 103B, the eyeball 111 is photographed by the first camera 102A.
 画像データ取得部23は、画像データを取得する。位置検出部24は、取得した画像データに基づいて、瞳孔中心及び角膜反射中心の位置データを検出する。位置検出部24は、角膜反射中心の位置データを正常に検出したか否かを判定する。検出光の反射像が被験者の角膜上に収まっている場合、正常な値が検出される可能性が高くなる。一方、検出光の反射像が被験者の角膜上に収まっておらず、例えば強膜上にはみ出して歪んでいる等の場合、正常な値が検出される可能性が低くなる。正常な値が検出された場合、曲率中心算出部25及び注視点検出部26における各処理により、注視点の位置データが取得される。なお、正常な値が検出されない場合、注視点検出部26は、当該視線検出処理においては、例えばエラー判定とすることができる。 The image data acquisition unit 23 acquires image data. The position detection unit 24 detects the position data of the center of the pupil and the center of the corneal reflex based on the acquired image data. The position detection unit 24 determines whether or not the position data of the corneal reflex center is normally detected. If the reflected image of the detection light is contained on the subject's cornea, it is more likely that a normal value will be detected. On the other hand, when the reflected image of the detected light does not fit on the cornea of the subject and is distorted by protruding onto the sclera, for example, the possibility that a normal value is detected is low. When a normal value is detected, the position data of the gazing point is acquired by each process in the curvature center calculation unit 25 and the gazing point detecting unit 26. If a normal value is not detected, the gaze point detection unit 26 can make an error determination, for example, in the line-of-sight detection process.
 角膜反射中心の正常な値が検出された場合、曲率中心算出部25は、検出された値に基づいて、角膜曲率中心を算出する。このとき、曲率中心算出部25は、当該視線検出処理において検出光を発光する光源が下部光源(第1光源103A及び第2光源103B)であった場合には、キャリブレーション処理において算出された2種類の距離(角膜曲率中心と瞳孔中心との距離)Ra、Rbのうち、検出光を発光する光源を下部光源とした場合に算出された距離Raの値を用いて、角膜曲率中心を算出する。また、曲率中心算出部25は、当該視線検出処理において検出光を発光する光源を上部光源(第3光源103C)とした場合には、キャリブレーション処理において検出光を発光する光源を上部光源とした場合に算出された距離Rbの値を用いて、角膜曲率中心を算出する。 When a normal value of the corneal reflex center is detected, the curvature center calculation unit 25 calculates the corneal curvature center based on the detected value. At this time, the curvature center calculation unit 25 is calculated in the calibration process when the light source that emits the detected light in the line-of-sight detection process is the lower light source (first light source 103A and second light source 103B). The center of corneal curvature is calculated using the value of the distance Ra calculated when the light source that emits the detected light is the lower light source among the types of distances (distance between the center of corneal curvature and the center of the pupil) Ra and Rb. .. Further, when the light source that emits the detected light in the line-of-sight detection process is the upper light source (third light source 103C), the curvature center calculation unit 25 uses the light source that emits the detected light in the calibration process as the upper light source. The center of curvature of the cornea is calculated using the value of the distance Rb calculated in the case.
 図14は、視線検出処理における表示部101及び照明装置103の動作の他の例を示す図である。光源制御部22は、図14に示すように、前回以前の被験者の注視点の位置の推移に基づいて次回の被験者の注視点の位置を予測し、予測結果に基づいて、検出光を発光する光源を下部光源である第1光源103A及び第2光源103Bと、上部光源である第3光源103Cとから選択することができる。例えば、前回以前の注視点が下側領域101D内を上側に向けて移動しており、次回の注視点が上側領域101Uに配置されると予測される場合、光源制御部22は、次回の注視点の検出時には、上部光源である第3光源103Cから検出光を発光させるように制御することができる。また、前回以前の注視点が上側領域101U内を下側に向けて移動しており、次回の注視点が下側領域101Dに配置されると予測される場合、光源制御部22は、次回の注視点の検出時には、下部光源である第1光源103A及び第2光源103Bから検出光を発光させるように制御することができる。 FIG. 14 is a diagram showing another example of the operation of the display unit 101 and the lighting device 103 in the line-of-sight detection process. As shown in FIG. 14, the light source control unit 22 predicts the position of the next subject's gaze point based on the transition of the gaze point position of the subject before the previous time, and emits the detected light based on the prediction result. The light source can be selected from a first light source 103A and a second light source 103B which are lower light sources and a third light source 103C which is an upper light source. For example, if the gaze point before the previous time is moving upward in the lower region 101D and it is predicted that the next gaze point will be placed in the upper region 101U, the light source control unit 22 will perform the next gaze. At the time of detecting the viewpoint, it is possible to control the detection light to be emitted from the third light source 103C which is the upper light source. Further, when the gaze point before the previous time is moving downward in the upper region 101U and it is predicted that the next gaze point will be arranged in the lower region 101D, the light source control unit 22 will perform the next time. At the time of detecting the gazing point, it is possible to control the detection light to be emitted from the first light source 103A and the second light source 103B, which are the lower light sources.
 図15は、視線検出処理における表示部101及び照明装置103の動作の他の例を示す図である。図15に示すように、表示部101のうち、上記した重複範囲Sd(図9参照)に対応した範囲に下側領域101Dを設定することができる。この場合、表示部101のうち、下側領域101Dの外側の領域が上側領域101Uとなる。被験者の注視点Pが下側領域101Dに存在する場合、つまり、注視点Pが重複範囲Sdの内側に存在する場合、光源制御部22は、第1光源103A及び第2光源103Bから検出光を発光させるように制御することができる。また、被験者の注視点Pが上側領域101Uに存在する場合、つまり、注視点Pが重複範囲Sdの外側に存在する場合、光源制御部22は、第3光源103Cから検出光を発光させるように制御することができる。なお、下側領域101Dの形状については、重複範囲Sdに沿った形状に限定されず、重複範囲Sdの少なくとも一部を含む範囲であれば、他の形状としてもよい。この場合、上側領域101Uが第3注視点範囲Sc(図9参照)に含まれる形状となるように下側領域101Dの形状を設定する。これにより、上側領域101Uが第3注視点範囲Scに含まれ、かつ下側領域101Dが重複範囲Sdに含まれるため、被験者が表示部101のどの位置を注視した場合でも、検出光の反射像を被験者の角膜上に形成することができる。なお、図15では、図を判別しやすくするため、重複領域Sdを示す一点鎖線を、表示部101の外周を示す実線に対して内側にずらして示している。本実施形態において、重複領域Sdは、表示部101の外周を含む領域とすることができる。同様に、図15では、図を判別しやすくするため、上側領域101Uと下側領域101Dとの境界を示す破線を、重複領域Sdの外周を示す一点鎖線に対して内側にずらして示している。本実施形態において、上側領域101Uと下側領域101Dとの境界は、重複領域Sdの外周上に配置することができる。 FIG. 15 is a diagram showing another example of the operation of the display unit 101 and the lighting device 103 in the line-of-sight detection process. As shown in FIG. 15, the lower region 101D can be set in the range corresponding to the above-mentioned overlapping range Sd (see FIG. 9) in the display unit 101. In this case, in the display unit 101, the region outside the lower region 101D becomes the upper region 101U. When the gazing point P of the subject is in the lower region 101D, that is, when the gazing point P is inside the overlapping range Sd, the light source control unit 22 detects the detection light from the first light source 103A and the second light source 103B. It can be controlled to emit light. Further, when the gazing point P of the subject exists in the upper region 101U, that is, when the gazing point P exists outside the overlapping range Sd, the light source control unit 22 causes the detection light to be emitted from the third light source 103C. Can be controlled. The shape of the lower region 101D is not limited to the shape along the overlapping range Sd, and may be another shape as long as it includes at least a part of the overlapping range Sd. In this case, the shape of the lower region 101D is set so that the upper region 101U has a shape included in the third gazing point range Sc (see FIG. 9). As a result, the upper region 101U is included in the third gazing point range Sc, and the lower region 101D is included in the overlapping range Sd. Therefore, no matter which position of the display unit 101 the subject gazes at, the reflected image of the detected light. Can be formed on the cornea of the subject. In FIG. 15, in order to make it easier to distinguish the figure, the alternate long and short dash line indicating the overlapping region Sd is shown shifted inward with respect to the solid line indicating the outer circumference of the display unit 101. In the present embodiment, the overlapping region Sd can be a region including the outer periphery of the display unit 101. Similarly, in FIG. 15, in order to make the figure easier to distinguish, the broken line indicating the boundary between the upper region 101U and the lower region 101D is shown shifted inward with respect to the alternate long and short dash line indicating the outer circumference of the overlapping region Sd. .. In the present embodiment, the boundary between the upper region 101U and the lower region 101D can be arranged on the outer periphery of the overlapping region Sd.
 次に、本実施形態に係る視線検出方法の一例について、図16及び図17を参照しながら説明する。図16は、本実施形態に係る視線検出方法におけるキャリブレーション処理の一例を示すフローチャートである。図17は、本実施形態に係る視線検出方法における視線検出処理の一例を示すフローチャートである。 Next, an example of the line-of-sight detection method according to the present embodiment will be described with reference to FIGS. 16 and 17. FIG. 16 is a flowchart showing an example of the calibration process in the line-of-sight detection method according to the present embodiment. FIG. 17 is a flowchart showing an example of the line-of-sight detection process in the line-of-sight detection method according to the present embodiment.
 図16に示すように、キャリブレーション処理において、表示制御部21は、表示部101の中央部に目標画像Mを表示する(ステップS101)。光源制御部22の制御により第1光源103A及び第2光源103Bの一方を発光させて眼球111に検出光を照射し(ステップS102)、第1カメラ102A及び第2カメラ102Bのうち発光した光源から遠い方のカメラにより、被験者の眼球111を撮影する(ステップS103)。また、光源制御部22の制御により第1光源103A及び第2光源103Bの他方を発光させて眼球111に検出光を照射し(ステップS104)、第1カメラ102A及び第2カメラ102Bのうち発光した光源から遠い方のカメラにより、被験者の眼球111を撮影する(ステップS105)。例えば、第1光源103Aから検出光を射出した場合には、第2カメラ102Bで眼球111を撮影する。また、第2光源103Bから検出光を射出した場合には、第1カメラ102Aで眼球111を撮影する。 As shown in FIG. 16, in the calibration process, the display control unit 21 displays the target image M in the central portion of the display unit 101 (step S101). Under the control of the light source control unit 22, one of the first light source 103A and the second light source 103B is made to emit light to irradiate the eyeball 111 with the detected light (step S102), and the light emitted from the first camera 102A and the second camera 102B emits light. The subject's eyeball 111 is photographed by a distant camera (step S103). Further, under the control of the light source control unit 22, the other of the first light source 103A and the second light source 103B was made to emit light to irradiate the eyeball 111 with the detected light (step S104), and the light was emitted from the first camera 102A and the second camera 102B. The subject's eyeball 111 is photographed by a camera far from the light source (step S105). For example, when the detection light is emitted from the first light source 103A, the eyeball 111 is photographed by the second camera 102B. Further, when the detection light is emitted from the second light source 103B, the eyeball 111 is photographed by the first camera 102A.
 画像データ取得部23は、左右の眼球の画像データを取得する。位置検出部24は、瞳孔中心の位置データ及び角膜反射中心の位置データを検出し(ステップS106)、各位置データをグローバル座標(世界座標)系に変換する(ステップS107)。 The image data acquisition unit 23 acquires image data of the left and right eyeballs. The position detection unit 24 detects the position data at the center of the pupil and the position data at the center of the corneal reflex (step S106), and converts each position data into a global coordinate (world coordinate) system (step S107).
 曲率中心算出部25は、仮想光源103Vと、角膜反射中心113Cとを結ぶ第1直線を求める(ステップS108)。また、曲率中心算出部25は、目標位置130と瞳孔中心112Cとを結ぶ第2直線を求める(ステップS109)。曲率中心算出部25は、第1直線と第2直線との交点を角膜曲率中心110の位置データとして求める(ステップS110)。そして、曲率中心算出部25は、角膜曲率中心110と瞳孔中心112Cとの距離Raを算出する(ステップS111)。 The curvature center calculation unit 25 obtains a first straight line connecting the virtual light source 103V and the corneal reflection center 113C (step S108). Further, the curvature center calculation unit 25 obtains a second straight line connecting the target position 130 and the pupil center 112C (step S109). The curvature center calculation unit 25 obtains the intersection of the first straight line and the second straight line as the position data of the corneal curvature center 110 (step S110). Then, the curvature center calculation unit 25 calculates the distance Ra between the corneal curvature center 110 and the pupil center 112C (step S111).
 次に、光源制御部22の制御により第3光源103Cを発光させて眼球111に検出光を照射し(ステップS112)、第1カメラ102A及び第2カメラ102Bの一方のカメラで被験者の眼球111を撮影し(ステップS113)、その後、他方のカメラにより、被験者の眼球111を撮影する(ステップS114)。 Next, the third light source 103C is made to emit light under the control of the light source control unit 22 to irradiate the eyeball 111 with the detected light (step S112), and the subject's eyeball 111 is photographed by one of the cameras 102A and the second camera 102B. An image is taken (step S113), and then the eyeball 111 of the subject is photographed by the other camera (step S114).
 画像データ取得部23は、左右の眼球の画像データを取得する。位置検出部24は、瞳孔中心の位置データ及び角膜反射中心の位置データを検出し(ステップS115)、各位置データをグローバル座標(世界座標)系に変換する(ステップS116)。 The image data acquisition unit 23 acquires image data of the left and right eyeballs. The position detection unit 24 detects the position data at the center of the pupil and the position data at the center of the corneal reflex (step S115), and converts each position data into a global coordinate (world coordinate) system (step S116).
 曲率中心算出部25は、第3光源103Cと、角膜反射中心113Cとを結ぶ第1直線を求める(ステップS117)。また、曲率中心算出部25は、目標位置130と瞳孔中心112Cとを結ぶ第2直線を求める(ステップS118)。曲率中心算出部25は、第1直線と第2直線との交点を角膜曲率中心110の位置データとして求める(ステップS119)。そして、曲率中心算出部25は、角膜曲率中心110と瞳孔中心112Cとの距離Rbを算出する(ステップS120)。 The curvature center calculation unit 25 obtains a first straight line connecting the third light source 103C and the corneal reflection center 113C (step S117). Further, the curvature center calculation unit 25 obtains a second straight line connecting the target position 130 and the pupil center 112C (step S118). The curvature center calculation unit 25 obtains the intersection of the first straight line and the second straight line as the position data of the corneal curvature center 110 (step S119). Then, the curvature center calculation unit 25 calculates the distance Rb between the corneal curvature center 110 and the pupil center 112C (step S120).
 キャリブレーション処理が完了した後、視線検出処理を行う。図17に示すように、視線検出処理において、光源制御部22は、検出光を発光する光源を第3光源103Cとするか否かを選択する(ステップS201)。光源制御部22は、検出光を発光する光源を第3光源103Cとする理由がない場合(ステップS201のNo)、検出光を発光させる光源を第1光源103A及び第2光源103Bとする。この場合、光源制御部22の制御は、第1光源103A及び第2光源103Bの一方を発光させて眼球111に検出光を照射し(ステップS202)、第1カメラ102A及び第2カメラ102Bのうち発光した光源から遠い方のカメラにより、被験者の眼球111を撮影する(ステップS203)。また、光源制御部22の制御により第1光源103A及び第2光源103Bの他方を発光させて眼球111に検出光を照射し(ステップS204)、第1カメラ102A及び第2カメラ102Bのうち発光した光源から遠い方のカメラにより、被験者の眼球111を撮影する(ステップS205)。例えば、第1光源103Aから検出光を射出した場合には、第2カメラ102Bで眼球111を撮影する。また、第2光源103Bから検出光を射出した場合には、第1カメラ102Aで眼球111を撮影する。 After the calibration process is completed, the line-of-sight detection process is performed. As shown in FIG. 17, in the line-of-sight detection process, the light source control unit 22 selects whether or not the light source that emits the detected light is the third light source 103C (step S201). When there is no reason to use the third light source 103C as the light source for emitting the detected light (No in step S201), the light source control unit 22 uses the first light source 103A and the second light source 103B as the light sources for emitting the detected light. In this case, the control of the light source control unit 22 causes one of the first light source 103A and the second light source 103B to emit light to irradiate the eyeball 111 with the detected light (step S202), and among the first camera 102A and the second camera 102B. The subject's eyeball 111 is photographed by a camera far from the light source that emits light (step S203). Further, under the control of the light source control unit 22, the other of the first light source 103A and the second light source 103B was made to emit light to irradiate the eyeball 111 with the detected light (step S204), and the light was emitted from the first camera 102A and the second camera 102B. The subject's eyeball 111 is photographed by a camera far from the light source (step S205). For example, when the detection light is emitted from the first light source 103A, the eyeball 111 is photographed by the second camera 102B. Further, when the detection light is emitted from the second light source 103B, the eyeball 111 is photographed by the first camera 102A.
 一方、光源制御部22は、検出光を発光する光源を第3光源103Cとする理由がある場合(ステップS201のYes)、光源制御部22は、第3光源103Cを発光させて眼球111に検出光を照射し(ステップS206)、第1カメラ102A及び第2カメラ102Bの一方のカメラで被験者の眼球111を撮影し(ステップS207)、その後、他方のカメラにより、被験者の眼球111を撮影する(ステップS208)。 On the other hand, when the light source control unit 22 has a reason to use the third light source 103C as the light source for emitting the detected light (Yes in step S201), the light source control unit 22 causes the third light source 103C to emit light and detects it on the eyeball 111. Light is irradiated (step S206), the subject's eyeball 111 is photographed by one of the first camera 102A and the second camera 102B (step S207), and then the subject's eyeball 111 is photographed by the other camera (step S207). Step S208).
 ステップS205又はステップS208の後、画像データ取得部23は、画像データを取得する。位置検出部24は、取得した画像データに基づいて、瞳孔中心及び角膜反射中心の位置データを検出する。位置検出部24は、角膜反射中心の位置データを正常に検出したか否かを判定する(ステップS209)。ステップS209において、正常な値を検出したと判定された場合(ステップS209のYes)、曲率中心算出部25及び注視点検出部26における各処理により注視点の位置データが取得される(ステップS210)。ステップS210において、曲率中心算出部25は、視線検出処理において検出光を発光する光源を下部光源(第1光源103A及び第2光源103B)とした場合には、キャリブレーション処理において検出光を発光する光源を下部光源とした場合に算出された距離Raの値を用いて、角膜曲率中心を算出する。また、曲率中心算出部25は、視線検出処理において検出光を発光する光源を上部光源(第3光源103C)とした場合には、キャリブレーション処理において検出光を発光する光源を上部光源とした場合に算出された距離Rbの値を用いて、角膜曲率中心を算出する。ステップS209において、正常な値が検出されなかったと判定した場合(ステップS209のNo)、エラー判定とする(ステップS211)。 After step S205 or step S208, the image data acquisition unit 23 acquires image data. The position detection unit 24 detects the position data of the center of the pupil and the center of the corneal reflex based on the acquired image data. The position detection unit 24 determines whether or not the position data of the corneal reflex center is normally detected (step S209). When it is determined in step S209 that a normal value is detected (Yes in step S209), the position data of the gazing point is acquired by each process in the curvature center calculation unit 25 and the gazing point detecting unit 26 (step S210). .. In step S210, when the light source that emits the detected light in the line-of-sight detection process is the lower light source (first light source 103A and second light source 103B), the curvature center calculation unit 25 emits the detected light in the calibration process. The center of curvature of the cornea is calculated using the value of the distance Ra calculated when the light source is the lower light source. Further, when the light source that emits the detected light in the line-of-sight detection process is the upper light source (third light source 103C), the curvature center calculation unit 25 uses the upper light source as the light source that emits the detected light in the calibration process. The center of curvature of the cornea is calculated using the value of the distance Rb calculated in. If it is determined in step S209 that a normal value has not been detected (No in step S209), an error determination is made (step S211).
 ステップS210又はステップS211の後、注視点検出部26は、注視点の検出を終了するか否かを判定する(ステップS212)。ステップS212の判定の結果、注視点の検出を終了する場合(ステップS212のYes)、処理を終了する。また、注視点の検出を終了しない場合(ステップS212のNo)、ステップS201以降の処理を繰り返し行う。 After step S210 or step S211 the gaze point detection unit 26 determines whether or not to end the gaze point detection (step S212). As a result of the determination in step S212, when the detection of the gaze point is terminated (Yes in step S212), the process is terminated. If the gaze point detection is not completed (No in step S212), the processes after step S201 are repeated.
 以上のように、本実施形態に係る視線検出装置100は、画像を表示する表示部101と、検出光を発光して被験者の少なくとも一方の眼球111に照射する複数の光源(第1光源103A、第2光源103B、第3光源103C)と、検出光が照射された眼球111の画像を撮像するステレオカメラ装置102と、撮像された画像から、検出光が照射された眼球111の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出する位置検出部24と、瞳孔中心の位置と角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出する注視点検出部26と、複数の光源の発光と非発光とを制御し、被験者の注視点の位置に応じて複数の光源のうち検出光を発光する光源を切り替える光源制御部22とを備える。 As described above, the line-of-sight detection device 100 according to the present embodiment has a display unit 101 for displaying an image and a plurality of light sources (first light source 103A, 1st light source 103A, which emits detection light to irradiate at least one eyeball 111 of the subject. The second light source 103B, the third light source 103C), the stereo camera device 102 that captures the image of the eyeball 111 irradiated with the detection light, and the center of the pupil of the eyeball 111 irradiated with the detection light from the captured image. The position of the gazing point of the subject is calculated based on the position detection unit 24 that detects the position of the center of the pupil and the position of the center of the corneal reflection indicating the center of the corneal reflex, and the position of the center of the pupil and the position of the center of the curvature of the corneum. It includes a gazing point detection unit 26 and a light source control unit 22 that controls light emission and non-light emission of a plurality of light sources and switches a light source that emits the detected light among the plurality of light sources according to the position of the gazing point of the subject.
 また、本実施形態に係る視線検出方法は、表示部101に画像を表示することと、複数の光源から検出光を発光して被験者の少なくとも一方の眼球111に照射することと、検出光が照射された眼球111の画像を撮像することと、撮像された画像から、検出光が照射された眼球111の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出することと、瞳孔中心の位置と角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出することと、複数の光源の発光と非発光とを制御し、被験者の注視点の位置に応じて複数の光源のうち検出光を発光する光源を切り替えることとを含む。 Further, in the line-of-sight detection method according to the present embodiment, an image is displayed on the display unit 101, detection light is emitted from a plurality of light sources to irradiate at least one pupil 111 of the subject, and the detection light irradiates. The image of the pupil 111 is captured, and the position of the center of the pupil indicating the center of the pupil of the eyeball 111 irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex are determined from the captured image. The position of the gaze point of the subject is calculated based on the detection, the position of the center of the pupil and the position of the center of curvature of the corneal, and the position of the gaze point of the subject is controlled by controlling the emission and non-emission of multiple light sources. It includes switching the light source that emits the detected light among the plurality of light sources according to the above.
 また、本実施形態に係る視線検出プログラムは、表示部101に画像を表示する処理と、複数の光源から検出光を発光して被験者の少なくとも一方の眼球111に照射する処理と、検出光が照射された眼球111の画像を撮像する処理と、撮像された画像から、検出光が照射された眼球111の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出する処理と、瞳孔中心の位置と角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出する処理と、複数の光源の発光と非発光とを制御し、被験者の注視点の位置に応じて複数の光源のうち検出光を発光する光源を切り替える処理とをコンピュータに実行させる。 Further, in the line-of-sight detection program according to the present embodiment, a process of displaying an image on the display unit 101, a process of emitting detection light from a plurality of light sources and irradiating at least one pupil 111 of the subject, and a process of irradiating the detection light. The process of capturing the image of the eyeball 111 and the position of the center of the pupil indicating the center of the pupil of the eyeball 111 irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex are determined from the captured image. The process of detecting, the process of calculating the position of the gaze point of the subject based on the position of the center of the pupil and the position of the center of curvature of the corneal, and the process of controlling the emission and non-emission of multiple light sources, and the position of the gaze point of the subject. The computer is made to execute the process of switching the light source that emits the detected light among the plurality of light sources according to the above.
 本実施形態に係る構成によれば、被験者の注視点の位置に応じて、複数の光源のうち検出光を発光する光源を切り替えることができるため、被験者の眼球に照射される検出光の反射像が角膜の外側にはみ出すことを抑制できる。これにより、検出光の反射像の歪みの発生を抑制できるため、視線の検出精度の低下を抑制できる。 According to the configuration according to the present embodiment, since the light source that emits the detection light can be switched from the plurality of light sources according to the position of the gazing point of the subject, the reflected image of the detection light emitted to the eyeball of the subject. Can be prevented from protruding to the outside of the cornea. As a result, it is possible to suppress the occurrence of distortion of the reflected image of the detected light, and thus it is possible to suppress a decrease in the detection accuracy of the line of sight.
 本実施形態に係る視線検出装置100において、複数の光源は、表示部101の下方に設置された第1光源103A及び第2光源103Bと、表示部101の上方に設置された第3光源103Cとを含み、光源制御部22は、被験者の注視点の位置が表示部101の下側領域101Dにある場合に第1光源103A及び第2光源103Bから検出光を発光させ、被験者の注視点の位置が表示部101の上側領域101Uにある場合に第3光源103Cから検出光を発光させる。これにより、検出光の反射像を被験者の角膜内に確実に配置することができる。 In the line-of-sight detection device 100 according to the present embodiment, the plurality of light sources include the first light source 103A and the second light source 103B installed below the display unit 101, and the third light source 103C installed above the display unit 101. The light source control unit 22 emits detection light from the first light source 103A and the second light source 103B when the position of the gaze point of the subject is in the lower region 101D of the display unit 101, and the position of the gaze point of the subject. Is located in the upper region 101U of the display unit 101, the detection light is emitted from the third light source 103C. As a result, the reflected image of the detected light can be reliably placed in the cornea of the subject.
 本実施形態に係る視線検出装置100において、下部光源は、第1光源103A及び第2光源103Bを含み、光源制御部22は、被験者の注視点の位置が表示部101の下側領域にある場合に、第1光源103Aと第2光源103Bとを交互に発光させる。これにより、検出光の1つの反射像を被験者の角膜内に確実に形成することができる。したがって、反射像の中心(角膜反射中心)の位置データを高精度に求めることができる。 In the line-of-sight detection device 100 according to the present embodiment, the lower light source includes the first light source 103A and the second light source 103B, and the light source control unit 22 has a case where the position of the gazing point of the subject is in the lower region of the display unit 101. The first light source 103A and the second light source 103B are alternately caused to emit light. This makes it possible to reliably form one reflected image of the detected light in the cornea of the subject. Therefore, the position data of the center of the reflected image (corneal reflex center) can be obtained with high accuracy.
 以上、本開示の実施形態を説明したが、これら実施形態の内容により実施形態が限定されるものではない。また、前述した構成要素には、当業者が容易に想定できるもの、実質的に同一のもの、いわゆる均等の範囲のものが含まれる。さらに、前述した構成要素は適宜組み合わせることが可能である。さらに、前述した実施形態の要旨を逸脱しない範囲で構成要素の種々の省略、置換又は変更を行うことができる。 Although the embodiments of the present disclosure have been described above, the embodiments are not limited by the contents of these embodiments. Further, the above-mentioned components include those that can be easily assumed by those skilled in the art, those that are substantially the same, that is, those in a so-called equal range. Furthermore, the components described above can be combined as appropriate. Further, various omissions, replacements or changes of the components can be made without departing from the gist of the above-described embodiment.
 例えば、上記実施形態では、複数の光源として、表示部101の下部に配置された下部光源(第1光源103A、第2光源103B)と、表示部101の上部に配置された上部光源(第3光源103C)とが設けられた構成を例に挙げて説明したが、これに限定されない。例えば、下部光源及び上部光源に加えて又はこれらの光源の少なくとも一方に代えて、表示部101の左右の側部に光源が配置された構成であってもよい。 For example, in the above embodiment, as a plurality of light sources, a lower light source (first light source 103A, second light source 103B) arranged at the lower part of the display unit 101 and an upper light source (third light source 103B) arranged at the upper part of the display unit 101. The configuration provided with the light source 103C) has been described as an example, but the present invention is not limited to this. For example, in addition to the lower light source and the upper light source, or in place of at least one of these light sources, the light sources may be arranged on the left and right side portions of the display unit 101.
 また、上記実施形態では、上部光源として、1つの光源(第3光源103C)が設けられた構成を例に挙げて説明したが、これに限定されない。例えば、上部光源として、複数の光源が設けられた構成であってもよい。 Further, in the above embodiment, a configuration in which one light source (third light source 103C) is provided as an upper light source has been described as an example, but the present invention is not limited to this. For example, the upper light source may be configured to be provided with a plurality of light sources.
 また、上記実施形態では、表示部101の下部に第1カメラ102A及び第2カメラ102Bが配置された構成を例に挙げて説明したが、これに限定されない。第1カメラ102A及び第2カメラ102Bは、表示部101の上部又は側部に配置されてもよい。 Further, in the above embodiment, the configuration in which the first camera 102A and the second camera 102B are arranged at the lower part of the display unit 101 has been described as an example, but the present invention is not limited to this. The first camera 102A and the second camera 102B may be arranged on the upper portion or the side portion of the display unit 101.
 本開示に係る視線検出装置、視線検出方法及び視線検出プログラムは、例えばコンピュータ等の処理装置等に利用することができる。 The line-of-sight detection device, line-of-sight detection method, and line-of-sight detection program according to the present disclosure can be used, for example, in a processing device such as a computer.
 A…対応領域、M…目標画像、P…注視点、Sa…第1注視点範囲、Sb…第2注視点範囲、Sc…第3注視点範囲、Sd…重複範囲、20…コンピュータシステム、20A…演算処理装置、20B…記憶装置、20C…コンピュータプログラム、21…表示制御部、22…光源制御部、23…画像データ取得部、24…位置検出部、25…曲率中心算出部、26…注視点検出部、27…記憶部、28…出力制御部、30…入出力インターフェース装置、40…駆動回路、50…出力装置、60…入力装置、100…視線検出装置、101…表示部、101D…下側領域、101U…上側領域、102…ステレオカメラ装置、102A…第1カメラ、102B…第2カメラ、102C,103C…第3光源、103…照明装置、103A…第1光源、103B…第2光源、103V…仮想光源、109,127,Ra,Rb…距離(角膜曲率中心と瞳孔中心との距離)、110…角膜曲率中心、111…眼球、111a…角膜、111b…強膜、112…瞳孔、112C…瞳孔中心、113…角膜反射像,反射像、113C,121,122,124…角膜反射中心、123,173,178…直線、130…目標位置、141…第1直線、142…第2直線、166…交点、302…入出力部、402…表示装置駆動部、404A…第1カメラ入出力部、404B…第2カメラ入出力部、406…光源駆動部 A ... Corresponding area, M ... Target image, P ... Gaze point, Sa ... 1st gaze point range, Sb ... 2nd gaze point range, Sc ... 3rd gaze point range, Sd ... Overlap range, 20 ... Computer system, 20A ... Arithmetic processing device, 20B ... Storage device, 20C ... Computer program, 21 ... Display control unit, 22 ... Light source control unit, 23 ... Image data acquisition unit, 24 ... Position detection unit, 25 ... Curvature center calculation unit, 26 ... Note Viewpoint detection unit, 27 ... storage unit, 28 ... output control unit, 30 ... input / output interface device, 40 ... drive circuit, 50 ... output device, 60 ... input device, 100 ... line-of-sight detection device, 101 ... display unit, 101D ... Lower area, 101U ... upper area, 102 ... stereo camera device, 102A ... first camera, 102B ... second camera, 102C, 103C ... third light source, 103 ... lighting device, 103A ... first light source, 103B ... second Light source, 103V ... Virtual light source, 109,127, Ra, Rb ... Distance (distance between the center of corneal curvature and the center of the pupil), 110 ... Center of curvature of the corneal, 111 ... Eyeball, 111a ... Corn, 111b ... Strong membrane, 112 ... , 112C ... pupil center, 113 ... corneal reflection image, reflection image, 113C, 121, 122, 124 ... corneal reflection center, 123,173,178 ... straight line, 130 ... target position, 141 ... first straight line, 142 ... second Straight line, 166 ... intersection, 302 ... input / output unit, 402 ... display device drive unit, 404A ... first camera input / output unit, 404B ... second camera input / output unit, 406 ... light source drive unit

Claims (5)

  1.  画像を表示する表示部と、
     検出光を発光して被験者の少なくとも一方の眼球に照射する複数の光源と、
     前記検出光が照射された眼球の画像を撮像する撮像部と、
     撮像された前記画像から、前記検出光が照射された眼球の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出する位置検出部と、
     前記瞳孔中心の位置と前記角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出する注視点検出部と、
     複数の前記光源の発光と非発光とを制御し、前記被験者の前記注視点の位置に応じて複数の前記光源のうち前記検出光を発光する前記光源を切り替える光源制御部と
     を備える視線検出装置。
    A display unit that displays images and
    Multiple light sources that emit detection light to illuminate at least one eyeball of the subject,
    An imaging unit that captures an image of the eyeball irradiated with the detection light, and an imaging unit.
    A position detection unit that detects the position of the center of the pupil indicating the center of the pupil of the eyeball irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex from the captured image.
    A gaze point detection unit that calculates the position of the gaze point of the subject based on the position of the center of the pupil and the position of the center of curvature of the cornea.
    A line-of-sight detection device including a light source control unit that controls light emission and non-light emission of a plurality of the light sources and switches the light source that emits the detected light among the plurality of the light sources according to the position of the gazing point of the subject. ..
  2.  複数の前記光源は、前記表示部の下方に設置された下部光源と、前記表示部の上方に設置された上部光源とを含み、
     前記光源制御部は、前記被験者の前記注視点の位置が前記表示部の下側領域にある場合に前記下部光源から前記検出光を発光させ、前記被験者の前記注視点の位置が前記表示部の上側領域にある場合に前記上部光源から前記検出光を発光させる
     請求項1に記載の視線検出装置。
    The plurality of the light sources include a lower light source installed below the display unit and an upper light source installed above the display unit.
    The light source control unit emits the detection light from the lower light source when the position of the gaze point of the subject is in the lower region of the display unit, and the position of the gaze point of the subject is the position of the display unit. The line-of-sight detection device according to claim 1, wherein the detection light is emitted from the upper light source when it is in the upper region.
  3.  前記下部光源は、下部第1光源及び下部第2光源を含み、
     前記光源制御部は、前記被験者の前記注視点の位置が前記表示部の下側領域にある場合に、前記下部第1光源と前記下部第2光源とを交互に発光させる
     請求項2に記載の視線検出装置。
    The lower light source includes a lower first light source and a lower second light source.
    The second aspect of claim 2, wherein the light source control unit causes the lower first light source and the lower second light source to alternately emit light when the position of the gaze point of the subject is in the lower region of the display unit. Line-of-sight detector.
  4.  表示部に画像を表示することと、
     複数の光源から検出光を発光して被験者の少なくとも一方の眼球に照射することと、
     前記検出光が照射された眼球の画像を撮像することと、
     撮像された前記画像から、前記検出光が照射された眼球の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出することと、
     前記瞳孔中心の位置と前記角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出することと、
     複数の前記光源の発光と非発光とを制御し、前記被験者の前記注視点の位置に応じて複数の前記光源のうち前記検出光を発光する前記光源を切り替えることと
     を含む視線検出方法。
    Displaying an image on the display and
    To emit detection light from multiple light sources and irradiate at least one eyeball of the subject,
    Taking an image of the eyeball irradiated with the detection light and
    From the captured image, the position of the pupil center indicating the center of the pupil of the eyeball irradiated with the detection light and the position of the corneal reflex center indicating the center of the corneal reflex are detected.
    To calculate the position of the gaze point of the subject based on the position of the center of the pupil and the position of the center of curvature of the cornea,
    A line-of-sight detection method including controlling light emission and non-light emission of a plurality of the light sources, and switching the light source that emits the detected light among the plurality of the light sources according to the position of the gazing point of the subject.
  5.  表示部に画像を表示する処理と、
     複数の光源から検出光を発光して被験者の少なくとも一方の眼球に照射する処理と、
     前記検出光が照射された眼球の画像を撮像する処理と、
     撮像された前記画像から、前記検出光が照射された眼球の瞳孔の中心を示す瞳孔中心の位置と角膜反射の中心を示す角膜反射中心の位置とを検出する処理と、
     前記瞳孔中心の位置と前記角膜曲率中心の位置とに基づいて被験者の注視点の位置を算出する処理と、
     複数の前記光源の発光と非発光とを制御し、前記被験者の前記注視点の位置に応じて複数の前記光源のうち前記検出光を発光する前記光源を切り替える処理と
     をコンピュータに実行させる視線検出プログラム。
    The process of displaying an image on the display and
    The process of emitting detection light from multiple light sources and irradiating at least one eyeball of the subject,
    The process of capturing an image of the eyeball irradiated with the detection light and
    A process of detecting the position of the center of the pupil indicating the center of the pupil of the eyeball irradiated with the detection light and the position of the center of the corneal reflex indicating the center of the corneal reflex from the captured image.
    The process of calculating the position of the gaze point of the subject based on the position of the center of the pupil and the position of the center of curvature of the cornea, and the process of calculating the position of the gazing point of the subject.
    Line-of-sight detection that controls the light emission and non-light emission of the plurality of light sources and causes a computer to perform a process of switching the light source that emits the detected light among the plurality of the light sources according to the position of the gazing point of the subject. program.
PCT/JP2021/010240 2020-08-24 2021-03-12 Sight line detection device, sight line detection method, and sight line detection program WO2022044395A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020140664A JP2022036452A (en) 2020-08-24 2020-08-24 Line of sight detection device, line of sight detection method, and line of sight detection program
JP2020-140664 2020-08-24

Publications (1)

Publication Number Publication Date
WO2022044395A1 true WO2022044395A1 (en) 2022-03-03

Family

ID=80355043

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010240 WO2022044395A1 (en) 2020-08-24 2021-03-12 Sight line detection device, sight line detection method, and sight line detection program

Country Status (2)

Country Link
JP (1) JP2022036452A (en)
WO (1) WO2022044395A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012065719A (en) * 2010-09-21 2012-04-05 Fujitsu Ltd View line detection device, view line detection method, view line-detecting computer program and mobile terminal
JP2017111746A (en) * 2015-12-18 2017-06-22 国立大学法人静岡大学 Sight line detection device and sight line detection method
JP2018197974A (en) * 2017-05-24 2018-12-13 富士通株式会社 Line-of-sight detection computer program, line-of-sight detection device and line-of-sight detection method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012065719A (en) * 2010-09-21 2012-04-05 Fujitsu Ltd View line detection device, view line detection method, view line-detecting computer program and mobile terminal
JP2017111746A (en) * 2015-12-18 2017-06-22 国立大学法人静岡大学 Sight line detection device and sight line detection method
JP2018197974A (en) * 2017-05-24 2018-12-13 富士通株式会社 Line-of-sight detection computer program, line-of-sight detection device and line-of-sight detection method

Also Published As

Publication number Publication date
JP2022036452A (en) 2022-03-08

Similar Documents

Publication Publication Date Title
EP3488763B1 (en) Line-of-sight detection device, line-of-sight detection method, and computer program
JP6971686B2 (en) Line-of-sight detection device and line-of-sight detection method
US11266307B2 (en) Evaluation device, evaluation method, and non-transitory storage medium
JP6958507B2 (en) Line-of-sight detection device, line-of-sight detection method and line-of-sight detection program
WO2021246012A1 (en) Device, method, and program for detecting line of sight
WO2022044395A1 (en) Sight line detection device, sight line detection method, and sight line detection program
JP2015046111A (en) Viewpoint detection device and viewpoint detection method
JP7063045B2 (en) Line-of-sight detection device, line-of-sight detection method and line-of-sight detection program
WO2022038814A1 (en) Corneal curvature radius calculation device, line-of-sight detection device, corneal curvature radius calculation method, and corneal curvature radius calculation program
WO2022038815A1 (en) Distance calculation device, distance calculation method, and distance calculation program
WO2023008023A1 (en) Line-of-sight detection device, line-of-sight detection method, and line-of-sight detection program
JP7027958B2 (en) Evaluation device, evaluation method, and evaluation program
JP2005296383A (en) Visual line detector
US10996747B2 (en) Line-of-sight detection device, line-of-sight detection method, and medium
JP2019024776A (en) Evaluation apparatus, evaluation method, and evaluation program
JP7043889B2 (en) Visual function detection device, visual function detection method and program
JP2023019690A (en) Visual line detection device, visual line detection method, and visual line detection program
JP2023019659A (en) Visual line detection device, visual line detection method, and visual line detection program
WO2020183792A1 (en) Display device, display method and display program
JP7043890B2 (en) Visual function detection device, visual function detection method and program
WO2021014868A1 (en) Eyeball detection device, line-of-sight detection device, eyeball detection method, and eyeball detection program
JP2024090775A (en) Sight line detection device, sight line detection method and sight line detection program
WO2019181272A1 (en) Evaluation device, evaluation method, and evaluation program
JP2024090620A (en) Subject specification device, visual line detection device, subject specification method and subject specification program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21860827

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21860827

Country of ref document: EP

Kind code of ref document: A1