WO2015080003A1 - Dispositif de détection de pupille, dispositif de détection de ligne visuelle et procédé de détection de pupille - Google Patents

Dispositif de détection de pupille, dispositif de détection de ligne visuelle et procédé de détection de pupille Download PDF

Info

Publication number
WO2015080003A1
WO2015080003A1 PCT/JP2014/080682 JP2014080682W WO2015080003A1 WO 2015080003 A1 WO2015080003 A1 WO 2015080003A1 JP 2014080682 W JP2014080682 W JP 2014080682W WO 2015080003 A1 WO2015080003 A1 WO 2015080003A1
Authority
WO
WIPO (PCT)
Prior art keywords
pupil
region
unit
center
center position
Prior art date
Application number
PCT/JP2014/080682
Other languages
English (en)
Japanese (ja)
Inventor
修二 箱嶋
首藤 勝行
賢 二宮
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2015080003A1 publication Critical patent/WO2015080003A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the present invention relates to a pupil detection technique.
  • a line-of-sight detection device that detects the position where the subject is gazing on an observation surface such as a monitor screen from the face image captured by the camera.
  • an observation surface such as a monitor screen
  • a detection device to the head.
  • a non-contact type that does not require a device or the like to be attached to the head has been developed, and a highly accurate gaze detection device is required.
  • gaze detection is performed based on the positional relationship between the pupil on the camera image and the corneal reflection. For this reason, it is important to accurately obtain the pupil center coordinates and the corneal reflection center coordinates on the camera image.
  • Patent Document 1 proposes a technique capable of stably detecting a pupil by actively using information of a corneal reflection image even when a large portion of the pupil is hidden by corneal reflection. ing.
  • the present invention has been made in view of the above, and an object of the present invention is to provide a pupil detection device, a line-of-sight detection device, and a pupil detection method capable of detecting a pupil with a smaller calculation amount and higher accuracy.
  • the present invention uses a specifying unit that specifies a pupil region and a corneal reflection region from an image obtained by imaging an eye, and a first region included in the pupil region.
  • a first estimation unit that estimates the center position of the pupil
  • a second estimation unit that estimates the center position of the pupil using a second region that is included in the pupil region and is different from the first region
  • a pupil position detection unit that detects the center position of the pupil based on the center position of the pupil estimated by the first estimation unit and the center position of the pupil estimated by the second estimation unit;
  • the pupil detection device, line-of-sight detection device, and pupil detection method according to the present invention have an effect that the pupil can be detected with higher accuracy with a smaller amount of calculation.
  • FIG. 1 is a diagram illustrating an arrangement of a display unit, a stereo camera, and a light source used in the first embodiment.
  • FIG. 2 is a diagram illustrating an outline of functions of the diagnosis support apparatus.
  • FIG. 3 is a block diagram showing detailed functions of the respective units shown in FIG.
  • FIG. 4 is a diagram showing eye and distance detection when two cameras are used.
  • FIG. 5 is a diagram illustrating a captured image captured by the stereo camera.
  • FIG. 6 is a diagram illustrating an example of an eye image cut out from the captured image of FIG.
  • FIG. 7 is a diagram illustrating an example of a luminance change of an image near the cornea reflection region.
  • FIG. 8 is a diagram illustrating an example of luminance change of an image near the pupil region.
  • FIG. 1 is a diagram illustrating an arrangement of a display unit, a stereo camera, and a light source used in the first embodiment.
  • FIG. 2 is a diagram illustrating an outline of functions of the diagnosis support apparatus.
  • FIG. 9 is a diagram schematically showing FIG.
  • FIG. 10 is a diagram schematically showing FIG.
  • FIG. 11 is a diagram schematically showing FIG.
  • FIG. 12 is a diagram schematically showing FIG.
  • FIG. 13 is a diagram schematically showing FIG.
  • FIG. 14 is a diagram schematically illustrating FIG.
  • FIG. 15 is a flowchart illustrating pupil detection processing according to the first embodiment.
  • FIG. 16 is a diagram illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment.
  • FIG. 17 is a diagram illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment.
  • FIG. 18 is a diagram illustrating an outline of functions of the diagnosis support apparatus.
  • FIG. 16 is a diagram illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment.
  • FIG. 19 is a block diagram illustrating an example of detailed functions of the respective units illustrated in FIG.
  • FIG. 20 is a diagram illustrating an outline of processing executed by the diagnosis support apparatus according to the second embodiment.
  • FIG. 21 is an explanatory diagram showing the difference between the method using two light sources and the second embodiment using one light source.
  • FIG. 22 is a diagram for explaining calculation processing for calculating the distance between the pupil center position and the corneal curvature center position.
  • FIG. 23 is a flowchart illustrating an example of calculation processing according to the second embodiment.
  • FIG. 24 is a diagram illustrating a method of calculating the position of the corneal curvature center using the distance obtained in advance.
  • FIG. 25 is a flowchart illustrating an example of a line-of-sight detection process according to the second embodiment.
  • FIG. 26 is a diagram for explaining a calculation process of the modification.
  • FIG. 27 is a flowchart illustrating an example of a modification calculation process.
  • gaze point detection performed without contact is known as one of the usage scenes of the pupil detection device.
  • a specific method there is a method of irradiating a near-infrared point light source (LED or the like) to the eye and estimating the line of sight from the light source image reflected by the cornea and the position of the pupil.
  • the gaze point detection performance largely depends on the detection accuracy of the center coordinates of the pupil and the cornea reflection.
  • the pupil detection device (diagnosis support device) of the first embodiment detects the pupil center position in the eye region image by setting individual regions in each of the X direction and the Y direction, for example, and obtaining the luminance centroid of each region. To do. This eliminates the influence of corneal reflection and enables highly accurate detection with a small amount of computation.
  • the pupil detection device of the present embodiment can be used for a non-contact type gaze detection device or the like. By improving the accuracy of the pupil detection device, the overall performance of the visual line detection device can be improved.
  • the pupil detection device of the present embodiment can be used for a diagnosis support device that supports diagnosis of developmental disabilities using the pupil detection result. Below, the example which used the pupil detection apparatus for such a diagnosis assistance apparatus is demonstrated. Applicable devices are not limited to the line-of-sight detection device and the diagnosis support device.
  • FIG. 1 is a diagram illustrating an example of an arrangement of a display unit, a stereo camera, and a light source used in the first embodiment.
  • a set of stereo cameras 102 is arranged below the display screen 101.
  • the stereo camera 102 is an imaging unit that can perform stereo shooting with infrared rays, and includes a right camera 202 and a left camera 204.
  • infrared LED (Light Emitting Diode) light sources 203 and 205 are arranged in the circumferential direction, respectively.
  • the infrared LED light sources 203 and 205 are light sources that irradiate near infrared rays having a wavelength of 850 nm, for example.
  • the pupils of the subject are detected by the infrared LED light sources 203 and 205. Details of the pupil detection method will be described later.
  • the space is expressed by coordinates and the position is specified.
  • the center position of the display screen 101 is the origin, the top and bottom are the Y coordinate (up is +), the side is the X coordinate (right is +), and the depth is the Z coordinate (front is +). .
  • FIG. 2 is a diagram showing an outline of functions of the diagnosis support apparatus 100.
  • FIG. 2 shows a part of the configuration shown in FIG. 1 and a configuration used for driving the configuration.
  • the diagnosis support apparatus 100 includes a right camera 202, a left camera 204, infrared LED light sources 203 and 205, a speaker 105, a drive / IF (interface) unit 208, and a control unit 300.
  • the display screen 101 shows the positional relationship between the right camera 202 and the left camera 204 in an easy-to-understand manner, but the display screen 101 is a screen displayed on the display unit 210.
  • the drive unit and the IF unit may be integrated or separate.
  • the speaker 105 functions as an audio output unit that outputs audio or the like for alerting the subject during calibration or the like.
  • the drive / IF unit 208 drives each unit included in the stereo camera 102.
  • the drive / IF unit 208 serves as an interface between each unit included in the stereo camera 102 and the control unit 300.
  • the control unit 300 is a communication I / F that communicates with a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory) by connecting to a network. And a computer equipped with a bus for connecting each unit.
  • a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory) by connecting to a network.
  • a computer equipped with a bus for connecting each unit.
  • the storage unit 150 stores various information such as a control program, a measurement result, and a diagnosis support result.
  • the storage unit 150 stores, for example, an image to be displayed on the display unit 210.
  • the display unit 210 displays various information such as a target image for diagnosis.
  • FIG. 3 is a block diagram showing an example of detailed functions of each unit shown in FIG. As shown in FIG. 3, a display unit 210 and a drive / IF unit 208 are connected to the control unit 300.
  • the drive / IF unit 208 includes camera IFs 314 and 315, an LED drive control unit 316, and a speaker drive unit 322.
  • the right camera 202 and the left camera 204 are connected to the drive / IF unit 208 via the camera IFs 314 and 315, respectively.
  • the driving / IF unit 208 drives these cameras to image the subject.
  • the infrared LED light source 203 and the infrared LED light source 205 are light sources that irradiate near-infrared rays of 850 nm, for example.
  • the wavelength of the infrared rays to be irradiated is not limited to the above.
  • the speaker driving unit 322 drives the speaker 105.
  • the diagnosis support apparatus 100 may include an interface (printer IF) for connecting to a printer as a printing unit.
  • the printer may be provided inside the diagnosis support apparatus 100.
  • the control unit 300 controls the entire diagnosis support apparatus 100.
  • the control unit 300 includes a specifying unit 351, a first estimation unit 352, a second estimation unit 353, a gaze detection unit 354, a viewpoint detection unit 355, an output control unit 356, an evaluation unit 357, and pupil position detection. Part 358.
  • the specific part 351, the 1st estimation part 352, and the 2nd estimation part 353 should just be provided at least.
  • Each element included in the control unit 300 (specification unit 351, first estimation unit 352, second estimation unit 353, gaze detection unit 354, viewpoint detection unit 355, output control unit 356, evaluation unit 357, pupil position detection unit 358) May be realized by software (program), a hardware circuit, or a combination of software and a hardware circuit.
  • the program When implemented by a program, the program is a file in an installable or executable format, such as a CD-ROM (Compact Disk Read Only Memory), a flexible disk (FD), a CD-R (Compact Disk Recordable), a DVD ( It is recorded on a computer-readable recording medium such as Digital Versatile Disk) and provided as a computer program product.
  • the program may be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network.
  • the program may be provided or distributed via a network such as the Internet.
  • the program may be provided by being incorporated in advance in a ROM or the like.
  • the identifying unit 351 identifies the pupil region and the corneal reflection region from the captured image (image captured by the eyes) captured by the imaging unit (stereo camera 102).
  • a specifying method by the specifying unit 351 for example, a method of specifying a low-luminance (dark) region in an image as a pupil region and a high-luminance (bright) region as a corneal reflection region has been conventionally used. You can apply any way you are.
  • the first estimation unit 352 estimates the center position of the pupil using the region (first region) included in the pupil region.
  • the second estimation unit 353 estimates the center position of the pupil using a region (second region) that is included in the pupil region and is different from the first region.
  • the first estimation unit 352 uses a region (first region) that is surrounded by a tangent line (first tangent line) that is in contact with the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • the center position of the pupil in the first tangential direction is estimated.
  • the second estimation unit 353 uses a region (second region) that is in contact with the corneal reflection region and is surrounded by a tangent (second tangent) orthogonal to the first tangent and the outer peripheral line and does not include the corneal reflection region. Thus, the center position of the pupil in the second tangential direction is estimated.
  • the pupil position detection unit 358 detects the center position of the pupil based on the center position of the pupil estimated by the first estimation unit 352 and the center position of the pupil estimated by the second estimation unit 353.
  • the pupil position detection unit 358 includes, for example, a straight line that passes through the position estimated by the first estimation unit 352 and is orthogonal to the first tangent line, and a straight line that passes through the position estimated by the second estimation unit 353 and is orthogonal to the second tangent line. The center position of the pupil is detected from the intersection.
  • the first tangent can be a straight line extending in the horizontal direction (X direction), and the second tangent can be a straight line extending in the vertical direction (Y direction).
  • the directions of the first tangent line and the second tangent line are not limited to the horizontal direction and the vertical direction, and can be any direction as long as they are orthogonal to each other.
  • a case where the first tangent is in the horizontal direction (X direction) and the second tangent is in the vertical direction (Y direction) will be described as an example.
  • the center position of the first region in the first tangential direction may be estimated as the center position of the pupil in the first tangential direction.
  • the center position of the second region in the second tangent direction may be estimated as the center position of the pupil in the second tangent direction.
  • the corneal reflection diameter needs to be smaller than the pupil diameter, but this condition is usually satisfied.
  • the pupil position detection unit 358 includes, for example, a straight line that passes through the position estimated by the first estimation unit 352 and is orthogonal to the first tangent line, and a straight line that passes through the position estimated by the second estimation unit 353 and is orthogonal to the second tangent line.
  • the center position of the pupil is detected from the intersection.
  • the gaze detection unit 354 detects the gaze (gaze direction) of the subject using the detected center position of the pupil.
  • the viewpoint detection unit 355 detects the viewpoint of the subject using the detected gaze direction. For example, the viewpoint detection unit 355 detects a viewpoint (gaze point) that is a point that the subject gazes out of the target images displayed on the display screen 101.
  • any conventionally used method can be applied.
  • a gaze direction and a gaze point of a subject are detected using a stereo camera will be described as an example.
  • the line-of-sight detection unit 354 calculates the position (eye position) of the subject's pupil in the three-dimensional world coordinate system using a stereo vision technique.
  • the line-of-sight detection unit 354 calculates the position of the subject's corneal reflection using images taken by the left and right cameras. Then, the gaze detection unit 354 calculates a gaze vector representing the gaze direction of the subject from the position of the pupil of the subject and the position of corneal reflection.
  • the method for detecting the subject's line of sight is not limited to this.
  • the subject's line of sight may be detected by analyzing an image captured using visible light instead of infrared light.
  • the viewpoint detection unit 355 detects, for example, the intersection of the line-of-sight vector represented in the coordinate system as shown in FIG. 1 and the XY plane as the gaze point of the subject.
  • the gaze point may be measured by obtaining the intersection of the left and right gazes of the subject.
  • FIG. 4 is a diagram showing an example of eye and distance detection when two cameras (the right camera 202 and the left camera 204) are used.
  • a camera calibration theory based on a stereo calibration method is applied in advance to obtain camera parameters.
  • the stereo calibration method any conventionally used method such as a method using Tsai's camera calibration theory can be applied.
  • the three-dimensional coordinates of the eye in the world coordinate system are obtained. It is done. Thereby, for example, the distance between the eyes and the stereo camera 102 can be estimated.
  • the output control unit 356 controls the output of various information to the display unit 210, the speaker 105, and the like.
  • the output control unit 356 controls the output to the display unit 210 such as the diagnostic image and the evaluation result by the evaluation unit 357.
  • the diagnostic image may be an image corresponding to the evaluation process based on the detection results of the pupil, the line of sight, the viewpoint, and the like.
  • a diagnostic image including an image (such as a geometric pattern image) preferred by a subject with a developmental disorder and other images (such as a person image) may be used.
  • the evaluation unit 357 performs an evaluation process based on the diagnostic image and the gazing point detected by the viewpoint detection unit 355. For example, in the case of diagnosing a developmental disorder, the evaluation unit 357 analyzes the diagnostic image and the gazing point, and evaluates whether or not the image preferred by the subject with the developmental disorder has been gazed.
  • FIG. 5 is a diagram illustrating an example of a captured image captured by the stereo camera 102.
  • the captured image in FIG. 5 is an example of an image obtained by capturing the face of the subject, and includes an eye region 501.
  • FIG. 6 is a diagram illustrating an example of an eye image obtained by cutting out the eye region 501 from the captured image of FIG.
  • the eye image includes a pupil 601, an iris 602, and a corneal reflection 603.
  • the image may be defocused due to the influence of the depth of field of the photographing lens.
  • FIG. 7 is a diagram showing an example of the luminance change of the image near the cornea reflection region.
  • FIG. 8 is a diagram illustrating an example of luminance change of an image near the pupil region.
  • the corneal reflection 603 has a conical luminance distribution 701 that becomes brighter toward the center as shown in FIG. For this reason, it is a common practice to easily and accurately obtain the center coordinates using the luminance centroid. That is, when the total number of pixel sets existing in a certain region (corneal reflection region or pupil region) is n (1, 2, 3,..., N), the X coordinate and Y coordinate of the luminance centroid of the region. Can be calculated using the following equations (1) and (2), respectively.
  • the pupil region has a conical luminance distribution 801 that becomes darker toward the center as shown in FIG. For this reason, as in the case of corneal reflection, a method for obtaining the luminance gravity center of the pupil by the equations (1) and (2) is conceivable.
  • the pupil region used for calculating the luminance center of gravity is divided into cases in the X direction and the Y direction, respectively, and is limited to a range where there is no omission due to corneal reflection. Thereby, the influence of corneal reflection can be avoided and the pupil can be detected with higher accuracy.
  • the formula for calculating the pupil center for example, the above formulas (1) and (2) can be applied, so that the position of the pupil can be calculated with a small amount of calculation.
  • FIGS. 9 to 14 are diagrams schematically showing FIG. For the sake of explanation, ambiguous areas due to defocusing are omitted.
  • FIG. 15 is a flowchart illustrating an example of pupil detection processing according to the present embodiment.
  • the identifying unit 351 cuts out the eye area from the captured image (step S101). Note that in the case of a captured image obtained by capturing the entire image of the eye, the clipping process may be omitted.
  • the specifying unit 351 specifies the pupil region from the pixel luminance in the eye region (step S102). As illustrated in FIG. 9, the specifying unit 351 specifies, for example, a set of pixels having brightness that is equal to or less than a predetermined threshold as the pupil region 901.
  • the specifying unit 351 obtains the minimum value x1 and maximum value x2 of the X coordinate and the minimum value y1 and maximum value y2 of the Y coordinate of the pupil region (pupil region 901 in FIG. 9) (steps S103 and S104).
  • the specifying unit 351 specifies the corneal reflection region from the pixel luminance in the eye region (step S105). As illustrated in FIG. 9, the specifying unit 351 specifies, for example, a set of pixels having brightness equal to or higher than a predetermined threshold as the cornea reflection region 902. The specifying unit 351 obtains the minimum value x11 and maximum value x22 of the X coordinate and the minimum value y11 and maximum value y22 of the Y coordinate of the cornea reflection region (corneal reflection region 902 in FIG. 9) (steps S106 and S107).
  • the first estimation unit 352 calculates the luminance centroid in the X direction (step S108).
  • An example of a method for obtaining the luminance gravity center in the X direction will be described with reference to FIGS. 10 and 11.
  • the first estimation unit 352 separates the set of pixels (x, y) in the pupil region in the equation (1) into one or two regions.
  • the first region is a region included in “(x1, y22) ⁇ (x, y) ⁇ (x2, y2)”, which is the upper portion of the pupil region.
  • This region corresponds to a region (first region or third region) that is surrounded by a tangent line that is in contact with the upper portion of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • it corresponds to a region 1001 represented by oblique lines.
  • the second region is a region included in “(x1, y11) ⁇ (x, y) ⁇ (x2, y1)”, which is the lower part of the pupil region.
  • This region corresponds to a region (a first region or a third region) that is surrounded by a tangent line that is in contact with the lower portion of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • a region a first region or a third region
  • a region satisfying the condition is used for calculating the luminance centroid.
  • the region 1001 is used for calculating the luminance centroid. Since the region 1001 has a bilaterally symmetric shape, it is possible to obtain the X coordinate of the luminance centroid using equation (1).
  • a straight line 1101 in FIG. 11 is a straight line indicating the luminance gravity center in the X direction calculated in this way. It is estimated that the pupil center exists on this straight line 1101.
  • the second estimation unit 353 calculates the luminance centroid in the Y direction (step S109).
  • An example of a method for obtaining the luminance center of gravity in the Y direction will be described with reference to FIGS.
  • the second estimation unit 353 separates a set of pixels (x, y) in the pupil region into one or two regions in Equation (2).
  • the first region is a region included in “(x1, y1) ⁇ (x, y) ⁇ (x11, y2)”, which is the left part of the pupil region.
  • This region corresponds to a region (second region or fourth region) that is surrounded by a tangent line that contacts the left part of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • it corresponds to a region 1201 represented by hatching.
  • the second region is a region included in “(x22, y1) ⁇ (x, y) ⁇ (x2, y2)”, which is the right part of the pupil region.
  • This region corresponds to a region (second region or fourth region) that is surrounded by a tangent line that is in contact with the right part of the corneal reflection region and the outer peripheral line of the pupil region and does not include the corneal reflection region.
  • it corresponds to a region 1202 represented by hatching.
  • the second estimation unit 353 calculates the luminance centroid using, for example, a region having a large area among the regions.
  • the areas 1201 and 1202 are compared, and the luminance centroid is calculated using the one having the larger area.
  • the area 1202 is used for calculation. Since this region 1202 has a vertically symmetric shape, the Y coordinate of the luminance centroid can be obtained using equation (2).
  • a straight line 1301 in FIG. 13 is a straight line indicating the luminance gravity center in the Y direction calculated in this way. It is estimated that the pupil center exists on this straight line 1301.
  • the average value of the Y coordinate of the luminance centroid obtained from the first region and the Y coordinate of the luminance centroid obtained from the second region is determined as the luminance centroid in the Y direction. May be calculated as
  • the point where the straight line 1401 indicating the luminance centroid in the X direction intersects with the straight line 1402 indicating the luminance centroid in the Y direction is the pupil center 1403 (step S110).
  • the pupil position detection unit 358 detects an intersection where the straight line 1401 and the straight line 1402 intersect as the pupil center 1403.
  • the pupil center can be obtained accurately with a small amount of calculation.
  • the following effects can be obtained. (1) When the corneal reflection diameter is smaller than the pupil diameter, the pupil center coordinates can be obtained more accurately. (2) Since the amount of calculation is small, the pupil can be detected even with a lower function CPU.
  • the gaze detection apparatus and the gaze detection method of the second embodiment will be described in detail based on the drawings.
  • this invention is not limited by this embodiment.
  • the line-of-sight detection apparatus is used as a diagnosis support apparatus that supports diagnosis of developmental disabilities using the line-of-sight detection result
  • Applicable devices are not limited to diagnosis support devices.
  • the line-of-sight detection apparatus (diagnosis support apparatus) of the present embodiment detects the line of sight using an illumination unit installed at one place.
  • the line-of-sight detection device (diagnosis support device) of the present embodiment calculates the corneal curvature center position with high accuracy by using a result obtained by gazing at one point on the subject before the line-of-sight detection.
  • an illumination part is an element which can irradiate light to a test subject's eyeball including a light source.
  • the light source is an element that generates light, such as an LED (Light Emitting Diode).
  • a light source may be comprised from one LED, and may be comprised by combining several LED and arrange
  • the “light source” may be used as a term representing the illumination unit in this way.
  • FIGS. 16 and 17 are diagrams illustrating an example of the arrangement of the display unit, the stereo camera, the infrared light source, and the subject according to the second embodiment.
  • symbol may be attached
  • the diagnosis support apparatus includes a display unit 210, a stereo camera 2102, and an LED light source 2103.
  • the stereo camera 2102 is disposed below the display unit 210.
  • the LED light source 2103 is arranged at the center position of two cameras included in the stereo camera 2102.
  • the LED light source 2103 is a light source that irradiates near infrared rays having a wavelength of 850 nm, for example.
  • FIG. 16 shows an example in which an LED light source 2103 (illumination unit) is configured by nine LEDs.
  • the stereo camera 2102 uses a lens that can transmit near-infrared light having a wavelength of 850 nm.
  • the stereo camera 2102 includes a right camera 2202 and a left camera 2203.
  • the LED light source 2103 irradiates near-infrared light toward the eyeball 111 of the subject.
  • the pupil 112 is reflected and darkened with low brightness
  • the corneal reflection 113 generated as a virtual image in the eyeball 111 is reflected and brightened with high brightness. Accordingly, the positions of the pupil 112 and the corneal reflection 113 on the image can be acquired by each of the two cameras (the right camera 2202 and the left camera 2203).
  • the three-dimensional world coordinate values of the positions of the pupil 112 and the corneal reflection 113 are calculated from the positions of the pupil 112 and the corneal reflection 113 obtained by two cameras.
  • the top and bottom are the Y coordinate (up is +)
  • the side is the X coordinate (right is +)
  • the depth is the Z coordinate (front is +).
  • FIG. 18 is a diagram illustrating an outline of functions of the diagnosis support apparatus 2100 according to the second embodiment.
  • FIG. 18 shows a part of the configuration shown in FIGS. 16 and 17 and a configuration used for driving the configuration.
  • the diagnosis support apparatus 2100 includes a right camera 2202, a left camera 2203, an LED light source 2103, a speaker 105, a drive / IF (interface) unit 208, a control unit 2300, and a storage unit 150.
  • a display unit 210 In FIG. 18, the display screen 101 shows the positional relationship between the right camera 2202 and the left camera 2203 in an easy-to-understand manner, but the display screen 101 is a screen displayed on the display unit 210.
  • the drive unit and the IF unit may be integrated or separate.
  • the speaker 105 functions as an audio output unit that outputs audio or the like for alerting the subject during calibration or the like.
  • the drive / IF unit 208 drives each unit included in the stereo camera 2102.
  • the drive / IF unit 208 serves as an interface between each unit included in the stereo camera 2102 and the control unit 2300.
  • the control unit 2300 is, for example, a communication I / F that communicates with a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) or a RAM (Random Access Memory) by connecting to a network. And a computer equipped with a bus for connecting each unit.
  • a control device such as a CPU (Central Processing Unit) and a storage device such as a ROM (Read Only Memory) or a RAM (Random Access Memory) by connecting to a network.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the storage unit 150 stores various information such as a control program, a measurement result, and a diagnosis support result.
  • the storage unit 150 stores, for example, an image to be displayed on the display unit 210.
  • the display unit 210 displays various information such as a target image for diagnosis.
  • FIG. 19 is a block diagram illustrating an example of detailed functions of each unit illustrated in FIG. As shown in FIG. 19, a display unit 210 and a drive / IF unit 208 are connected to the control unit 2300.
  • the drive / IF unit 208 includes camera IFs 314 and 315, an LED drive control unit 316, and a speaker drive unit 322.
  • the right camera 2202 and the left camera 2203 are connected to the drive / IF unit 208 via the camera IFs 314 and 315, respectively.
  • the driving / IF unit 208 drives these cameras to image the subject.
  • the speaker driving unit 322 drives the speaker 105.
  • the diagnosis support apparatus 2100 may include an interface (printer IF) for connecting to a printer as a printing unit. Further, the printer may be provided inside the diagnosis support apparatus 2100.
  • the control unit 2300 controls the entire diagnosis support apparatus 2100.
  • the control unit 2300 includes a first calculation unit 2351, a second calculation unit (corneal reflection center calculation unit) 2352, a third calculation unit (corneal curvature center calculation unit) 2353, a line-of-sight detection unit 2354, and a viewpoint detection unit 2355. And an output control unit 2356 and an evaluation unit 2357.
  • the line-of-sight detection device may include at least the first calculation unit 2351, the second calculation unit 2352, the third calculation unit 2353, and the line-of-sight detection unit 2354.
  • Each element included in the control unit 2300 (the first calculation unit 2351, the second calculation unit 2352, the third calculation unit 2353, the line-of-sight detection unit 2354, the viewpoint detection unit 2355, the output control unit 2356, and the evaluation unit 2357) It may be realized by software (program), may be realized by a hardware circuit, or may be realized by using software and a hardware circuit in combination.
  • the program When implemented by a program, the program is a file in an installable or executable format, such as a CD-ROM (Compact Disk Read Only Memory), a flexible disk (FD), a CD-R (Compact Disk Recordable), a DVD ( It is recorded on a computer-readable recording medium such as Digital Versatile Disk) and provided as a computer program product.
  • the program may be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network.
  • the program may be provided or distributed via a network such as the Internet.
  • the program may be provided by being incorporated in advance in a ROM or the like.
  • the first calculation unit 2351 calculates the position (first position) of the pupil center indicating the center of the pupil from the eyeball image captured by the stereo camera 2102.
  • the second calculator 2352 calculates the position of the corneal reflection center (second position) indicating the center of corneal reflection from the captured image of the eyeball.
  • the third calculation unit 2353 calculates the corneal curvature center (fourth position) from the straight line (first straight line) connecting the LED light source 2103 and the corneal reflection center. For example, the third calculation unit 2353 calculates a position on the straight line where the distance from the corneal reflection center is a predetermined value as the corneal curvature center. As the predetermined value, a value determined in advance from a general radius of curvature of the cornea or the like can be used.
  • the third calculation unit 2353 may calculate the corneal curvature center in consideration of individual differences. In this case, the third calculation unit 2353 first uses a pupil center and a corneal reflection center that are calculated when the subject is gazes at the target position (third position), and a straight line that connects the pupil center and the target position (the first position). 2) and the first straight line connecting the corneal reflection center and the LED light source 2103 is calculated. Then, the third calculation unit 2353 calculates the distance (first distance) between the pupil center and the calculated intersection, and stores the calculated distance in the storage unit 150, for example.
  • the target position may be a position that is determined in advance and can calculate a three-dimensional world coordinate value.
  • the center position of the display screen 101 (the origin of the three-dimensional world coordinates) can be set as the target position.
  • the output control unit 2356 displays an image (target image) or the like that causes the subject to gaze at the target position (center) on the display screen 101. Thereby, a test subject can be made to gaze at a target position.
  • the target image may be any image as long as it allows the subject to pay attention.
  • an image in which a display mode such as luminance or color changes, an image in which the display mode is different from other regions, or the like can be used as the target image.
  • the target position is not limited to the center of the display screen 101, and may be an arbitrary position. If the center of the display screen 101 is set as the target position, the distance from an arbitrary end of the display screen 101 is minimized. For this reason, it becomes possible to make the measurement error at the time of gaze detection smaller, for example.
  • the processing up to the calculation of the distance is executed in advance, for example, before actual gaze detection is started.
  • the third calculation unit 2353 calculates, on the straight line connecting the LED light source 2103 and the corneal reflection center, a position where the distance from the pupil center is a previously calculated distance as the corneal curvature center. .
  • the line-of-sight detection unit 2354 detects the line of sight of the subject from the pupil center and the corneal curvature center.
  • the gaze detection unit 2354 detects the direction from the corneal curvature center to the pupil center as the gaze direction of the subject.
  • the viewpoint detection unit 2355 detects the viewpoint of the subject using the detected gaze direction.
  • the viewpoint detection unit 2355 detects, for example, a viewpoint (gaze point) that is a point on the display screen 101 where the subject gazes.
  • the viewpoint detection unit 2355 detects, for example, the intersection of the line-of-sight vector represented in the three-dimensional world coordinate system as shown in FIG. 17 and the XY plane as the gaze point of the subject.
  • the output control unit 2356 controls the output of various information to the display unit 210, the speaker 105, and the like. For example, the output control unit 2356 outputs the target image at the target position on the display unit 210. Further, the output control unit 2356 controls the output to the display unit 210 such as the diagnostic image and the evaluation result by the evaluation unit 2357.
  • the diagnostic image may be an image according to the evaluation process based on the line-of-sight (viewpoint) detection result.
  • a diagnostic image including an image (such as a geometric pattern image) preferred by a subject with a developmental disorder and other images (such as a person image) may be used.
  • Evaluation unit 2357 performs an evaluation process based on the diagnostic image and the gazing point detected by the viewpoint detection unit 2355. For example, in the case of diagnosing a developmental disorder, the evaluation unit 2357 analyzes the diagnostic image and the gazing point, and evaluates whether or not the image preferred by the subject with the developmental disorder has been gazed.
  • the output control unit 2356 may display the same diagnostic image as in the first embodiment, and the evaluation unit 2357 may perform the same evaluation process as the evaluation unit 357 in the first embodiment.
  • the pupil detection processing (identification unit 351, first estimation unit 352, second estimation unit 353, pupil position detection unit 358) and gaze detection processing (gaze detection unit 354) of the first embodiment are
  • the pupil detection process (first calculation unit 2351) and the gaze detection process (second calculation unit 2352, third calculation unit 2353, gaze detection unit 2354) of the second embodiment may be used.
  • FIG. 20 is a diagram illustrating an outline of processing executed by the diagnosis support apparatus 2100 of the present embodiment.
  • the elements described in FIGS. 16 to 19 are denoted by the same reference numerals and description thereof is omitted.
  • the pupil center 407 and the corneal reflection center 408 represent the center of the pupil detected when the LED light source 2103 is turned on and the center of the corneal reflection point, respectively.
  • the corneal curvature radius 409 represents the distance from the corneal surface to the corneal curvature center 410.
  • FIG. 21 is an explanatory diagram showing a difference between a method using two light sources (illumination units) (hereinafter referred to as method A) and the present embodiment using one light source (illumination unit).
  • method A a method using two light sources
  • illumination unit one light source
  • Method A uses two LED light sources 511 and 512 instead of the LED light source 2103.
  • a straight line 515 connecting the cornea reflection center 513 and the LED light source 511 when the LED light source 511 is irradiated and a straight line 516 connecting the cornea reflection center 514 and the LED light source 512 when the LED light source 512 is irradiated.
  • An intersection is calculated. This intersection is the corneal curvature center 505.
  • a straight line 523 connecting the cornea reflection center 522 and the LED light source 2103 when the LED light source 2103 is irradiated is considered.
  • a straight line 523 passes through the corneal curvature center 505. It is also known that the radius of curvature of the cornea is almost constant with little influence from individual differences. Thus, the corneal curvature center when the LED light source 2103 is irradiated exists on the straight line 523 and can be calculated by using a general curvature radius value.
  • the viewpoint position may deviate from the original position due to individual differences in the eyeballs, and accurate viewpoint position detection cannot be performed. There is.
  • FIG. 22 is a diagram for explaining calculation processing for calculating the corneal curvature center position and the distance between the pupil center position and the corneal curvature center position before performing viewpoint detection (line-of-sight detection).
  • the elements described in FIGS. 16 to 19 are denoted by the same reference numerals and description thereof is omitted.
  • the connection between the left and right cameras (the right camera 2202 and the left camera 2203) and the control unit 2300 is not shown and is omitted.
  • the target position 605 is a position for displaying a target image or the like at one point on the display unit 210 and causing the subject to stare.
  • the center position of the display screen 101 is set.
  • a straight line 613 is a straight line connecting the LED light source 2103 and the corneal reflection center 612.
  • a straight line 614 is a straight line connecting the target position 605 (gaze point) that the subject looks at and the pupil center 611.
  • a corneal curvature center 615 is an intersection of the straight line 613 and the straight line 614.
  • the third calculation unit 2353 calculates and stores the distance 616 between the pupil center 611 and the corneal curvature center 615.
  • FIG. 23 is a flowchart illustrating an example of calculation processing according to the present embodiment.
  • the output control unit 2356 reproduces the target image at one point on the display screen 101 (step S201), and causes the subject to gaze at the one point.
  • the control unit 2300 turns on the LED light source 2103 toward the eyes of the subject using the LED drive control unit 316 (step S202).
  • the controller 2300 images the eyes of the subject with the left and right cameras (the right camera 2202 and the left camera 2203) (step S203).
  • the pupil part is detected as a dark part (dark pupil) by irradiation of the LED light source 2103. Further, a corneal reflection virtual image is generated as a reflection of LED irradiation, and a corneal reflection point (corneal reflection center) is detected as a bright portion. That is, the first calculation unit 2351 detects a pupil portion from the captured image, and calculates coordinates indicating the position of the pupil center. For example, the first calculation unit 2351 detects, as a pupil part, a region having a predetermined brightness or less including the darkest part in a certain region including the eyes, and a region having a predetermined brightness or more including the brightest part is reflected by the cornea. Detect as.
  • the second calculation unit 2352 detects a corneal reflection portion from the captured image, and calculates coordinates indicating the position of the corneal reflection center.
  • the first calculation unit 2351 and the second calculation unit 2352 calculate each coordinate value for each of two images acquired by the left and right cameras (step S204).
  • the left and right cameras are pre-calibrated with a stereo calibration method to obtain three-dimensional world coordinates, and conversion parameters are calculated.
  • a stereo calibration method any conventionally used method such as a method using Tsai's camera calibration theory can be applied.
  • the first calculation unit 2351 and the second calculation unit 2352 use the conversion parameters to convert the coordinates of the left and right cameras into the three-dimensional world coordinates of the pupil center and the corneal reflection center (step S205).
  • the 3rd calculation part 2353 calculates
  • the third calculation unit 2353 calculates a straight line connecting the world coordinates of the center of the target image displayed at one point on the display screen 101 and the world coordinates of the pupil center (step S207).
  • the 3rd calculation part 2353 calculates
  • the third calculation unit 2353 calculates the distance between the pupil center and the corneal curvature center at this time, and stores it in the storage unit 150 or the like (step S209). The stored distance is used to calculate the corneal curvature center at the time of subsequent detection of the viewpoint (line of sight).
  • the distance between the pupil center and the corneal curvature center when looking at one point on the display unit 210 in the calculation process is kept constant within a range in which the viewpoint in the display unit 210 is detected.
  • the distance between the center of the pupil and the center of corneal curvature may be obtained from the average of all the values calculated during playback of the target image, or from the average of several values of the values calculated during playback. You may ask for it.
  • FIG. 24 is a diagram showing a method of calculating the corrected position of the corneal curvature center using the distance between the pupil center and the corneal curvature center obtained in advance when performing viewpoint detection.
  • a gazing point 805 represents a gazing point obtained from a corneal curvature center calculated using a general curvature radius value.
  • a gazing point 806 represents a gazing point obtained from a corneal curvature center calculated using a distance obtained in advance.
  • the pupil center 811 and the corneal reflection center 812 indicate the position of the pupil center and the position of the corneal reflection center calculated at the time of viewpoint detection, respectively.
  • a straight line 813 is a straight line connecting the LED light source 2103 and the corneal reflection center 812.
  • the corneal curvature center 814 is the position of the corneal curvature center calculated from a general curvature radius value.
  • the distance 815 is the distance between the pupil center and the corneal curvature center calculated by the prior calculation process.
  • the corneal curvature center 816 is the position of the corneal curvature center calculated using the distance obtained in advance.
  • the corneal curvature center 816 is obtained from the fact that the corneal curvature center exists on the straight line 813 and the distance between the pupil center and the corneal curvature center is the distance 815.
  • the line of sight 817 calculated when a general radius of curvature value is used is corrected to the line of sight 818.
  • the gazing point on the display screen 101 is corrected from the gazing point 805 to the gazing point 806.
  • the connection between the left and right cameras (the right camera 2202 and the left camera 2203) and the control unit 2300 is not shown and is omitted.
  • FIG. 25 is a flowchart illustrating an example of a line-of-sight detection process according to the present embodiment.
  • the line-of-sight detection process of FIG. 25 can be executed as the process of detecting the line of sight in the diagnostic process using the diagnostic image.
  • a process for displaying a diagnostic image, an evaluation process by the evaluation unit 2357 using the detection result of the gazing point, and the like are executed.
  • Step S301 to step S305 are the same as step S202 to step S206 in FIG.
  • the third calculation unit 2353 calculates, as the corneal curvature center, a position that is on the straight line calculated in step S305 and whose distance from the pupil center is equal to the distance obtained by the previous calculation process (step S306).
  • the line-of-sight detection unit 2354 obtains a vector (line-of-sight vector) connecting the pupil center and the corneal curvature center (step S307). This vector indicates the line-of-sight direction viewed by the subject.
  • the viewpoint detection unit 2355 calculates the three-dimensional world coordinate value of the intersection between the line-of-sight direction and the display screen 101 (step S308). This value is a coordinate value representing one point on the display unit 210 that the subject gazes in world coordinates.
  • the viewpoint detection unit 2355 converts the obtained three-dimensional world coordinate value into a coordinate value (x, y) represented in the two-dimensional coordinate system of the display unit 210 (step S309). Thereby, the viewpoint (gaze point) on the display part 210 which a test subject looks at can be calculated.
  • the calculation process for calculating the distance between the pupil center position and the corneal curvature center position is not limited to the method described with reference to FIGS. Hereinafter, another example of the calculation process will be described with reference to FIGS.
  • FIG. 26 is a diagram for explaining the calculation process of the present modification. The elements described in FIGS. 16 to 19 and FIG.
  • the line segment 1101 is a line segment (first line segment) connecting the target position 605 and the LED light source 103.
  • a line segment 1102 is a line segment (second line segment) that is parallel to the line segment 1101 and connects the pupil center 611 and the straight line 613.
  • the distance 616 between the pupil center 611 and the corneal curvature center 615 is calculated and stored using the line segment 1101 and the line segment 1102 as follows.
  • FIG. 27 is a flowchart showing an example of calculation processing of the present modification.
  • Steps S401 to S407 are the same as steps S201 to S207 in FIG.
  • the third calculation unit 2353 calculates a line segment (the line segment 1101 in FIG. 26) that connects the center of the target image displayed at one point on the screen of the display unit 101 and the center of the LED light source 2103.
  • the length of the line segment (L1101) is calculated (step S408).
  • the third calculation unit 2353 calculates a line segment (line segment 1102 in FIG. 26) that passes through the pupil center 611 and is parallel to the line segment calculated in step S408, and calculates the length of the calculated line segment (L1102). Is calculated (step S409).
  • the third calculation unit 2353 includes a triangle having the corneal curvature center 615 as a vertex and the line segment calculated in step S408 as a lower side, and a triangle having the corneal curvature center 615 as a vertex and the line segment calculated in step S409 as a lower side. Is a similar relationship, the distance 616 between the pupil center 611 and the corneal curvature center 615 is calculated (step S410). For example, the third calculation unit 2353 causes the ratio of the length of the line segment 1102 to the length of the line segment 1101 and the ratio of the distance 616 to the distance between the target position 605 and the corneal curvature center 615 to be equal. The distance 616 is calculated.
  • the distance 616 can be calculated by the following equation (3).
  • L614 is the distance from the target position 605 to the pupil center 611.
  • Distance 616 (L614 ⁇ L1102) / (L1101 ⁇ L1102) (3)
  • the third calculation unit 2353 stores the calculated distance 616 in the storage unit 150 or the like (step S411).
  • the stored distance is used to calculate the corneal curvature center at the time of subsequent detection of the viewpoint (line of sight).
  • the following effects can be obtained. (1) It is not necessary to arrange light sources (illuminating units) at two places, and it becomes possible to perform line-of-sight detection with the light sources arranged at one place. (2) Since the number of light sources is one, the apparatus can be made compact and the cost can be reduced.
  • the pupil detection device, the line-of-sight detection device, and the pupil detection method according to the present invention are suitable for a diagnosis support device and a diagnosis support method for developmental disabilities using captured images.
  • Diagnosis support apparatus 101 Display screen 102, 2102 Stereo camera 105 Speaker 150 Storage unit 202, 2202 Right camera 203, 205 Infrared LED light source 204, 203 Left camera 208 Drive / IF unit 210 Display unit 300, 2300 Control unit 316 LED drive control unit 322 Speaker drive unit 351 Identification unit 352 First estimation unit 353 Second estimation unit 354, 2354 Gaze detection unit 355, 2355 View point detection unit 356, 2356 Output control unit 357, 2357 Evaluation unit 2351 First calculation unit 2352 Second calculation unit (corneal reflection center calculation unit) 2353 Third calculation unit (corneal curvature center calculation unit)

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Multimedia (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne un dispositif de détection de pupille qui est pourvu : d'une unité de spécification (351) qui spécifie, à partir d'une image capturée d'un œil, une région de pupille et une région de réflexion de cornée; d'une première unité d'estimation (352) qui utilise une première région contenue dans la région de pupille pour estimer la position centrale de la pupille; d'une seconde unité d'estimation (353) qui utilise une seconde région contenue dans la région de pupille, différente de la première région, pour estimer la position centrale de la pupille; d'une unité de détection de position de pupille (358) qui détecte la position centrale de la pupille, sur la base de la position centrale de la pupille estimée par la première unité d'estimation (352) et de la position centrale de la pupille estimée par la seconde unité d'estimation (353).
PCT/JP2014/080682 2013-11-29 2014-11-19 Dispositif de détection de pupille, dispositif de détection de ligne visuelle et procédé de détection de pupille WO2015080003A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2013-247956 2013-11-29
JP2013247956 2013-11-29
JP2014043098A JP6269177B2 (ja) 2013-11-29 2014-03-05 瞳孔検出装置、視線検出装置および瞳孔検出方法
JP2014-043098 2014-03-05

Publications (1)

Publication Number Publication Date
WO2015080003A1 true WO2015080003A1 (fr) 2015-06-04

Family

ID=53198947

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/080682 WO2015080003A1 (fr) 2013-11-29 2014-11-19 Dispositif de détection de pupille, dispositif de détection de ligne visuelle et procédé de détection de pupille

Country Status (2)

Country Link
JP (1) JP6269177B2 (fr)
WO (1) WO2015080003A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3011894A1 (fr) * 2014-10-24 2016-04-27 JVC KENWOOD Corporation Appareil et procédé de détection du regard

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08327887A (ja) * 1995-05-31 1996-12-13 Matsushita Electric Ind Co Ltd 視線検出装置
JP2003144388A (ja) * 2001-11-16 2003-05-20 Canon Inc 視線検出装置
JP2008264341A (ja) * 2007-04-24 2008-11-06 Chube Univ 眼球運動計測方法および眼球運動計測装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08327887A (ja) * 1995-05-31 1996-12-13 Matsushita Electric Ind Co Ltd 視線検出装置
JP2003144388A (ja) * 2001-11-16 2003-05-20 Canon Inc 視線検出装置
JP2008264341A (ja) * 2007-04-24 2008-11-06 Chube Univ 眼球運動計測方法および眼球運動計測装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3011894A1 (fr) * 2014-10-24 2016-04-27 JVC KENWOOD Corporation Appareil et procédé de détection du regard

Also Published As

Publication number Publication date
JP6269177B2 (ja) 2018-01-31
JP2015126850A (ja) 2015-07-09

Similar Documents

Publication Publication Date Title
US10722113B2 (en) Gaze detection apparatus and gaze detection method
EP3075304B1 (fr) Dispositif d'assistance de détection de la ligne de visée et procédé d'assistance de détection de la ligne de visée
US10896324B2 (en) Line-of-sight detection device and method for detecting line of sight
EP3123943B1 (fr) Dispositif de détection et procédé de détection
US11023039B2 (en) Visual line detection apparatus and visual line detection method
JP6201956B2 (ja) 視線検出装置および視線検出方法
JP6245093B2 (ja) 診断支援装置および診断支援方法
JP2016028669A (ja) 瞳孔検出装置、および瞳孔検出方法
JP2020038734A (ja) 視線検出装置及び視線検出方法
EP3028644B1 (fr) Dispositif et procédé d'assistance au diagnostic
JP6269177B2 (ja) 瞳孔検出装置、視線検出装置および瞳孔検出方法
US20130321608A1 (en) Eye direction detecting apparatus and eye direction detecting method
JP6187347B2 (ja) 検出装置および検出方法
JP6471533B2 (ja) 視線検出装置および視線検出方法
JP2017131446A (ja) 瞳孔検出装置、および瞳孔検出方法
JP2020119583A (ja) 視線検出装置及び視線検出方法
JP2016157326A (ja) 視線検出装置および視線検出方法
JP2015181797A (ja) 検出装置および検出方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14866490

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14866490

Country of ref document: EP

Kind code of ref document: A1