WO2021131335A1 - Dispositif de détection de réflexe cornéen, dispositif de détection de ligne de visée, procédé de détection de réflexe cornéen et programme de détection de réflexe cornéen - Google Patents

Dispositif de détection de réflexe cornéen, dispositif de détection de ligne de visée, procédé de détection de réflexe cornéen et programme de détection de réflexe cornéen Download PDF

Info

Publication number
WO2021131335A1
WO2021131335A1 PCT/JP2020/041241 JP2020041241W WO2021131335A1 WO 2021131335 A1 WO2021131335 A1 WO 2021131335A1 JP 2020041241 W JP2020041241 W JP 2020041241W WO 2021131335 A1 WO2021131335 A1 WO 2021131335A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
subject
eyeball
corneal reflex
light
Prior art date
Application number
PCT/JP2020/041241
Other languages
English (en)
Japanese (ja)
Inventor
修二 箱嶋
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2021131335A1 publication Critical patent/WO2021131335A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present disclosure relates to a corneal reflex detection device, a line-of-sight detection device, a corneal reflex detection method, and a corneal reflex detection program.
  • the corneal reflex method is known as one of the line-of-sight detection techniques.
  • a subject is irradiated with infrared light emitted from a light source, the subject's eyeball irradiated with infrared light is photographed with a camera, and the position of the pupil with respect to the corneal reflex, which is a reflection image of the light source on the corneal surface.
  • Is detected to detect the subject's line of sight For example, a method of detecting the position of a pupil using a difference image of a dark pupil and a bright pupil has been proposed (see, for example, Patent Document 1).
  • Patent Document 1 cannot easily and accurately detect the position of the corneal reflex.
  • the present disclosure has been made in view of the above, and includes a corneal reflex detection device, a line-of-sight detection device, a corneal reflex detection method, and a corneal reflex detection program capable of easily and accurately detecting the position of a corneal reflex.
  • the purpose is to provide.
  • the corneal reflection detection device includes a light source capable of irradiating the eyeball of the subject and a photographing unit that photographs the eyeball of the subject and generates image data, and the position of light emission by the light source and the photographing unit.
  • the image acquisition device capable of changing the positional relationship of the imaging position by the above and the imaging position of the light source and the imaging position of the imaging unit are arranged so as to have a first positional relationship
  • the eyeball of the subject is placed on the eyeball.
  • the first photographing mode in which one light is irradiated and the eyeball of the subject in the state of being irradiated with the first light is photographed by the photographing unit to generate the first image data, the emission position of the light source, and the photographing.
  • the subject's eyeball is irradiated with the second light in a state where the part is arranged so as to have a second positional relationship with the photographing position, and the subject's eyeball in the state of being irradiated with the second light is the photographing unit.
  • the line-of-sight detection device includes the above-mentioned corneal reflex detection device.
  • the eyeball of a subject is irradiated with the first light in a state where the position where the light is emitted from the light source and the position where the light is photographed by the photographing unit have a first positional relationship.
  • the first image data is generated by photographing the eyeball of the subject in a state of being irradiated with one light by the photographing unit, and the emission position of the light source and the photographing position are different from the first positional relationship.
  • the eyeball of the subject is irradiated with the second light in a state of being arranged so as to have a second positional relationship, and the eyeball of the subject in the state of being irradiated with the second light is photographed by the photographing unit to obtain a second image.
  • the corneal reflection detection program irradiates the eyeball of a subject with the first light in a state where the position where the light is emitted from the light source and the position where the light is photographed by the photographing unit have a first positional relationship.
  • the process of photographing the eyeball of the subject in a state of being irradiated with one light by the photographing unit to generate the first image data, and the emission position of the light source and the photographing position are different from the first positional relationship.
  • the eyeball of the subject is irradiated with the second light in a state of being arranged so as to have a second positional relationship, and the eyeball of the subject in the state of being irradiated with the second light is photographed by the photographing unit to obtain a second image.
  • the computer is made to perform the process of estimating the position of the corneal reflex in the subject's eyeball.
  • the position of the corneal reflex can be easily and accurately detected.
  • FIG. 1 is a diagram schematically showing an example of a line-of-sight detection device according to the present embodiment.
  • FIG. 2 is a functional block diagram showing an example of the line-of-sight detection device.
  • FIG. 3 is a diagram schematically showing the principle of line-of-sight detection by the line-of-sight detection device.
  • FIG. 4 is a diagram showing an example of the first image data and the second image data.
  • FIG. 5 is a diagram showing an example of difference image data.
  • FIG. 6 is an enlarged view showing a part (a portion of both eyes of the subject) in FIG.
  • FIG. 7 is a diagram showing some pixels of the difference image data.
  • FIG. 8 is a diagram showing another example of the difference image data.
  • FIG. 1 is a diagram schematically showing an example of a line-of-sight detection device according to the present embodiment.
  • FIG. 2 is a functional block diagram showing an example of the line-of-sight detection device.
  • FIG. 3 is a
  • FIG. 9 is a flowchart showing an example of the corneal reflex detection method according to the present embodiment.
  • FIG. 10 is a diagram showing another example of the line-of-sight detection device.
  • FIG. 11 is a diagram showing another example of the line-of-sight detection device.
  • the corneal reflex detection device the line-of-sight detection device, the corneal reflex detection method, and the corneal reflex detection program according to the present disclosure will be described with reference to the drawings.
  • the present invention is not limited to this embodiment.
  • the components in the following embodiments include those that can be easily replaced by those skilled in the art, or those that are substantially the same.
  • the direction parallel to the first axis of the predetermined surface is the X-axis direction
  • the direction parallel to the second axis of the predetermined surface orthogonal to the first axis is the Y-axis direction
  • orthogonal to each of the first axis and the second axis is the Z-axis direction.
  • the predetermined plane includes an XY plane.
  • FIG. 1 is a diagram schematically showing an example of the line-of-sight detection device 100 according to the present embodiment.
  • the line-of-sight detection device 100 according to the present embodiment detects the line of sight of the subject and outputs the detection result.
  • the line-of-sight detection device 100 detects the line of sight based on, for example, the position of the pupil of the subject and the position of the corneal reflex image.
  • the line-of-sight detection device 100 includes a display device 10, an image acquisition device 20, a computer system 30, an output device 40, an input device 50, and an input / output interface device 60.
  • the display device 10, the image acquisition device 20, the computer system 30, the output device 40, and the input device 50 perform data communication via the input / output interface device 60.
  • the display device 10 and the image acquisition device 20 each have a drive circuit (not shown).
  • the display device 10 includes a flat panel display such as a liquid crystal display (LCD) or an organic electroluminescence display (OLED).
  • the display device 10 has a display unit 11.
  • the display unit 11 displays information such as an image.
  • the display unit 11 is substantially parallel to the XY plane.
  • the X-axis direction is the left-right direction of the display unit 11
  • the Y-axis direction is the vertical direction of the display unit 11
  • the Z-axis direction is the depth direction orthogonal to the display unit 11.
  • the display device 10 may be a head-mounted display device. In the case of a head-mounted display, a configuration such as the image acquisition device 20 is arranged in the head-mounted module.
  • the image acquisition device 20 acquires image data of the left and right eyeballs EB of the subject, and transmits the acquired image data to the computer system 30.
  • the image acquisition device 20 has a photographing device (imaging unit) 21.
  • the imaging device 21 generates image data by photographing the left and right eyeballs EB of the subject.
  • the photographing device 21 has various cameras according to the method of detecting the line of sight of the subject. In the case of the method of detecting the line of sight based on the position of the pupil of the subject and the position of the corneal reflex image as in the present embodiment, the photographing device 21 has an infrared camera, for example, near red with a wavelength of 850 [nm].
  • the photographing device 21 has an optical system capable of transmitting external light and an imaging element capable of receiving its near-infrared light.
  • the photographing device 21 outputs a frame synchronization signal.
  • the period of the frame synchronization signal can be, for example, 20 [msec], but is not limited to this.
  • the photographing device 21 is configured as a stereo camera having a first camera 21A and a second camera 21B.
  • the first camera 21A is arranged at the first shooting position Q1.
  • the second camera 21B is arranged at the second shooting position Q2.
  • the image data acquired by the first camera 21A and the second camera 21B has, for example, a configuration in which pixels having a brightness set by an 8-bit value (0 to 255) are arranged in a two-dimensional manner.
  • Pixels with a brightness value of 0 are displayed as black in the image data. Pixels with a brightness value of 255 are displayed as white in the image data.
  • the first camera 21A and the second camera 21B have, for example, the same resolution. In this case, the image data acquired by the first camera 21A and the second camera 21B have the same number of vertical and horizontal pixels.
  • the image acquisition device 20 has a lighting device (light source) 22 that illuminates the eyeball EB of the subject.
  • the lighting device 22 includes an LED (light emission diode) light source, and can emit near-infrared light having a wavelength of, for example, 850 [nm].
  • the lighting device 22 emits detection light (light) so as to synchronize with the frame synchronization signal of the photographing device 21.
  • the lighting device 22 has a first light source 22A and a second light source 22B.
  • the first light source 22A is arranged at the first emission position P1 and emits the first light L1.
  • the second light source 22B is arranged at the second emission position P2 and emits the second light L2.
  • the computer system 30 comprehensively controls the operation of the line-of-sight detection device 100.
  • the computer system 30 includes an arithmetic processing unit 30A and a storage device 30B.
  • the arithmetic processing device 30A includes a microprocessor such as a CPU (central processing unit).
  • the storage device 30B includes a memory or storage such as a ROM (read only memory) and a RAM (random access memory).
  • the arithmetic processing unit 30A performs arithmetic processing according to the computer program 30C stored in the storage device 30B.
  • the output device 40 includes a display device such as a flat panel display.
  • the output device 40 may include a printing device.
  • the input device 50 generates input data by being operated.
  • the input device 50 includes a keyboard or mouse for a computer system.
  • the input device 50 may include a touch sensor provided on the display unit of the output device 40, which is a display device.
  • the line-of-sight detection device 100 is a device in which the display device 10 and the computer system 30 are separate devices.
  • the display device 10 and the computer system 30 may be integrated.
  • the line-of-sight detection device 100 may include a tablet-type personal computer.
  • the tablet-type personal computer may be equipped with a display device, an image acquisition device, a computer system, an input device, an output device, and the like.
  • FIG. 2 is a functional block diagram showing an example of the line-of-sight detection device 100.
  • the computer system 30 includes a photographing control unit 31, an input / output control unit 32, a calculation unit 33, a processing unit 34, a gazing point detection unit 35, and a storage unit 36.
  • the functions of the computer system 30 are exhibited by the arithmetic processing unit 30A and the storage device 30B (see FIG. 1).
  • the computer system 30 may be provided with some functions outside the line-of-sight detection device 100.
  • the cornea is composed of the imaging device (imaging unit) 21 and the lighting device (light source) 22 of the image acquisition device 20, the imaging control unit 31, the input / output control unit 32, the calculation unit 33, and the processing unit 34.
  • the reflection detection device 70 is configured.
  • the corneal reflex detection device 70 may include at least one of the gaze point detection unit 35 and the storage unit 36.
  • the shooting control unit 31 controls the shooting device 21 and the lighting device 22.
  • the photographing control unit 31 controls the emission timing, emission time, and the like of the detection light for each of the first light source 22A and the second light source 22B of the lighting device 22.
  • the shooting control unit 31 controls the shooting timing and the like for each of the first camera 21A and the second camera 21B of the shooting device 21.
  • the shooting control unit 31 can change the positional relationship between the light emission position by the lighting device 22 and the shooting position by the shooting device 21.
  • the shooting control unit 31 causes the shooting device 21 and the lighting device 22 to perform the first shooting mode and the second shooting mode described below.
  • the subject's eyeball EB is irradiated with the first light from the first ejection position P1, and the subject's eyeball EB in the state of being irradiated with the first light is photographed from the first imaging position Q1 to obtain the first image.
  • This is a shooting mode that generates data.
  • the positional relationship between the ejection position of the lighting device 22 and the imaging position of the imaging device 21 is the first positional relationship in which the ejection position is the first ejection position P1 and the imaging position is the first imaging position Q1.
  • the shooting control unit 31 emits the first light L1 from the first light source 22A, and the shooting device 21 and the lighting device 22 so that the first camera 21A shoots the eyeball of the subject in this state. To control.
  • the subject's eyeball EB is irradiated with the second light from the second ejection position P2, and the subject's eyeball EB in the state of being irradiated with the second light is shot from the second shooting position Q2 to obtain the second image.
  • This is a shooting mode that generates data.
  • the positional relationship between the injection position of the lighting device 22 and the shooting position of the photographing device 21 is a second positional relationship in which the ejection position is the second shooting position P2 and the shooting position is the second shooting position Q2.
  • the shooting control unit 31 emits the second light L2 from the second light source 22B, and the shooting device 21 and the lighting device 22 so that the second camera 21B shoots the subject's eyeball in this state. To control.
  • the input / output control unit 32 acquires data from at least one of the image acquisition device 20 and the input device 50.
  • the input / output control unit 32 acquires image data (first image data, second image data) acquired by the image acquisition device 20.
  • the input / output control unit 32 stores the acquired first image data and the second image data in the storage unit 36. Further, the input / output control unit 32 outputs data to at least one of the display device 10 and the output device 40.
  • the calculation unit 33 performs a brightness calculation regarding the brightness of the corresponding pixel between the first image data and the second image data.
  • the calculation unit 33 may perform a calculation for obtaining the difference in the brightness of the corresponding pixels between the first image data and the second image data as the brightness calculation. Further, the calculation unit 33 may perform a calculation for obtaining the difference and calculating the absolute value of the obtained difference in brightness as a brightness calculation. Further, the calculation unit 33 may perform an operation in which the value of the difference in brightness is set to "0" when the value of the difference in brightness obtained is a negative value.
  • the brightness calculation is not limited to the calculation for obtaining the difference in brightness.
  • the calculation unit 33 compares the brightness of the corresponding pixels between the first image data and the second image data, and if both are equal to or less than a predetermined threshold value, the brightness value is set to “0”.
  • the operation of replacing with may be performed as a brightness calculation.
  • the processing unit 34 estimates the position of the corneal reflex in the subject's eyeball based on the calculation result.
  • the processing unit 34 can estimate the position of the corneal reflex based on the calculated absolute value.
  • the processing unit 34 can generate, for example, differential image data in which the calculated absolute value is the brightness of the pixel, and can estimate the position of the corneal reflex based on the generated differential image data.
  • the processing unit 34 When estimating the position of the corneal reflex based on the difference image data, the processing unit 34 first obtains the highest brightness pixel, which is the pixel with the highest brightness, from the difference image data.
  • the processing unit 34 extracts high-luminance pixels having a brightness exceeding a predetermined brightness threshold around the obtained maximum-luminance pixel.
  • the periphery of the highest-luminance pixel can be, for example, a region included in a square within ⁇ 32 pixels in the vertical direction with respect to the highest-luminance pixel.
  • the predetermined luminance threshold value can be set to -30, for example, when the luminance is represented by 8 bits. That is, a pixel having a brightness exceeding ⁇ 30 with respect to the highest brightness pixel can be regarded as a high brightness pixel.
  • the processing unit 34 calculates the area of the extracted high-brightness pixels and determines whether or not the area of the high-brightness pixels is equal to or less than a predetermined area threshold value.
  • the predetermined area threshold value can be, for example, an area in which a continuous area is within 20 pixels.
  • the processing unit 34 determines that the highest-luminance pixel is included in the corneal reflex when the area of the high-luminance pixel is less than a predetermined area threshold. On the other hand, when the area of the high-luminance pixel is equal to or larger than a predetermined area threshold value, the processing unit 34 determines that the high-luminance pixel is not included in the corneal reflex.
  • the processing unit 34 estimates that the position of the highest luminance pixel is the position of the corneal reflex.
  • the gaze point detection unit 35 detects the position data (line of sight data) of the gaze point of the subject.
  • the gazing point detection unit 35 detects the subject's line-of-sight vector defined by the three-dimensional global coordinate system based on the image data of the left and right eyeballs EB of the subject acquired by the image acquisition device 20.
  • the gaze point detection unit 35 detects the position data of the intersection of the detected subject's line-of-sight vector and the display unit 11 of the display device 10 as the position data of the gaze point of the subject. That is, in the present embodiment, the gazing point position data is the position data of the intersection of the line-of-sight vector of the subject defined by the three-dimensional global coordinate system and the display unit 11 of the display device 10.
  • the gazing point is a designated point on the display unit 11 designated by being gazed at by the subject.
  • the gazing point detection unit 35 detects the position data of the gazing point of the subject at each predetermined sampling cycle.
  • This sampling cycle can be, for example, the cycle of the frame synchronization signal output from the photographing apparatus 21 (for example, every 20 [msec]).
  • the gazing point detection unit 35 detects the pupil region and the corneal reflex region based on the image data of the left and right eyeballs EB, and calculates the positions of the pupil center and the corneal reflex center based on the detection result. Further, the gazing point detection unit 35 calculates the center of curvature of the cornea based on the virtual straight line connecting the intermediate position (virtual light source position) between the first camera 21A and the second camera 21B and the calculated position of the corneal reflex center. To do. The gazing point detection unit 35 detects the linear direction of the virtual straight line connecting the calculated center of the pupil and the center of curvature of the cornea as the line-of-sight direction G (see FIG. 3).
  • the storage unit 36 stores the first image data and the second image data acquired by the input / output control unit 32.
  • the storage unit 36 stores the difference image data generated by the processing unit 34. Further, the storage unit 36 is arranged so that the light emitting position by the lighting device 22 and the photographing position by the photographing device 21 have a first positional relationship (first emitting position P1, first photographing position Q1). A process of irradiating the eyeball EB of the subject with the first light L1 and photographing the eyeball EB of the subject in the state of being irradiated with the first light L1 with the photographing device 21 to generate the first image data IM1 including a plurality of pixels.
  • the subject receives the second light L2 in a state where the light emitting position by the lighting device 22 and the shooting position by the shooting device 21 are arranged so as to have a second positional relationship (second firing position P2, second shooting position Q2).
  • the eyeball EB of the subject in the state of being irradiated with the second light L2 is photographed by the photographing device 21 to generate the second image data IM2 including a plurality of pixels, and the first image data IM1.
  • the position of the corneal reflection CR in the eyeball EB of the subject is estimated based on the process of performing the brightness calculation regarding the brightness between the corresponding pixels between the pixels constituting the second image data IM2 and the calculation result.
  • Stores a corneal reflex detection program that causes the computer to perform the processing to be performed.
  • FIG. 3 is a diagram schematically showing the principle of line-of-sight detection by the line-of-sight detection device 100.
  • FIG. 3 shows an example of the subject's right eye ER, the same explanation can be given for the left eye EL.
  • the gazing point detection unit 35 detects the pupil region and the corneal reflex region based on the image data obtained by photographing the right eye ER, and based on the detection result, the pupil center PUC and the pupil center PUC and The positions of the corneal reflex centers CRA and CRB are calculated.
  • the gazing point detection unit 35 connects the position (virtual light source position) P3 of the virtual light source 21C set between the first camera 21A and the second camera 21B and the calculated positions of the corneal reflex centers CRA and CRB.
  • the corneal curvature center CRC is calculated based on the straight line.
  • the gazing point detection unit 35 detects the linear direction of the virtual straight line connecting the calculated pupil center PUC and the corneal curvature center CRC as the line-of-sight direction G.
  • the image data obtained by photographing the subject's right eye ER and left eye EL by the photographing device 21 may have a large size as data.
  • the gazing point detection unit 35 calculates the positions of the corneal reflex centers CRA and CRB from the entire image data, it may take time for processing.
  • the position of the corneal reflex CR can be estimated by the corneal reflex detection device 70, and the gazing point detection unit 35 can calculate the positions of the corneal reflex centers CRA and CRB using the estimation result. ..
  • the positions of the corneal reflex centers CRA and CRB can be easily and accurately detected from the large-sized image data.
  • the procedure for detecting corneal reflexes using the corneal reflex detection device 70 will be described.
  • the photographing control unit 31 causes the photographing device 21 and the lighting device 22 to perform the first photographing mode.
  • the photographing control unit 31 controls the photographing device 21 and the lighting device 22 so that the first light source 22A emits the first light L1 and the first camera 21A photographs the eyeball of the subject in this state.
  • the image acquisition device 20 generates first image data in which the eyeball of the subject irradiated with the first light L1 from the first emission position P1 is photographed from the first imaging position Q1.
  • the shooting control unit 31 causes the shooting device 21 and the lighting device 22 to perform the second shooting mode.
  • the photographing control unit 31 controls the photographing device 21 and the lighting device 22 so that the second light source 22B emits the second light L2 and the second camera 21B photographs the subject's eyeball in this state.
  • the image acquisition device 20 generates the second image data obtained by photographing the eyeball of the subject in the state of being irradiated with the second light L2 from the second emission position P2 from the second imaging position Q2.
  • FIG. 4 is a diagram showing an example of the first image data and the second image data.
  • the same number of pixels are arranged in the vertical direction and the horizontal direction in the drawing.
  • the faces of the subjects are reflected in substantially the same positions, sizes, and ranges.
  • the corneal reflex CR is reflected in the eyeball EB (right eye ER and left eye EL) of the subject.
  • the first corneal reflex CR1 formed by the first light L1 is reflected in the first image data IM1.
  • the first corneal reflex CR1 is located on the right side of the figure with respect to the center of the subject's black eye.
  • the second image data IM2 shows the second corneal reflex CR2 formed by the second light L2.
  • the second corneal reflex CR2 is located on the left side of the figure with respect to the center of the subject's black eye.
  • the positions of the corneal reflex CRs are different from each other, and the other parts are acquired as images that are close to each other.
  • the input / output control unit 32 acquires the above-mentioned first image data IM1 and second image data IM2 acquired by the image acquisition device 20 via the input / output interface device 60.
  • the calculation unit 33 performs a luminance calculation regarding the luminance of the corresponding pixel between the first image data IM1 and the second image data IM2.
  • the brightness calculation includes a calculation for obtaining the difference in the brightness of the corresponding pixels between the first image data IM1 and the second image data IM2.
  • the luminance calculation includes a calculation for calculating the absolute value of the obtained difference. In the present embodiment, the luminance calculation is performed, for example, for each pixel by the calculation represented by the following equation (1).
  • Luminance value of pixels (X, Y) in the difference image data
  • (X, Y) is the position coordinates of the pixels in the vertical direction and the horizontal direction.
  • the calculation unit 33 obtains the above-mentioned luminance calculation, that is, the difference in luminance between the pixels for all the pixels of the first image data IM1 and the second image data IM2, and calculates the absolute value of the difference.
  • the calculation unit 33 associates the calculation result with the pixels and stores it in the storage unit 36.
  • the brightness of the pixel in the portion where the image approximated by the first image data IM1 and the second image data IM2 is displayed has the same or similar value. Therefore, the absolute value of the difference in brightness in such pixels is 0 or a value close to 0. Further, in the first image data IM1 and the second image data IM2, the positions of the first corneal reflex CR1 and the second corneal reflex CR2 with respect to the center of the black eye are different. Therefore, the absolute value of the difference in brightness between the pixels in the portion where the first corneal reflex CR1 and the second corneal reflex CR2 are reflected is a large value. By performing the luminance calculation in this way, a large difference occurs in the absolute value of the luminance difference between the portion where the first corneal reflex CR1 and the second corneal reflex CR2 are reflected and the other portion. become.
  • the processing unit 34 estimates the position of the corneal reflex CR in the eyeball EB of the subject based on the calculation result. First, the processing unit 34 generates the difference image data based on the calculation result.
  • the difference image data is image data in which the absolute value of the difference in brightness calculated by the calculation unit 33 for each pixel is the brightness of the pixel.
  • FIG. 5 is a diagram showing an example of the difference image data IM3.
  • FIG. 6 is an enlarged view showing a part (a portion of both eyes of the subject) in FIG.
  • the pixels corresponding to the first corneal reflex CR1 and the second corneal reflex CR2 have a large luminance value (absolute value of the luminance difference) and are close to white. It is displayed.
  • the pixels in the other parts have a luminance value (absolute value of the luminance difference) of 0 or a value close to 0, and the display is close to black.
  • the processing unit 34 estimates the position of the corneal reflex CR based on the difference image data IM3 generated in this way.
  • FIG. 7 is a diagram showing some pixels K of the difference image data IM3.
  • the processing unit 34 first obtains the highest luminance pixel K1 which is the highest luminance pixel among the pixels K constituting the difference image data IM3.
  • the processing unit 34 may obtain the maximum luminance pixel K1 by performing image processing based on the difference image data IM3, or out of the luminance (absolute value of the luminance difference) stored in the storage unit 36.
  • the maximum brightness pixel K1 may be obtained by selecting the pixel corresponding to the maximum value.
  • the processing unit 34 extracts the high-luminance pixel K2 having a brightness exceeding a predetermined brightness threshold among the pixels K around the highest-luminance pixel K1.
  • the processing unit 34 may extract the high-luminance pixel K2 by performing image processing based on the difference image data IM3, or the processing unit 34 may extract the brightness (absolute value of the difference in brightness) stored in the storage unit 36.
  • the high-luminance pixel K2 may be extracted based on the value.
  • the processing unit 34 sets the region composed of the highest-luminance pixel K1 and the high-luminance pixel K2 as the high-luminance region AR.
  • FIG. 8 is a diagram showing another example of the difference image data.
  • FIG. 8 shows the difference image data IM4 for the subject wearing the glasses.
  • the processing unit 34 may obtain the pixels included in the reflected light L3 as the highest luminance pixel K1 instead of the corneal reflection CR.
  • processing is performed to prevent the highest luminance pixel K1 included in a region different from the corneal reflex CR, such as the reflected light L3 of the spectacles, from being mistakenly recognized as a part of the corneal reflex CR. That is, the processing unit 34 calculates the area of the high-luminance region AR, and determines whether or not the highest-luminance pixel K1 is included in the corneal reflex CR based on the calculation result. For example, the reflected light L3 as shown in FIG. 8 tends to be formed in a wider region than the corneal reflection CR. Therefore, the processing unit 34 obtains the area of the high-luminance region AR in the difference image data, and determines whether or not the area of the high-luminance region AR is smaller than a predetermined area threshold value.
  • the processing unit 34 replaces the brightness of each pixel (including the highest brightness pixel K1) constituting the high brightness region AR with 0.
  • the processing unit 34 obtains a new maximum brightness pixel K1, extracts the high brightness region AR based on the maximum brightness pixel K1, and determines whether the area of the high brightness region AR is smaller than the predetermined area threshold value. Judge whether or not.
  • the processing unit 34 estimates the position of the highest luminance pixel K1 as the position of the corneal reflex CR.
  • FIG. 9 is a flowchart showing an example of the corneal reflex detection method according to the present embodiment.
  • the shooting control unit 31 causes the image acquisition device 20 to perform the first shooting mode (step S101).
  • step S101 the first light L1 is emitted from the first light source 22A, and the first light L1 is irradiated to the eyeball EB of the subject.
  • the subject's eyeball EB is photographed by the first camera 21A in a state where the subject's eyeball EB is irradiated with the first light L1, and the first image data IM1 is generated.
  • the input / output control unit 32 acquires the generated first image data IM1 (step S102).
  • the input / output control unit 32 stores the acquired first image data IM1 in the storage unit 36.
  • the shooting control unit 31 causes the image acquisition device 20 to perform the second shooting mode (step S103).
  • step S103 the second light L2 is emitted from the second light source 22B, and the second light L2 is irradiated to the eyeball EB of the subject. Further, the subject's eyeball EB is photographed by the second camera 21B in a state where the subject's eyeball EB is irradiated with the second light L2, and the second image data IM2 is generated.
  • the input / output control unit 32 acquires the generated second image data IM2 (step S104).
  • the input / output control unit 32 stores the acquired second image data IM2 in the storage unit 36.
  • the calculation unit 33 performs a luminance calculation regarding the luminance of the corresponding pixel between the first image data IM1 and the second image data IM2 (step S105).
  • the calculation unit 33 obtains the difference in the brightness of the corresponding pixels between the first image data IM1 and the second image data IM2, and calculates the absolute value of the difference.
  • the calculation unit 33 associates the calculation result with the pixels and stores it in the storage unit 36.
  • step S106 image data in which the absolute value of the difference in brightness calculated for each pixel is the value of the brightness in the pixel is generated as the difference image data IM3.
  • the processing unit 34 obtains the highest luminance pixel K1 which is the highest luminance pixel from the difference image data IM3 (step S107).
  • the processing unit 34 extracts high-luminance pixels having a brightness exceeding a predetermined brightness threshold around the maximum-brightness pixel K1 and calculates the area of the high-luminance region AR composed of the extracted pixels (step). S108).
  • the processing unit 34 determines whether or not the calculation result is smaller than the predetermined area threshold value (step S109). When it is determined that the calculation result is equal to or greater than the area threshold value (No in step S109), the processing unit 34 replaces the brightness of each pixel (including the highest brightness pixel K1) constituting the high brightness region AR with 0 (step). S110), and then the processes after step S107 are performed again.
  • the processing unit 34 determines that the highest luminance pixel K1 is included in the corneal reflex CR (step S111. In this case, the processing unit 34). Estimates the position of the highest luminance pixel K1 as the position of the corneal reflex CR, and ends the process.
  • the corneal reflection detection device 70 can photograph the eyeball EB of the subject and the lighting device 22 capable of irradiating the eyeball EB of the subject, and can capture image data including a plurality of pixels by the photographing.
  • An image acquisition device 20 having a photographing device 21 to generate and capable of changing the positional relationship between the light emitting position by the lighting device 22 and the photographing position by the photographing device 21, and the emission position and the photographing position have a first positional relationship (
  • the subject's eyeball EB is irradiated with the first light L1 in a state of being arranged at the first ejection position P1 and the first imaging position Q1), and the subject's eyeball EB in the state of being irradiated with the first light L1 is photographed.
  • the first shooting mode in which the device 21 takes a picture and acquires the first image data IM1, and the second position relationship (second shot position P2, second shooting position Q2) in which the ejection position and the shooting position are different from the first positional relationship.
  • the eyeball EB of the subject is irradiated with the second light L2
  • the eyeball EB of the subject in the state of being irradiated with the second light L2 is photographed by the photographing device 21 and the second image data IM2.
  • the second shooting mode for acquiring the above is supported between the shooting control unit 31 that causes the lighting device 22 and the shooting device 21 to perform the operation, and the pixels that make up the first image data IM1 and the pixels that make up the second image data IM2. It includes a calculation unit 33 that performs a brightness calculation regarding the brightness between the pixels to be performed, and a processing unit 34 that estimates the position of the corneal reflection CR in the eyeball EB of the subject based on the calculation result.
  • the light emission position by the illumination device 22 and the imaging position by the imaging device 21 have a first positional relationship (first emission position P1, first imaging position Q1).
  • the eyeball EB of the subject is irradiated with the first light L1 in the state of being arranged in the eyeball EB of the subject, and the eyeball EB of the subject in the state of being irradiated with the first light L1 is photographed by the photographing device 21 and the first image data including a plurality of pixels is included.
  • second firing position P2, second shooting position Q2 second firing position P2, second shooting position Q2
  • the subject's eyeball EB is irradiated with the second light L2, and the subject's eyeball EB in the state of being irradiated with the second light L2 is photographed by the photographing device 21 to generate the second image data IM2 including a plurality of pixels.
  • the light emission position by the illumination device 22 and the imaging position by the imaging device 21 have a first positional relationship (first emission position P1, first imaging position Q1).
  • the first light L1 is irradiated to the eyeball EB of the subject in such an arrangement, and the eyeball EB of the subject in the state of being irradiated with the first light L1 is photographed by the photographing device 21 to include a plurality of pixels.
  • a state in which the process of generating the image data IM1 and the light emitting position by the lighting device 22 and the shooting position by the shooting device 21 are arranged so as to have a second positional relationship (second firing position P2, second shooting position Q2).
  • the second light L2 is irradiated to the eyeball EB of the subject, and the eyeball EB of the subject in the state of being irradiated with the second light L2 is photographed by the photographing device 21 to generate the second image data IM2 including a plurality of pixels.
  • the processing for performing the brightness calculation regarding the brightness between the pixels corresponding to the pixels constituting the first image data IM1 and the pixels constituting the second image data IM2, and the calculation result, the eyeball EB of the subject Let the computer execute the process of estimating the position of the corneal reflection CR in.
  • the first image data and the second image data are generated by setting the light emission position by the lighting device 22 and the shooting position by the shooting device 21 as the first positional relationship and the second positional relationship.
  • the corneal reflex CR can be formed at different positions with respect to the center of the eyeball EB of the subject between the first image data and the second image data. Therefore, the position of the corneal reflex CR of the subject can be easily and accurately detected by performing the luminance calculation on the corresponding pixel between the first image data and the second image data.
  • the corneal reflection detection device 70 at least two of the lighting device 22 and the photographing device 21 are provided. As a result, one of the two provided can be used as a component of the first positional relationship, and the other can be used as a component of the second positional relationship, so that it is easy to set the first positional relationship and the second positional relationship. Become.
  • the brightness calculation includes a calculation for obtaining the difference in brightness of the corresponding pixels between the first image data IM1 and the second image data IM2. According to this configuration, it is possible to easily distinguish between the pixels corresponding to the corneal reflex CR and the pixels other than the pixels by obtaining the difference in brightness.
  • the calculation unit 33 performs an operation to calculate the absolute value of the obtained difference, and the processing unit 34 estimates the position of the corneal reflex CR based on the calculated absolute value. To do. According to this configuration, it is possible to easily distinguish between the pixels corresponding to the corneal reflex CR and the other pixels with higher accuracy.
  • the processing unit 34 generates the difference image data IM3 in which the calculated absolute value is the brightness of the pixel, and the position of the corneal reflex CR is based on the generated difference image data IM3. To estimate. According to this configuration, the position of the corneal reflex CR of the subject can be easily and highly accurately detected by using the difference image data IM3.
  • the processing unit 34 obtains the highest luminance pixel K1 which is the highest luminance pixel from the difference image data IM3, and sets a predetermined luminance threshold around the highest luminance pixel K1.
  • a pixel having a brightness exceeding that is extracted, it is determined whether or not the highest brightness pixel K1 is included in the corneal reflex CR based on the extraction result, and when it is determined that the highest brightness pixel K1 is included in the corneal reflex CR, It is estimated that the position of the highest luminance pixel K1 is the position of the corneal reflex CR. According to this configuration, the position of the corneal reflex CR of the subject can be estimated with high accuracy.
  • the line-of-sight detection device 100 includes the above-mentioned corneal reflex detection device 70.
  • the image data obtained by photographing the right eye ER and the left eye EL of the subject by the photographing device 21 may have a large size as data.
  • the gazing point detection unit 35 calculates the positions of the corneal reflex centers CRA and CRB from the entire image data, it may take time for processing.
  • the position of the corneal reflex CR is estimated with high accuracy by the corneal reflex detection device 70, and the gazing point detection unit 35 calculates the positions of the corneal reflex centers CRA and CRB using the estimation result. .. This makes it possible to provide the line-of-sight detection device 100 capable of easily and accurately detecting the positions of the corneal reflex centers CRA and CRB.
  • first light source 22A and second light source 22B a configuration in which a total of two light sources (first light source 22A and second light source 22B) are used, one for each of the first emission position P1 and the second emission position P2, has been described as an example. , Not limited to this. There may be one light source as long as it can be arranged at the first emission position P1 and the second emission position P2. In this case, a configuration in which one light source is replaced between the first emission position P1 and the second emission position P2, a configuration in which one light source is moved between the first emission position P1 and the second emission position P2, and the like can be mentioned.
  • first camera 21A and second camera 21B a configuration in which a total of two cameras (first camera 21A and second camera 21B) are used, one at each of the first shooting position Q1 and the second shooting position Q2, has been described as an example. However, it is not limited to this. There may be one camera as long as it can be arranged at the first shooting position Q1 and the second shooting position Q2. In this case, a configuration in which one camera is replaced between the first shooting position Q1 and the second shooting position W2, a configuration in which one camera is moved between the first shooting position Q1 and the second shooting position Q2, and the like can be mentioned. ..
  • the configuration in which the corneal reflex detection device 70 is included in the line-of-sight detection device 100 has been described as an example, but the present invention is not limited to this.
  • the corneal reflex detection device 70 may be provided as an independent device, for example.
  • the corneal reflex detection device 70 is different from the line-of-sight detection device 100, for example, a terminal device such as a smartphone, an image analysis device for analyzing images such as a security camera, and a device having a face recognition function for recognizing the face of a target person. It may be provided in another device.
  • FIG. 10 is a diagram showing another example of the line-of-sight detection device.
  • the line-of-sight detection device 100A shown in FIG. 10 has the same configuration as the line-of-sight detection device 100 described in the above embodiment in that the photographing device 21 of the image acquisition device 20 has the first camera 21A and the second camera 21B.
  • the line-of-sight detection device 100A is different from the above-mentioned line-of-sight detection device 100 in that only one light source 22D is provided at the emission position P4 for the illumination device 22.
  • the first shooting mode can be performed by using the light source 22D and the first camera 21A which are in the positional relationship (first positional relationship) between the emission position P4 and the shooting position Q1.
  • the second shooting mode can be performed by using the light source 22D and the second camera 21B which are in the positional relationship (second positional relationship) between the injection position P4 and the shooting position Q2. Therefore, the present disclosure can be applied even in a configuration in which one light source and two cameras are provided.
  • FIG. 11 is a diagram showing another example of the line-of-sight detection device.
  • the line-of-sight detection device 200 shown in FIG. 11 is a terminal device such as a smartphone.
  • the line-of-sight detection device 200 includes a main body 210 and an image acquisition device 220.
  • the main body 210 is, for example, a display device for displaying images, an output device capable of outputting voice and vibration, an input device such as a touch panel and buttons for inputting user operations, and communication for communicating information with the outside. It has a device, a control device for controlling each of these devices, and the like (none of which are shown).
  • the image acquisition device 220 includes a photographing device (photographing unit) 221 and a lighting device (light source) 222.
  • a photographing device photographing unit
  • a lighting device light source
  • the illuminating device 222 has a first light source 222A and a second light source 222B.
  • the first light source 222A and the second light source 222B are arranged on both sides in the left-right direction with respect to the photographing device 221.
  • the distance between the first light source 222A and the photographing device 221 and the distance between the second light source 222B and the photographing device 221 are equal.
  • the distance between the first light source 222A and the second light source 222B is preferable.
  • the first shooting mode can be performed by using the first light source 222A and the shooting device 221 that are in the positional relationship (first positional relationship) between the injection position P5 and the shooting position Q3.
  • the second shooting mode can be performed by using the second light source 222B and the shooting device 22 which are in the positional relationship (second positional relationship) between the injection position P6 and the shooting position Q3.
  • the present disclosure can be applied even in a configuration in which two light sources and one camera are provided.
  • the corneal reflex detection device, the line-of-sight detection device, the corneal reflex detection method, and the corneal reflex detection program of the present disclosure can be used, for example, in a processing device such as a computer, a terminal device such as a smartphone, and the like.
  • AR ... high brightness region CR ... corneal reflex, CRA ... corneal reflex center, CRC ... corneal curvature center, CR1 ... first corneal reflex, CR2 ... second corneal reflex, EB ... eyeball, EL ... left eye, ER ... right eye , G ... line-of-sight direction, IM1 ... first image data, IM2 ... second image data, IM3, IM4 ... difference image data, K ... pixel, K1 ... highest brightness pixel, K2 ... high brightness pixel, L1 ... first light, L2 ... 2nd light, L3 ... Reflected light, PUC ... Eye center, P1 ... 1st injection position, P2 ...

Abstract

Ce dispositif de détection de réflexe cornéen : expose un globe oculaire d'un sujet à une première lumière dans un état dans lequel une position d'émission d'une source de lumière et une position de photographie d'une unité de photographie sont agencées dans une première relation de position ; génère des premières données d'image en photographiant, avec l'unité de photographie, le globe oculaire du sujet exposé à la première lumière ; expose le globe oculaire du sujet à une seconde lumière dans un état dans lequel la position d'émission de la source de lumière et la position de photographie de l'unité de photographie sont agencées dans une seconde relation de position ; génère des secondes données d'image en photographiant, avec l'unité de photographie, le globe oculaire du sujet exposé à la seconde lumière ; effectue un calcul de luminosité concernant la luminosité entre des pixels correspondants entre des pixels constituant les premières données d'image et des pixels constituant les secondes données d'image ; et estime, sur la base d'un résultat de calcul, la position du réflexe cornéen dans le globe oculaire du sujet.
PCT/JP2020/041241 2019-12-23 2020-11-04 Dispositif de détection de réflexe cornéen, dispositif de détection de ligne de visée, procédé de détection de réflexe cornéen et programme de détection de réflexe cornéen WO2021131335A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019231141A JP2021099664A (ja) 2019-12-23 2019-12-23 角膜反射検出装置、視線検出装置、角膜反射検出方法、及び角膜反射検出プログラム
JP2019-231141 2019-12-23

Publications (1)

Publication Number Publication Date
WO2021131335A1 true WO2021131335A1 (fr) 2021-07-01

Family

ID=76542011

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/041241 WO2021131335A1 (fr) 2019-12-23 2020-11-04 Dispositif de détection de réflexe cornéen, dispositif de détection de ligne de visée, procédé de détection de réflexe cornéen et programme de détection de réflexe cornéen

Country Status (2)

Country Link
JP (1) JP2021099664A (fr)
WO (1) WO2021131335A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06323832A (ja) * 1993-05-13 1994-11-25 Nissan Motor Co Ltd 車両用インターフェイス
JPH09251539A (ja) * 1996-03-18 1997-09-22 Nissan Motor Co Ltd 視線計測装置
WO2017014137A1 (fr) * 2015-07-17 2017-01-26 ソニー株式会社 Dispositif d'observation du globe oculaire, terminal de lunetterie, procédé de détection du regard, et programme

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06323832A (ja) * 1993-05-13 1994-11-25 Nissan Motor Co Ltd 車両用インターフェイス
JPH09251539A (ja) * 1996-03-18 1997-09-22 Nissan Motor Co Ltd 視線計測装置
WO2017014137A1 (fr) * 2015-07-17 2017-01-26 ソニー株式会社 Dispositif d'observation du globe oculaire, terminal de lunetterie, procédé de détection du regard, et programme

Also Published As

Publication number Publication date
JP2021099664A (ja) 2021-07-01

Similar Documents

Publication Publication Date Title
US10896324B2 (en) Line-of-sight detection device and method for detecting line of sight
WO2018030515A1 (fr) Dispositif de détection de ligne de visée
CN108140244B (zh) 视线检测装置以及视线检测方法
EP3011894B1 (fr) Appareil et procédé de détection du regard
JP2019025195A (ja) 視線検出装置及び視線検出方法
WO2018164104A1 (fr) Dispositif de traitement d'image oculaire
WO2021059745A1 (fr) Dispositif de correction de données de regard, dispositif d'évaluation, procédé de correction de données de regard, procédé d'évaluation, programme de correction de données de regard et programme d'évaluation
WO2021131335A1 (fr) Dispositif de détection de réflexe cornéen, dispositif de détection de ligne de visée, procédé de détection de réflexe cornéen et programme de détection de réflexe cornéen
JP4451195B2 (ja) 視線検出装置
US11937928B2 (en) Evaluation apparatus, evaluation method, and evaluation program
JP2019146915A (ja) 評価装置、評価方法、及び評価プログラム
US10996747B2 (en) Line-of-sight detection device, line-of-sight detection method, and medium
JP7063045B2 (ja) 視線検出装置、視線検出方法及び視線検出プログラム
JP6370168B2 (ja) 照明撮像装置及びそれを備えた視線検出装置
WO2020026574A1 (fr) Dispositif, procédé et programme de détection de ligne de mire
JP2018163411A (ja) 顔向き推定装置及び顔向き推定方法
WO2020183792A1 (fr) Dispositif d'affichage, procédé d'affichage et programme d'affichage
JP2020024750A (ja) 視線検出装置、視線検出方法、及びコンピュータプログラム
WO2023008023A1 (fr) Dispositif de détection de ligne de visée, procédé de détection de ligne de visée et programme de détection de ligne de visée
WO2021246012A1 (fr) Dispositif, procédé et programme de détection de ligne de vue
WO2022038814A1 (fr) Dispositif de calcul de rayon de courbure cornéenne, dispositif de détection de ligne de visée, procédé de calcul de rayon de courbure cornéenne et programme de calcul de rayon de courbure cornéenne
WO2021010122A1 (fr) Dispositif d'évaluation, procédé d'évaluation et programme d'évaluation
JP2019024776A (ja) 評価装置、評価方法、及び評価プログラム
WO2022038815A1 (fr) Dispositif de calcul de distance, procédé de calcul de distance, et programme de calcul de distance
WO2021059746A1 (fr) Dispositif de traitement de données de visée, dispositif d'évaluation, procédé de traitement de données de visée, procédé d'évaluation, programme de traitement de données de visée et programme d'évaluation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20906779

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20906779

Country of ref document: EP

Kind code of ref document: A1