WO2021171396A1 - Pupil position detection device, visual line detection appratus, and pupil position detection method - Google Patents

Pupil position detection device, visual line detection appratus, and pupil position detection method Download PDF

Info

Publication number
WO2021171396A1
WO2021171396A1 PCT/JP2020/007602 JP2020007602W WO2021171396A1 WO 2021171396 A1 WO2021171396 A1 WO 2021171396A1 JP 2020007602 W JP2020007602 W JP 2020007602W WO 2021171396 A1 WO2021171396 A1 WO 2021171396A1
Authority
WO
WIPO (PCT)
Prior art keywords
pupil position
pupil
reliability
unit
image
Prior art date
Application number
PCT/JP2020/007602
Other languages
French (fr)
Japanese (ja)
Inventor
遼平 村地
亮介 虎間
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2020/007602 priority Critical patent/WO2021171396A1/en
Publication of WO2021171396A1 publication Critical patent/WO2021171396A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Definitions

  • the present disclosure relates to a pupil position detection device, a line-of-sight detection device, and a pupil position detection method.
  • the conventional pupil position detection device executes template matching to detect pupil position candidates from the image of the eye region, calculates the likelihood of each detected pupil position candidate, and selects the pupil position candidate having the maximum likelihood. It was determined to be the position (see, for example, Patent Document 1).
  • the calculated likelihood may not be an appropriate value depending on the imaging environment, and there is a problem that the reliability of the pupil position cannot be evaluated correctly.
  • the present disclosure has been made to solve the above problems, and an object of the present disclosure is to evaluate the reliability of the pupil position without being affected by the imaging environment.
  • the pupil position detection device is composed of an image acquisition unit that acquires an image captured by a subject, an eye area extraction unit that extracts an eye area from an image acquired by the image acquisition unit, and an eye area extraction unit.
  • the degree of spread of the plurality of pupil positions is calculated based on the positional relationship between the pupil position detection unit that detects a plurality of pupil positions from the extracted eye region and the plurality of pupil positions detected by the pupil position detection unit. It is provided with a second reliability calculation unit that calculates a second reliability that is an index of the certainty of the plurality of pupil positions based on the degree of spread.
  • the reliability of the pupil positions can be evaluated without being affected by the imaging environment.
  • FIG. 2A and 2B are diagrams showing a hardware configuration example of the line-of-sight detection device according to the first embodiment. It is a figure which shows the example of the eye area image extracted by the eye area extraction part. It is explanatory drawing which shows the example of a filter. It is explanatory drawing which shows the example of the sweep by a filter. It is explanatory drawing which shows the example of the vector corresponding to the 1st luminance gradient value, the example of the vector corresponding to the 2nd luminance gradient value, and the example of the luminance gradient vector.
  • FIG. 11A is a diagram showing an example of three pupil positions having a small degree of spread
  • FIG. 11B is an explanatory diagram showing an example of a method of calculating the second reliability in the example of FIG. 11A
  • FIG. 12A is a diagram showing an example of three pupil positions having a large degree of spread
  • FIG. 12B is a diagram showing an example of three pupil positions having a large degree of spread
  • FIG. 12B is an explanatory diagram showing an example of a method of calculating the second reliability in the example of FIG. 12A. It is explanatory drawing which shows the example of the calculation method of the line-of-sight angle with respect to the yaw direction. It is explanatory drawing which shows the example of the calculation method of the line-of-sight angle with respect to a pitch direction. It is a flowchart which shows the operation example of the line-of-sight detection apparatus which concerns on Embodiment 1. FIG. It is a block diagram which shows the structural example of the line-of-sight detection apparatus which concerns on Embodiment 2. FIG. It is explanatory drawing which shows the example of the pupil position in the 1st reliability calculation region and the brightness value distribution around it.
  • FIG. 1 is a block diagram showing a configuration example of the line-of-sight detection device 10 according to the first embodiment.
  • the line-of-sight detection device 10 includes a pupil position detection device 1 and a line-of-sight angle calculation unit 11.
  • the pupil position detection device 1 includes an image acquisition unit 2, an eye region extraction unit 3, a pupil position detection unit 4, and a second reliability calculation unit 6. Further, the image pickup device 12 is connected to the pupil position detection device 1.
  • FIGS. 2A and 2B are diagrams showing a hardware configuration example of the line-of-sight detection device 10 according to the first embodiment.
  • the functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the second reliability calculation unit 6, and the line-of-sight angle calculation unit 11 in the line-of-sight detection device 10 are realized by a processing circuit. That is, the line-of-sight detection device 10 includes a processing circuit for realizing the above functions.
  • the processing circuit may be a processing circuit 100 as dedicated hardware, or may be a processor 101 that executes a program stored in the memory 102.
  • the processing circuit 100 when the processing circuit is dedicated hardware, the processing circuit 100 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, or an ASIC (Application Specific Integrated Circuit). ), FPGA (Field Processor Gate Array), or a combination thereof.
  • the functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the second reliability calculation unit 6, and the line-of-sight angle calculation unit 11 may be realized by a plurality of processing circuits 100, or the functions of each unit may be realized. May be collectively realized by one processing circuit 100.
  • the processing circuit is the processor 101
  • the functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the second reliability calculation unit 6, and the line-of-sight angle calculation unit 11 Is realized by software, firmware, or a combination of software and firmware.
  • the software or firmware is described as a program and stored in the memory 102.
  • the processor 101 realizes the functions of each part by reading and executing the program stored in the memory 102. That is, the line-of-sight detection device 10 includes a memory 102 for storing a program in which the step shown in the flowchart of FIG. 15 described later will be executed as a result when executed by the processor 101.
  • this program causes a computer to execute the procedures or methods of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the second reliability calculation unit 6, and the line-of-sight angle calculation unit 11. I can say.
  • the processor 101 is a CPU (Central Processing Unit), a processing device, an arithmetic unit, a microprocessor, or the like.
  • the memory 102 may be a non-volatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), an EPROM (Erasable Program ROM), or a flash memory, and may be a non-volatile or volatile semiconductor memory such as a hard disk or a flexible disk. It may be an optical disc such as a CD (Compact Disc) or a DVD (Digital Versaille Disc).
  • the processing circuit in the line-of-sight detection device 10 can realize the above-mentioned functions by hardware, software, firmware, or a combination thereof.
  • the imaging device 12 images the periphery of the face of the subject whose pupil position is detected and the line of sight is detected in chronological order.
  • the image pickup device 12 outputs the captured image I1 to the image acquisition unit 2.
  • the image acquisition unit 2 acquires the image I1 of the subject imaged by the image pickup device 12 from the image pickup device 12 and outputs it to the eye region extraction unit 3.
  • the eye region extraction unit 3 extracts the target person's eye region from the image I1 acquired by the image acquisition unit 2, and transfers the extracted image of the eye region (hereinafter referred to as “eye region image I2”) to the pupil position detection unit 4. Output.
  • FIG. 3 is a diagram showing an example of the eye region image I2 extracted by the eye region extraction unit 3.
  • the eye area extraction unit 3 detects the face area of the subject from the image I1 acquired by the image acquisition unit 2, and from the detected face area, the outer corner FP1, the inner corner of the eye FP2, the upper eyelid FP3, and the lower eyelid, which are the feature points of the right eye.
  • FP4 is detected, and the right eye region image I2 is extracted from the image I1 based on the positions of the detected eye feature points.
  • the eye region extraction unit 3 detects the outer corners of the eyes, the inner corners of the eyes, the upper eyelid, and the lower eyelid, which are the feature points of the left eye, from the face region, and based on the positions of the detected eye feature points, the left eye from the image I1. Extract the eye area image.
  • the eye region image I2 is composed of a plurality of units (hereinafter referred to as "image units") U arranged in two directions (that is, the X direction and the Y direction) orthogonal to each other.
  • each image unit U is composed of one pixel.
  • each image unit U is composed of a plurality of pixels adjacent to each other.
  • the eye region extraction unit 3 extracts either the right eye region or the left eye region from the image I1.
  • the pupil position detection unit 4 detects the pupil position of the one eye
  • the second reliability calculation unit 6 The second reliability of the pupil position of the one eye detected by the pupil position detection unit 4 is calculated.
  • the eye region extraction unit 3 may extract the right eye region and the left eye region.
  • the pupil position detection unit 4 detects the pupil position of the right eye and the pupil position of the left eye
  • the second reliability calculation unit 6 detects the second reliability of the pupil position of the right eye and the second reliability of the pupil position of the left eye. Calculate the reliability.
  • the pupil position detection unit 4 detects one or more positions (hereinafter, referred to as “pupil position”) indicating that it is within the pupil region from the eye region image I2 extracted by the eye region extraction unit 3.
  • the pupil position detection unit 4 outputs information including the position of the eye feature point and one or more pupil positions to the second reliability calculation unit 6.
  • the pupil position detection unit 4 may detect the pupil position from the eye region image I2 by a well-known method, and the method is not limited.
  • the template matching as described above may be used, or the method based on the luminance gradient vector as described below may be used.
  • the likelihood in template matching corresponds to the evaluation value E in the method based on the luminance gradient vector described later.
  • FIG. 4 is an explanatory diagram showing an example of the filter F.
  • FIG. 5 is an explanatory diagram showing an example of sweeping by the filter F.
  • the filter F shown in FIG. 4 is used for calculating the luminance gradient vector V2.
  • the filter F is applied so as to sweep the eye region image I2 as shown in FIG.
  • the filter F has a luminance value B_L and a attention image in the image unit U_L arranged to the left of the attention image unit U_I for each image unit (hereinafter, may be referred to as "attention image unit”) U_I in the eye area image I2.
  • the difference value (hereinafter referred to as “first luminance gradient value”) ⁇ B_X from the luminance value B_R in the image unit U_R arranged to the right of the unit U_I is calculated.
  • the filter F is a difference value between the luminance value B_U in the image unit U_U arranged above the attention image unit U_I and the luminance value B_D in the image unit U_D arranged below the attention image unit U_I (hereinafter, "second”). It is called “luminance gradient value”.) ⁇ B_Y is calculated.
  • FIG. 6 is an explanatory diagram showing an example of the vector V2_X corresponding to the first luminance gradient value ⁇ B_X, an example of the vector V2_Y corresponding to the second luminance gradient value ⁇ B_Y, and an example of the luminance gradient vector V2.
  • the luminance gradient vector V2 is represented by the sum of the vector V2_X corresponding to the first luminance gradient value ⁇ B_X and the vector V2_Y corresponding to the second luminance gradient value ⁇ B_Y.
  • the pupil position detection unit 4 calculates the first luminance gradient value ⁇ B_X and the second luminance gradient value ⁇ B_Y, and uses the calculated first luminance gradient value ⁇ B_X and the second luminance gradient value ⁇ B_Y to obtain the luminance gradient vector V2.
  • the angle corresponding to the direction hereinafter referred to as “luminance gradient angle”) ⁇ is calculated.
  • FIG. 7 shows an example of the luminance value B_I in the attention image unit U_I and an example of the luminance values B_D, B_L, B_R, B_U in the individual image units U_D, U_L, U_R, U_U arranged around the attention image unit U_I.
  • the luminance value B_L is 44
  • the luminance value B_R is 16
  • the luminance value B_U is 47
  • the luminance value B_D is 18.
  • the filter F to the attention image unit U_I
  • the first luminance gradient value ⁇ B_X is calculated to ⁇ 28
  • the second luminance gradient value ⁇ B_Y is calculated to ⁇ 29.
  • the luminance gradient angle ⁇ is calculated to be 46 °.
  • FIG. 8 is an explanatory diagram showing an example of the luminance gradient vector V2 corresponding to the attention image unit U_I.
  • the pupil position detection unit 4 uses the luminance gradient vector V2 to calculate the evaluation value E corresponding to each image unit U in the eye region image I2.
  • the evaluation value E is based on the number n of the luminance gradient vectors V2 toward the individual image unit U (that is, the attention image unit U_I). A method of calculating the evaluation value E will be described with reference to FIGS. 9 and 10.
  • FIG. 9 is an explanatory diagram showing an example of the evaluation area EA.
  • the pupil position detection unit 4 sets an evaluation region (hereinafter referred to as “evaluation region”) EA including the attention image unit U_I.
  • the evaluation area EA includes the attention image unit U_I and includes N image units (hereinafter, may be referred to as “evaluation image unit”) U_E different from the attention image unit U_I.
  • N is any integer greater than or equal to 2.
  • FIG. 9 shows an example of the evaluation area EA.
  • the shape of the evaluation region EA is square, and the image unit U_I of interest is arranged at the center of the evaluation region EA.
  • the evaluation area EA has a size smaller than a predetermined size (hereinafter referred to as "reference iris size").
  • the reference iris size corresponds to the size of a standard human iris.
  • the pupil position detection unit 4 calculates the angle ⁇ 'corresponding to the inclination of the straight line connecting the attention image unit U_I and the individual evaluation image units U_E.
  • the pupil position detection unit 4 calculates the difference value ⁇ between the corresponding luminance gradient angle ⁇ and the corresponding angle ⁇ 'for each evaluation image unit U_E.
  • the pupil position detection unit 4 determines that the corresponding luminance gradient vector V2 has a direction toward the attention image unit U_I.
  • the calculated difference value ⁇ is equal to or greater than the threshold value ⁇ th, the pupil position detection unit 4 determines that the corresponding luminance gradient vector V2 does not have a direction toward the attention image unit U_I.
  • the pupil position detection unit 4 calculates the number n of the luminance gradient vectors V2 toward the attention image unit U_I based on the result of such determination.
  • the number n is calculated as a value of 0 or more and N or less.
  • the pupil position detection unit 4 calculates an evaluation value E according to the calculated number n. That is, the larger the number n, the higher the evaluation value E is calculated. In other words, the smaller the number n, the lower the evaluation value E is calculated.
  • FIG. 10 is an explanatory diagram showing an example of the luminance gradient vector V2 corresponding to each evaluation image unit U_E.
  • the solid line arrow indicates the luminance gradient vector V2 having a direction toward the attention image unit U_I.
  • the arrow by the broken line indicates the luminance gradient vector V2 having no direction toward the attention image unit U_I.
  • the pupil position detection unit 4 calculates the number n to 19. Further, the pupil position detection unit 4 calculates an evaluation value E according to the calculated number n. For example, the evaluation value E based on n / N is calculated.
  • the pupil position detection unit 4 detects the pupil position PP in the eye region image I2 based on the calculated evaluation value E. More specifically, the pupil position detection unit 4 extracts an evaluation value E that is equal to or higher than a predetermined threshold value from a plurality of evaluation values E corresponding to the plurality of image units U in the eye region image I2. , The position corresponding to the extracted evaluation value E is detected as the pupil position PP.
  • the pupil position detection unit 4 may detect a plurality of pupil position PPs when there are a plurality of evaluation values E that are equal to or higher than the threshold value. Further, the pupil position detection unit 4 detects coordinate values C_X and C_Y indicating the position of the image unit U corresponding to the pupil position PP in the eye region image I2 for each of the detected one or more pupil position PPs.
  • the pupil position detection unit 4 detects the position corresponding to the maximum value (that is, the maximum value) of the plurality of evaluation values E corresponding to the plurality of image units U in the eye region image I2 as the pupil position PP. You may.
  • the eye region image I2 is usually composed of a region corresponding to the inside of the ocular fissure (hereinafter referred to as “intraocular fissure region”) and a region corresponding to the outside of the ocular fissure (hereinafter referred to as “extraocular fissure region”). It is configured.
  • the extraocular region is located around the intraocular region.
  • the intraocular fissure region includes a region corresponding to the pupil (hereinafter referred to as “pupil region”), a region corresponding to the iris (hereinafter referred to as “iris region”), and a region corresponding to the sclera (hereinafter referred to as “sclera region”). ".) Is included.
  • the scleral region is located around the iris region.
  • the iris region is located around the pupil region.
  • the shape of the iris region is circular.
  • the shape of the pupil region is circular.
  • the brightness in the pupil region is usually lower than the brightness in the iris region. Therefore, in the intraocular fissure region, an edge based on the brightness discontinuity is generated at the boundary between the pupil region and the iris region. Moreover, the brightness in the iris region is lower than the brightness in the sclera region. Therefore, in the intraocular fissure region, an edge based on the brightness discontinuity is generated at the boundary between the iris region and the scleral region.
  • the number n corresponding to the image unit U in the pupil region is larger than the number n corresponding to the image unit U outside the pupil region. Therefore, the evaluation value E corresponding to the image unit U in the pupil region is likely to be higher than the evaluation value E corresponding to the image unit U outside the pupil region. Therefore, the pupil position PP can be detected by detecting the evaluation value E which is equal to or higher than the threshold value among the evaluation values E.
  • the second reliability calculation unit 6 calculates the degree of expansion of the plurality of pupil positions based on the positional relationship of the plurality of pupil positions.
  • the second reliability calculation unit 6 calculates the second reliability, which is an index of the certainty of the plurality of pupil positions, based on the calculated spread degree.
  • the second reliability calculation unit 6 obtains information including the position of the eye feature point, the plurality of pupil positions, the evaluation value E of each pupil position, and the second reliability common to the plurality of pupil positions. Output to the calculation unit 11.
  • the pupil position detection unit 4 erroneously detects the pupil position. This is an evaluation method based on the idea that there is a high possibility. Since the pupil position detecting unit 4 tries to detect the pupil position in the pupil region, there is a tendency to detect a plurality of pupil positions having substantially the same evaluation value E in the pupil region. Therefore, when the distance spread of the plurality of pupil positions detected by the pupil position detection unit 4 is small, it is highly possible that the plurality of pupil positions exist in the pupil region, and thus the plurality of pupil positions are located. It can be said that it is highly reliable.
  • FIG. 11A is a diagram showing an example of three pupil positions PP1, PP2, and PP3 having a small degree of spread.
  • FIG. 11B is an explanatory diagram showing an example of a method for calculating the second reliability in the example of FIG. 11A.
  • the pupil position detection unit 4 detects three pupil positions PP1, PP2, PP3 from the eye region image I2.
  • the pupil position image unit U_21 corresponds to the pupil position PP1
  • the pupil position image unit U_22 corresponds to the pupil position PP2
  • the pupil position image unit U_23 corresponds to the pupil position PP3.
  • the second reliability calculation unit 6 sets the circumscribed rectangle 21 circumscribing the pupil position image units U_21, U_22, and U_23 with respect to the eye region image I2.
  • the second reliability calculation unit 6 calculates the diagonal length L1 based on the minimum coordinates (xmin, ymin) and the maximum coordinates (xmax, ymax) in the set circumscribing rectangle 21.
  • FIG. 12A is a diagram showing an example of three pupil positions PP4, PP5, and PP6 having a large degree of spread.
  • FIG. 12B is an explanatory diagram showing an example of a method for calculating the second reliability in the example of FIG. 12A.
  • the pupil position detection unit 4 detects three pupil positions PP4, PP5, and PP6 from the eye region image I2.
  • the pupil position image unit U_24 corresponds to the pupil position PP4
  • the pupil position image unit U_25 corresponds to the pupil position PP5
  • the pupil position image unit U_26 corresponds to the pupil position PP6.
  • the second reliability calculation unit 6 sets the circumscribed rectangle 22 circumscribing the pupil position image units U_24, U_25, and U_26 with respect to the eye region image I2.
  • the second reliability calculation unit 6 calculates the diagonal length L2 based on the minimum coordinates (xmin, ymin) and the maximum coordinates (xmax, ymax) in the set circumscribing rectangle 22.
  • the diagonal length L1 is a value representing the degree of spread of the pupil position image units U_21, U_22, and U_23.
  • the diagonal length L2 is a value representing the degree of spread of the pupil position image units U_24, U_25, and U_26.
  • the smaller the value of the diagonal length L2 the higher the reliability of the pupil positions PP4, PP5, PP6 corresponding to the pupil position image units U_24, U_25, U_26.
  • the second reliability calculation unit 6 may use the reciprocal of the diagonal lengths L1 and L2 as the second reliability, or may calculate the second reliability according to the diagonal lengths L1 and L2.
  • the shorter the diagonal lengths L1 and L2 the larger the second reliability is calculated, and the longer the diagonal lengths L1 and L2, the higher the second reliability. Is calculated as a small value.
  • the second reliability calculation unit 6 has set circumscribed rectangles circumscribing at a plurality of pupil positions, but a figure having a shape other than the rectangle may be set. Further, although the second reliability calculation unit 6 has calculated the diagonal length as the degree of spread, an index other than the diagonal length may be calculated as long as it is an index indicating the degree of spread (variation) of a plurality of pupil positions. ..
  • the second reliability calculation unit 6 may perform a plurality of evaluations.
  • a plurality of pupil positions corresponding to the evaluation value E larger than the predetermined value among the values E may be acquired from the pupil position detection unit 4.
  • the second reliability calculation unit 6 calculates the degree of spread of the plurality of pupil positions acquired from the pupil position detection unit 4, and as a result, the pupil having the maximum evaluation value E detected by the pupil position detection unit 4. The reliability of the position can be evaluated.
  • the line-of-sight angle calculation unit 11 outputs the position of the eye feature point, the plurality of pupil positions, the evaluation value E of each pupil position, and the second reliability common to the plurality of pupil positions, which is output by the pupil position detection device 1. Get information including and.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ of the subject using the acquired information. For example, when the pupil position detection device 1 outputs the above information regarding only the right eye, the line-of-sight angle calculation unit 11 has a plurality of pupil positions as long as the second reliability included in this information is equal to or higher than a predetermined threshold value.
  • the pupil position having the maximum evaluation value E is selected from the inside, and the line-of-sight angle ⁇ is calculated based on the selected pupil position. Further, for example, when the pupil position detecting device 1 outputs the above information regarding the right eye and the above information regarding the left eye, the line-of-sight angle calculation unit 11 includes the second reliability included in the information of the right eye and the second reliability included in the information of the left eye. 2 If both of the reliabilitys are equal to or higher than the above threshold value, the pupil position having the maximum evaluation value E is selected from the plurality of pupil positions included in the information of the right eye and the plurality of pupil positions included in the information of the left eye. Then, the line-of-sight angle ⁇ is calculated based on the selected pupil position.
  • the line-of-sight angle calculation unit 11 has a second reliability equal to or higher than the above threshold value.
  • the pupil position having the maximum evaluation value E is selected from the plurality of pupil positions corresponding to the degrees, and the line-of-sight angle ⁇ is calculated based on the selected pupil position. If the second reliability is less than the above threshold value, the line-of-sight angle calculation unit 11 does not calculate the line-of-sight angle ⁇ based on this information.
  • the second reliability calculation unit 6 includes the position of the eye feature point, the plurality of pupil positions, the evaluation value E of each pupil position, and the second reliability common to the plurality of pupil positions. Although it is configured to output information, it may be configured to output information including the position of the eye feature point, the pupil position having the maximum evaluation value E, and the second reliability. In this case, if the second reliability included in the information acquired from the pupil position detection device 1 is equal to or higher than a predetermined threshold value, the line-of-sight angle calculation unit 11 gazes at the pupil position having the maximum evaluation value E. Calculate the angle ⁇ .
  • the line-of-sight angle ⁇ in this example includes the line-of-sight angle ⁇ _X with respect to the yaw direction and the line-of-sight angle ⁇ _Y with respect to the pitch direction.
  • a method of calculating the line-of-sight angle ⁇ _X will be described with reference to FIG.
  • FIG. 13 is an explanatory diagram showing an example of a method of calculating the line-of-sight angle ⁇ _X with respect to the yaw direction. Further, a method of calculating the line-of-sight angle ⁇ _Y will be described with reference to FIG.
  • FIG. 14 is an explanatory diagram showing an example of a method of calculating the line-of-sight angle ⁇ _Y with respect to the pitch direction.
  • the line-of-sight angle calculation unit 11 calculates the positions P_FP1_X and P_FP2_X in the X direction using the information of the eye feature points.
  • the position P_FP1_X corresponds to the outer corner of the eye FP1.
  • the position P_FP2_X corresponds to the inner corner FP2.
  • the line-of-sight angle calculation unit 11 calculates the position (hereinafter referred to as "first reference position") P_C_X in the X direction based on the calculated positions P_FP1_X and P_FP2_X.
  • the first reference position P_C_X corresponds to an intermediate position between the outer corner FP1 and the inner corner FP2. That is, the first reference position P_C_X corresponds to the central portion of the intraocular fissure region with respect to the X direction.
  • the line-of-sight angle calculation unit 11 calculates the position P_PP_X with respect to the X direction using the information on the pupil position.
  • the position P_PP_X corresponds to the pupil position PP.
  • the line-of-sight angle calculation unit 11 calculates the interval L_X_1 with respect to the first reference position P_C_X and the interval L_X_2 with respect to the position P_PP_X for the position P_FP1_X or the position P_FP2_X.
  • FIG. 13 shows an example when the position P_FP2_X is used as a reference for the intervals L_X_1 and L_X_2.
  • the line-of-sight angle calculation unit 11 is preset with a value indicating the maximum value ⁇ max_X of the line-of-sight angle ⁇ _X.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ _X by the following equation (1) using the set maximum value ⁇ max_X and the calculated intervals L_X_1 and L_X_2.
  • ⁇ _X ⁇ max_X ⁇ (L_X_1-L_X_2) / L_X_1 (1)
  • the maximum value ⁇ max_X is based on, for example, the following model M_X. That is, in the model M_X, if the position P_PP_X is the same as the first reference position P_C_X, the line-of-sight angle ⁇ _X is 0 °. Further, in the model M_X, if the position P_PP_X is the same as the position P_FP1_X, the line-of-sight angle ⁇ _X becomes a value (for example, ⁇ 60 °) corresponding to the maximum value ⁇ max_X.
  • the line-of-sight angle ⁇ _X becomes a value (for example, + 60 °) corresponding to the maximum value ⁇ max_X.
  • the line-of-sight angle calculation unit 11 calculates the positions P_FP3_Y and P_FP4_Y in the Y direction by using the information of the eye feature points.
  • the position P_FP3_Y corresponds to the upper eyelid FP3.
  • the position P_FP4_Y corresponds to the lower eyelid FP4.
  • the line-of-sight angle calculation unit 11 calculates the position P_C_Y in the Y direction (hereinafter referred to as "second reference position") based on the calculated positions P_FP3_Y and P_FP4_Y.
  • the second reference position P_C_Y corresponds to an intermediate position between the upper eyelid FP3 and the lower eyelid FP4. That is, the second reference position P_C_Y corresponds to the central portion of the intraocular fissure region with respect to the Y direction.
  • the line-of-sight angle calculation unit 11 calculates the position P_PP_Y with respect to the Y direction using the information on the pupil position.
  • the position P_PP_Y corresponds to the pupil position PP.
  • the line-of-sight angle calculation unit 11 calculates the interval L_Y_1 with respect to the second reference position P_C_Y and the interval L_Y_2 with respect to the position P_PP_Y for the position P_FP3_Y or the position P_FP4_Y.
  • FIG. 14 shows an example when the position P_FP3_Y is used as a reference for the intervals L_Y_1 and L_Y_2.
  • the line-of-sight angle calculation unit 11 is preset with a value indicating the maximum value ⁇ max_Y of the line-of-sight angle ⁇ _Y.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ _Y by the following equation (2) using the set maximum value ⁇ max_Y and the calculated intervals L_Y_1 and L_Y_2.
  • ⁇ _Y ⁇ max_Y ⁇ (L_Y_1-L_Y_2) / L_Y_1 (2)
  • the maximum value ⁇ max_Y is based on, for example, the following model M_Y. That is, in the model M_Y, if the position P_PP_Y is the same as the second reference position P_C_Y, the line-of-sight angle ⁇ _Y is 0 °. Further, in the model M_Y, if the position P_PP_Y is the same as the position P_FP3_Y, the line-of-sight angle ⁇ _Y becomes a value (for example, + 20 °) corresponding to the maximum value ⁇ max_Y.
  • the line-of-sight angle ⁇ _Y becomes a value (for example, ⁇ 20 °) corresponding to the maximum value ⁇ max_Y.
  • the method of calculating the line-of-sight angles ⁇ _X and ⁇ _Y is not limited to these specific examples.
  • Various known techniques can be used to calculate the line-of-sight angles ⁇ _X and ⁇ _Y. Detailed description of these techniques will be omitted.
  • the line-of-sight detection device 10 includes a line-of-sight angle calculation unit 11, and the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ of the target person, but the present invention is not limited to this configuration.
  • the line-of-sight detection device 10 may include a line-of-sight direction calculation unit (not shown), and the line-of-sight direction calculation unit may calculate the line-of-sight direction of the target person.
  • Various known techniques can be used for the calculation method of the line-of-sight direction. Detailed description of these techniques will be omitted.
  • At least one of the line-of-sight angle and the line-of-sight direction of the target person calculated by the line-of-sight detection device 10 is used, for example, for determining the state of the target person.
  • the line-of-sight detection device 10 and the image pickup device 12 are mounted on the vehicle, and the image pickup device 12 takes an image of the driver as a target person.
  • the line-of-sight detection device 10 detects at least one of the line-of-sight angle and the line-of-sight direction of the driver who is the target person.
  • the driver monitoring device (not shown) mounted on the vehicle uses at least one of the driver's line-of-sight angle and line-of-sight direction, and the driver's state is a predetermined state (hereinafter referred to as "warning target state"). .) Is determined.
  • the driver monitoring device determines that the driver is in the warning target state, the driver monitoring device outputs a voice or the like to warn the driver.
  • the warning target states are the state where the driver is driving aside, the state where the driver is not checking backward at the timing when the driver should check backward, the state where the driver's attention is low, and the driver is dozing. It is a state where the driver is not able to drive, and a state where the driver cannot drive.
  • Various known techniques can be used to determine the state subject to warning. Detailed description of these techniques will be omitted.
  • FIG. 15 is a flowchart showing an operation example of the line-of-sight detection device 10 according to the first embodiment. The process shown in FIG. 15 is repeatedly executed when a predetermined condition is satisfied (for example, when the ignition power of the vehicle equipped with the line-of-sight detection device 10 is turned on).
  • a predetermined condition for example, when the ignition power of the vehicle equipped with the line-of-sight detection device 10 is turned on.
  • the image acquisition unit 2 acquires the image I1 captured by the subject from the image pickup device 12 (step ST1).
  • the eye region extraction unit 3 extracts the eye region image I2 from the image I1 acquired by the image acquisition unit 2 (step ST2).
  • the pupil position detection unit 4 detects a plurality of pupil position PPs from the eye region image I2 extracted by the eye region extraction unit 3 (step ST3).
  • the second reliability calculation unit 6 calculates the degree of spread of the plurality of pupil position PPs based on the positional relationship of the plurality of pupil position PPs detected by the pupil position detection unit 4, and based on the degree of spread.
  • the second reliability which is an index of the certainty of the plurality of pupil position PPs, is calculated (step ST4).
  • the line-of-sight angle calculation unit 11 determines that the value of the second reliability calculated by the second reliability calculation unit 6 is equal to or greater than a predetermined threshold value, the evaluation value E is based on the highest pupil position PP.
  • the line-of-sight angle ⁇ of the subject is calculated (step ST5).
  • the pupil position detection device 1 includes an image acquisition unit 2, an eye region extraction unit 3, a pupil position detection unit 4, and a second reliability calculation unit 6.
  • the image acquisition unit 2 acquires an image captured by the subject.
  • the eye region extraction unit 3 extracts the eye region from the image acquired by the image acquisition unit 2.
  • the pupil position detection unit 4 detects a plurality of pupil positions from the eye region extracted by the eye region extraction unit 3.
  • the second reliability calculation unit 6 calculates the degree of expansion of the plurality of pupil positions based on the positional relationship of the plurality of pupil positions detected by the pupil position detection unit 4, and the plurality of pupils based on the degree of expansion.
  • the second reliability which is an index of the certainty of the position, is calculated. Since the degree of spread of the plurality of pupil positions is not affected by the imaging environment, the pupil position detecting device 1 can evaluate the reliability of the pupil positions without being affected by the imaging environment.
  • the line-of-sight detection device 10 includes a pupil position detection device 1 and a line-of-sight angle calculation unit 11.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle based on the plurality of pupil positions output by the pupil position detection device 1 and the second reliability common to the plurality of pupil positions. Since the line-of-sight detection device 10 calculates the line-of-sight angle based on the highly reliable pupil position evaluated by a method that is not affected by the imaging environment, the line-of-sight angle can be calculated accurately.
  • FIG. 16 is a block diagram showing a configuration example of the line-of-sight detection device 10 according to the second embodiment.
  • the line-of-sight detection device 10 according to the second embodiment has a configuration in which a first reliability calculation unit 5 and a third reliability calculation unit 7 are added to the line-of-sight detection device 10 of the first embodiment shown in FIG. Is.
  • the same or corresponding parts as those in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
  • the first reliability calculation unit 5 and the third reliability calculation unit 7 shown in FIG. 16 are realized by the processing circuit 100 shown in FIG. 2A.
  • the functions of the first reliability calculation unit 5 and the third reliability calculation unit 7 are realized by the processor 101 shown in FIG. 2B executing a program stored in the memory 102.
  • the pupil position detection unit 4 of the second embodiment outputs information including the positions of the eye feature points and the plurality of pupil positions to the first reliability calculation unit 5 in addition to the second reliability calculation unit 6.
  • the second reliability calculation unit 6 of the second embodiment provides information including the positions of the eye feature points, the plurality of pupil positions, and the second reliability common to the plurality of pupil positions, in the third reliability calculation unit 7. Output to.
  • the first reliability calculation unit 5 calculates the brightness difference between the pixels at the pupil position and the pixels around the pupil position for each pupil position detected by the pupil position detection unit 4.
  • the first reliability calculation unit 5 calculates the first reliability, which is an index of the certainty of the pupil position, based on the calculated luminance difference.
  • the first reliability calculation unit 5 outputs information including the positions of the eye feature points, the plurality of pupil positions, and the first reliability of each pupil position to the third reliability calculation unit 7.
  • the pupil position evaluation method using the first reliability is an evaluation method based on the idea that the pupil is darker than the periphery (iris, etc.). An example of the first reliability calculation method will be described with reference to FIGS. 17 and 18.
  • FIG. 17 is an explanatory diagram showing an example of the pupil position image unit U_11 in the first reliability calculation region U_10 and the brightness value distribution around it.
  • the pupil position image unit U_11 is an image unit U corresponding to the pupil position PP detected by the pupil position detection unit 4. The larger the value, the brighter the brightness value in the image unit U, and the smaller the value, the darker the brightness value.
  • the first reliability calculation unit 5 sets an image unit (hereinafter, referred to as “first reliability calculation area U_10”) of a predetermined region centered on the pupil position image unit U_11 with respect to the eye region image I2. do.
  • the first reliability calculation area U_10 is a circular area having a radius of 4 image units. Since the pupil is circular and expands or contracts depending on the external light environment, it is desirable that the first reliability calculation region U_10 is also circular, and the size is at least equal to or larger than the size of a standard human pupil. Desirably the size.
  • FIG. 18 is an explanatory diagram showing an example of concentric image unit groups U_12, U_13, U_14, and U_15 set in the first reliability calculation area U_10.
  • the first reliability calculation unit 5 sets m concentric image unit groups centered on the pupil position image unit U_11 with respect to the first reliability calculation region U_10. m is any integer greater than or equal to 1.
  • m 4
  • the first reliability calculation unit 5 sets four concentric image unit groups U_12, U_13, U_14, and U_15 centered on the pupil position image unit U_11. If one image unit U is one pixel, the m image unit group becomes the "m pixel group".
  • the first reliability calculation unit 5 selects the darkest image unit U_12a from the concentric image unit group U_12.
  • the brightness value of the image unit U_12a is 21.
  • the first reliability calculation unit 5 selects the darkest image unit U_13a from the concentric image unit group U_13.
  • the brightness value of the image unit U_13a is 24.
  • the first reliability calculation unit 5 selects the darkest image unit U_14a from the concentric image unit group U_14.
  • the brightness value of the image unit U_14a is 26.
  • the first reliability calculation unit 5 selects the darkest image unit U_15a from the concentric image unit group U_15.
  • the brightness value of the image unit U_15a is 51. In this way, the first reliability calculation unit 5 selects m image units from the m image unit group.
  • the first reliability calculation unit 5 may use the value of the luminance difference as it is as the first reliability, or may calculate the first reliability according to the value of the luminance difference. When calculating the first reliability according to the luminance difference value, the larger the luminance difference value is, the larger the first reliability is calculated, and the smaller the luminance difference value is, the smaller the first reliability is. Is calculated to.
  • the first reliability calculation unit 5 calculates the first reliability for each of the plurality of pupil positions.
  • the third reliability calculation unit 7 calculates the third reliability using the first reliability calculated by the first reliability calculation unit 5 and the second reliability calculated by the second reliability calculation unit 6. ..
  • the third reliability calculation unit 7 outputs information including the position of the eye feature point, the plurality of pupil positions, and the third reliability of each pupil position to the line-of-sight angle calculation unit 11. A method of calculating the third reliability will be described with reference to FIG.
  • FIG. 19 is an explanatory diagram showing an example of a method for calculating the third reliability.
  • the first reliability calculation unit 5 calculates that the first reliability of the pupil position PP4 is 60%, the first reliability of the pupil position PP5 is 90%, and the first reliability of the pupil position PP6 is 55%.
  • the second reliability calculation unit 6 calculates that the second reliability common to the pupil positions PP4, PP5, and PP6 is 30%.
  • the third reliability calculation unit 7 uses a predetermined first reliability weight (for example, 0.8) and a second reliability weight (for example, 0.2) to be used in FIG. As shown in the above, the third reliability of the pupil position PP4 is calculated as 54%, the third reliability of the pupil position PP5 is 78%, and the third reliability of the pupil position PP6 is calculated as 50%.
  • the third reliability calculation unit 7 calculates the weighted average value obtained by weighted averaging the first reliability and the second reliability as the third reliability, but the third reliability is calculated by another method such as a simple average. The reliability may be calculated.
  • the line-of-sight angle calculation unit 11 acquires information output by the pupil position detection device 1 including the position of the eye feature point, the plurality of pupil positions, and the third reliability of each of the pupil positions.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ of the subject using the acquired information. For example, when the pupil position detection device 1 outputs the above information regarding only the right eye, the line-of-sight angle calculation unit 11 selects the pupil position having the highest third reliability from the plurality of pupil positions included in this information. If the third reliability of the selected pupil position is equal to or higher than a predetermined threshold value, the line-of-sight angle ⁇ is calculated based on the selected pupil position.
  • the line-of-sight angle calculation unit 11 has a third reliability among the plurality of pupil positions included in the information. Selects the largest pupil position, and if the third reliability of the selected pupil position is equal to or greater than the above threshold value, the line-of-sight angle ⁇ is calculated based on the selected pupil position. The line-of-sight angle calculation unit 11 does not calculate the line-of-sight angle ⁇ when there is no pupil position having a third reliability that is equal to or higher than the above threshold value.
  • the pupil position detecting device 1 shown in FIG. 16 has a configuration in which a third reliability is calculated from the first reliability and the second reliability and the third reliability is output to the line-of-sight angle calculation unit 11.
  • the first reliability and the second reliability may be output to the line-of-sight angle calculation unit 11.
  • the pupil position detecting device 1 does not include the third reliability calculation unit 7.
  • the line-of-sight angle calculation unit 11 discards the eye area image I2 and the line of sight. The angle ⁇ is not calculated.
  • the line-of-sight angle calculation unit 11 has the highest reliability among the pupil positions detected from the eye area image I2.
  • the line-of-sight angle ⁇ is calculated based on the large pupil position.
  • FIG. 20 is a flowchart showing an operation example of the line-of-sight detection device 10 according to the second embodiment.
  • the process shown in FIG. 20 is repeatedly executed when a predetermined condition is satisfied (for example, when the ignition power of the vehicle equipped with the line-of-sight detection device 10 is turned on). Since the operations of steps ST1 to ST4 in FIG. 20 are the same as the operations of steps ST1 to ST4 in FIG. 15, the description thereof will be omitted.
  • step ST11 the first reliability calculation unit 5 calculates the brightness difference between the pixels of the pupil position PP and the pixels around the pupil position PP for each pupil position PP detected by the pupil position detection unit 4.
  • the first reliability which is an index of the certainty of the pupil position PP, is calculated based on the brightness difference.
  • the operation of step ST4 and the operation of step ST11 may be performed in the reverse order or in parallel. May be done in.
  • the third reliability calculation unit 7 applies the first reliability for each pupil position PP calculated by the first reliability calculation unit 5 and the plurality of pupil position PPs calculated by the second reliability calculation unit 6.
  • the weighted average with the common second reliability is calculated to obtain the third reliability (step ST12).
  • the line-of-sight angle calculation unit 11 selects the pupil position PP having the highest third reliability value calculated by the third reliability calculation unit 7 among the plurality of pupil position PPs detected by the pupil position detection unit 4. Based on this, the line-of-sight angle ⁇ of the subject is calculated (step ST13).
  • the pupil position detecting device 1 includes the first reliability calculation unit 5.
  • the first reliability calculation unit 5 calculates the brightness difference between the pixel at the pupil position and the pixel around the pupil position for each pupil position detected by the pupil position detection unit 4, and based on the brightness difference.
  • the first reliability which is an index of the certainty of the pupil position, is calculated. Even if the brightness value of each pixel in the eye region is affected by the imaging environment, the brightness difference between the pixel at the pupil position and the pixels around the pupil position is not affected by the imaging environment. No. 1 can evaluate the reliability of the pupil position without being affected by the imaging environment.
  • the pupil position detection device 1 includes a third reliability calculation unit 7.
  • the third reliability calculation unit 7 calculates the third reliability using the first reliability calculated by the first reliability calculation unit 5 and the second reliability calculated by the second reliability calculation unit 6. calculate. By combining the first reliability and the second reliability with different evaluation methods, the pupil position detecting device 1 can evaluate the reliability of the pupil position more accurately.
  • the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the first reliability calculation unit 5, the second reliability calculation unit 6, the third reliability calculation unit 7, and the line-of-sight angle calculation unit Although the functions of 11 are integrated in the in-vehicle device mounted on the vehicle, they may be distributed to the server device on the network, the mobile terminal such as a smartphone, the in-vehicle device, and the like.
  • the line-of-sight detection device includes a driver monitoring device that monitors the state of a driver in a vehicle, an occupant monitoring device that monitors the state of each occupant including a driver in a vehicle, and the like. Suitable for use.
  • 1 pupil position detection device 2 image acquisition unit, 3 eye area extraction unit, 4 pupil position detection unit, 5 1st reliability calculation unit, 6 2nd reliability calculation unit, 7 3rd reliability calculation unit, 10 line-of-sight detection Device, 11 line-of-sight angle calculation unit, 12 image pickup device, 21 and 22, circumscribing rectangle, 100 processing circuit, 101 processor, 102 memory.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

An eye region extraction unit (3) extracts an eye region from an image acquired by an image acquisition unit (2). A pupil position detection unit (4) detects a plurality of pupil positions from the eye region extracted by the eye region extraction unit (3). A second reliability calculation unit (6) calculates a spread degree of the plurality of pupil positions on the basis of positional relationships between the plurality of pupil positions detected by the pupil position detection unit (4), and calculates a second reliability that serves as an index of certainty of the plurality of pupil positions, on the basis of the spread degree.

Description

瞳孔位置検出装置、視線検出装置、及び瞳孔位置検出方法Pupil position detection device, line-of-sight detection device, and pupil position detection method
 本開示は、瞳孔位置検出装置、視線検出装置、及び瞳孔位置検出方法に関するものである。 The present disclosure relates to a pupil position detection device, a line-of-sight detection device, and a pupil position detection method.
 従来の瞳孔位置検出装置は、テンプレートマッチングを実行して目領域の画像から瞳孔位置候補を検出すると共に検出した各瞳孔位置候補の尤度を算出し、尤度が最大となる瞳孔位置候補を瞳孔位置と判定していた(例えば、特許文献1参照)。 The conventional pupil position detection device executes template matching to detect pupil position candidates from the image of the eye region, calculates the likelihood of each detected pupil position candidate, and selects the pupil position candidate having the maximum likelihood. It was determined to be the position (see, for example, Patent Document 1).
特開2013-215549号公報Japanese Unexamined Patent Publication No. 2013-215549
 テンプレートマッチングにおいて全ての撮像環境に対応するためには、膨大な数のテンプレートが必要であるが、このような膨大な数を事前に用意しておくことは困難である。したがって、現実に用意されるテンプレートの数は、全ての撮像環境に対応するために必要な数には満たないものであった。このため、従来の瞳孔位置検出装置においては、撮像環境によっては、算出された尤度が適切な値ではなくなることがあり、瞳孔位置の信頼性を正しく評価できないという課題があった。 In template matching, a huge number of templates are required to support all imaging environments, but it is difficult to prepare such a huge number in advance. Therefore, the number of templates actually prepared is less than the number required to support all imaging environments. Therefore, in the conventional pupil position detecting device, the calculated likelihood may not be an appropriate value depending on the imaging environment, and there is a problem that the reliability of the pupil position cannot be evaluated correctly.
 本開示は、上記のような課題を解決するためになされたもので、撮像環境の影響を受けることなく瞳孔位置の信頼性を評価することを目的とする。 The present disclosure has been made to solve the above problems, and an object of the present disclosure is to evaluate the reliability of the pupil position without being affected by the imaging environment.
 本開示に係る瞳孔位置検出装置は、対象者が撮像された画像を取得する画像取得部と、画像取得部により取得された画像から目領域を抽出する目領域抽出部と、目領域抽出部により抽出された目領域から複数の瞳孔位置を検出する瞳孔位置検出部と、瞳孔位置検出部により検出された複数の瞳孔位置の位置関係に基づいて当該複数の瞳孔位置の広がり度を算出し、当該広がり度に基づいて当該複数の瞳孔位置の確からしさの指標となる第2信頼度を算出する第2信頼度算出部とを備えるものである。 The pupil position detection device according to the present disclosure is composed of an image acquisition unit that acquires an image captured by a subject, an eye area extraction unit that extracts an eye area from an image acquired by the image acquisition unit, and an eye area extraction unit. The degree of spread of the plurality of pupil positions is calculated based on the positional relationship between the pupil position detection unit that detects a plurality of pupil positions from the extracted eye region and the plurality of pupil positions detected by the pupil position detection unit. It is provided with a second reliability calculation unit that calculates a second reliability that is an index of the certainty of the plurality of pupil positions based on the degree of spread.
 本開示によれば、複数の瞳孔位置の広がり度は、撮像環境の影響を受けないため、撮像環境の影響を受けることなく瞳孔位置の信頼性を評価できる。 According to the present disclosure, since the degree of spread of a plurality of pupil positions is not affected by the imaging environment, the reliability of the pupil positions can be evaluated without being affected by the imaging environment.
実施の形態1に係る視線検出装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the line-of-sight detection apparatus which concerns on Embodiment 1. FIG. 図2A及び図2Bは、実施の形態1に係る視線検出装置のハードウェア構成例を示す図である。2A and 2B are diagrams showing a hardware configuration example of the line-of-sight detection device according to the first embodiment. 目領域抽出部が抽出した目領域画像の例を示す図である。It is a figure which shows the example of the eye area image extracted by the eye area extraction part. フィルタの例を示す説明図である。It is explanatory drawing which shows the example of a filter. フィルタによる掃引の例を示す説明図である。It is explanatory drawing which shows the example of the sweep by a filter. 第1輝度勾配値に対応するベクトルの例、第2輝度勾配値に対応するベクトルの例、及び輝度勾配ベクトルの例を示す説明図である。It is explanatory drawing which shows the example of the vector corresponding to the 1st luminance gradient value, the example of the vector corresponding to the 2nd luminance gradient value, and the example of the luminance gradient vector. 注目画像単位における輝度値の例、及び注目画像単位に対する周囲に配置された個々の画像単位における輝度値の例を示す説明図である。It is explanatory drawing which shows the example of the luminance value in the attention image unit, and the example of the luminance value in each image unit arranged around the attention image unit. 注目画像単位に対応する輝度勾配ベクトルの例を示す説明図である。It is explanatory drawing which shows the example of the luminance gradient vector corresponding to the attention image unit. 評価領域の例を示す説明図である。It is explanatory drawing which shows the example of the evaluation area. 個々の評価画像単位に対応する輝度勾配ベクトルの例を示す説明図である。It is explanatory drawing which shows the example of the luminance gradient vector corresponding to each evaluation image unit. 図11Aは、広がり度が小さい3個の瞳孔位置の例を示す図であり、図11Bは、図11Aの例における第2信頼度の算出方法の例を示す説明図である。FIG. 11A is a diagram showing an example of three pupil positions having a small degree of spread, and FIG. 11B is an explanatory diagram showing an example of a method of calculating the second reliability in the example of FIG. 11A. 図12Aは、広がり度が大きい3個の瞳孔位置の例を示す図であり、図12Bは、図12Aの例における第2信頼度の算出方法の例を示す説明図である。FIG. 12A is a diagram showing an example of three pupil positions having a large degree of spread, and FIG. 12B is an explanatory diagram showing an example of a method of calculating the second reliability in the example of FIG. 12A. ヨー方向に対する視線角度の算出方法の例を示す説明図である。It is explanatory drawing which shows the example of the calculation method of the line-of-sight angle with respect to the yaw direction. ピッチ方向に対する視線角度の算出方法の例を示す説明図である。It is explanatory drawing which shows the example of the calculation method of the line-of-sight angle with respect to a pitch direction. 実施の形態1に係る視線検出装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the line-of-sight detection apparatus which concerns on Embodiment 1. FIG. 実施の形態2に係る視線検出装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the line-of-sight detection apparatus which concerns on Embodiment 2. FIG. 第1信頼度算出領域における瞳孔位置とその周囲の輝度値分布の例を示す説明図である。It is explanatory drawing which shows the example of the pupil position in the 1st reliability calculation region and the brightness value distribution around it. 第1信頼度算出領域に設定された同心円状の画像単位群の例を示す説明図である。It is explanatory drawing which shows the example of the concentric image unit group set in the 1st reliability calculation area. 第3信頼度の算出方法の例を示す説明図である。It is explanatory drawing which shows the example of the calculation method of the 3rd reliability. 実施の形態2に係る視線検出装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the line-of-sight detection apparatus which concerns on Embodiment 2.
 以下、本開示をより詳細に説明するために、本開示を実施するための形態について、添付の図面に従って説明する。
実施の形態1.
 図1は、実施の形態1に係る視線検出装置10の構成例を示すブロック図である。視線検出装置10は、瞳孔位置検出装置1、及び視線角度算出部11を備える。瞳孔位置検出装置1は、画像取得部2、目領域抽出部3、瞳孔位置検出部4、及び第2信頼度算出部6を備える。また、瞳孔位置検出装置1には、撮像装置12が接続される。
Hereinafter, in order to explain the present disclosure in more detail, a mode for carrying out the present disclosure will be described with reference to the accompanying drawings.
Embodiment 1.
FIG. 1 is a block diagram showing a configuration example of the line-of-sight detection device 10 according to the first embodiment. The line-of-sight detection device 10 includes a pupil position detection device 1 and a line-of-sight angle calculation unit 11. The pupil position detection device 1 includes an image acquisition unit 2, an eye region extraction unit 3, a pupil position detection unit 4, and a second reliability calculation unit 6. Further, the image pickup device 12 is connected to the pupil position detection device 1.
 図2A及び図2Bは、実施の形態1に係る視線検出装置10のハードウェア構成例を示す図である。視線検出装置10における画像取得部2、目領域抽出部3、瞳孔位置検出部4、第2信頼度算出部6、及び視線角度算出部11の機能は、処理回路により実現される。即ち、視線検出装置10は、上記機能を実現するための処理回路を備える。処理回路は、専用のハードウェアとしての処理回路100であってもよいし、メモリ102に格納されるプログラムを実行するプロセッサ101であってもよい。 2A and 2B are diagrams showing a hardware configuration example of the line-of-sight detection device 10 according to the first embodiment. The functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the second reliability calculation unit 6, and the line-of-sight angle calculation unit 11 in the line-of-sight detection device 10 are realized by a processing circuit. That is, the line-of-sight detection device 10 includes a processing circuit for realizing the above functions. The processing circuit may be a processing circuit 100 as dedicated hardware, or may be a processor 101 that executes a program stored in the memory 102.
 図2Aに示されるように、処理回路が専用のハードウェアである場合、処理回路100は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)、又はこれらを組み合わせたものが該当する。画像取得部2、目領域抽出部3、瞳孔位置検出部4、第2信頼度算出部6、及び視線角度算出部11の機能を複数の処理回路100で実現してもよいし、各部の機能をまとめて1つの処理回路100で実現してもよい。 As shown in FIG. 2A, when the processing circuit is dedicated hardware, the processing circuit 100 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, or an ASIC (Application Specific Integrated Circuit). ), FPGA (Field Processor Gate Array), or a combination thereof. The functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the second reliability calculation unit 6, and the line-of-sight angle calculation unit 11 may be realized by a plurality of processing circuits 100, or the functions of each unit may be realized. May be collectively realized by one processing circuit 100.
 図2Bに示されるように、処理回路がプロセッサ101である場合、画像取得部2、目領域抽出部3、瞳孔位置検出部4、第2信頼度算出部6、及び視線角度算出部11の機能は、ソフトウェア、ファームウェア、又はソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェア又はファームウェアはプログラムとして記述され、メモリ102に格納される。プロセッサ101は、メモリ102に格納されたプログラムを読みだして実行することにより、各部の機能を実現する。即ち、視線検出装置10は、プロセッサ101により実行されるときに、後述する図15のフローチャートで示されるステップが結果的に実行されることになるプログラムを格納するためのメモリ102を備える。また、このプログラムは、画像取得部2、目領域抽出部3、瞳孔位置検出部4、第2信頼度算出部6、及び視線角度算出部11の手順又は方法をコンピュータに実行させるものであるとも言える。 As shown in FIG. 2B, when the processing circuit is the processor 101, the functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the second reliability calculation unit 6, and the line-of-sight angle calculation unit 11 Is realized by software, firmware, or a combination of software and firmware. The software or firmware is described as a program and stored in the memory 102. The processor 101 realizes the functions of each part by reading and executing the program stored in the memory 102. That is, the line-of-sight detection device 10 includes a memory 102 for storing a program in which the step shown in the flowchart of FIG. 15 described later will be executed as a result when executed by the processor 101. In addition, this program causes a computer to execute the procedures or methods of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the second reliability calculation unit 6, and the line-of-sight angle calculation unit 11. I can say.
 ここで、プロセッサ101とは、CPU(Central Processing Unit)、処理装置、演算装置、又はマイクロプロセッサ等のことである。
 メモリ102は、RAM(Random Access Memory)、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、又はフラッシュメモリ等の不揮発性もしくは揮発性の半導体メモリであってもよいし、ハードディスク又はフレキシブルディスク等の磁気ディスクであってもよいし、CD(Compact Disc)又はDVD(Digital Versatile Disc)等の光ディスクであってもよい。
Here, the processor 101 is a CPU (Central Processing Unit), a processing device, an arithmetic unit, a microprocessor, or the like.
The memory 102 may be a non-volatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), an EPROM (Erasable Program ROM), or a flash memory, and may be a non-volatile or volatile semiconductor memory such as a hard disk or a flexible disk. It may be an optical disc such as a CD (Compact Disc) or a DVD (Digital Versaille Disc).
 なお、画像取得部2、目領域抽出部3、瞳孔位置検出部4、第2信頼度算出部6、及び視線角度算出部11の機能について、一部を専用のハードウェアで実現し、一部をソフトウェア又はファームウェアで実現するようにしてもよい。このように、視線検出装置10における処理回路は、ハードウェア、ソフトウェア、ファームウェア、又はこれらの組み合わせによって、上述の機能を実現することができる。 Some of the functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the second reliability calculation unit 6, and the line-of-sight angle calculation unit 11 are realized by dedicated hardware, and some of them are realized. May be realized by software or firmware. As described above, the processing circuit in the line-of-sight detection device 10 can realize the above-mentioned functions by hardware, software, firmware, or a combination thereof.
 撮像装置12は、瞳孔位置検出及び視線検出の対象者の顔周辺を時系列的に撮像する。撮像装置12は、撮像した画像I1を画像取得部2へ出力する。 The imaging device 12 images the periphery of the face of the subject whose pupil position is detected and the line of sight is detected in chronological order. The image pickup device 12 outputs the captured image I1 to the image acquisition unit 2.
 画像取得部2は、撮像装置12が撮像した対象者の画像I1を、この撮像装置12から取得して目領域抽出部3へ出力する。 The image acquisition unit 2 acquires the image I1 of the subject imaged by the image pickup device 12 from the image pickup device 12 and outputs it to the eye region extraction unit 3.
 目領域抽出部3は、画像取得部2が取得した画像I1から対象者の目領域を抽出し、抽出した目領域の画像(以下「目領域画像I2」という。)を瞳孔位置検出部4へ出力する。 The eye region extraction unit 3 extracts the target person's eye region from the image I1 acquired by the image acquisition unit 2, and transfers the extracted image of the eye region (hereinafter referred to as “eye region image I2”) to the pupil position detection unit 4. Output.
 図3は、目領域抽出部3が抽出した目領域画像I2の例を示す図である。目領域抽出部3は、画像取得部2が取得した画像I1から対象者の顔領域を検出し、検出した顔領域から右目の特徴点である目尻FP1、目頭FP2、上瞼FP3、及び下瞼FP4を検出し、検出したこれらの目特徴点の位置に基づいて画像I1から右目の目領域画像I2を抽出する。同様に、目領域抽出部3は、顔領域から左目の特徴点である目尻、目頭、上瞼、及び下瞼を検出し、検出したこれらの目特徴点の位置に基づいて画像I1から左目の目領域画像を抽出する。 FIG. 3 is a diagram showing an example of the eye region image I2 extracted by the eye region extraction unit 3. The eye area extraction unit 3 detects the face area of the subject from the image I1 acquired by the image acquisition unit 2, and from the detected face area, the outer corner FP1, the inner corner of the eye FP2, the upper eyelid FP3, and the lower eyelid, which are the feature points of the right eye. FP4 is detected, and the right eye region image I2 is extracted from the image I1 based on the positions of the detected eye feature points. Similarly, the eye region extraction unit 3 detects the outer corners of the eyes, the inner corners of the eyes, the upper eyelid, and the lower eyelid, which are the feature points of the left eye, from the face region, and based on the positions of the detected eye feature points, the left eye from the image I1. Extract the eye area image.
 目領域画像I2は、互いに直交する2方向(すなわちX方向及びY方向)に配列された複数個の単位(以下「画像単位」という。)Uにより構成されている。ここで、個々の画像単位Uは、1個の画素により構成されている。または、個々の画像単位Uは、互いに隣接する複数個の画素により構成されている。 The eye region image I2 is composed of a plurality of units (hereinafter referred to as "image units") U arranged in two directions (that is, the X direction and the Y direction) orthogonal to each other. Here, each image unit U is composed of one pixel. Alternatively, each image unit U is composed of a plurality of pixels adjacent to each other.
 なお、目領域抽出部3は、画像I1から右目の目領域又は左目の目領域のいずれか一方を抽出する。目領域抽出部3が右目又は左目のいずれか一方のみの、つまり片目の目領域を抽出した場合、瞳孔位置検出部4は、当該片目の瞳孔位置を検出し、第2信頼度算出部6は、瞳孔位置検出部4が検出した当該片目の瞳孔位置の第2信頼度を算出する。
 あるいは、目領域抽出部3は、右目の目領域と左目の目領域とを抽出してもよい。この場合、瞳孔位置検出部4は、右目の瞳孔位置と左目の瞳孔位置とを検出し、第2信頼度算出部6は、右目の瞳孔位置の第2信頼度と左目の瞳孔位置の第2信頼度とを算出する。
The eye region extraction unit 3 extracts either the right eye region or the left eye region from the image I1. When the eye region extraction unit 3 extracts only one of the right eye and the left eye, that is, the pupil region of one eye, the pupil position detection unit 4 detects the pupil position of the one eye, and the second reliability calculation unit 6 , The second reliability of the pupil position of the one eye detected by the pupil position detection unit 4 is calculated.
Alternatively, the eye region extraction unit 3 may extract the right eye region and the left eye region. In this case, the pupil position detection unit 4 detects the pupil position of the right eye and the pupil position of the left eye, and the second reliability calculation unit 6 detects the second reliability of the pupil position of the right eye and the second reliability of the pupil position of the left eye. Calculate the reliability.
 瞳孔位置検出部4は、目領域抽出部3が抽出した目領域画像I2から、瞳孔の領域内であることを示す位置(以下、「瞳孔位置」という。)を1以上検出する。瞳孔位置検出部4は、目特徴点の位置と1以上の瞳孔位置とを含む情報を第2信頼度算出部6へ出力する。瞳孔位置検出部4は、周知の方法によって目領域画像I2から瞳孔位置を検出すればよく、方法は問わない。上述したようなテンプレートマッチングでもよいし、下記に説明するような輝度勾配ベクトルに基づく方法でもよい。なお、テンプレートマッチングにおける尤度は、後述する輝度勾配ベクトルに基づく方法における評価値Eに相当する。 The pupil position detection unit 4 detects one or more positions (hereinafter, referred to as “pupil position”) indicating that it is within the pupil region from the eye region image I2 extracted by the eye region extraction unit 3. The pupil position detection unit 4 outputs information including the position of the eye feature point and one or more pupil positions to the second reliability calculation unit 6. The pupil position detection unit 4 may detect the pupil position from the eye region image I2 by a well-known method, and the method is not limited. The template matching as described above may be used, or the method based on the luminance gradient vector as described below may be used. The likelihood in template matching corresponds to the evaluation value E in the method based on the luminance gradient vector described later.
 図4~図8を参照して、輝度勾配ベクトルV2の算出方法を説明する。
 図4は、フィルタFの例を示す説明図である。図5は、フィルタFによる掃引の例を示す説明図である。輝度勾配ベクトルV2の算出には、図4に示すフィルタFが用いられる。フィルタFは、図5に示す如く、目領域画像I2を掃引するようにして適用される。
A method of calculating the luminance gradient vector V2 will be described with reference to FIGS. 4 to 8.
FIG. 4 is an explanatory diagram showing an example of the filter F. FIG. 5 is an explanatory diagram showing an example of sweeping by the filter F. The filter F shown in FIG. 4 is used for calculating the luminance gradient vector V2. The filter F is applied so as to sweep the eye region image I2 as shown in FIG.
 フィルタFは、目領域画像I2における個々の画像単位(以下「注目画像単位」ということがある。)U_Iについて、注目画像単位U_Iに対する左方に配置された画像単位U_Lにおける輝度値B_Lと注目画像単位U_Iに対する右方に配置された画像単位U_Rにおける輝度値B_Rとの差分値(以下「第1輝度勾配値」という。)ΔB_Xを算出するものである。また、フィルタFは、注目画像単位U_Iに対する上方に配置された画像単位U_Uにおける輝度値B_Uと注目画像単位U_Iに対する下方に配置された画像単位U_Dにおける輝度値B_Dとの差分値(以下「第2輝度勾配値」という。)ΔB_Yを算出するものである。 The filter F has a luminance value B_L and a attention image in the image unit U_L arranged to the left of the attention image unit U_I for each image unit (hereinafter, may be referred to as "attention image unit") U_I in the eye area image I2. The difference value (hereinafter referred to as “first luminance gradient value”) ΔB_X from the luminance value B_R in the image unit U_R arranged to the right of the unit U_I is calculated. Further, the filter F is a difference value between the luminance value B_U in the image unit U_U arranged above the attention image unit U_I and the luminance value B_D in the image unit U_D arranged below the attention image unit U_I (hereinafter, "second"). It is called "luminance gradient value".) ΔB_Y is calculated.
 図6は、第1輝度勾配値ΔB_Xに対応するベクトルV2_Xの例、第2輝度勾配値ΔB_Yに対応するベクトルV2_Yの例、及び輝度勾配ベクトルV2の例を示す説明図である。輝度勾配ベクトルV2は、第1輝度勾配値ΔB_Xに対応するベクトルV2_Xと第2輝度勾配値ΔB_Yに対応するベクトルV2_Yとの和により表される。そこで、瞳孔位置検出部4は、第1輝度勾配値ΔB_X及び第2輝度勾配値ΔB_Yを算出し、算出した第1輝度勾配値ΔB_X及び第2輝度勾配値ΔB_Yを用いて、輝度勾配ベクトルV2の向きに対応する角度(以下「輝度勾配角度」という。)θを算出する。 FIG. 6 is an explanatory diagram showing an example of the vector V2_X corresponding to the first luminance gradient value ΔB_X, an example of the vector V2_Y corresponding to the second luminance gradient value ΔB_Y, and an example of the luminance gradient vector V2. The luminance gradient vector V2 is represented by the sum of the vector V2_X corresponding to the first luminance gradient value ΔB_X and the vector V2_Y corresponding to the second luminance gradient value ΔB_Y. Therefore, the pupil position detection unit 4 calculates the first luminance gradient value ΔB_X and the second luminance gradient value ΔB_Y, and uses the calculated first luminance gradient value ΔB_X and the second luminance gradient value ΔB_Y to obtain the luminance gradient vector V2. The angle corresponding to the direction (hereinafter referred to as “luminance gradient angle”) θ is calculated.
 図7は、注目画像単位U_Iにおける輝度値B_Iの例、及び注目画像単位U_Iに対する周囲に配置された個々の画像単位U_D,U_L,U_R,U_Uにおける輝度値B_D,B_L,B_R,B_Uの例を示す説明図である。例えば、図7に示す如く、輝度値B_Lが44であり、かつ、輝度値B_Rが16であり、かつ、輝度値B_Uが47であり、かつ、輝度値B_Dが18であるものとする。この場合、注目画像単位U_IにフィルタFが適用されることにより、第1輝度勾配値ΔB_Xが-28に算出されるとともに、第2輝度勾配値ΔB_Yが-29に算出される。また、輝度勾配角度θが46°に算出される。図8は、注目画像単位U_Iに対応する輝度勾配ベクトルV2の例を示す説明図である。 FIG. 7 shows an example of the luminance value B_I in the attention image unit U_I and an example of the luminance values B_D, B_L, B_R, B_U in the individual image units U_D, U_L, U_R, U_U arranged around the attention image unit U_I. It is explanatory drawing which shows. For example, as shown in FIG. 7, it is assumed that the luminance value B_L is 44, the luminance value B_R is 16, the luminance value B_U is 47, and the luminance value B_D is 18. In this case, by applying the filter F to the attention image unit U_I, the first luminance gradient value ΔB_X is calculated to −28 and the second luminance gradient value ΔB_Y is calculated to −29. Further, the luminance gradient angle θ is calculated to be 46 °. FIG. 8 is an explanatory diagram showing an example of the luminance gradient vector V2 corresponding to the attention image unit U_I.
 瞳孔位置検出部4は、輝度勾配ベクトルV2を用いて、目領域画像I2における個々の画像単位Uに対応する評価値Eを算出する。評価値Eは、個々の画像単位U(すなわち注目画像単位U_I)に向かう輝度勾配ベクトルV2の本数nに基づくものである。図9及び図10を参照して、評価値Eの算出方法について説明する。 The pupil position detection unit 4 uses the luminance gradient vector V2 to calculate the evaluation value E corresponding to each image unit U in the eye region image I2. The evaluation value E is based on the number n of the luminance gradient vectors V2 toward the individual image unit U (that is, the attention image unit U_I). A method of calculating the evaluation value E will be described with reference to FIGS. 9 and 10.
 図9は、評価領域EAの例を示す説明図である。まず、瞳孔位置検出部4は、注目画像単位U_Iを含む評価用の領域(以下「評価領域」という。)EAを設定する。評価領域EAは、注目画像単位U_Iを含むものであり、かつ、注目画像単位U_Iと異なるN個の画像単位(以下「評価画像単位」ということがある。)U_Eを含むものである。Nは、2以上の任意の整数である。 FIG. 9 is an explanatory diagram showing an example of the evaluation area EA. First, the pupil position detection unit 4 sets an evaluation region (hereinafter referred to as “evaluation region”) EA including the attention image unit U_I. The evaluation area EA includes the attention image unit U_I and includes N image units (hereinafter, may be referred to as “evaluation image unit”) U_E different from the attention image unit U_I. N is any integer greater than or equal to 2.
 図9は、評価領域EAの例を示している。図9に示す例においては、評価領域EAの形状が正方形状であり、かつ、評価領域EAの中心部に注目画像単位U_Iが配置されている。また、評価領域EAに48個の評価画像単位U_Eが含まれている。すなわち、N=48である。 FIG. 9 shows an example of the evaluation area EA. In the example shown in FIG. 9, the shape of the evaluation region EA is square, and the image unit U_I of interest is arranged at the center of the evaluation region EA. Further, the evaluation area EA includes 48 evaluation image units U_E. That is, N = 48.
 評価領域EAは、所定のサイズ(以下「基準虹彩サイズ」という。)よりも小さいサイズを有している。基準虹彩サイズは、標準的な人の虹彩のサイズに対応するものである。 The evaluation area EA has a size smaller than a predetermined size (hereinafter referred to as "reference iris size"). The reference iris size corresponds to the size of a standard human iris.
 次いで、瞳孔位置検出部4は、注目画像単位U_Iと個々の評価画像単位U_Eとを結ぶ直線の傾きに対応する角度θ’を算出する。次いで、瞳孔位置検出部4は、個々の評価画像単位U_Eについて、対応する輝度勾配角度θと対応する角度θ’との差分値Δθを算出する。当該算出された差分値Δθが所定の閾値Δθth未満である場合、瞳孔位置検出部4は、対応する輝度勾配ベクトルV2が注目画像単位U_Iに向かう向きを有するものであると判定する。他方、当該算出された差分値Δθが閾値Δθth以上である場合、瞳孔位置検出部4は、対応する輝度勾配ベクトルV2が注目画像単位U_Iに向かう向きを有しないものであると判定する。 Next, the pupil position detection unit 4 calculates the angle θ'corresponding to the inclination of the straight line connecting the attention image unit U_I and the individual evaluation image units U_E. Next, the pupil position detection unit 4 calculates the difference value Δθ between the corresponding luminance gradient angle θ and the corresponding angle θ'for each evaluation image unit U_E. When the calculated difference value Δθ is less than the predetermined threshold value Δθth, the pupil position detection unit 4 determines that the corresponding luminance gradient vector V2 has a direction toward the attention image unit U_I. On the other hand, when the calculated difference value Δθ is equal to or greater than the threshold value Δθth, the pupil position detection unit 4 determines that the corresponding luminance gradient vector V2 does not have a direction toward the attention image unit U_I.
 次いで、瞳孔位置検出部4は、かかる判定の結果に基づき、注目画像単位U_Iに向かう輝度勾配ベクトルV2の本数nを算出する。このとき、本数nは、0以上かつN以下の値に算出される。瞳孔位置検出部4は、当該算出された本数nに応じた評価値Eを算出する。すなわち、本数nが多いほど、評価値Eが高い値に算出される。換言すれば、本数nが少ないほど、評価値Eが低い値に算出される。 Next, the pupil position detection unit 4 calculates the number n of the luminance gradient vectors V2 toward the attention image unit U_I based on the result of such determination. At this time, the number n is calculated as a value of 0 or more and N or less. The pupil position detection unit 4 calculates an evaluation value E according to the calculated number n. That is, the larger the number n, the higher the evaluation value E is calculated. In other words, the smaller the number n, the lower the evaluation value E is calculated.
 図10は、個々の評価画像単位U_Eに対応する輝度勾配ベクトルV2の例を示す説明図である。例えば、図10に示す如く、評価領域EAに48個の評価画像単位U_Eが含まれているものとする(N=48)。48個の評価画像単位U_Eに対応する48本の輝度勾配ベクトルV2のうち、19本の輝度勾配ベクトルV2が注目画像単位U_Iに向かう向きを有しているものとする(n=19)。図中、実線による矢印は、注目画像単位U_Iに向かう向きを有する輝度勾配ベクトルV2を示している。他方、破線による矢印は、注目画像単位U_Iに向かう向きを有しない輝度勾配ベクトルV2を示している。 FIG. 10 is an explanatory diagram showing an example of the luminance gradient vector V2 corresponding to each evaluation image unit U_E. For example, as shown in FIG. 10, it is assumed that the evaluation region EA includes 48 evaluation image units U_E (N = 48). Of the 48 luminance gradient vectors V2 corresponding to the 48 evaluation image units U_E, it is assumed that 19 luminance gradient vectors V2 have a direction toward the attention image unit U_I (n = 19). In the figure, the solid line arrow indicates the luminance gradient vector V2 having a direction toward the attention image unit U_I. On the other hand, the arrow by the broken line indicates the luminance gradient vector V2 having no direction toward the attention image unit U_I.
 この場合、瞳孔位置検出部4により、本数nが19に算出される。また、瞳孔位置検出部4により、当該算出された本数nに応じた評価値Eが算出される。例えば、n/Nに基づく評価値Eが算出される。瞳孔位置検出部4は、算出した評価値Eに基づいて、目領域画像I2における瞳孔位置PPを検出する。より具体的には、瞳孔位置検出部4は、目領域画像I2における複数個の画像単位Uに対応する複数個の評価値Eのうち、予め定められた閾値以上となる評価値Eを抽出し、抽出した評価値Eに対応する位置を瞳孔位置PPとして検出する。この際、瞳孔位置検出部4は、閾値以上となる評価値Eが複数ある場合、複数の瞳孔位置PPを検出してよい。また、瞳孔位置検出部4は、検出した1以上の瞳孔位置PPのそれぞれについて、目領域画像I2における瞳孔位置PPに対応する画像単位Uの位置を示す座標値C_X,C_Yを検出する。 In this case, the pupil position detection unit 4 calculates the number n to 19. Further, the pupil position detection unit 4 calculates an evaluation value E according to the calculated number n. For example, the evaluation value E based on n / N is calculated. The pupil position detection unit 4 detects the pupil position PP in the eye region image I2 based on the calculated evaluation value E. More specifically, the pupil position detection unit 4 extracts an evaluation value E that is equal to or higher than a predetermined threshold value from a plurality of evaluation values E corresponding to the plurality of image units U in the eye region image I2. , The position corresponding to the extracted evaluation value E is detected as the pupil position PP. At this time, the pupil position detection unit 4 may detect a plurality of pupil position PPs when there are a plurality of evaluation values E that are equal to or higher than the threshold value. Further, the pupil position detection unit 4 detects coordinate values C_X and C_Y indicating the position of the image unit U corresponding to the pupil position PP in the eye region image I2 for each of the detected one or more pupil position PPs.
 なお、瞳孔位置検出部4は、目領域画像I2における複数個の画像単位Uに対応する複数個の評価値Eのうちの最大値(すなわち極大値)に対応する位置を瞳孔位置PPとして検出してもよい。 The pupil position detection unit 4 detects the position corresponding to the maximum value (that is, the maximum value) of the plurality of evaluation values E corresponding to the plurality of image units U in the eye region image I2 as the pupil position PP. You may.
 ここで、評価値Eに基づいて瞳孔位置PPが検出される原理について説明する。 Here, the principle that the pupil position PP is detected based on the evaluation value E will be described.
 第一に、通常、目領域画像I2は、眼裂内に対応する領域(以下「眼裂内領域」という。)及び眼裂外に対応する領域(以下「眼裂外領域」という。)により構成されている。眼裂外領域は、眼裂内領域に対する周囲に配置されている。また、眼裂内領域は、瞳孔に対応する領域(以下「瞳孔領域」という。)、虹彩に対応する領域(以下「虹彩領域」という。)及び強膜に対応する領域(以下「強膜領域」という。)を含むものである。強膜領域は、虹彩領域に対する周囲に配置されている。虹彩領域は、瞳孔領域に対する周囲に配置されている。虹彩領域の形状は、円形状である。瞳孔領域の形状は、円形状である。 First, the eye region image I2 is usually composed of a region corresponding to the inside of the ocular fissure (hereinafter referred to as "intraocular fissure region") and a region corresponding to the outside of the ocular fissure (hereinafter referred to as "extraocular fissure region"). It is configured. The extraocular region is located around the intraocular region. The intraocular fissure region includes a region corresponding to the pupil (hereinafter referred to as “pupil region”), a region corresponding to the iris (hereinafter referred to as “iris region”), and a region corresponding to the sclera (hereinafter referred to as “sclera region”). ".) Is included. The scleral region is located around the iris region. The iris region is located around the pupil region. The shape of the iris region is circular. The shape of the pupil region is circular.
 第二に、通常、瞳孔領域における輝度は、虹彩領域における輝度に比して低い。このため、眼裂内領域においては、瞳孔領域と虹彩領域との境界部にて輝度の不連続性に基づくエッジが発生する。また、虹彩領域における輝度は、強膜領域における輝度に比して低い。このため、眼裂内領域においては、虹彩領域と強膜領域との境界部にて輝度の不連続性に基づくエッジが発生する。 Second, the brightness in the pupil region is usually lower than the brightness in the iris region. Therefore, in the intraocular fissure region, an edge based on the brightness discontinuity is generated at the boundary between the pupil region and the iris region. Moreover, the brightness in the iris region is lower than the brightness in the sclera region. Therefore, in the intraocular fissure region, an edge based on the brightness discontinuity is generated at the boundary between the iris region and the scleral region.
 これらの原理により、瞳孔領域内の画像単位Uに対応する本数nは、瞳孔領域外の画像単位Uに対応する本数nに比して多くなる蓋然性が高い。したがって、瞳孔領域内の画像単位Uに対応する評価値Eは、瞳孔領域外の画像単位Uに対応する評価値Eに比して高くなる蓋然性が高い。よって、評価値Eのうちの閾値以上となる評価値Eを検出することにより、瞳孔位置PPを検出することができるのである。 According to these principles, it is highly probable that the number n corresponding to the image unit U in the pupil region is larger than the number n corresponding to the image unit U outside the pupil region. Therefore, the evaluation value E corresponding to the image unit U in the pupil region is likely to be higher than the evaluation value E corresponding to the image unit U outside the pupil region. Therefore, the pupil position PP can be detected by detecting the evaluation value E which is equal to or higher than the threshold value among the evaluation values E.
 第2信頼度算出部6は、瞳孔位置検出部4が複数の瞳孔位置を検出した場合、当該複数の瞳孔位置の位置関係に基づいて、当該複数の瞳孔位置の広がり度を算出する。第2信頼度算出部6は、算出した広がり度に基づいて、当該複数の瞳孔位置の確からしさの指標となる第2信頼度を算出する。第2信頼度算出部6は、目特徴点の位置と、複数の瞳孔位置と、瞳孔位置それぞれの評価値Eと、当該複数の瞳孔位置に共通の第2信頼度とを含む情報を視線角度算出部11へ出力する。 When the pupil position detection unit 4 detects a plurality of pupil positions, the second reliability calculation unit 6 calculates the degree of expansion of the plurality of pupil positions based on the positional relationship of the plurality of pupil positions. The second reliability calculation unit 6 calculates the second reliability, which is an index of the certainty of the plurality of pupil positions, based on the calculated spread degree. The second reliability calculation unit 6 obtains information including the position of the eye feature point, the plurality of pupil positions, the evaluation value E of each pupil position, and the second reliability common to the plurality of pupil positions. Output to the calculation unit 11.
 第2信頼度を用いた瞳孔位置の評価方法は、複数個の瞳孔位置の距離的な広がりが大きければ、換言すると位置的なばらつきが大きければ、瞳孔位置検出部4が瞳孔位置を誤検出した可能性が高いという考え方に基づいた評価方法である。瞳孔位置検出部4は、瞳孔領域内で瞳孔位置を検出しようとするため、瞳孔領域において評価値Eが略同じとなる瞳孔位置を複数個検出する傾向がある。そのため、瞳孔位置検出部4が検出した複数個の瞳孔位置の距離的な広がりが小さい場合、当該複数個の瞳孔位置は瞳孔領域内に存在する可能性が高く、よって当該複数個の瞳孔位置は信頼性が高いといえる。一方、瞳孔位置検出部4が検出した複数個の瞳孔位置の距離的な広がりが大きい場合、当該複数個の瞳孔位置のうちの少なくとも1つは瞳孔領域外に存在する可能性が高く、よって当該複数個の瞳孔位置は信頼性が低いといえる。図11A、図11B、図12A、及び図12を参照して、第2信頼度の算出方法について説明する。 In the pupil position evaluation method using the second reliability, if the distance spread of the plurality of pupil positions is large, in other words, if the positional variation is large, the pupil position detection unit 4 erroneously detects the pupil position. This is an evaluation method based on the idea that there is a high possibility. Since the pupil position detecting unit 4 tries to detect the pupil position in the pupil region, there is a tendency to detect a plurality of pupil positions having substantially the same evaluation value E in the pupil region. Therefore, when the distance spread of the plurality of pupil positions detected by the pupil position detection unit 4 is small, it is highly possible that the plurality of pupil positions exist in the pupil region, and thus the plurality of pupil positions are located. It can be said that it is highly reliable. On the other hand, when the distance spread of the plurality of pupil positions detected by the pupil position detection unit 4 is large, it is highly possible that at least one of the plurality of pupil positions exists outside the pupil region. It can be said that the positions of a plurality of pupils are unreliable. A method of calculating the second reliability will be described with reference to FIGS. 11A, 11B, 12A, and 12.
 図11Aは、広がり度が小さい3個の瞳孔位置PP1,PP2,PP3の例を示す図である。図11Bは、図11Aの例における第2信頼度の算出方法の例を示す説明図である。図11Aに示す如く、瞳孔位置検出部4が、目領域画像I2から3個の瞳孔位置PP1,PP2,PP3を検出したとする。図11Bにおいて、瞳孔位置画像単位U_21は瞳孔位置PP1に対応し、瞳孔位置画像単位U_22は瞳孔位置PP2に対応し、瞳孔位置画像単位U_23は瞳孔位置PP3に対応する。まず、第2信頼度算出部6は、瞳孔位置画像単位U_21,U_22,U_23に外接する外接矩形21を、目領域画像I2に対して設定する。次いで、第2信頼度算出部6は、設定した外接矩形21において最小となる座標(xmin,ymin)と最大となる座標(xmax,ymax)とに基づいて、対角線長L1を算出する。 FIG. 11A is a diagram showing an example of three pupil positions PP1, PP2, and PP3 having a small degree of spread. FIG. 11B is an explanatory diagram showing an example of a method for calculating the second reliability in the example of FIG. 11A. As shown in FIG. 11A, it is assumed that the pupil position detection unit 4 detects three pupil positions PP1, PP2, PP3 from the eye region image I2. In FIG. 11B, the pupil position image unit U_21 corresponds to the pupil position PP1, the pupil position image unit U_22 corresponds to the pupil position PP2, and the pupil position image unit U_23 corresponds to the pupil position PP3. First, the second reliability calculation unit 6 sets the circumscribed rectangle 21 circumscribing the pupil position image units U_21, U_22, and U_23 with respect to the eye region image I2. Next, the second reliability calculation unit 6 calculates the diagonal length L1 based on the minimum coordinates (xmin, ymin) and the maximum coordinates (xmax, ymax) in the set circumscribing rectangle 21.
 図12Aは、広がり度が大きい3個の瞳孔位置PP4,PP5,PP6の例を示す図である。図12Bは、図12Aの例における第2信頼度の算出方法の例を示す説明図である。図12Aに示す如く、瞳孔位置検出部4が、目領域画像I2から3個の瞳孔位置PP4,PP5,PP6を検出したとする。図12Bにおいて、瞳孔位置画像単位U_24は瞳孔位置PP4に対応し、瞳孔位置画像単位U_25は瞳孔位置PP5に対応し、瞳孔位置画像単位U_26は瞳孔位置PP6に対応する。まず、第2信頼度算出部6は、瞳孔位置画像単位U_24,U_25,U_26に外接する外接矩形22を、目領域画像I2に対して設定する。次いで、第2信頼度算出部6は、設定した外接矩形22において最小となる座標(xmin,ymin)と最大となる座標(xmax,ymax)とに基づいて、対角線長L2を算出する。 FIG. 12A is a diagram showing an example of three pupil positions PP4, PP5, and PP6 having a large degree of spread. FIG. 12B is an explanatory diagram showing an example of a method for calculating the second reliability in the example of FIG. 12A. As shown in FIG. 12A, it is assumed that the pupil position detection unit 4 detects three pupil positions PP4, PP5, and PP6 from the eye region image I2. In FIG. 12B, the pupil position image unit U_24 corresponds to the pupil position PP4, the pupil position image unit U_25 corresponds to the pupil position PP5, and the pupil position image unit U_26 corresponds to the pupil position PP6. First, the second reliability calculation unit 6 sets the circumscribed rectangle 22 circumscribing the pupil position image units U_24, U_25, and U_26 with respect to the eye region image I2. Next, the second reliability calculation unit 6 calculates the diagonal length L2 based on the minimum coordinates (xmin, ymin) and the maximum coordinates (xmax, ymax) in the set circumscribing rectangle 22.
 対角線長L1は、瞳孔位置画像単位U_21,U_22,U_23の広がり度を表す値である。対角線長L1の値が小さいほど、瞳孔位置画像単位U_21,U_22,U_23に対応する瞳孔位置PP1,PP2,PP3の信頼性が高い。同様に、対角線長L2は、瞳孔位置画像単位U_24,U_25,U_26の広がり度を表す値である。対角線長L2の値が小さいほど、瞳孔位置画像単位U_24,U_25,U_26に対応する瞳孔位置PP4,PP5,PP6の信頼性が高い。第2信頼度算出部6は、対角線長L1,L2の逆数を第2信頼度としてもよいし、対角線長L1,L2に応じた第2信頼度を算出してもよい。対角線長L1,L2に応じた第2信頼度を算出する場合、対角線長L1,L2が短いほど、第2信頼度が大きい値に算出され、対角線長L1,L2が長いほど、第2信頼度が小さい値に算出される。 The diagonal length L1 is a value representing the degree of spread of the pupil position image units U_21, U_22, and U_23. The smaller the value of the diagonal length L1, the higher the reliability of the pupil positions PP1, PP2, and PP3 corresponding to the pupil position image units U_21, U_22, and U_23. Similarly, the diagonal length L2 is a value representing the degree of spread of the pupil position image units U_24, U_25, and U_26. The smaller the value of the diagonal length L2, the higher the reliability of the pupil positions PP4, PP5, PP6 corresponding to the pupil position image units U_24, U_25, U_26. The second reliability calculation unit 6 may use the reciprocal of the diagonal lengths L1 and L2 as the second reliability, or may calculate the second reliability according to the diagonal lengths L1 and L2. When calculating the second reliability according to the diagonal lengths L1 and L2, the shorter the diagonal lengths L1 and L2, the larger the second reliability is calculated, and the longer the diagonal lengths L1 and L2, the higher the second reliability. Is calculated as a small value.
 上記説明では、第2信頼度算出部6は、複数個の瞳孔位置に外接する外接矩形を設定したが、矩形以外の形状の図形を設定してもよい。また、第2信頼度算出部6は、広がり度として対角線長を算出したが、複数個の瞳孔位置の広がり度(ばらつき)を表す指標であれば、対角線長以外の指標を算出してもよい。 In the above description, the second reliability calculation unit 6 has set circumscribed rectangles circumscribing at a plurality of pupil positions, but a figure having a shape other than the rectangle may be set. Further, although the second reliability calculation unit 6 has calculated the diagonal length as the degree of spread, an index other than the diagonal length may be calculated as long as it is an index indicating the degree of spread (variation) of a plurality of pupil positions. ..
 なお、瞳孔位置検出部4が、複数個の評価値Eのうちの最大の評価値Eに対応する瞳孔位置のみを検出する構成である場合、第2信頼度算出部6は、複数個の評価値Eのうちの予め定められた値より大きい評価値Eに対応する複数個の瞳孔位置を、瞳孔位置検出部4から取得すればよい。第2信頼度算出部6は、瞳孔位置検出部4から取得した複数個の瞳孔位置の広がり度を算出することで、結果として、瞳孔位置検出部4が検出した最大の評価値Eをもつ瞳孔位置の信頼性を評価できる。 When the pupil position detection unit 4 is configured to detect only the pupil position corresponding to the maximum evaluation value E among the plurality of evaluation values E, the second reliability calculation unit 6 may perform a plurality of evaluations. A plurality of pupil positions corresponding to the evaluation value E larger than the predetermined value among the values E may be acquired from the pupil position detection unit 4. The second reliability calculation unit 6 calculates the degree of spread of the plurality of pupil positions acquired from the pupil position detection unit 4, and as a result, the pupil having the maximum evaluation value E detected by the pupil position detection unit 4. The reliability of the position can be evaluated.
 視線角度算出部11は、瞳孔位置検出装置1が出力する、目特徴点の位置と、複数の瞳孔位置と、瞳孔位置それぞれの評価値Eと、当該複数の瞳孔位置に共通の第2信頼度とを含む情報を取得する。視線角度算出部11は、取得した情報を用いて、対象者の視線角度φを算出する。例えば、瞳孔位置検出装置1が右目のみに関する上記情報を出力した場合、視線角度算出部11は、この情報に含まれる第2信頼度が予め定められた閾値以上であれば、複数の瞳孔位置の中から最大の評価値Eをもつ瞳孔位置を選択し、当該選択した瞳孔位置に基づいて視線角度φを算出する。また、例えば、瞳孔位置検出装置1が右目に関する上記情報と左目に関する上記情報とを出力した場合、視線角度算出部11は、右目の情報に含まれる第2信頼度と左目の情報に含まれる第2信頼度とが両方とも上記閾値以上であれば、右目の情報に含まれる複数の瞳孔位置と左目の情報に含まれる複数の瞳孔位置との中から最大の評価値Eをもつ瞳孔位置を選択し、当該選択した瞳孔位置に基づいて視線角度φを算出する。一方、右目の情報に含まれる第2信頼度又は左目の情報に含まれる第2信頼度のいずれか一方のみが上記閾値以上であれば、視線角度算出部11は、上記閾値以上の第2信頼度に対応する複数の瞳孔位置の中から最大の評価値Eをもつ瞳孔位置を選択し、当該選択した瞳孔位置に基づいて視線角度φを算出する。なお、視線角度算出部11は、第2信頼度が上記閾値未満である場合、この情報に基づく視線角度φの算出を行わない。 The line-of-sight angle calculation unit 11 outputs the position of the eye feature point, the plurality of pupil positions, the evaluation value E of each pupil position, and the second reliability common to the plurality of pupil positions, which is output by the pupil position detection device 1. Get information including and. The line-of-sight angle calculation unit 11 calculates the line-of-sight angle φ of the subject using the acquired information. For example, when the pupil position detection device 1 outputs the above information regarding only the right eye, the line-of-sight angle calculation unit 11 has a plurality of pupil positions as long as the second reliability included in this information is equal to or higher than a predetermined threshold value. The pupil position having the maximum evaluation value E is selected from the inside, and the line-of-sight angle φ is calculated based on the selected pupil position. Further, for example, when the pupil position detecting device 1 outputs the above information regarding the right eye and the above information regarding the left eye, the line-of-sight angle calculation unit 11 includes the second reliability included in the information of the right eye and the second reliability included in the information of the left eye. 2 If both of the reliabilitys are equal to or higher than the above threshold value, the pupil position having the maximum evaluation value E is selected from the plurality of pupil positions included in the information of the right eye and the plurality of pupil positions included in the information of the left eye. Then, the line-of-sight angle φ is calculated based on the selected pupil position. On the other hand, if only one of the second reliability included in the information of the right eye and the second reliability included in the information of the left eye is equal to or higher than the above threshold value, the line-of-sight angle calculation unit 11 has a second reliability equal to or higher than the above threshold value. The pupil position having the maximum evaluation value E is selected from the plurality of pupil positions corresponding to the degrees, and the line-of-sight angle φ is calculated based on the selected pupil position. If the second reliability is less than the above threshold value, the line-of-sight angle calculation unit 11 does not calculate the line-of-sight angle φ based on this information.
 上記説明では、第2信頼度算出部6が、目特徴点の位置と、複数の瞳孔位置と、瞳孔位置それぞれの評価値Eと、当該複数の瞳孔位置に共通の第2信頼度とを含む情報を出力する構成であるが、目特徴点の位置と、最大の評価値Eをもつ瞳孔位置と、第2信頼度とを含む情報を出力する構成であってもよい。この場合、視線角度算出部11は、瞳孔位置検出装置1から取得した情報に含まれる第2信頼度が予め定められた閾値以上であれば、最大の評価値Eをもつ瞳孔位置に基づいて視線角度φを算出する。 In the above description, the second reliability calculation unit 6 includes the position of the eye feature point, the plurality of pupil positions, the evaluation value E of each pupil position, and the second reliability common to the plurality of pupil positions. Although it is configured to output information, it may be configured to output information including the position of the eye feature point, the pupil position having the maximum evaluation value E, and the second reliability. In this case, if the second reliability included in the information acquired from the pupil position detection device 1 is equal to or higher than a predetermined threshold value, the line-of-sight angle calculation unit 11 gazes at the pupil position having the maximum evaluation value E. Calculate the angle φ.
 この例での視線角度φは、ヨー方向に対する視線角度φ_X、及びピッチ方向に対する視線角度φ_Yを含むものとする。図13を参照して、視線角度φ_Xの算出方法について説明する。図13は、ヨー方向に対する視線角度φ_Xの算出方法の例を示す説明図である。また、図14を参照して、視線角度φ_Yの算出方法について説明する。図14は、ピッチ方向に対する視線角度φ_Yの算出方法の例を示す説明図である。 The line-of-sight angle φ in this example includes the line-of-sight angle φ_X with respect to the yaw direction and the line-of-sight angle φ_Y with respect to the pitch direction. A method of calculating the line-of-sight angle φ_X will be described with reference to FIG. FIG. 13 is an explanatory diagram showing an example of a method of calculating the line-of-sight angle φ_X with respect to the yaw direction. Further, a method of calculating the line-of-sight angle φ_Y will be described with reference to FIG. FIG. 14 is an explanatory diagram showing an example of a method of calculating the line-of-sight angle φ_Y with respect to the pitch direction.
 まず、視線角度算出部11は、目特徴点の情報を用いて、X方向に対する位置P_FP1_X,P_FP2_Xを算出する。位置P_FP1_Xは、目尻FP1に対応するものである。位置P_FP2_Xは、目頭FP2に対応するものである。 First, the line-of-sight angle calculation unit 11 calculates the positions P_FP1_X and P_FP2_X in the X direction using the information of the eye feature points. The position P_FP1_X corresponds to the outer corner of the eye FP1. The position P_FP2_X corresponds to the inner corner FP2.
 次いで、視線角度算出部11は、当該算出された位置P_FP1_X,P_FP2_Xに基づき、X方向に対する位置(以下「第1基準位置」という。)P_C_Xを算出する。第1基準位置P_C_Xは、目尻FP1と目頭FP2との中間位置に対応するものである。すなわち、第1基準位置P_C_Xは、X方向に対する眼裂内領域の中心部に対応するものである。 Next, the line-of-sight angle calculation unit 11 calculates the position (hereinafter referred to as "first reference position") P_C_X in the X direction based on the calculated positions P_FP1_X and P_FP2_X. The first reference position P_C_X corresponds to an intermediate position between the outer corner FP1 and the inner corner FP2. That is, the first reference position P_C_X corresponds to the central portion of the intraocular fissure region with respect to the X direction.
 次いで、視線角度算出部11は、瞳孔位置の情報を用いて、X方向に対する位置P_PP_Xを算出する。位置P_PP_Xは、瞳孔位置PPに対応するものである。 Next, the line-of-sight angle calculation unit 11 calculates the position P_PP_X with respect to the X direction using the information on the pupil position. The position P_PP_X corresponds to the pupil position PP.
 次いで、視線角度算出部11は、位置P_FP1_X又は位置P_FP2_Xについて、第1基準位置P_C_Xに対する間隔L_X_1及び位置P_PP_Xに対する間隔L_X_2を算出する。図13は、位置P_FP2_Xを間隔L_X_1,L_X_2の基準に用いた場合の例を示している。 Next, the line-of-sight angle calculation unit 11 calculates the interval L_X_1 with respect to the first reference position P_C_X and the interval L_X_2 with respect to the position P_PP_X for the position P_FP1_X or the position P_FP2_X. FIG. 13 shows an example when the position P_FP2_X is used as a reference for the intervals L_X_1 and L_X_2.
 視線角度算出部11には、視線角度φ_Xの最大値φmax_Xを示す値が予め設定されている。視線角度算出部11は、当該設定されている最大値φmax_X及び上記算出された間隔L_X_1,L_X_2を用いて、以下の式(1)により視線角度φ_Xを算出する。 The line-of-sight angle calculation unit 11 is preset with a value indicating the maximum value φmax_X of the line-of-sight angle φ_X. The line-of-sight angle calculation unit 11 calculates the line-of-sight angle φ_X by the following equation (1) using the set maximum value φmax_X and the calculated intervals L_X_1 and L_X_2.
 φ_X=φmax_X×(L_X_1-L_X_2)/L_X_1  (1) Φ_X = φmax_X × (L_X_1-L_X_2) / L_X_1 (1)
 最大値φmax_Xは、例えば、以下のようなモデルM_Xに基づくものである。すなわち、モデルM_Xにおいては、仮に位置P_PP_Xが第1基準位置P_C_Xと同一であるとき、視線角度φ_Xが0°となるものである。また、モデルM_Xにおいては、仮に位置P_PP_Xが位置P_FP1_Xと同一であるとき、視線角度φ_Xが最大値φmax_Xに対応する値(例えば-60°)となるものである。また、モデルM_Xにおいては、仮に位置P_PP_Xが位置P_FP2_Xと同一であるとき、視線角度φ_Xが最大値φmax_Xに対応する値(例えば+60°)となるものである。 The maximum value φmax_X is based on, for example, the following model M_X. That is, in the model M_X, if the position P_PP_X is the same as the first reference position P_C_X, the line-of-sight angle φ_X is 0 °. Further, in the model M_X, if the position P_PP_X is the same as the position P_FP1_X, the line-of-sight angle φ_X becomes a value (for example, −60 °) corresponding to the maximum value φmax_X. Further, in the model M_X, if the position P_PP_X is the same as the position P_FP2_X, the line-of-sight angle φ_X becomes a value (for example, + 60 °) corresponding to the maximum value φmax_X.
 また、視線角度算出部11は、目特徴点の情報を用いて、Y方向に対する位置P_FP3_Y,P_FP4_Yを算出する。位置P_FP3_Yは、上瞼FP3に対応するものである。位置P_FP4_Yは、下瞼FP4に対応するものである。 Further, the line-of-sight angle calculation unit 11 calculates the positions P_FP3_Y and P_FP4_Y in the Y direction by using the information of the eye feature points. The position P_FP3_Y corresponds to the upper eyelid FP3. The position P_FP4_Y corresponds to the lower eyelid FP4.
 次いで、視線角度算出部11は、当該算出された位置P_FP3_Y,P_FP4_Yに基づき、Y方向に対する位置(以下「第2基準位置」という。)P_C_Yを算出する。第2基準位置P_C_Yは、上瞼FP3と下瞼FP4との中間位置に対応するものである。すなわち、第2基準位置P_C_Yは、Y方向に対する眼裂内領域の中心部に対応するものである。 Next, the line-of-sight angle calculation unit 11 calculates the position P_C_Y in the Y direction (hereinafter referred to as "second reference position") based on the calculated positions P_FP3_Y and P_FP4_Y. The second reference position P_C_Y corresponds to an intermediate position between the upper eyelid FP3 and the lower eyelid FP4. That is, the second reference position P_C_Y corresponds to the central portion of the intraocular fissure region with respect to the Y direction.
 次いで、視線角度算出部11は、瞳孔位置の情報を用いて、Y方向に対する位置P_PP_Yを算出する。位置P_PP_Yは、瞳孔位置PPに対応するものである。 Next, the line-of-sight angle calculation unit 11 calculates the position P_PP_Y with respect to the Y direction using the information on the pupil position. The position P_PP_Y corresponds to the pupil position PP.
 次いで、視線角度算出部11は、位置P_FP3_Y又は位置P_FP4_Yについて、第2基準位置P_C_Yに対する間隔L_Y_1及び位置P_PP_Yに対する間隔L_Y_2を算出する。図14は、位置P_FP3_Yを間隔L_Y_1,L_Y_2の基準に用いた場合の例を示している。 Next, the line-of-sight angle calculation unit 11 calculates the interval L_Y_1 with respect to the second reference position P_C_Y and the interval L_Y_2 with respect to the position P_PP_Y for the position P_FP3_Y or the position P_FP4_Y. FIG. 14 shows an example when the position P_FP3_Y is used as a reference for the intervals L_Y_1 and L_Y_2.
 視線角度算出部11には、視線角度φ_Yの最大値φmax_Yを示す値が予め設定されている。視線角度算出部11は、当該設定されている最大値φmax_Y及び上記算出された間隔L_Y_1,L_Y_2を用いて、以下の式(2)により視線角度φ_Yを算出する。 The line-of-sight angle calculation unit 11 is preset with a value indicating the maximum value φmax_Y of the line-of-sight angle φ_Y. The line-of-sight angle calculation unit 11 calculates the line-of-sight angle φ_Y by the following equation (2) using the set maximum value φmax_Y and the calculated intervals L_Y_1 and L_Y_2.
 φ_Y=φmax_Y×(L_Y_1-L_Y_2)/L_Y_1  (2) Φ_Y = φmax_Y × (L_Y_1-L_Y_2) / L_Y_1 (2)
 最大値φmax_Yは、例えば、以下のようなモデルM_Yに基づくものである。すなわち、モデルM_Yにおいては、仮に位置P_PP_Yが第2基準位置P_C_Yと同一であるとき、視線角度φ_Yが0°となるものである。また、モデルM_Yにおいては、仮に位置P_PP_Yが位置P_FP3_Yと同一であるとき、視線角度φ_Yが最大値φmax_Yに対応する値(例えば+20°)となるものである。また、モデルM_Yにおいては、仮に位置P_PP_Yが位置P_FP4_Yと同一であるとき、視線角度φ_Yが最大値φmax_Yに対応する値(例えば-20°)となるものである。 The maximum value φmax_Y is based on, for example, the following model M_Y. That is, in the model M_Y, if the position P_PP_Y is the same as the second reference position P_C_Y, the line-of-sight angle φ_Y is 0 °. Further, in the model M_Y, if the position P_PP_Y is the same as the position P_FP3_Y, the line-of-sight angle φ_Y becomes a value (for example, + 20 °) corresponding to the maximum value φmax_Y. Further, in the model M_Y, if the position P_PP_Y is the same as the position P_FP4_Y, the line-of-sight angle φ_Y becomes a value (for example, −20 °) corresponding to the maximum value φmax_Y.
 なお、視線角度φ_X,φ_Yの算出方法は、これらの具体例に限定されるものではない。視線角度φ_X,φ_Yの算出には、公知の種々の技術を用いることができる。これらの技術についての詳細な説明は省略する。 The method of calculating the line-of-sight angles φ_X and φ_Y is not limited to these specific examples. Various known techniques can be used to calculate the line-of-sight angles φ_X and φ_Y. Detailed description of these techniques will be omitted.
 なお、上記説明では、視線検出装置10が、視線角度算出部11を備え、この視線角度算出部11が対象者の視線角度φを算出する構成であるが、この構成に限定されない。例えば、視線検出装置10が、視線方向算出部(図示せず)を備え、この視線方向算出部が対象者の視線方向を算出する構成であってもよい。視線方向の算出方法には、公知の種々の技術を用いることができる。これらの技術についての詳細な説明は省略する。 In the above description, the line-of-sight detection device 10 includes a line-of-sight angle calculation unit 11, and the line-of-sight angle calculation unit 11 calculates the line-of-sight angle φ of the target person, but the present invention is not limited to this configuration. For example, the line-of-sight detection device 10 may include a line-of-sight direction calculation unit (not shown), and the line-of-sight direction calculation unit may calculate the line-of-sight direction of the target person. Various known techniques can be used for the calculation method of the line-of-sight direction. Detailed description of these techniques will be omitted.
 視線検出装置10が算出した対象者の視線角度又は視線方向の少なくとも一方は、例えば、対象者の状態の判定に用いられる。ここで、視線検出装置10及び撮像装置12が車両に搭載されており、撮像装置12が運転者を対象者として撮像する例を考える。この例においては、視線検出装置10が、対象者である運転者の視線角度又は視線方向の少なくとも一方を検出する。そして、当該車両に搭載された運転者監視装置(図示せず)は、運転者の視線角度又は視線方向の少なくとも一方を用いて、運転者の状態が所定の状態(以下「警告対象状態」という。)であるか否かを判定する。当該運転者監視装置は、運転者が警告対象状態であると判定した場合、音声等を出力して運転者に警告する。警告対象状態は、運転者が脇見運転をしている状態、運転者が後方確認をすべきタイミングで後方確認をしていない状態、運転者の注意力が低下している状態、運転者が居眠りをしている状態、及び運転者による運転が不可能である状態等である。警告対象状態に係る判定には、公知の種々の技術を用いることができる。これらの技術についての詳細な説明は省略する。 At least one of the line-of-sight angle and the line-of-sight direction of the target person calculated by the line-of-sight detection device 10 is used, for example, for determining the state of the target person. Here, consider an example in which the line-of-sight detection device 10 and the image pickup device 12 are mounted on the vehicle, and the image pickup device 12 takes an image of the driver as a target person. In this example, the line-of-sight detection device 10 detects at least one of the line-of-sight angle and the line-of-sight direction of the driver who is the target person. Then, the driver monitoring device (not shown) mounted on the vehicle uses at least one of the driver's line-of-sight angle and line-of-sight direction, and the driver's state is a predetermined state (hereinafter referred to as "warning target state"). .) Is determined. When the driver monitoring device determines that the driver is in the warning target state, the driver monitoring device outputs a voice or the like to warn the driver. The warning target states are the state where the driver is driving aside, the state where the driver is not checking backward at the timing when the driver should check backward, the state where the driver's attention is low, and the driver is dozing. It is a state where the driver is not able to drive, and a state where the driver cannot drive. Various known techniques can be used to determine the state subject to warning. Detailed description of these techniques will be omitted.
 図15は、実施の形態1に係る視線検出装置10の動作例を示すフローチャートである。図15に示される処理は、所定の条件が満たされているとき(例えば、視線検出装置10が搭載された車両のイグニッション電源がオンされているとき)、繰り返し実行される。 FIG. 15 is a flowchart showing an operation example of the line-of-sight detection device 10 according to the first embodiment. The process shown in FIG. 15 is repeatedly executed when a predetermined condition is satisfied (for example, when the ignition power of the vehicle equipped with the line-of-sight detection device 10 is turned on).
 まず、画像取得部2が、対象者が撮像された画像I1を撮像装置12から取得する(ステップST1)。次いで、目領域抽出部3が、画像取得部2により取得された画像I1から目領域画像I2を抽出する(ステップST2)。次いで、瞳孔位置検出部4が、目領域抽出部3により抽出された目領域画像I2から複数の瞳孔位置PPを検出する(ステップST3)。次いで、第2信頼度算出部6が、瞳孔位置検出部4により検出された複数の瞳孔位置PPの位置関係に基づいて当該複数の瞳孔位置PPの広がり度を算出し、当該広がり度に基づいて当該複数の瞳孔位置PPの確からしさの指標となる第2信頼度を算出する(ステップST4)。次いで、視線角度算出部11が、第2信頼度算出部6により算出された第2信頼度の値が予め定められた閾値以上であれば、評価値Eが最も高い瞳孔位置PPに基づいて、対象者の視線角度φを算出する(ステップST5)。 First, the image acquisition unit 2 acquires the image I1 captured by the subject from the image pickup device 12 (step ST1). Next, the eye region extraction unit 3 extracts the eye region image I2 from the image I1 acquired by the image acquisition unit 2 (step ST2). Next, the pupil position detection unit 4 detects a plurality of pupil position PPs from the eye region image I2 extracted by the eye region extraction unit 3 (step ST3). Next, the second reliability calculation unit 6 calculates the degree of spread of the plurality of pupil position PPs based on the positional relationship of the plurality of pupil position PPs detected by the pupil position detection unit 4, and based on the degree of spread. The second reliability, which is an index of the certainty of the plurality of pupil position PPs, is calculated (step ST4). Next, if the line-of-sight angle calculation unit 11 determines that the value of the second reliability calculated by the second reliability calculation unit 6 is equal to or greater than a predetermined threshold value, the evaluation value E is based on the highest pupil position PP. The line-of-sight angle φ of the subject is calculated (step ST5).
 以上のように、実施の形態1に係る瞳孔位置検出装置1は、画像取得部2と、目領域抽出部3と、瞳孔位置検出部4と、第2信頼度算出部6とを備える。画像取得部2は、対象者が撮像された画像を取得する。目領域抽出部3は、画像取得部2により取得された画像から目領域を抽出する。瞳孔位置検出部4は、目領域抽出部3により抽出された目領域から複数の瞳孔位置を検出する。第2信頼度算出部6は、瞳孔位置検出部4により検出された複数の瞳孔位置の位置関係に基づいて当該複数の瞳孔位置の広がり度を算出し、当該広がり度に基づいて当該複数の瞳孔位置の確からしさの指標となる第2信頼度を算出する。複数の瞳孔位置の広がり度は、撮像環境の影響を受けないため、瞳孔位置検出装置1は、撮像環境の影響を受けることなく瞳孔位置の信頼性を評価できる。 As described above, the pupil position detection device 1 according to the first embodiment includes an image acquisition unit 2, an eye region extraction unit 3, a pupil position detection unit 4, and a second reliability calculation unit 6. The image acquisition unit 2 acquires an image captured by the subject. The eye region extraction unit 3 extracts the eye region from the image acquired by the image acquisition unit 2. The pupil position detection unit 4 detects a plurality of pupil positions from the eye region extracted by the eye region extraction unit 3. The second reliability calculation unit 6 calculates the degree of expansion of the plurality of pupil positions based on the positional relationship of the plurality of pupil positions detected by the pupil position detection unit 4, and the plurality of pupils based on the degree of expansion. The second reliability, which is an index of the certainty of the position, is calculated. Since the degree of spread of the plurality of pupil positions is not affected by the imaging environment, the pupil position detecting device 1 can evaluate the reliability of the pupil positions without being affected by the imaging environment.
 また、実施の形態1に係る視線検出装置10は、瞳孔位置検出装置1と、視線角度算出部11とを備える。視線角度算出部11は、瞳孔位置検出装置1が出力する複数の瞳孔位置と当該複数の瞳孔位置に共通する第2信頼度とに基づき視線角度を算出する。視線検出装置10は、撮像環境の影響を受けない方法により評価された、信頼性の高い瞳孔位置に基づいて視線角度を算出するので、視線角度を精度よく算出できる。 Further, the line-of-sight detection device 10 according to the first embodiment includes a pupil position detection device 1 and a line-of-sight angle calculation unit 11. The line-of-sight angle calculation unit 11 calculates the line-of-sight angle based on the plurality of pupil positions output by the pupil position detection device 1 and the second reliability common to the plurality of pupil positions. Since the line-of-sight detection device 10 calculates the line-of-sight angle based on the highly reliable pupil position evaluated by a method that is not affected by the imaging environment, the line-of-sight angle can be calculated accurately.
実施の形態2.
 図16は、実施の形態2に係る視線検出装置10の構成例を示すブロック図である。実施の形態2に係る視線検出装置10は、図1に示された実施の形態1の視線検出装置10に対して第1信頼度算出部5及び第3信頼度算出部7が追加された構成である。図16において図1と同一又は相当する部分は、同一の符号を付し説明を省略する。
Embodiment 2.
FIG. 16 is a block diagram showing a configuration example of the line-of-sight detection device 10 according to the second embodiment. The line-of-sight detection device 10 according to the second embodiment has a configuration in which a first reliability calculation unit 5 and a third reliability calculation unit 7 are added to the line-of-sight detection device 10 of the first embodiment shown in FIG. Is. In FIG. 16, the same or corresponding parts as those in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
 図16に示される第1信頼度算出部5及び第3信頼度算出部7は、図2Aに示された処理回路100により実現される。あるいは、第1信頼度算出部5及び第3信頼度算出部7の機能は、図2Bに示されたプロセッサ101がメモリ102に格納されるプログラムを実行することにより実現される。 The first reliability calculation unit 5 and the third reliability calculation unit 7 shown in FIG. 16 are realized by the processing circuit 100 shown in FIG. 2A. Alternatively, the functions of the first reliability calculation unit 5 and the third reliability calculation unit 7 are realized by the processor 101 shown in FIG. 2B executing a program stored in the memory 102.
 実施の形態2の瞳孔位置検出部4は、目特徴点の位置と複数の瞳孔位置とを含む情報を、第2信頼度算出部6に加え第1信頼度算出部5へも出力する。実施の形態2の第2信頼度算出部6は、目特徴点の位置と複数の瞳孔位置と当該複数の瞳孔位置に共通の第2信頼度とを含む情報を、第3信頼度算出部7へ出力する。 The pupil position detection unit 4 of the second embodiment outputs information including the positions of the eye feature points and the plurality of pupil positions to the first reliability calculation unit 5 in addition to the second reliability calculation unit 6. The second reliability calculation unit 6 of the second embodiment provides information including the positions of the eye feature points, the plurality of pupil positions, and the second reliability common to the plurality of pupil positions, in the third reliability calculation unit 7. Output to.
 第1信頼度算出部5は、瞳孔位置検出部4が検出した瞳孔位置ごとに、当該瞳孔位置の画素と当該瞳孔位置の周囲の画素との輝度差を算出する。第1信頼度算出部5は、算出した輝度差に基づいて、当該瞳孔位置の確からしさの指標となる第1信頼度を算出する。第1信頼度算出部5は、目特徴点の位置と、複数の瞳孔位置と、瞳孔位置それぞれの第1信頼度とを含む情報を、第3信頼度算出部7へ出力する。第1信頼度を用いた瞳孔位置の評価方法は、瞳孔が周辺(虹彩等)に比べて暗いという考え方に基づいた評価方法である。図17及び図18を参照して、第1信頼度の算出方法の一例について説明する。 The first reliability calculation unit 5 calculates the brightness difference between the pixels at the pupil position and the pixels around the pupil position for each pupil position detected by the pupil position detection unit 4. The first reliability calculation unit 5 calculates the first reliability, which is an index of the certainty of the pupil position, based on the calculated luminance difference. The first reliability calculation unit 5 outputs information including the positions of the eye feature points, the plurality of pupil positions, and the first reliability of each pupil position to the third reliability calculation unit 7. The pupil position evaluation method using the first reliability is an evaluation method based on the idea that the pupil is darker than the periphery (iris, etc.). An example of the first reliability calculation method will be described with reference to FIGS. 17 and 18.
 図17は、第1信頼度算出領域U_10における瞳孔位置画像単位U_11とその周囲の輝度値分布の例を示す説明図である。図17において、瞳孔位置画像単位U_11は、瞳孔位置検出部4が検出した瞳孔位置PPに対応する画像単位Uである。画像単位Uにおける輝度値は、値が大きいほど明るく、値が小さいほど暗い。第1信頼度算出部5は、目領域画像I2に対して、瞳孔位置画像単位U_11を中心とする予め定められた領域の画像単位(以下「第1信頼度算出領域U_10」という。)を設定する。例えば、第1信頼度算出領域U_10は、半径が4画像単位の円形領域である。瞳孔は、円形であり、かつ外光環境により拡縮することから、第1信頼度算出領域U_10も円形であることが望ましく、かつ標準的な人の瞳孔のサイズと少なくとも同等のサイズ又はそれ以上のサイズであることが望ましい。 FIG. 17 is an explanatory diagram showing an example of the pupil position image unit U_11 in the first reliability calculation region U_10 and the brightness value distribution around it. In FIG. 17, the pupil position image unit U_11 is an image unit U corresponding to the pupil position PP detected by the pupil position detection unit 4. The larger the value, the brighter the brightness value in the image unit U, and the smaller the value, the darker the brightness value. The first reliability calculation unit 5 sets an image unit (hereinafter, referred to as “first reliability calculation area U_10”) of a predetermined region centered on the pupil position image unit U_11 with respect to the eye region image I2. do. For example, the first reliability calculation area U_10 is a circular area having a radius of 4 image units. Since the pupil is circular and expands or contracts depending on the external light environment, it is desirable that the first reliability calculation region U_10 is also circular, and the size is at least equal to or larger than the size of a standard human pupil. Desirably the size.
 図18は、第1信頼度算出領域U_10に設定された同心円状の画像単位群U_12,U_13,U_14,U_15の例を示す説明図である。第1信頼度算出部5は、第1信頼度算出領域U_10に対して、瞳孔位置画像単位U_11を中心とするm個の同心円状の画像単位群を設定する。mは、1以上の任意の整数である。図18においては、m=4であり、第1信頼度算出部5は、瞳孔位置画像単位U_11を中心とする4個の同心円状の画像単位群U_12,U_13,U_14,U_15を設定する。なお、1個の画像単位Uが1個の画素であれば、m個の画像単位群は「m個の画素群」になる。 FIG. 18 is an explanatory diagram showing an example of concentric image unit groups U_12, U_13, U_14, and U_15 set in the first reliability calculation area U_10. The first reliability calculation unit 5 sets m concentric image unit groups centered on the pupil position image unit U_11 with respect to the first reliability calculation region U_10. m is any integer greater than or equal to 1. In FIG. 18, m = 4, and the first reliability calculation unit 5 sets four concentric image unit groups U_12, U_13, U_14, and U_15 centered on the pupil position image unit U_11. If one image unit U is one pixel, the m image unit group becomes the "m pixel group".
 第1信頼度算出部5は、同心円状の画像単位群U_12の中から最も暗い画像単位U_12aを選択する。画像単位U_12aの輝度値は21である。また、第1信頼度算出部5は、同心円状の画像単位群U_13の中から最も暗い画像単位U_13aを選択する。画像単位U_13aの輝度値は24である。また、第1信頼度算出部5は、同心円状の画像単位群U_14の中から最も暗い画像単位U_14aを選択する。画像単位U_14aの輝度値は26である。また、第1信頼度算出部5は、同心円状の画像単位群U_15の中から最も暗い画像単位U_15aを選択する。画像単位U_15aの輝度値は51である。このように、第1信頼度算出部5は、m個の画像単位群からm個の画像単位を選択する。 The first reliability calculation unit 5 selects the darkest image unit U_12a from the concentric image unit group U_12. The brightness value of the image unit U_12a is 21. Further, the first reliability calculation unit 5 selects the darkest image unit U_13a from the concentric image unit group U_13. The brightness value of the image unit U_13a is 24. Further, the first reliability calculation unit 5 selects the darkest image unit U_14a from the concentric image unit group U_14. The brightness value of the image unit U_14a is 26. Further, the first reliability calculation unit 5 selects the darkest image unit U_15a from the concentric image unit group U_15. The brightness value of the image unit U_15a is 51. In this way, the first reliability calculation unit 5 selects m image units from the m image unit group.
 次いで、第1信頼度算出部5は、選択したm=4個の最も暗い画像単位U_12a,U_13a,U_14a,U_15aの中から、最も明るい画像単位U_15aを選択する。次いで、第1信頼度算出部5は、上記選択した最も明るい画像単位U_15aの輝度値51と、瞳孔位置画像単位U_11の輝度値21との輝度差30(=51-21)を算出する。瞳孔は周辺(虹彩等)に比べて暗いため、両輝度値の差が大きい場合、瞳孔位置画像単位U_11に対応する瞳孔位置PPの信頼性が高い。第1信頼度算出部5は、輝度差の値をそのまま第1信頼度としてもよいし、輝度差の値に応じた第1信頼度を算出してもよい。輝度差の値に応じた第1信頼度を算出する場合、輝度差の値が大きいほど、第1信頼度が大きい値に算出され、輝度差の値が小さいほど、第1信頼度が小さい値に算出される。 Next, the first reliability calculation unit 5 selects the brightest image unit U_15a from the selected m = 4 darkest image units U_12a, U_13a, U_14a, U_15a. Next, the first reliability calculation unit 5 calculates the brightness difference 30 (= 51-21) between the brightness value 51 of the brightest image unit U_15a selected above and the brightness value 21 of the pupil position image unit U_11. Since the pupil is darker than the periphery (iris, etc.), the reliability of the pupil position PP corresponding to the pupil position image unit U_11 is high when the difference between the two luminance values is large. The first reliability calculation unit 5 may use the value of the luminance difference as it is as the first reliability, or may calculate the first reliability according to the value of the luminance difference. When calculating the first reliability according to the luminance difference value, the larger the luminance difference value is, the larger the first reliability is calculated, and the smaller the luminance difference value is, the smaller the first reliability is. Is calculated to.
 瞳孔位置検出部4が複数個の瞳孔位置を検出した場合、第1信頼度算出部5は、複数個の瞳孔位置のそれぞれについて第1信頼度を算出する。 When the pupil position detection unit 4 detects a plurality of pupil positions, the first reliability calculation unit 5 calculates the first reliability for each of the plurality of pupil positions.
 第3信頼度算出部7は、第1信頼度算出部5が算出した第1信頼度と、第2信頼度算出部6が算出した第2信頼度とを用いて第3信頼度を算出する。第3信頼度算出部7は、目特徴点の位置と、複数の瞳孔位置と、瞳孔位置それぞれの第3信頼度とを含む情報を、視線角度算出部11へ出力する。図19を参照して、第3信頼度の算出方法について説明する。 The third reliability calculation unit 7 calculates the third reliability using the first reliability calculated by the first reliability calculation unit 5 and the second reliability calculated by the second reliability calculation unit 6. .. The third reliability calculation unit 7 outputs information including the position of the eye feature point, the plurality of pupil positions, and the third reliability of each pupil position to the line-of-sight angle calculation unit 11. A method of calculating the third reliability will be described with reference to FIG.
 図19は、第3信頼度の算出方法の例を示す説明図である。ここでは、瞳孔位置検出部4が、図12Aに示される瞳孔位置PP4,PP5,PP6を検出したものとする。第1信頼度算出部5により、瞳孔位置PP4の第1信頼度が60%、瞳孔位置PP5の第1信頼度が90%、瞳孔位置PP6の第1信頼度が55%と算出される。第2信頼度算出部6により、瞳孔位置PP4,PP5,PP6に共通する第2信頼度が30%と算出される。この場合、第3信頼度算出部7は、予め定められた第1信頼度の重み(例えば、0.8)と第2信頼度の重み(例えば、0.2)とを用いて、図19に示す如く瞳孔位置PP4の第3信頼度を54%、瞳孔位置PP5の第3信頼度を78%、瞳孔位置PP6の第3信頼度を50%と算出する。 FIG. 19 is an explanatory diagram showing an example of a method for calculating the third reliability. Here, it is assumed that the pupil position detection unit 4 has detected the pupil positions PP4, PP5, and PP6 shown in FIG. 12A. The first reliability calculation unit 5 calculates that the first reliability of the pupil position PP4 is 60%, the first reliability of the pupil position PP5 is 90%, and the first reliability of the pupil position PP6 is 55%. The second reliability calculation unit 6 calculates that the second reliability common to the pupil positions PP4, PP5, and PP6 is 30%. In this case, the third reliability calculation unit 7 uses a predetermined first reliability weight (for example, 0.8) and a second reliability weight (for example, 0.2) to be used in FIG. As shown in the above, the third reliability of the pupil position PP4 is calculated as 54%, the third reliability of the pupil position PP5 is 78%, and the third reliability of the pupil position PP6 is calculated as 50%.
 上記説明では、第3信頼度算出部7は、第1信頼度と第2信頼度とを加重平均した加重平均値を第3信頼度として算出したが、単純平均等、他の方法により第3信頼度を算出してもよい。 In the above description, the third reliability calculation unit 7 calculates the weighted average value obtained by weighted averaging the first reliability and the second reliability as the third reliability, but the third reliability is calculated by another method such as a simple average. The reliability may be calculated.
 視線角度算出部11は、瞳孔位置検出装置1が出力する、目特徴点の位置と複数の瞳孔位置と瞳孔位置それぞれの第3信頼度とを含む情報を取得する。視線角度算出部11は、取得した情報を用いて、対象者の視線角度φを算出する。例えば、瞳孔位置検出装置1が右目のみに関する上記情報を出力した場合、視線角度算出部11は、この情報に含まれる複数の瞳孔位置の中から、第3信頼度が最も大きい瞳孔位置を選択し、選択した瞳孔位置の第3信頼度が予め定められた閾値以上であれば、当該選択した瞳孔位置に基づいて視線角度φを算出する。また、例えば、瞳孔位置検出装置1が右目に関する上記情報と左目に関する上記情報とを出力した場合、視線角度算出部11は、これらの情報に含まれる複数の瞳孔位置の中から、第3信頼度が最も大きい瞳孔位置を選択し、選択した瞳孔位置の第3信頼度が上記閾値以上であれば、当該選択した瞳孔位置に基づいて視線角度φを算出する。なお、視線角度算出部11は、上記閾値以上となる第3信頼度をもつ瞳孔位置が存在しない場合、視線角度φの算出を行わない。 The line-of-sight angle calculation unit 11 acquires information output by the pupil position detection device 1 including the position of the eye feature point, the plurality of pupil positions, and the third reliability of each of the pupil positions. The line-of-sight angle calculation unit 11 calculates the line-of-sight angle φ of the subject using the acquired information. For example, when the pupil position detection device 1 outputs the above information regarding only the right eye, the line-of-sight angle calculation unit 11 selects the pupil position having the highest third reliability from the plurality of pupil positions included in this information. If the third reliability of the selected pupil position is equal to or higher than a predetermined threshold value, the line-of-sight angle φ is calculated based on the selected pupil position. Further, for example, when the pupil position detecting device 1 outputs the above information regarding the right eye and the above information regarding the left eye, the line-of-sight angle calculation unit 11 has a third reliability among the plurality of pupil positions included in the information. Selects the largest pupil position, and if the third reliability of the selected pupil position is equal to or greater than the above threshold value, the line-of-sight angle φ is calculated based on the selected pupil position. The line-of-sight angle calculation unit 11 does not calculate the line-of-sight angle φ when there is no pupil position having a third reliability that is equal to or higher than the above threshold value.
 図16に示された瞳孔位置検出装置1は、第1信頼度と第2信頼度とから第3信頼度を算出して当該第3信頼度を視線角度算出部11へ出力する構成であったが、第1信頼度と第2信頼度とを視線角度算出部11へ出力する構成であってもよい。この構成の場合、瞳孔位置検出装置1は第3信頼度算出部7を備えない。また、この構成の場合、例えば、ある目領域画像I2について算出された第2信頼度が予め定められた閾値未満である場合、視線角度算出部11は、当該目領域画像I2を破棄し、視線角度φの算出を行わない。一方、当該目領域画像I2について算出された第2信頼度が上記閾値以上である場合、視線角度算出部11は、当該目領域画像I2から検出された瞳孔位置のうちの第1信頼度が最も大きい瞳孔位置に基づいて視線角度φを算出する。 The pupil position detecting device 1 shown in FIG. 16 has a configuration in which a third reliability is calculated from the first reliability and the second reliability and the third reliability is output to the line-of-sight angle calculation unit 11. However, the first reliability and the second reliability may be output to the line-of-sight angle calculation unit 11. In the case of this configuration, the pupil position detecting device 1 does not include the third reliability calculation unit 7. Further, in the case of this configuration, for example, when the second reliability calculated for a certain eye area image I2 is less than a predetermined threshold value, the line-of-sight angle calculation unit 11 discards the eye area image I2 and the line of sight. The angle φ is not calculated. On the other hand, when the second reliability calculated for the eye area image I2 is equal to or higher than the above threshold value, the line-of-sight angle calculation unit 11 has the highest reliability among the pupil positions detected from the eye area image I2. The line-of-sight angle φ is calculated based on the large pupil position.
 図20は、実施の形態2に係る視線検出装置10の動作例を示すフローチャートである。図20に示される処理は、所定の条件が満たされているとき(例えば、視線検出装置10が搭載された車両のイグニッション電源がオンされているとき)、繰り返し実行される。なお、図20のステップST1~ST4の動作は、図15のステップST1~ST4の動作と同じであるため、説明を省略する。 FIG. 20 is a flowchart showing an operation example of the line-of-sight detection device 10 according to the second embodiment. The process shown in FIG. 20 is repeatedly executed when a predetermined condition is satisfied (for example, when the ignition power of the vehicle equipped with the line-of-sight detection device 10 is turned on). Since the operations of steps ST1 to ST4 in FIG. 20 are the same as the operations of steps ST1 to ST4 in FIG. 15, the description thereof will be omitted.
 ステップST11において、第1信頼度算出部5は、瞳孔位置検出部4により検出された瞳孔位置PPごとに、当該瞳孔位置PPの画素と当該瞳孔位置PPの周囲の画素との輝度差を算出し、当該輝度差に基づいて当該瞳孔位置PPの確からしさの指標となる第1信頼度を算出するなお、ステップST4の動作とステップST11の動作は、逆の順番で行われてもよいし、並列に行われてもよい。 In step ST11, the first reliability calculation unit 5 calculates the brightness difference between the pixels of the pupil position PP and the pixels around the pupil position PP for each pupil position PP detected by the pupil position detection unit 4. The first reliability, which is an index of the certainty of the pupil position PP, is calculated based on the brightness difference. The operation of step ST4 and the operation of step ST11 may be performed in the reverse order or in parallel. May be done in.
 次いで、第3信頼度算出部7が、第1信頼度算出部5により算出された瞳孔位置PPごとの第1信頼度と、第2信頼度算出部6により算出された複数の瞳孔位置PPに共通する第2信頼度との加重平均を算出して第3信頼度を求める(ステップST12)。次いで、視線角度算出部11が、瞳孔位置検出部4により検出された複数の瞳孔位置PPのうち、第3信頼度算出部7により算出された第3信頼度の値が最も高い瞳孔位置PPに基づいて、対象者の視線角度φを算出する(ステップST13)。 Next, the third reliability calculation unit 7 applies the first reliability for each pupil position PP calculated by the first reliability calculation unit 5 and the plurality of pupil position PPs calculated by the second reliability calculation unit 6. The weighted average with the common second reliability is calculated to obtain the third reliability (step ST12). Next, the line-of-sight angle calculation unit 11 selects the pupil position PP having the highest third reliability value calculated by the third reliability calculation unit 7 among the plurality of pupil position PPs detected by the pupil position detection unit 4. Based on this, the line-of-sight angle φ of the subject is calculated (step ST13).
 以上のように、実施の形態2に係る瞳孔位置検出装置1は、第1信頼度算出部5を備える。第1信頼度算出部5は、瞳孔位置検出部4により検出された瞳孔位置ごとに、当該瞳孔位置の画素と当該瞳孔位置の周囲の画素との輝度差を算出し、当該輝度差に基づいて瞳孔位置の確からしさの指標となる第1信頼度を算出する。目領域における各画素の輝度値が撮像環境の影響を受けたとしても、瞳孔位置の画素と当該瞳孔位置の周囲の画素との輝度差は、撮像環境の影響を受けないため、瞳孔位置検出装置1は、撮像環境の影響を受けることなく瞳孔位置の信頼性を評価できる。 As described above, the pupil position detecting device 1 according to the second embodiment includes the first reliability calculation unit 5. The first reliability calculation unit 5 calculates the brightness difference between the pixel at the pupil position and the pixel around the pupil position for each pupil position detected by the pupil position detection unit 4, and based on the brightness difference. The first reliability, which is an index of the certainty of the pupil position, is calculated. Even if the brightness value of each pixel in the eye region is affected by the imaging environment, the brightness difference between the pixel at the pupil position and the pixels around the pupil position is not affected by the imaging environment. No. 1 can evaluate the reliability of the pupil position without being affected by the imaging environment.
 また、実施の形態2に係る瞳孔位置検出装置1は、第3信頼度算出部7を備える。第3信頼度算出部7は、第1信頼度算出部5により算出された第1信頼度と、第2信頼度算出部6により算出された第2信頼度とを用いて第3信頼度を算出する。評価方法が異なる第1信頼度と第2信頼度とを組み合わせることで、瞳孔位置検出装置1は、瞳孔位置の信頼性をより精度よく評価できる。 Further, the pupil position detection device 1 according to the second embodiment includes a third reliability calculation unit 7. The third reliability calculation unit 7 calculates the third reliability using the first reliability calculated by the first reliability calculation unit 5 and the second reliability calculated by the second reliability calculation unit 6. calculate. By combining the first reliability and the second reliability with different evaluation methods, the pupil position detecting device 1 can evaluate the reliability of the pupil position more accurately.
 上記説明では、画像取得部2、目領域抽出部3、瞳孔位置検出部4、第1信頼度算出部5、第2信頼度算出部6、第3信頼度算出部7、及び視線角度算出部11の機能が、車両に搭載される車載器に集約された構成であったが、ネットワーク上のサーバ装置、スマートフォン等の携帯端末、及び車載器等に分散されていてもよい。 In the above description, the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the first reliability calculation unit 5, the second reliability calculation unit 6, the third reliability calculation unit 7, and the line-of-sight angle calculation unit Although the functions of 11 are integrated in the in-vehicle device mounted on the vehicle, they may be distributed to the server device on the network, the mobile terminal such as a smartphone, the in-vehicle device, and the like.
 なお、本開示はその開示の範囲内において、各実施の形態の自由な組み合わせ、各実施の形態の任意の構成要素の変形、又は各実施の形態の任意の構成要素の省略が可能である It should be noted that, within the scope of the disclosure, it is possible to freely combine the embodiments, modify any component of each embodiment, or omit any component of each embodiment.
 本開示に係る視線検出装置は、車両に搭乗している運転者の状態を監視する運転者監視装置、及び車両に搭乗している運転者を含む各乗員の状態を監視する乗員監視装置等に用いるのに適している。 The line-of-sight detection device according to the present disclosure includes a driver monitoring device that monitors the state of a driver in a vehicle, an occupant monitoring device that monitors the state of each occupant including a driver in a vehicle, and the like. Suitable for use.
 1 瞳孔位置検出装置、2 画像取得部、3 目領域抽出部、4 瞳孔位置検出部、5 第1信頼度算出部、6 第2信頼度算出部、7 第3信頼度算出部、10 視線検出装置、11 視線角度算出部、12 撮像装置、21,22 外接矩形、100 処理回路、101 プロセッサ、102 メモリ。 1 pupil position detection device, 2 image acquisition unit, 3 eye area extraction unit, 4 pupil position detection unit, 5 1st reliability calculation unit, 6 2nd reliability calculation unit, 7 3rd reliability calculation unit, 10 line-of-sight detection Device, 11 line-of-sight angle calculation unit, 12 image pickup device, 21 and 22, circumscribing rectangle, 100 processing circuit, 101 processor, 102 memory.

Claims (7)

  1.  対象者が撮像された画像を取得する画像取得部と、
     前記画像取得部により取得された画像から目領域を抽出する目領域抽出部と、
     前記目領域抽出部により抽出された目領域から複数の瞳孔位置を検出する瞳孔位置検出部と、
     前記瞳孔位置検出部により検出された複数の瞳孔位置の位置関係に基づいて当該複数の瞳孔位置の広がり度を算出し、当該広がり度に基づいて当該複数の瞳孔位置の確からしさの指標となる第2信頼度を算出する第2信頼度算出部とを備える瞳孔位置検出装置。
    An image acquisition unit that acquires an image captured by the subject,
    An eye area extraction unit that extracts an eye area from an image acquired by the image acquisition unit, and an eye area extraction unit.
    A pupil position detection unit that detects a plurality of pupil positions from the eye region extracted by the eye region extraction unit, and a pupil position detection unit.
    The degree of spread of the plurality of pupil positions is calculated based on the positional relationship of the plurality of pupil positions detected by the pupil position detection unit, and the degree of certainty of the plurality of pupil positions is an index based on the degree of spread. 2 A pupil position detecting device including a second reliability calculating unit for calculating reliability.
  2.  前記第2信頼度算出部は、前記複数の瞳孔位置に外接する外接矩形を設定し、当該外接矩形の対角線の長さを広がり度とすることを特徴とする請求項1記載の瞳孔位置検出装置。 The pupil position detecting device according to claim 1, wherein the second reliability calculation unit sets circumscribed rectangles circumscribing the plurality of pupil positions, and sets the length of the diagonal line of the circumscribed rectangle as the degree of spread. ..
  3.  前記瞳孔位置検出部により検出された瞳孔位置ごとに、当該瞳孔位置の画素と当該瞳孔位置の周囲の画素との輝度差を算出し当該輝度差に基づいて当該瞳孔位置の確からしさの指標となる第1信頼度を算出する第1信頼度算出部を備えることを特徴とする請求項1記載の瞳孔位置検出装置。 For each pupil position detected by the pupil position detection unit, the brightness difference between the pixel at the pupil position and the pixels around the pupil position is calculated, and the brightness difference is used as an index of the certainty of the pupil position. The pupil position detecting device according to claim 1, further comprising a first reliability calculating unit for calculating the first reliability.
  4.  前記第1信頼度算出部は、前記瞳孔位置検出部により検出された瞳孔位置を中心とするm個(mは1以上の整数)の同心円状の画素群を設定し、当該m個の画素群のそれぞれから最も暗い画素を選択し、選択したm個の画素の中の最も明るい画素と当該瞳孔位置の画素との輝度差を算出することを特徴とする請求項3記載の瞳孔位置検出装置。 The first reliability calculation unit sets m concentric pixel groups (m is an integer of 1 or more) centered on the pupil position detected by the pupil position detection unit, and sets the m pixel groups. The pupil position detecting device according to claim 3, wherein the darkest pixel is selected from each of the above, and the brightness difference between the brightest pixel among the selected m pixels and the pixel at the pupil position is calculated.
  5.  前記第1信頼度算出部により算出された第1信頼度と、前記第2信頼度算出部により算出された第2信頼度とを用いて第3信頼度を算出する第3信頼度算出部を備えることを特徴とする請求項3記載の瞳孔位置検出装置。 A third reliability calculation unit that calculates a third reliability using the first reliability calculated by the first reliability calculation unit and the second reliability calculated by the second reliability calculation unit. The pupil position detecting device according to claim 3, further comprising.
  6.  請求項1記載の瞳孔位置検出装置と、
     前記瞳孔位置検出装置が出力する複数の瞳孔位置と当該複数の瞳孔位置に共通する第2信頼度とに基づき視線角度を算出する視線角度算出部とを備える視線検出装置。
    The pupil position detecting device according to claim 1 and
    A line-of-sight detection device including a line-of-sight angle calculation unit that calculates a line-of-sight angle based on a plurality of pupil positions output by the pupil position detection device and a second reliability common to the plurality of pupil positions.
  7.  画像取得部が、対象者が撮像された画像を取得し、
     目領域抽出部が、前記画像取得部により取得された画像から目領域を抽出し、
     瞳孔位置検出部が、前記目領域抽出部により抽出された目領域から複数の瞳孔位置を検出し、
     第2信頼度算出部が、前記瞳孔位置検出部により検出された複数の瞳孔位置の位置関係に基づいて当該複数の瞳孔位置の広がり度を算出し、当該広がり度に基づいて当該複数の瞳孔位置の確からしさの指標となる第2信頼度を算出する瞳孔位置検出方法。
    The image acquisition unit acquires the image captured by the subject,
    The eye area extraction unit extracts the eye area from the image acquired by the image acquisition unit.
    The pupil position detection unit detects a plurality of pupil positions from the eye region extracted by the eye region extraction unit.
    The second reliability calculation unit calculates the degree of expansion of the plurality of pupil positions based on the positional relationship of the plurality of pupil positions detected by the pupil position detection unit, and the plurality of pupil positions based on the degree of expansion. A pupil position detection method for calculating a second reliability, which is an index of certainty.
PCT/JP2020/007602 2020-02-26 2020-02-26 Pupil position detection device, visual line detection appratus, and pupil position detection method WO2021171396A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/007602 WO2021171396A1 (en) 2020-02-26 2020-02-26 Pupil position detection device, visual line detection appratus, and pupil position detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/007602 WO2021171396A1 (en) 2020-02-26 2020-02-26 Pupil position detection device, visual line detection appratus, and pupil position detection method

Publications (1)

Publication Number Publication Date
WO2021171396A1 true WO2021171396A1 (en) 2021-09-02

Family

ID=77489979

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/007602 WO2021171396A1 (en) 2020-02-26 2020-02-26 Pupil position detection device, visual line detection appratus, and pupil position detection method

Country Status (1)

Country Link
WO (1) WO2021171396A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006025969A (en) * 2004-07-14 2006-02-02 Matsushita Electric Ind Co Ltd Pupil detecting device, and iris certifying apparatus
WO2010035472A1 (en) * 2008-09-26 2010-04-01 パナソニック株式会社 Line-of-sight direction determination device and line-of-sight direction determination method
US20140098198A1 (en) * 2012-10-09 2014-04-10 Electronics And Telecommunications Research Institute Apparatus and method for eye tracking
JP2017010337A (en) * 2015-06-23 2017-01-12 富士通株式会社 Pupil detection program, pupil detection method, pupil detection apparatus and line of sight detection system
JP2017182739A (en) * 2016-03-31 2017-10-05 富士通株式会社 Gaze detection device, gaze detection method and computer program for gaze detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006025969A (en) * 2004-07-14 2006-02-02 Matsushita Electric Ind Co Ltd Pupil detecting device, and iris certifying apparatus
WO2010035472A1 (en) * 2008-09-26 2010-04-01 パナソニック株式会社 Line-of-sight direction determination device and line-of-sight direction determination method
US20140098198A1 (en) * 2012-10-09 2014-04-10 Electronics And Telecommunications Research Institute Apparatus and method for eye tracking
JP2017010337A (en) * 2015-06-23 2017-01-12 富士通株式会社 Pupil detection program, pupil detection method, pupil detection apparatus and line of sight detection system
JP2017182739A (en) * 2016-03-31 2017-10-05 富士通株式会社 Gaze detection device, gaze detection method and computer program for gaze detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TIMM FABIAN ET AL.: "ACCURATE EYE CENTRE LOCALISATION BY MEANS OF GRADIENTS", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMPUTER VISION THEORY, APPLICATIONS, 7 March 2011 (2011-03-07), pages 125 - 130, XP055208015 *

Similar Documents

Publication Publication Date Title
ES2756677T3 (en) Image processing apparatus, image processing procedure and program
EP3671313B1 (en) Gaze tracking using mapping of pupil center position
US10842364B2 (en) Image processing device, endoscope apparatus, information storage device, and image processing method
JP6548171B2 (en) Pupil detection system, gaze detection system, pupil detection method, and pupil detection program
JP6582604B2 (en) Pupil detection program, pupil detection method, pupil detection device, and gaze detection system
US10163027B2 (en) Apparatus for and method of processing image based on object region
US7599524B2 (en) Method and apparatus for providing a robust object finder
TWI694809B (en) Method for detecting eyeball movement, program thereof, storage media for the program and device for detecting eyeball movement
EP1499110A2 (en) Detecting and correcting red-eye in a digital-image
US10417495B1 (en) Systems and methods for determining biometric information
US10809800B2 (en) Robust convergence signal
JP5472463B2 (en) Image processing program and image processing apparatus
WO2020093566A1 (en) Cerebral hemorrhage image processing method and device, computer device and storage medium
JP2012239550A (en) Corneal reflection determining program, corneal reflection determining device, and method for determining corneal reflection
WO2019024350A1 (en) Biometric recognition method and apparatus
WO2021171395A1 (en) Pupil position detection device, sight line detection device, and pupil position detection method
JP6906943B2 (en) On-board unit
WO2021171396A1 (en) Pupil position detection device, visual line detection appratus, and pupil position detection method
CN114429670A (en) Pupil detection method, device, equipment and storage medium
JP7278425B2 (en) Pupil detection device, line-of-sight detection device, passenger monitoring system, and pupil detection method
JP2009059165A (en) Outline detection apparatus, sight line detection apparatus using the same, program for causing computer to remove false outline data, program for causing computer to detect sight line direction, and computer-readable recording medium with the program recorded
CN111401223B (en) Face shape comparison method, device and equipment
CN115484860A (en) Real-time detection and correction of shadows in hyperspectral retinal images
CN110929672A (en) Pupil positioning method and electronic equipment
JP5699497B2 (en) Image processing apparatus and image processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20920897

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 20920897

Country of ref document: EP

Kind code of ref document: A1