WO2023112606A1 - Face direction determination system - Google Patents

Face direction determination system Download PDF

Info

Publication number
WO2023112606A1
WO2023112606A1 PCT/JP2022/042961 JP2022042961W WO2023112606A1 WO 2023112606 A1 WO2023112606 A1 WO 2023112606A1 JP 2022042961 W JP2022042961 W JP 2022042961W WO 2023112606 A1 WO2023112606 A1 WO 2023112606A1
Authority
WO
WIPO (PCT)
Prior art keywords
face
person
detection unit
eye
distance
Prior art date
Application number
PCT/JP2022/042961
Other languages
French (fr)
Japanese (ja)
Inventor
俊剛 関
Original Assignee
矢崎総業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 矢崎総業株式会社 filed Critical 矢崎総業株式会社
Publication of WO2023112606A1 publication Critical patent/WO2023112606A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present invention relates to a face orientation determination system.
  • Patent Document 1 discloses a driver's face by detecting at least one three-dimensional position of the driver's eyebrows, the corners of the eyes, and the mouth as feature points of the driver's face. A driver monitoring device for estimating orientation is described.
  • the driver monitoring device described in Patent Literature 1 may not be able to estimate the driver's face orientation if, for example, only part of the facial feature points can be detected when the driver looks aside.
  • the present invention has been made in view of the above, and it is an object of the present invention to provide a face orientation determination system that can appropriately determine the orientation of a person's face.
  • a face orientation detection system includes an image capturing unit that captures an image of a person's face, and an image of the person's face captured by the image capturing unit. an eye corner detection unit for detecting the position of the corner of the eye of the person; the position of the corner of the eye detected by the corner of the eye detection unit; and a face orientation detection unit that detects the orientation of the person's face based on the above.
  • the face orientation detection system can determine the orientation of a person's face even when only one corner of the eye can be detected when the person looks aside, and as a result, the orientation of the person's face can be properly determined. can be done.
  • FIG. 1 is a block diagram showing a configuration example of a face orientation detection system according to an embodiment.
  • FIG. 2 is a diagram showing the range of the head axis according to the embodiment.
  • FIG. 3 is a diagram showing a calculation example of the position of the head axis according to the embodiment.
  • FIG. 4 is a diagram illustrating an example of calculation of the face orientation angle in the vertical direction according to the embodiment.
  • FIG. 5 is a diagram illustrating an example of calculation of the face orientation angle in the vertical direction according to the embodiment.
  • FIG. 6 is a diagram illustrating a calculation example of the face direction angle in the horizontal direction according to the embodiment.
  • FIG. 7 is a diagram illustrating a calculation example of the face direction angle in the horizontal direction according to the embodiment.
  • FIG. 1 is a block diagram showing a configuration example of a face orientation detection system according to an embodiment.
  • FIG. 2 is a diagram showing the range of the head axis according to the embodiment.
  • FIG. 3 is a diagram showing a
  • FIG. 8 is a diagram illustrating an example of determination of left and right facing directions according to the embodiment.
  • FIG. 9 is a diagram showing the positional relationship between the facial image capturing camera and the driver according to the embodiment.
  • FIG. 10 is a diagram illustrating an update example (increase) of parameters according to the embodiment.
  • FIG. 11 is a diagram illustrating an example of parameter update (decrease) according to the embodiment.
  • FIG. 12 is a diagram showing the installation angles of the facial image acquisition camera in the horizontal direction according to the embodiment.
  • FIG. 13 is a diagram showing installation angles of the face image capturing camera in the vertical direction according to the embodiment.
  • FIG. 14 is a diagram illustrating average errors of face direction detection results (angles) according to the embodiment.
  • FIG. 15 is a flowchart showing inattentiveness detection processing according to the embodiment.
  • FIG. 16 is a flowchart showing face orientation detection processing according to the embodiment.
  • FIG. 17 is a flow chart showing processing for calculating the position of the head axis according to the embodiment.
  • FIG. 18 is a flowchart showing parameter update processing according to the embodiment.
  • FIG. 19 is a flowchart showing processing for calculating the position of the head axis according to the first modified example of the embodiment.
  • FIG. 20 is a flow chart showing calculation processing of the position of the head axis according to the second modification of the embodiment.
  • FIG. 21 is a flow chart showing processing for calculating the position of the head axis according to the third modification of the embodiment.
  • the face orientation detection system 1 detects the orientation of a person's face. For example, it is installed in a vehicle and detects the orientation of the face of a driver sitting in the driver's seat of the vehicle.
  • the face orientation detection system 1 includes, for example, a face image acquisition camera 10 as an imaging section, a facial feature point detection unit 20 as an eye corner detection section, and a face orientation calculation unit 30, as shown in FIG.
  • the left-right direction X the direction along the left and right corners of the eyes E of the driver
  • the direction along the up-down direction is referred to as the vertical direction or the up-down direction Y
  • a direction along the line of sight of the driver is referred to as a depth direction Z.
  • the horizontal direction X, the vertical direction Y, and the depth direction Z intersect each other, and are typically orthogonal.
  • the face image capturing camera 10 captures the face of the driver. For example, it is mounted on a vehicle and captures the face of the driver seated in the driver's seat of the vehicle.
  • the face image acquisition camera 10 is arranged in front of the driver, for example, in front of the dashboard or the roof of the vehicle, and is provided so that the center of the eyelid of the driver becomes the center of the imaging range.
  • the driver's face is imaged from the
  • the facial image acquisition camera 10 is connected to the facial feature point detection unit 20 and outputs image (moving (video) or still image) data representing the captured driver's face to the facial feature point detection unit 20 .
  • the facial feature point detection unit 20 detects facial feature points from image data.
  • the facial feature point detection unit 20 detects facial feature points from image data using, for example, known image recognition technology (machine learning) such as Open CV.
  • the facial feature point detection unit 20 detects, for example, the left and right corners of the eyes E, the left and right pupils, etc. as facial feature points from the image data, and detects the positions of these facial parts.
  • the facial feature point detection unit 20 treats information on the position of the detected facial part as coordinate information.
  • the facial feature point detection unit 20 is connected to the face direction calculation unit 30 and outputs information representing the detected positions of the left and right corners of the eye E (information representing the position coordinates of the left and right corners of the eye E) to the face direction calculation unit 30 .
  • the face orientation calculation unit 30 calculates the orientation of the driver's face, and includes an electronic circuit mainly composed of a well-known microcomputer including a CPU, a ROM that constitutes a memory, a RAM, and an interface.
  • the face orientation calculation unit 30 includes an eye corner distance calculation unit 31, a storage unit 32, a face orientation detection unit 33, and a parameter update unit 34. deal.
  • the eye corner distance calculation unit 31 calculates the distance d 0 (see FIG. 3) between the left and right corners of the eye E when the person is facing forward.
  • the outer-eye corner distance calculator 31 is connected to the facial feature point detection unit 20, and converts information representing the positions of the left and right outer corners of the eye E (information representing the position coordinates of the left and right outer corners of the eye E) output from the facial feature point detection unit 20 into Based on this, the distance d0 between the left and right corners of the eye E (the left corner of the eye LE and the right corner of the eye RE) is calculated, and the center position of the left and right corners of the eye E is calculated. For example, as shown in FIG.
  • the outer corner of eye distance calculation unit 31 calculates the distance d 0 between the position coordinates (x LE0 , y LE0 ) of the left corner of the eye LE and the position coordinates (x RE0 , y RE0 ) of the right corner of the eye RE.
  • the central position coordinates (x C0 , y C0 ) between the position coordinates (x LE0 , y LE0 ) of the left corner of the eye LE and the position coordinates (x RE0 , y RE0 ) of the right corner of the eye RE are calculated.
  • the eye corner distance calculator 31 is connected to the face orientation detector 33 and outputs the calculated distance d 0 and the center position coordinates (x C0 , y C0 ) to the face orientation detector 33 .
  • the storage unit 32 stores information, and includes, for example, volatile memory such as RAM, and nonvolatile memory such as ROM and flash memory.
  • the storage unit 32 stores, for example, conditions and information necessary for various processes in the face orientation calculation unit 30, various programs and applications to be executed in the face orientation calculation unit 30, control data, and the like.
  • the storage unit 32 stores predetermined standard human body dimensions based on a human body dimension database that statistically represents human body dimensions.
  • the storage unit 32 stores, for example, standard human body dimensions as shown in FIG. w, the distance h in the vertical direction Y between the central positions of the left and right corners of the eyes E and the axis G, etc. are stored.
  • the storage unit 32 stores, as standard human body dimensions, a standard distance d between the left and right corners of the eye E, a standard distance w between the center positions of the left and right corners of the eyes E and the axis G, and a standard distance w between the left and right corners of the eyes E. and the position of the axis G (standard head axis P) determined from the standard distance h between the axis G and the axis G.
  • the face direction detection unit 33 detects the face direction of the driver.
  • the face orientation detection unit 33 detects the orientation of the driver's face, for example, based on the position of the outer corner of the eye E detected by the facial feature point detection unit 20 and the position of the head axis P as a head rotation reference.
  • the head axis P is a reference portion for turning the driver's head, and is set between the axis G and the ischium J as shown in FIG. In other words, the driver's head pivots about the head axis P.
  • the face orientation detection unit 33 performs personalization processing for adjusting the standard head axis P stored in the storage unit 32 to match the individual.
  • the face orientation detection unit 33 detects the position of the predetermined standard head axis P from the face image of the driver captured by the face image acquisition camera 10 with the driver facing forward. The orientation of the driver's face is detected based on the corrected position of the head axis P corrected according to the obtained distance d0 between the left and right corners of the eyes E of the driver.
  • the face orientation detection unit 33 uses the following equation (1), for example, to determine the position of the head from the position coordinates (x LE0 , y LE0 ) of the left corner of the eye LE and the position coordinates (x RE0 , y RE0 ) of the right corner of the eye RE. Obtain the X coordinate x A (x c0 ) of the axis P.
  • the face orientation detection unit 33 also obtains the Y coordinate y A of the head axis P using the following equations (2) to (4).
  • the face direction detection unit 33 uses Equation (2) to determine the standard distance d between the left and right corners of the eye E, the position of the corners of the eyes E (for example, the center position of the left and right corners of the eyes E), and the standard head axis P
  • the parameter Ph is obtained from the standard distance h in the vertical direction Y between and, using equation (3), the corrected distance h 0 is obtained from the distance d 0 between the left and right corners of the eye E and the parameter Ph , and the equation (4 ), the corrected Y coordinate of the head axis P is calculated from the position coordinates (x LE0 , y LE0 ) of the left corner of the eye LE, the position coordinates (x RE0 , y RE0 ) of the right corner of the eye RE, and the corrected distance h 0 .
  • the face orientation detection unit 33 personalizes the Y coordinate yA of the head axis P using the distance d0 between the left and right corners of the eyes E.
  • the face orientation detection unit 33 obtains the distance w0 in the depth direction Z between the central position of the left and right corners of the eyes E and the head axis P using the following equations (5) and (6).
  • the face direction detection unit 33 uses, for example, Equation (5) to determine the standard distance d between the left and right corners of the eyes E, and the standard distance in the depth direction Z between the central position of the left and right corners of the eyes E and the standard head axis P.
  • a parameter Pw is obtained from , and the corrected distance w0 is obtained from the distance d0 between the left and right corners of the eye E and the parameter Pw using equation (6). That is, the face direction detection unit 33 personalizes the Z coordinate of the head axis P using the distance d0 between the left and right corners of the eyes E.
  • the face direction detection unit 33 detects the face direction of the driver based on the personalized head axis P, that is, the head axis P after correction. For example, as shown in FIGS. 4 and 5, the face orientation detection unit 33 detects the distance h1 in the vertical direction Y between the position coordinates (y RE1 , z RE1 ) of the corner of the eye E and the position coordinates of the head axis P. . Based on the detected distance h1 , the face direction detection unit 33 uses the following equations (7) to (18) to obtain the angle of the face in the vertical direction Y.
  • Equation (7) represents the hypotenuse of a right-angled triangle
  • ⁇ P0 represents the angle between the outer corner of the eye E and the head axis P of the face when facing forward
  • ⁇ P1 represents the angle of the vertical direction Y of the face
  • ⁇ P2 represents the angle between the depth direction Z and the oblique side k.
  • the position coordinates of the head axis P are the origin (0, 0).
  • the face orientation detection unit 33 obtains the angle ⁇ P2 using Equation (7).
  • the angle ⁇ P2 is represented by Equation (8) using a trigonometric function.
  • the face orientation detection unit 33 combines equations (7) and (8) to obtain equation (9). Equation (18) can be obtained, and as a result, the angle ⁇ P1 of the vertical direction Y of the face can be obtained.
  • the face orientation detection unit 33 detects the position coordinates (x RE1 , z RE1 ) of the right corner of the eye RE, as shown in FIGS. 6 and 7, for example. Based on the detected position coordinates (x RE1 , z RE1 ) of the right corner of the eye RE, the face orientation detection unit 33 obtains the angle in the horizontal direction X of the face using the following equations (19) to (28). In the following equations (19) to (28), “ ⁇ Y1 ” represents the angle of the left-right direction X of the face, and the range is 0° ⁇ Y1 ⁇ 90°, and “x c1 ” represents the left-right angle.
  • Equation (19) The center position coordinate (X coordinate) of the outer corner of the eye E is represented, and “d E ” represents the displacement of the outer corner of the eye E.
  • the position coordinates of the head axis P are the origin (0, 0).
  • the face orientation detection unit 33 obtains the angle ⁇ Y1 using Equation (19).
  • “x c1 ” in Equation (19) is represented by Equation (20)
  • Equation (21) is obtained by substituting Equation (20) into Equation (19).
  • equation (21) the only unknown variable is 'dx'.
  • Equation (21) can be transformed into equation (22) using trigonometric functions, and equation (22) can be transformed into equations (23) and (24).
  • Equation (26) can be derived, and Equations (27) and (28) can be derived from Equation (26).
  • the face direction detection unit 33 can calculate “dx” from equation (28), and by substituting the calculated “dx” into equation (21), the angle ⁇ Y1 of the horizontal direction X of the face can be obtained. .
  • the face direction detection unit 33 detects that the position coordinates (x RE1 , z RE1 ) of the right corner of the eye RE are higher than the position coordinates (x RE0 , z RE0 ) of the right corner of the eye RE when facing forward. If it is larger, it is determined that the face is facing left. On the other hand, if the position coordinates (x RE1 , z RE1 ) of the right corner of the eye RE are smaller than the position coordinates (x RE0 , z RE0 ) of the right corner of the eye RE when the face is facing forward, the face orientation detection unit 33 is directed to the right.
  • the parameter update unit 34 updates the above-described parameters P h and P w according to the relative position (positional deviation) between the position of the face image acquisition camera 10 and the position of the driver, so that the person is positioned relative to the front position.
  • the position of the standard head axis P is corrected in accordance with the relative position of the facial image acquisition camera 10 .
  • the facial image acquisition camera 10 is often arranged at a position shifted from the front of the driver, as shown in FIG. 9, for example.
  • the face image capturing camera 10 captures an image of the driver's face that is shifted by an angle ⁇ a from the front direction M when the driver looks straight ahead.
  • the parameter updating unit 34 subtracts the angle ⁇ a corresponding to the deviation amount of the face image acquisition camera 10 from the angle of the face detected by the face orientation detecting unit 33 .
  • the angle ⁇ a corresponding to the amount of deviation of the facial image capturing camera 10 can be offset.
  • the angle ⁇ a corresponding to the deviation amount of the face image acquisition camera 10 is subtracted from the angle of the face detected by the face orientation detection unit 33. may not be zero. Therefore, the parameter updating unit 34 performs processing to bring the absolute value of the value obtained by subtracting the angle ⁇ a corresponding to the deviation amount of the face image capturing camera 10 from the angle of the face detected by the face direction detecting unit 33 closer to zero.
  • the parameter update unit 34 increases the parameters P h and P w described above, that is, the parameters D1 to D4 step by step.
  • the face orientation detection unit 33 detects the angle of the face based on the parameter P h and the parameter P w that are increased stepwise.
  • the parameter update unit 34 acquires a face image from the angle of the face detected by the face orientation detection unit 33 as the parameters P h and P w , that is, the parameters D1 to D4 are increased in order.
  • the absolute value of the value obtained by subtracting the angle ⁇ a corresponding to the deviation amount of the camera 10 gradually decreases, and the parameter D3 is the smallest among the parameters D1 to D4. P w ).
  • the parameter updating unit 34 decreases the parameters P h and P w described above, that is, the parameters D1 to D4 step by step.
  • the face orientation detection unit 33 detects the angle of the face based on the parameter P h and the parameter P w that are increased stepwise.
  • the parameter update unit 34 acquires a face image from the angle of the face detected by the face orientation detection unit 33 as the parameters P h and P w , that is, the parameters D1 to D4 are sequentially decreased.
  • the absolute value of the value obtained by subtracting the angle ⁇ a corresponding to the deviation amount of the camera 10 gradually decreases, and the parameter D3 is the smallest among the parameters D1 to D4. P w ).
  • the parameter updating unit 34 updates the parameters P h and P w so that, for example, as shown in FIG. was “19.6" in the previous model, but after the improvement it was reduced to "4.5". It used to be “7.9”, but after the improvement, it has decreased to "3.8".
  • the angle ⁇ a (see FIG. 9) corresponding to the amount of deviation of the facial image acquisition camera 10 is the amount of deviation in the horizontal direction X. Usually, the amount of deviation of the facial image acquisition camera 10 is the horizontal direction
  • the deviation amount of the angle ⁇ CYaw of X and the angle ⁇ CPitch in the vertical direction Y shown in FIG. 13 is included.
  • the parameter updating unit 34 When obtaining the angle of the face in the vertical direction Y, the parameter updating unit 34 subtracts the deviation amount of the angle ⁇ CPitch in the vertical direction Y, and when obtaining the angle of the face in the horizontal direction X, the parameter updating unit 34 subtracts Subtract the angle ⁇ CYaw .
  • FIG. 15 is a flowchart showing inattentiveness detection processing according to the embodiment.
  • the face direction detection system 1 includes a line-of-sight detection unit 40 that detects the driver's line of sight.
  • the face image capturing camera 10 captures an image of the driver's face from the front of the vehicle and outputs image data representing the captured face of the driver to the face feature point detection unit 20 .
  • the facial feature point detection unit 20 detects a face from image data (step S1), detects the positions of the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the detected face, and represents the positions of the left and right corners of the eyes E.
  • Information (information representing the positional coordinates of the left and right corners of the eyes E) is output to the face direction calculation unit 30, and information representing the positions of the left and right pupils is output to the line of sight detection section 40 (step S2).
  • the face orientation calculation unit 30 detects the orientation of the face based on the information representing the positions of the left and right corners of the eyes E output from the facial feature point detection unit 20 (step S3; detection processing shown in the flowchart of FIG. 16 to be described later).
  • the line-of-sight detection unit 40 detects the direction of the line of sight based on the information representing the positions of the left and right pupils output from the facial feature point detection unit 20 (step S4).
  • the inattentive-looking determination processing unit determines whether the driver is looking aside based on the face direction detected by the face direction calculation unit 30 and the line-of-sight direction detected by the line-of-sight detection unit 40 (step S5). If it is determined that the driver is looking aside, the inattentive determination processing unit notifies that the driver is looking aside by voice or the like (step S6). Do not notify.
  • FIG. 16 is a flowchart showing face orientation detection processing according to the embodiment.
  • the eye corner distance calculation unit 31 calculates the left and right corners of the eye based on the information representing the positions of the left and right corners of the eye E (information representing the position coordinates of the left and right corners of the eye E) output from the facial feature point detection unit 20.
  • the distance d 0 of the outer corner of the eye E is calculated, and the position coordinates of the center of the left and right corners of the eye E are calculated (step T1).
  • Step T2 The face orientation detection unit 33 acquires the X coordinate of the head axis P in step T2.
  • the face orientation detection unit 33 uses the above equation (2) to determine the standard distance d between the left and right corners of the eyes E, and the vertical direction Y between the center position of the left and right corners of the eyes E and the standard head axis P.
  • the initial parameter P h is obtained from the standard distance h, and using equation (5), the standard distance d between the left and right outer corners of the eye E, and the depth direction between the central position of the left and right outer corners of the eye E and the standard head axis P
  • An initial parameter Pw is obtained from the standard distance w of Z (step T3).
  • the face direction detection unit 33 uses equation (3) to obtain the corrected distance h 0 from the distance d 0 between the left and right corners of the eyes E and the parameter P h , and uses equation (6) to obtain the corrected distance h 0 .
  • a corrected distance w 0 is obtained from the distance d 0 of the outer corner of the eye E and the parameter P w
  • the vertical direction Y of the face is calculated based on the corrected head axis P determined based on the initial distance h 0 and the distance w 0 .
  • angle (formulas (7) to (18)) and the angle of the horizontal direction X of the face are obtained (step T4).
  • the parameter update unit 34 acquires predetermined angles (angle ⁇ CPitch , angle ⁇ CYaw ) of the face image acquisition camera 10 (step T5), and the initial face angle obtained by the face direction detection unit 33 is output. (step T6). Then, the parameter update unit 34 obtains the absolute value of a value obtained by subtracting the angle (angle ⁇ CPitch , angle ⁇ CYaw ) of the face image acquisition camera 10 from the initial face angle obtained by the face direction detection unit 33, and obtains the initial face angle.
  • parameter P h and parameter P w are increased or decreased and updated (steps T7 and T9).
  • the face direction detection unit 33 uses equation (3) to obtain the corrected distance h 0 from the distance d 0 between the left and right corners of the eyes E and the updated parameter P h , and uses equation (6) to obtain the corrected distance h 0 .
  • the corrected distance w 0 is obtained from the distance d 0 of the outer corner of the eye E and the updated parameter P w , and based on the corrected head axis P determined based on the updated distance h 0 and distance w 0 ,
  • the angle of the vertical direction Y of the face (formulas (7) to (18)) and the angle of the horizontal direction X of the face are obtained (step T10), and the orientation of the face is output. (Step T11).
  • FIG. 17 is a flow chart showing calculation processing of the position of the head axis P according to the embodiment.
  • the facial image acquisition camera 10 images the driver's face while the driver's face is facing forward (steps U1 and U2).
  • the facial feature point detection unit 20 detects a face from the image data captured by the facial image acquisition camera 10 (step U3), and detects the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the face.
  • the position of the part of the face is detected (step U4).
  • the outer corner of the eye distance calculator 31 calculates the distance d 0 between the left and right corners of the eye E. is calculated, and the central positions of the left and right corners of the eyes E are calculated (step U5).
  • the face orientation detection unit 33 detects the position of the predetermined standard head axis P from the left and right sides of the driver obtained from the image of the driver's face taken by the face image acquisition camera 10 with the driver facing the front.
  • the corrected position of the head axis P is calculated according to the distance d0 of the outer corner of the eye E (step U6).
  • FIG. 18 is a flowchart showing parameter update processing according to the embodiment.
  • the face orientation detection unit 33 inputs the initial parameters P h and P w (step L1). Then, the face direction detection unit 33 obtains the corrected distance w0 from the distance d0 between the left and right corners of the eye E and the parameter Pw , and the corrected distance w0 from the distance d0 between the left and right corners of the eye E and the parameter Ph . h0 is obtained, and the initial face angle of the driver is calculated based on the corrected head axis P (step L2).
  • the parameter update unit 34 updates the initial difference between the initial face angle calculated by the face direction detection unit 33 and the angle ⁇ a (angle ⁇ CPitch , angle ⁇ CYaw ) corresponding to the deviation amount of the face image capturing camera 10. Calculate (step L3).
  • the parameter update unit 34 increases the parameter P h and the parameter P w (step L4), and the face orientation detection unit 33 calculates the corrected distance w from the distance d 0 between the left and right corners of the eyes E and the increased parameter P w 0 is obtained, the corrected distance h 0 is obtained from the distance d 0 between the left and right corners of the eyes E and the increased parameter P h , and the angle of the driver's face is calculated based on the corrected head axis P ( step L5).
  • the parameter update unit 34 calculates the difference between the face angle calculated by the face direction detection unit 33 and the angle ⁇ a (angle ⁇ CPitch , angle ⁇ CYaw ) corresponding to the deviation amount of the face image acquisition camera 10. (Step L6).
  • the parameter updating unit 34 increases the parameters P h and P w (step L8).
  • the face direction detection unit 33 obtains the corrected distance w0 from the distance d0 between the left and right corners of the eyes E and the increased parameter Pw , and calculates the corrected distance w0 from the distance d0 between the left and right corners of the eyes E and the increased parameter Ph .
  • the corrected distance h0 is obtained, and the angle of the driver's face is calculated based on the corrected head axis P (step L9).
  • the parameter update unit 34 calculates the difference between the angle of the face calculated by the face direction detection unit 33 and the angle ⁇ a (angle ⁇ CPitch , angle ⁇ CYaw ) corresponding to the deviation amount of the face image acquisition camera 10 (step L10). If the calculated difference is larger than the previous difference (step L11; Yes), the parameter updating unit 34 sets the previous parameters P h and P w as updated values (step L12), and terminates the process. On the other hand, if the calculated difference is smaller than the previous difference (step L11; No), the parameter updating unit 34 returns to step L8 and increases the parameter P h and parameter P w .
  • step L7 when the calculated difference is larger than the initial value (initial difference) (step L7; No), the parameter updating unit 34 reduces the parameters P h and P w (step L13).
  • the face orientation detection unit 33 obtains the corrected distance w0 from the distance d0 between the left and right corners of the eyes E and the reduced parameter Pw , and calculates the corrected distance w0 from the distance d0 between the left and right corners of the eyes E and the reduced parameter Ph .
  • the corrected distance h0 is obtained, and the angle of the driver's face is calculated based on the corrected head axis P (step L14).
  • the parameter update unit 34 calculates the difference between the angle of the face calculated by the face direction detection unit 33 and the angle ⁇ a (angle ⁇ CPitch , angle ⁇ CYaw ) corresponding to the deviation amount of the face image acquisition camera 10 (step L15). If the calculated difference is greater than the previous difference (step L16; Yes), the parameter updating unit 34 sets the previous parameters P h and P w as update values (step L17), and terminates the process. On the other hand, if the calculated difference is smaller than the previous difference (step L16; No), the parameter updating unit 34 returns to step L13 and decreases the parameter P h and parameter P w .
  • the face orientation detection system 1 includes the face image acquisition camera 10, the facial feature point detection unit 20, and the face orientation detection section 33.
  • a face image capturing camera 10 captures an image of a person's face.
  • the facial feature point detection unit 20 detects the positions of the corners of the eyes E of the person based on the image of the person's face taken by the face image acquisition camera 10 .
  • the face orientation detection unit 33 detects the orientation of the person's face based on the position of the corner of the eye E detected by the facial feature point detection unit 20 and the position of the head axis P that serves as a reference when turning the person's head. to detect
  • the face orientation detection system 1 can determine the orientation of the person's face even if only one corner of the eye can be detected when the person looks aside, or even if the face is partially covered by an object such as a mask. can do.
  • the face direction detection system 1 does not detect the face direction using a three-dimensional rotation matrix or the like using a large number of facial feature points as in the conventional art, but detects the face direction based on two-dimensional coordinate data. Since the orientation is detected, an increase in the amount of calculation can be suppressed, and the processing load can be reduced. As a result, the face orientation determination system can properly detect the orientation of the person's face.
  • the face orientation detection unit 33 detects the orientation of the person's face in the vertical direction Y based on the distance h1 between the position of the head axis P and the position of the corner of the eye E in the vertical direction Y. do. With this configuration, the face direction detection system 1 can appropriately detect the face direction of the person in the vertical direction Y while suppressing an increase in the amount of calculation.
  • the face orientation detection unit 33 detects the position of the head axis P and the position of the outer corner of the eye E (for example, the position coordinates of the right outer corner of the eye RE (x RE1 , z RE1 )), the orientation of the person's face in the horizontal direction X is detected.
  • the face orientation detection system 1 can appropriately detect the orientation of the person's face in the left-right direction X while suppressing an increase in the amount of calculation.
  • the face orientation detection unit 33 captures a predetermined standard position of the head axis P as the position of the head axis P with the face image acquisition camera 10 while the person faces the front.
  • the orientation of the person's face is detected based on the position of the corrected head axis P corrected according to the distance d0 between the left and right corners of the eyes E of the person obtained from the image of the person's face.
  • the face direction detection system 1 can correct the head axis P according to individual differences of the person, so that the detection accuracy of the person's face direction can be improved.
  • the face orientation detection unit 33 determines the distance h between the position of the head axis P and the position of the outer corner of the eye E in the vertical direction Y, according to the distance d0 between the left and right corners of the eye E of the person. Then, based on the corrected position of the head axis P obtained by correcting the distance w between the position of the head axis P in the depth direction Z intersecting the vertical direction Y and the central positions of the left and right corners of the eyes E, the human face is determined. to detect the orientation of With this configuration, the face direction detection system 1 can correct the head axis P along the vertical direction Y and the depth direction Z according to individual differences of the person, so that the detection accuracy of the person's face direction is improved. be able to.
  • the face orientation detection unit 33 sets a predetermined standard position of the head axis P as the position of the head axis P according to the relative position of the face image acquisition camera 10 with respect to the front position of the person.
  • the orientation of the person's face is detected based on the corrected position of the head axis P corrected by .
  • the face orientation detection system 1 can correct an error in the installation accuracy of the face image acquisition camera 10, etc., so that it is possible to suppress a decrease in accuracy in detecting the face orientation of a person.
  • the face direction detection system 1 In the face direction detection system 1, the distance h between the position of the head axis P and the position of the corner of the eye E in the vertical direction Y and the distance h A parameter updating unit 34 is further provided for correcting the distance w between the position of the head axis P in the depth direction Z intersecting Y and the central positions of the left and right corners of the eyes E.
  • the face direction detection system 1 can correct an error in mounting accuracy of the face image acquisition camera 10 in the vertical direction Y and the depth direction Z, thereby suppressing a decrease in accuracy in detecting the face direction of a person. can do.
  • FIG. 19 is a flow chart showing the calculation processing of the position of the head axis P according to the first modified example of the embodiment.
  • the calculation processing of the position of the head axis P according to the first modification differs from the calculation processing of the position of the head axis P according to the embodiment in that unique information is stored by face authentication.
  • the face image capturing camera 10 captures an image of the driver's face while the driver's face is facing forward (steps V1 and V2).
  • the face feature point detection unit 20 detects a face from the image data captured by the face image acquisition camera 10 (step V3), detects the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the face, and detects these features.
  • the positions of facial parts are detected (step V4).
  • the outer corner of the eye distance calculator 31 calculates the distance d 0 between the left and right corners of the eye E. is calculated, and the central positions of the left and right corners of the eyes E are calculated (step V5).
  • the face orientation detection unit 33 detects the position of the predetermined standard head axis P from the left and right sides of the driver obtained from the image of the driver's face taken by the face image acquisition camera 10 with the driver facing the front.
  • the corrected position of the head axis P is calculated according to the distance d0 of the outer corner of the eye E (step V6).
  • a face authentication unit (not shown) authenticates the driver's face (step V7), and stores the face image, the position of the head axis P, the seat position, etc. in the storage unit 32 (step V8).
  • FIG. 20 is a flow chart showing the calculation processing of the position of the head axis P according to the second modified example of the embodiment.
  • the face orientation detection system 1 includes an acceleration sensor 50 that detects the acceleration of the vehicle, and a front determination section 60 that determines whether the driver is facing forward.
  • the face direction detection system 1 detects the acceleration of the vehicle using the acceleration sensor 50 (step W1).
  • the acceleration sensor 50 outputs the detected acceleration to the front determination section 60 .
  • the front determination unit 60 determines that the vehicle is moving forward and the driver is facing forward.
  • the face image capturing camera 10 captures the face of the driver while the driver faces the front (step W3).
  • the face feature point detection unit 20 detects a face from the image data captured by the face image acquisition camera 10 (step W4), detects the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the face, and detects these features.
  • the positions of facial parts are detected (step W5).
  • the outer corner of the eye distance calculator 31 calculates the distance d 0 between the left and right corners of the eye E. is calculated, and the central positions of the left and right corners of the eyes E are calculated (step W6).
  • the face orientation detection unit 33 detects the position of the predetermined standard head axis P from the left and right sides of the driver obtained from the image of the driver's face taken by the face image acquisition camera 10 with the driver facing the front. Then, the corrected position of the head axis P is calculated according to the distance d0 of the outer corner of the eye E (step W7). If the acceleration sensor 50 does not detect acceleration (step W2; No), the face direction detection system 1 returns to step W1 and detects acceleration again. As described above, the face direction detection system 1 includes the acceleration sensor 50 that detects the acceleration of the vehicle, and the front determination unit that determines whether the driver is facing forward based on the acceleration of the vehicle detected by the acceleration sensor 50. 60.
  • the face image capturing camera 10 captures an image of the driver's face based on the determination result based on the acceleration of the vehicle determined by the front determination unit 60 .
  • the face orientation detection system 1 can automatically determine that the driver is facing forward based on the acceleration of the vehicle, and capture an image of the front face of the driver.
  • FIG. 21 is a flow chart showing the calculation processing of the position of the head axis P according to the third modified example of the embodiment.
  • the face direction detection system 1 detects the line of sight of the driver using the line of sight detection unit 40 (step F1).
  • the line of sight detection unit 40 outputs the detected line of sight of the driver to the front determination unit 60 .
  • the front determination unit 60 determines that the driver is facing the front.
  • the face image acquisition camera 10 captures the face of the driver while the driver faces the front (step F3).
  • the facial feature point detection unit 20 detects a face from the image data captured by the face video acquisition camera 10 (step F4), detects the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the face, and The positions of facial parts are detected (step F5). Based on information representing the positions of the left and right corners of the eye E (information representing the position coordinates of the left and right corners of the eye E) output from the facial feature point detection unit 20, the outer corner of the eye distance calculator 31 calculates the distance d 0 between the left and right corners of the eye E. is calculated, and the central positions of the left and right corners of the eyes E are calculated (step F6).
  • the face orientation detection unit 33 detects the position of the predetermined standard head axis P from the left and right sides of the driver obtained from the image of the driver's face taken by the face image acquisition camera 10 with the driver facing the front.
  • the corrected position of the head axis P is calculated according to the distance d0 of the outer corner of the eye E (step F7).
  • the face orientation detection system 1 returns to step F1 and detects the line of sight again.
  • the face orientation detection system 1 includes the line-of-sight detection unit 40 that detects the driver's line of sight, and determines whether the driver is facing forward based on the driver's line of sight detected by the line-of-sight detection unit 40.
  • a front determination unit 60 is further provided.
  • the face image acquisition camera 10 captures an image of the driver's face based on the determination result based on the line of sight of the driver determined by the front determination unit 60 .
  • the face orientation detection system 1 can automatically determine that the driver is facing forward based on the line of sight of the driver, and capture an image of the front face of the driver.
  • the face orientation detection unit 33 performs personalization processing for correcting the standard head axis P stored in the storage unit 32 according to the individual.
  • the face direction may be detected based on the standard head axis P without performing the detection.
  • the parameter update unit 34 performs parameter update processing for correcting the standard head axis P stored in the storage unit 32 in accordance with the relative position of the face image acquisition camera 10, but is not limited to this.
  • the face direction may be detected based on the standard head axis P without performing parameter update processing.
  • Face direction detection system 10 Face video acquisition camera (imaging unit) 20 facial feature point detection unit (eye corner detection unit) 33 face orientation detection unit 34 parameter update unit 40 line of sight detection unit 50 acceleration sensor E corner of eye P head axis (based on head rotation) Y up-down direction (vertical direction) X Horizontal direction Z Depth direction h, h 1 , d 0 , w Distance

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

A face direction detection system (1) includes a face video acquisition camera (10), a face feature point detection unit (20), and a face direction detection unit (33). The face video acquisition camera (10) captures an image of the face of a person. The face feature point detection unit (20) detects positions of corners of eyes of the person on the basis of the image of the face of the person captured by the face video acquisition camera (10). The face direction detection unit (33) detects a direction of the face of the person on the basis of the positions of the corners of the eyes detected by the face feature point detection unit (20) and a position of a head axis serving as a reference when the head of the person turns. As a result, the face direction detection system (1) can appropriately determine the direction of the face of a driver.

Description

顔向き判定システムfacial recognition system
 本発明は、顔向き判定システムに関する。 The present invention relates to a face orientation determination system.
 従来、顔向き判定システムとして、例えば、特許文献1には、運転者の顔の特徴点として、運転者の眉、目尻、及び口元の少なくとも1つの3次元位置を検出することで運転者の顔向きを推定する運転者監視装置が記載されている。 Conventionally, as a face orientation determination system, for example, Patent Document 1 discloses a driver's face by detecting at least one three-dimensional position of the driver's eyebrows, the corners of the eyes, and the mouth as feature points of the driver's face. A driver monitoring device for estimating orientation is described.
特開2011-154721号公報JP 2011-154721 A
 ところで、上述の特許文献1に記載の運転者監視装置は、例えば、運転者が脇見した際に顔の特徴点の一部しか検出できない場合、運転者の顔向きを推定できないおそれがある。 By the way, the driver monitoring device described in Patent Literature 1 may not be able to estimate the driver's face orientation if, for example, only part of the facial feature points can be detected when the driver looks aside.
 そこで、本発明は、上記に鑑みてなされたものであって、人物の顔の向きを適正に判定することができる顔向き判定システムを提供することを目的とする。 Therefore, the present invention has been made in view of the above, and it is an object of the present invention to provide a face orientation determination system that can appropriately determine the orientation of a person's face.
 上述した課題を解決し、目的を達成するために、本発明に係る顔向き検出システムは、人物の顔を撮像する撮像部と、前記撮像部により撮像された前記人物の顔の画像に基づいて、前記人物の目尻の位置を検出する目尻検出部と、前記目尻検出部により検出された前記目尻の位置と、前記人物の頭部を回旋する際の基準となる頭部回旋基準の位置とに基づいて、前記人物の顔の向きを検出する顔向き検出部と、を備えることを特徴とする。 In order to solve the above-described problems and achieve the object, a face orientation detection system according to the present invention includes an image capturing unit that captures an image of a person's face, and an image of the person's face captured by the image capturing unit. an eye corner detection unit for detecting the position of the corner of the eye of the person; the position of the corner of the eye detected by the corner of the eye detection unit; and a face orientation detection unit that detects the orientation of the person's face based on the above.
 本発明に係る顔向き検出システムは、人物が脇見した際に片方の目尻しか検出できない場合でも人物の顔の向きを判定することができ、この結果、人物の顔の向きを適正に判定することができる。 The face orientation detection system according to the present invention can determine the orientation of a person's face even when only one corner of the eye can be detected when the person looks aside, and as a result, the orientation of the person's face can be properly determined. can be done.
図1は、実施形態に係る顔向き検出システムの構成例を示すブロック図である。FIG. 1 is a block diagram showing a configuration example of a face orientation detection system according to an embodiment. 図2は、実施形態に係る頭軸の範囲を示す図である。FIG. 2 is a diagram showing the range of the head axis according to the embodiment. 図3は、実施形態に係る頭軸の位置の計算例を示す図である。FIG. 3 is a diagram showing a calculation example of the position of the head axis according to the embodiment. 図4は、実施形態に係る上下方向の顔向き角度の計算例を示す図である。FIG. 4 is a diagram illustrating an example of calculation of the face orientation angle in the vertical direction according to the embodiment. 図5は、実施形態に係る上下方向の顔向き角度の計算例を示す図である。FIG. 5 is a diagram illustrating an example of calculation of the face orientation angle in the vertical direction according to the embodiment. 図6は、実施形態に係る左右方向の顔向き角度の計算例を示す図である。FIG. 6 is a diagram illustrating a calculation example of the face direction angle in the horizontal direction according to the embodiment. 図7は、実施形態に係る左右方向の顔向き角度の計算例を示す図である。FIG. 7 is a diagram illustrating a calculation example of the face direction angle in the horizontal direction according to the embodiment. 図8は、実施形態に係る左右の顔向き方向の判定例を示す図である。FIG. 8 is a diagram illustrating an example of determination of left and right facing directions according to the embodiment. 図9は、実施形態に係る顔映像取得カメラと運転者との位置関係を示す図である。FIG. 9 is a diagram showing the positional relationship between the facial image capturing camera and the driver according to the embodiment. 図10は、実施形態に係るパラメータの更新例(増加)を示す図である。FIG. 10 is a diagram illustrating an update example (increase) of parameters according to the embodiment. 図11は、実施形態に係るパラメータの更新例(減少)を示す図である。FIG. 11 is a diagram illustrating an example of parameter update (decrease) according to the embodiment. 図12は、実施形態に係る左右方向における顔映像取得カメラの設置角度を示す図である。FIG. 12 is a diagram showing the installation angles of the facial image acquisition camera in the horizontal direction according to the embodiment. 図13は、実施形態に係る上下方向における顔映像取得カメラの設置角度を示す図である。FIG. 13 is a diagram showing installation angles of the face image capturing camera in the vertical direction according to the embodiment. 図14は、実施形態に係る顔向き検出結果(角度)の平均誤差を示す図である。FIG. 14 is a diagram illustrating average errors of face direction detection results (angles) according to the embodiment. 図15は、実施形態に係る脇見検出処理を示すフローチャートである。FIG. 15 is a flowchart showing inattentiveness detection processing according to the embodiment. 図16は、実施形態に係る顔向き検出処理を示すフローチャートである。FIG. 16 is a flowchart showing face orientation detection processing according to the embodiment. 図17は、実施形態に係る頭軸の位置の計算処理を示すフローチャートである。FIG. 17 is a flow chart showing processing for calculating the position of the head axis according to the embodiment. 図18は、実施形態に係るパラメータ更新処理を示すフローチャートである。FIG. 18 is a flowchart showing parameter update processing according to the embodiment. 図19は、実施形態の第1変形例に係る頭軸の位置の計算処理を示すフローチャートである。FIG. 19 is a flowchart showing processing for calculating the position of the head axis according to the first modified example of the embodiment. 図20は、実施形態の第2変形例に係る頭軸の位置の計算処理を示すフローチャートである。FIG. 20 is a flow chart showing calculation processing of the position of the head axis according to the second modification of the embodiment. 図21は、実施形態の第3変形例に係る頭軸の位置の計算処理を示すフローチャートである。FIG. 21 is a flow chart showing processing for calculating the position of the head axis according to the third modification of the embodiment.
 本発明を実施するための形態(実施形態)につき、図面を参照しつつ詳細に説明する。以下の実施形態に記載した内容により本発明が限定されるものではない。また、以下に記載した構成要素には、当業者が容易に想定できるもの、実質的に同一のものが含まれる。更に、以下に記載した構成は適宜組み合わせることが可能である。また、本発明の要旨を逸脱しない範囲で構成の種々の省略、置換又は変更を行うことができる。 The form (embodiment) for carrying out the present invention will be described in detail with reference to the drawings. The present invention is not limited by the contents described in the following embodiments. In addition, the components described below include those that can be easily assumed by those skilled in the art and those that are substantially the same. Furthermore, the configurations described below can be combined as appropriate. In addition, various omissions, substitutions, or changes in configuration can be made without departing from the gist of the present invention.
〔実施形態〕
 図面を参照しながら実施形態に係る顔向き検出システム1について説明する。顔向き検出システム1は、人物の顔の向きを検出するものであり、例えば、車両に搭載され、当該車両の運転席に着座した運転者の顔の向きを検出する。顔向き検出システム1は、例えば、図1に示すように、撮像部としての顔映像取得カメラ10と、目尻検出部としての顔特徴点検出ユニット20と、顔向き計算ユニット30とを備える。
[Embodiment]
A face orientation detection system 1 according to an embodiment will be described with reference to the drawings. The face orientation detection system 1 detects the orientation of a person's face. For example, it is installed in a vehicle and detects the orientation of the face of a driver sitting in the driver's seat of the vehicle. The face orientation detection system 1 includes, for example, a face image acquisition camera 10 as an imaging section, a facial feature point detection unit 20 as an eye corner detection section, and a face orientation calculation unit 30, as shown in FIG.
 ここで、運転者が正面を向いた状態で当該運転者の左右の目尻Eに沿った方向を左右方向Xと称し、上下に沿った方向を鉛直方向又は上下方向Yと称し、運転者が正面を向いた状態で当該運転者の視線に沿った方向を奥行き方向Zと称する。左右方向X、上下方向Y、及び、奥行き方向Zは、互いに交差し、典型的には直交する。 Here, when the driver faces the front, the direction along the left and right corners of the eyes E of the driver is referred to as the left-right direction X, the direction along the up-down direction is referred to as the vertical direction or the up-down direction Y, and the driver faces the front. A direction along the line of sight of the driver is referred to as a depth direction Z. The horizontal direction X, the vertical direction Y, and the depth direction Z intersect each other, and are typically orthogonal.
 顔映像取得カメラ10は、運転者の顔を撮像するものであり、例えば、車両に搭載され、当該車両の運転席に着座した運転者の顔を撮像する。顔映像取得カメラ10は、運転者の前方、例えば、車両のダッシュボードや車両のルーフの前方等に配置され、運転者のアイリプスの中心が撮像範囲の中心になるように設けられ、車両の前方から運転者の顔を撮像する。顔映像取得カメラ10は、顔特徴点検出ユニット20に接続され、撮像した運転者の顔を表す画像(動画(映像)又は静止画)データを顔特徴点検出ユニット20に出力する。 The face image capturing camera 10 captures the face of the driver. For example, it is mounted on a vehicle and captures the face of the driver seated in the driver's seat of the vehicle. The face image acquisition camera 10 is arranged in front of the driver, for example, in front of the dashboard or the roof of the vehicle, and is provided so that the center of the eyelid of the driver becomes the center of the imaging range. The driver's face is imaged from the The facial image acquisition camera 10 is connected to the facial feature point detection unit 20 and outputs image (moving (video) or still image) data representing the captured driver's face to the facial feature point detection unit 20 .
 顔特徴点検出ユニット20は、画像データから顔の特徴点を検出するものである。顔特徴点検出ユニット20は、例えば、Open CV等の周知の画像認識技術(機械学習)を用いて画像データから顔の特徴点を検出する。顔特徴点検出ユニット20は、例えば、画像データから顔の特徴点として左右の目尻E、左右の瞳孔等を検出し、これらの顔の部位の位置を検出する。ここでは、顔特徴点検出ユニット20は、検出した顔の部位の位置の情報を座標上の情報として扱う。顔特徴点検出ユニット20は、顔向き計算ユニット30に接続され、検出した左右の目尻Eの位置を表す情報(左右の目尻Eの位置座標を表す情報)を顔向き計算ユニット30に出力する。 The facial feature point detection unit 20 detects facial feature points from image data. The facial feature point detection unit 20 detects facial feature points from image data using, for example, known image recognition technology (machine learning) such as Open CV. The facial feature point detection unit 20 detects, for example, the left and right corners of the eyes E, the left and right pupils, etc. as facial feature points from the image data, and detects the positions of these facial parts. Here, the facial feature point detection unit 20 treats information on the position of the detected facial part as coordinate information. The facial feature point detection unit 20 is connected to the face direction calculation unit 30 and outputs information representing the detected positions of the left and right corners of the eye E (information representing the position coordinates of the left and right corners of the eye E) to the face direction calculation unit 30 .
 顔向き計算ユニット30は、運転者の顔の向きを計算するものであり、CPU、メモリを構成するROM、RAM及びインターフェースを含む周知のマイクロコンピュータを主体とする電子回路を含んで構成される。顔向き計算ユニット30は、目尻距離演算部31と、記憶部32と、顔向き検出部33と、パラメータ更新部34とを含んで構成され、顔の部位の位置の情報を座標上の情報として扱う。 The face orientation calculation unit 30 calculates the orientation of the driver's face, and includes an electronic circuit mainly composed of a well-known microcomputer including a CPU, a ROM that constitutes a memory, a RAM, and an interface. The face orientation calculation unit 30 includes an eye corner distance calculation unit 31, a storage unit 32, a face orientation detection unit 33, and a parameter update unit 34. deal.
 目尻距離演算部31は、人物が正面を向いた状態での左右の目尻Eの距離d(図3参照)を演算するものである。目尻距離演算部31は、顔特徴点検出ユニット20に接続され、当該顔特徴点検出ユニット20から出力された左右の目尻Eの位置を表す情報(左右の目尻Eの位置座標を表す情報)に基づいて、左右の目尻E(左目尻LE、右目尻RE)の距離dを演算すると共に、左右の目尻Eの中央の位置を演算する。目尻距離演算部31は、例えば、図3に示すように、左目尻LEの位置座標(xLE0、yLE0)と右目尻REの位置座標(xRE0、yRE0)との距離dを演算すると共に、左目尻LEの位置座標(xLE0、yLE0)と右目尻REの位置座標(xRE0、yRE0)との中央の位置座標(xC0、yC0)を演算する。目尻距離演算部31は、顔向き検出部33に接続され、演算した距離d及び中央の位置座標(xC0、yC0)を顔向き検出部33に出力する。 The eye corner distance calculation unit 31 calculates the distance d 0 (see FIG. 3) between the left and right corners of the eye E when the person is facing forward. The outer-eye corner distance calculator 31 is connected to the facial feature point detection unit 20, and converts information representing the positions of the left and right outer corners of the eye E (information representing the position coordinates of the left and right outer corners of the eye E) output from the facial feature point detection unit 20 into Based on this, the distance d0 between the left and right corners of the eye E (the left corner of the eye LE and the right corner of the eye RE) is calculated, and the center position of the left and right corners of the eye E is calculated. For example, as shown in FIG. 3, the outer corner of eye distance calculation unit 31 calculates the distance d 0 between the position coordinates (x LE0 , y LE0 ) of the left corner of the eye LE and the position coordinates (x RE0 , y RE0 ) of the right corner of the eye RE. At the same time, the central position coordinates (x C0 , y C0 ) between the position coordinates (x LE0 , y LE0 ) of the left corner of the eye LE and the position coordinates (x RE0 , y RE0 ) of the right corner of the eye RE are calculated. The eye corner distance calculator 31 is connected to the face orientation detector 33 and outputs the calculated distance d 0 and the center position coordinates (x C0 , y C0 ) to the face orientation detector 33 .
 記憶部32は、情報を記憶するものであり、例えば、RAM等の揮発性メモリや、ROM、フラッシュメモリ等の不揮発性メモリを含んで構成される。記憶部32は、例えば、顔向き計算ユニット30での各種処理に必要な条件や情報、顔向き計算ユニット30で実行する各種プログラムやアプリケーション、制御データ等が格納されている。そして、記憶部32は、人体の寸法を統計的に表した人体寸法データベースに基づいて予め定められた標準の人体の寸法を記憶している。記憶部32は、例えば、標準の人体の寸法として、図3に示すように、左右の目尻Eの標準距離d、左右の目尻Eの中央の位置と軸椎Gとの奥行き方向Zの標準距離w、左右の目尻Eの中央の位置と軸椎Gとの上下方向Yの距離h等を記憶している。すなわち、記憶部32は、標準の人体の寸法として、左右の目尻Eの標準距離d、左右の目尻Eの中央の位置と軸椎Gと標準距離w、及び、左右の目尻Eの中央の位置と軸椎Gとの標準距離hから定まる軸椎G(標準の頭軸P)の位置を記憶している。 The storage unit 32 stores information, and includes, for example, volatile memory such as RAM, and nonvolatile memory such as ROM and flash memory. The storage unit 32 stores, for example, conditions and information necessary for various processes in the face orientation calculation unit 30, various programs and applications to be executed in the face orientation calculation unit 30, control data, and the like. The storage unit 32 stores predetermined standard human body dimensions based on a human body dimension database that statistically represents human body dimensions. The storage unit 32 stores, for example, standard human body dimensions as shown in FIG. w, the distance h in the vertical direction Y between the central positions of the left and right corners of the eyes E and the axis G, etc. are stored. That is, the storage unit 32 stores, as standard human body dimensions, a standard distance d between the left and right corners of the eye E, a standard distance w between the center positions of the left and right corners of the eyes E and the axis G, and a standard distance w between the left and right corners of the eyes E. and the position of the axis G (standard head axis P) determined from the standard distance h between the axis G and the axis G.
 顔向き検出部33は、運転者の顔向きを検出するものである。顔向き検出部33は、例えば、顔特徴点検出ユニット20により検出された目尻Eの位置と、頭部回旋基準としての頭軸Pの位置とに基づいて、運転者の顔の向きを検出する。ここで、頭軸Pは、運転者の頭部を回旋する際の基準となる部分であり、図2に示すように、軸椎Gと座骨Jとの間に設定される。言い換えれば、運転者の頭部は、頭軸Pを基準として旋回する。顔向き検出部33は、記憶部32に記憶された標準の頭軸Pを個人に合わせて調整するパーソナライズ処理を行う。具体的には、顔向き検出部33は、予め定められた標準の頭軸Pの位置を、運転者が正面を向いた状態で顔映像取得カメラ10により撮像された運転者の顔の画像から求まる運転者の左右の目尻Eの距離dに応じて補正した補正後の頭軸Pの位置に基づいて、運転者の顔の向きを検出する。顔向き検出部33は、例えば、以下の式(1)を用いて、左目尻LEの位置座標(xLE0、yLE0)と右目尻REの位置座標(xRE0、yRE0)とから、頭軸PのX座標x(xc0)を求める。また、顔向き検出部33は、以下の式(2)~(4)を用いて、頭軸PのY座標yを求める。顔向き検出部33は、例えば、式(2)を用いて、左右の目尻Eの標準距離d、及び、目尻Eの位置(例えば、左右の目尻Eの中央の位置)と標準の頭軸Pとの上下方向Yの標準距離hからパラメータPを求め、式(3)を用いて、左右の目尻Eの距離dとパラメータPとから補正後の距離hを求め、式(4)を用いて、左目尻LEの位置座標(xLE0、yLE0)と右目尻REの位置座標(xRE0、yRE0)と補正後の距離hとから頭軸Pの補正後のY座標yを求める。つまり、顔向き検出部33は、左右の目尻Eの距離dを用いて、頭軸PのY座標yをパーソナライズ化している。また、顔向き検出部33は、以下の式(5)、(6)を用いて、左右の目尻Eの中央の位置と頭軸Pとの奥行き方向Zの距離wを求める。顔向き検出部33は、例えば、式(5)を用いて、左右の目尻Eの標準距離d、及び、左右の目尻Eの中央の位置と標準の頭軸Pとの奥行き方向Zの標準距離wとからパラメータPを求め、式(6)を用いて、左右の目尻Eの距離dとパラメータPとから補正後の距離wを求める。つまり、顔向き検出部33は、左右の目尻Eの距離dを用いて、頭軸PのZ座標をパーソナライズ化している。 The face direction detection unit 33 detects the face direction of the driver. The face orientation detection unit 33 detects the orientation of the driver's face, for example, based on the position of the outer corner of the eye E detected by the facial feature point detection unit 20 and the position of the head axis P as a head rotation reference. . Here, the head axis P is a reference portion for turning the driver's head, and is set between the axis G and the ischium J as shown in FIG. In other words, the driver's head pivots about the head axis P. The face orientation detection unit 33 performs personalization processing for adjusting the standard head axis P stored in the storage unit 32 to match the individual. Specifically, the face orientation detection unit 33 detects the position of the predetermined standard head axis P from the face image of the driver captured by the face image acquisition camera 10 with the driver facing forward. The orientation of the driver's face is detected based on the corrected position of the head axis P corrected according to the obtained distance d0 between the left and right corners of the eyes E of the driver. The face orientation detection unit 33 uses the following equation (1), for example, to determine the position of the head from the position coordinates (x LE0 , y LE0 ) of the left corner of the eye LE and the position coordinates (x RE0 , y RE0 ) of the right corner of the eye RE. Obtain the X coordinate x A (x c0 ) of the axis P. The face orientation detection unit 33 also obtains the Y coordinate y A of the head axis P using the following equations (2) to (4). For example, the face direction detection unit 33 uses Equation (2) to determine the standard distance d between the left and right corners of the eye E, the position of the corners of the eyes E (for example, the center position of the left and right corners of the eyes E), and the standard head axis P The parameter Ph is obtained from the standard distance h in the vertical direction Y between and, using equation (3), the corrected distance h 0 is obtained from the distance d 0 between the left and right corners of the eye E and the parameter Ph , and the equation (4 ), the corrected Y coordinate of the head axis P is calculated from the position coordinates (x LE0 , y LE0 ) of the left corner of the eye LE, the position coordinates (x RE0 , y RE0 ) of the right corner of the eye RE, and the corrected distance h 0 . Find yA . That is, the face orientation detection unit 33 personalizes the Y coordinate yA of the head axis P using the distance d0 between the left and right corners of the eyes E. In addition, the face orientation detection unit 33 obtains the distance w0 in the depth direction Z between the central position of the left and right corners of the eyes E and the head axis P using the following equations (5) and (6). The face direction detection unit 33 uses, for example, Equation (5) to determine the standard distance d between the left and right corners of the eyes E, and the standard distance in the depth direction Z between the central position of the left and right corners of the eyes E and the standard head axis P. A parameter Pw is obtained from , and the corrected distance w0 is obtained from the distance d0 between the left and right corners of the eye E and the parameter Pw using equation (6). That is, the face direction detection unit 33 personalizes the Z coordinate of the head axis P using the distance d0 between the left and right corners of the eyes E.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 顔向き検出部33は、パーソナライズ化した頭軸P、すなわち補正後の頭軸Pに基づいて運転者の顔向きを検出する。顔向き検出部33は、例えば、図4、図5に示すように、目尻Eの位置座標(yRE1、zRE1)と頭軸Pの位置座標との上下方向Yの距離hを検出する。顔向き検出部33は、検出した距離hに基づき、以下の式(7)~(18)を用いて、顔の上下方向Yの角度を求める。以下の式(7)~(18)では、「k」は、直角三角形の斜辺を表し、「θP0」は、正面を向いたときの顔の目尻Eと頭軸Pとの角度を表し、「θP1」は、顔の上下方向Yの角度を表し、「θP2」は、奥行き方向Zと斜辺kとの角度を表す。この例では、頭軸Pの位置座標を原点(0,0)としている。顔向き検出部33は、式(7)を用いて、角度θP2を求める。このとき、三角関数により、角度θP2は、式(8)で表される。顔向き検出部33は、式(7)と式(8)とを組み合わせると、式(9)となり、式(9)~式(18)に示すように順番に演算を進めると、最終的に式(18)を取得することができ、この結果、顔の上下方向Yの角度θP1を得られる。 The face direction detection unit 33 detects the face direction of the driver based on the personalized head axis P, that is, the head axis P after correction. For example, as shown in FIGS. 4 and 5, the face orientation detection unit 33 detects the distance h1 in the vertical direction Y between the position coordinates (y RE1 , z RE1 ) of the corner of the eye E and the position coordinates of the head axis P. . Based on the detected distance h1 , the face direction detection unit 33 uses the following equations (7) to (18) to obtain the angle of the face in the vertical direction Y. In the following formulas (7) to (18), "k" represents the hypotenuse of a right-angled triangle, "θ P0 " represents the angle between the outer corner of the eye E and the head axis P of the face when facing forward, “θ P1 ” represents the angle of the vertical direction Y of the face, and “θ P2 ” represents the angle between the depth direction Z and the oblique side k. In this example, the position coordinates of the head axis P are the origin (0, 0). The face orientation detection unit 33 obtains the angle θ P2 using Equation (7). At this time, the angle θ P2 is represented by Equation (8) using a trigonometric function. The face orientation detection unit 33 combines equations (7) and (8) to obtain equation (9). Equation (18) can be obtained, and as a result, the angle θ P1 of the vertical direction Y of the face can be obtained.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 顔向き検出部33は、例えば、図6、図7に示すように、右目尻REの位置座標(xRE1、zRE1)を検出する。顔向き検出部33は、検出した右目尻REの位置座標(xRE1、zRE1)に基づき、以下の式(19)~(28)を用いて、顔の左右方向Xの角度を求める。以下の式(19)~(28)では、「θY1」は、顔の左右方向Xの角度を表し、その範囲が0°≦θY1≦90°であり、「xc1」は、左右の目尻Eの中央の位置座標(X座標)を表し、「d」は、目尻Eの変位を表す。この例では、頭軸Pの位置座標を原点(0,0)としている。顔向き検出部33は、式(19)を用いて、角度θY1を求める。このとき、式(19)の「xc1」は、式(20)で表され、式(20)を式(19)に代入すると、式(21)が得られる。式(21)において、未知の変数は、「dx」だけである。式(21)は、三角関数により、式(22)に変形することができ、式(22)は、式(23)、式(24)に変形することができる。そして、式(25)に示すように、変数を置き換えると、式(26)を導き出すことができ、式(26)から式(27)、式(28)を導き出すことができる。顔向き検出部33は、式(28)から「dx」を算出することができ、算出した「dx」を式(21)に代入することで、顔の左右方向Xの角度θY1を得られる。 The face orientation detection unit 33 detects the position coordinates (x RE1 , z RE1 ) of the right corner of the eye RE, as shown in FIGS. 6 and 7, for example. Based on the detected position coordinates (x RE1 , z RE1 ) of the right corner of the eye RE, the face orientation detection unit 33 obtains the angle in the horizontal direction X of the face using the following equations (19) to (28). In the following equations (19) to (28), “θ Y1 ” represents the angle of the left-right direction X of the face, and the range is 0°≦θ Y1 ≦90°, and “x c1 ” represents the left-right angle. The center position coordinate (X coordinate) of the outer corner of the eye E is represented, and “d E ” represents the displacement of the outer corner of the eye E. In this example, the position coordinates of the head axis P are the origin (0, 0). The face orientation detection unit 33 obtains the angle θ Y1 using Equation (19). At this time, “x c1 ” in Equation (19) is represented by Equation (20), and Equation (21) is obtained by substituting Equation (20) into Equation (19). In equation (21), the only unknown variable is 'dx'. Equation (21) can be transformed into equation (22) using trigonometric functions, and equation (22) can be transformed into equations (23) and (24). Then, by replacing variables as shown in Equation (25), Equation (26) can be derived, and Equations (27) and (28) can be derived from Equation (26). The face direction detection unit 33 can calculate “dx” from equation (28), and by substituting the calculated “dx” into equation (21), the angle θ Y1 of the horizontal direction X of the face can be obtained. .
Figure JPOXMLDOC01-appb-M000003
 なお、式(20)において、「dx≧0」であり、顔が右を向いた場合、「xc1=xLE1+dx」となる。
Figure JPOXMLDOC01-appb-M000003
Note that in equation (20), if "dx≧0" and the face is turned to the right, then "x c1 =x LE1 +dx".
 顔向き検出部33は、図8に示すように、右目尻REの位置座標(xRE1、zRE1)が、正面を向いたときの右目尻REの位置座標(xRE0、zRE0)よりも大きい場合、顔が左側を向いていると判定する。一方で、顔向き検出部33は、右目尻REの位置座標(xRE1、zRE1)が、正面を向いたときの右目尻REの位置座標(xRE0、zRE0)よりも小さい場合、顔が右側を向いていると判定する。 As shown in FIG. 8, the face direction detection unit 33 detects that the position coordinates (x RE1 , z RE1 ) of the right corner of the eye RE are higher than the position coordinates (x RE0 , z RE0 ) of the right corner of the eye RE when facing forward. If it is larger, it is determined that the face is facing left. On the other hand, if the position coordinates (x RE1 , z RE1 ) of the right corner of the eye RE are smaller than the position coordinates (x RE0 , z RE0 ) of the right corner of the eye RE when the face is facing forward, the face orientation detection unit 33 is directed to the right.
 次に、パラメータ更新部34について説明する。パラメータ更新部34は、顔映像取得カメラ10の位置と運転者の位置との相対位置(位置ずれ)に応じて、上述したパラメータP、パラメータPを更新することで、人物が正面位置に対する顔映像取得カメラ10の相対位置に応じて標準の頭軸Pの位置を補正するものである。ここで、顔映像取得カメラ10は、車両の構造上、例えば、図9に示すように、運転者の正面からずれた位置に配置される場合が多い。この場合、顔映像取得カメラ10は、運転者が正面を視た際の正面方向Mから角度θaだけずれが生じた状態で運転者の顔を撮像する。このため、パラメータ更新部34は、顔向き検出部33により検出された顔の角度から、顔映像取得カメラ10のずれ量に相当する角度θaを減算する。これにより、理論上は、顔映像取得カメラ10のずれ量に相当する角度θaを相殺することができるが、顔映像取得カメラ10の取り付け精度の誤差、運転者の個人差(視線が正面を視ている状態で、顔が正面を向いてない可能性がある)等によって、顔向き検出部33により検出された顔の角度から、顔映像取得カメラ10のずれ量に相当する角度θaを減算しても、ゼロにならない場合がある。そこで、パラメータ更新部34は、顔向き検出部33により検出された顔の角度から、顔映像取得カメラ10のずれ量に相当する角度θaを減算した値の絶対値をゼロに近づける処理を行う。 Next, the parameter updating unit 34 will be explained. The parameter update unit 34 updates the above-described parameters P h and P w according to the relative position (positional deviation) between the position of the face image acquisition camera 10 and the position of the driver, so that the person is positioned relative to the front position. The position of the standard head axis P is corrected in accordance with the relative position of the facial image acquisition camera 10 . Here, due to the structure of the vehicle, the facial image acquisition camera 10 is often arranged at a position shifted from the front of the driver, as shown in FIG. 9, for example. In this case, the face image capturing camera 10 captures an image of the driver's face that is shifted by an angle θa from the front direction M when the driver looks straight ahead. Therefore, the parameter updating unit 34 subtracts the angle θa corresponding to the deviation amount of the face image acquisition camera 10 from the angle of the face detected by the face orientation detecting unit 33 . As a result, theoretically, the angle θa corresponding to the amount of deviation of the facial image capturing camera 10 can be offset. The angle θa corresponding to the deviation amount of the face image acquisition camera 10 is subtracted from the angle of the face detected by the face orientation detection unit 33. may not be zero. Therefore, the parameter updating unit 34 performs processing to bring the absolute value of the value obtained by subtracting the angle θa corresponding to the deviation amount of the face image capturing camera 10 from the angle of the face detected by the face direction detecting unit 33 closer to zero.
 パラメータ更新部34は、例えば、図10に示すように、上述したパラメータP、パラメータP、すなわちパラメータD1~D4を段階的に増加させる。このとき、顔向き検出部33は、段階的に増加されたパラメータP、パラメータPに基づいて顔の角度をそれぞれ検出する。パラメータ更新部34は、図10に示すように、パラメータP、パラメータP、すなわちパラメータD1~D4を順番に大きくするに従って、顔向き検出部33により検出された顔の角度から、顔映像取得カメラ10のずれ量に相当する角度θaを減算した値の絶対値が徐々に小さくなり、パラメータD1~D4の中でパラメータD3が最も小さくなるので、当該パラメータD3が表す値(パラメータP、パラメータP)に更新する。 For example, as shown in FIG. 10, the parameter update unit 34 increases the parameters P h and P w described above, that is, the parameters D1 to D4 step by step. At this time, the face orientation detection unit 33 detects the angle of the face based on the parameter P h and the parameter P w that are increased stepwise. As shown in FIG. 10, the parameter update unit 34 acquires a face image from the angle of the face detected by the face orientation detection unit 33 as the parameters P h and P w , that is, the parameters D1 to D4 are increased in order. The absolute value of the value obtained by subtracting the angle θa corresponding to the deviation amount of the camera 10 gradually decreases, and the parameter D3 is the smallest among the parameters D1 to D4. P w ).
 パラメータ更新部34は、図11に示すように、上述したパラメータP、パラメータP、すなわちパラメータD1~D4を段階的に減少させる。このとき、顔向き検出部33は、段階的に増加されたパラメータP、パラメータPに基づいて顔の角度をそれぞれ検出する。パラメータ更新部34は、図11に示すように、パラメータP、パラメータP、すなわちパラメータD1~D4を順番に小さくするに従って、顔向き検出部33により検出された顔の角度から、顔映像取得カメラ10のずれ量に相当する角度θaを減算した値の絶対値が徐々に小さくなり、パラメータD1~D4の中でパラメータD3が最も小さくなるので、当該パラメータD3が表す値(パラメータP、パラメータP)に更新する。パラメータ更新部34は、パラメータP、パラメータPを更新することで、例えば、図14に示すように、顔向き検出結果(角度)の平均誤差が「縦方向(Pitch)」において、改善前では「19.6」であったが、改善後では「4.5」と小さくなっており、また、顔向き検出結果(角度)の平均誤差が「横方向(Yaw)」において、改善前では「7.9」であったが、改善後では「3.8」と小さくなっている。なお、顔映像取得カメラ10のずれ量に相当する角度θa(図9参照)は、左右方向Xのずれ量であるが、通常、顔映像取得カメラ10のずれ量は、図12に示す左右方向Xの角度θCYaw、及び、図13に示す上下方向Yの角度θCPitchのずれ量を含む。パラメータ更新部34は、顔の上下方向Yの角度を求める場合には、上下方向Yの角度θCPitchのずれ量を減算し、顔の左右方向Xの角度を求める場合には、左右方向Xの角度θCYawを減算する。 The parameter updating unit 34, as shown in FIG. 11, decreases the parameters P h and P w described above, that is, the parameters D1 to D4 step by step. At this time, the face orientation detection unit 33 detects the angle of the face based on the parameter P h and the parameter P w that are increased stepwise. As shown in FIG. 11, the parameter update unit 34 acquires a face image from the angle of the face detected by the face orientation detection unit 33 as the parameters P h and P w , that is, the parameters D1 to D4 are sequentially decreased. The absolute value of the value obtained by subtracting the angle θa corresponding to the deviation amount of the camera 10 gradually decreases, and the parameter D3 is the smallest among the parameters D1 to D4. P w ). The parameter updating unit 34 updates the parameters P h and P w so that, for example, as shown in FIG. was "19.6" in the previous model, but after the improvement it was reduced to "4.5". It used to be "7.9", but after the improvement, it has decreased to "3.8". The angle θa (see FIG. 9) corresponding to the amount of deviation of the facial image acquisition camera 10 is the amount of deviation in the horizontal direction X. Usually, the amount of deviation of the facial image acquisition camera 10 is the horizontal direction The deviation amount of the angle θ CYaw of X and the angle θ CPitch in the vertical direction Y shown in FIG. 13 is included. When obtaining the angle of the face in the vertical direction Y, the parameter updating unit 34 subtracts the deviation amount of the angle θCPitch in the vertical direction Y, and when obtaining the angle of the face in the horizontal direction X, the parameter updating unit 34 subtracts Subtract the angle θ CYaw .
 次に、脇見検出処理について説明する。図15は、実施形態に係る脇見検出処理を示すフローチャートである。顔向き検出システム1は、運転者の視線を検出する視線検出部40を備える。顔映像取得カメラ10は、車両の前方から運転者の顔を撮像し、撮像した運転者の顔を表す画像データを顔特徴点検出ユニット20に出力する。顔特徴点検出ユニット20は、画像データから顔を検出し(ステップS1)、検出した顔の特徴点として左右の目尻E、左右の瞳孔等の位置を検出し、左右の目尻Eの位置を表す情報(左右の目尻Eの位置座標を表す情報)を顔向き計算ユニット30に出力し、左右の瞳孔の位置を表す情報を視線検出部40に出力する(ステップS2)。顔向き計算ユニット30は、顔特徴点検出ユニット20から出力された左右の目尻Eの位置を表す情報に基づいて顔の向きを検出する(ステップS3;後述する図16のフローチャートで示す検出処理)。視線検出部40は、顔特徴点検出ユニット20から出力された左右の瞳孔の位置を表す情報に基づいて視線の向きを検出する(ステップS4)。脇見判定処理部(図示省略)は、顔向き計算ユニット30によって検出された顔の向き、及び、視線検出部40によって検出された視線の向きに基づいて運転者の脇見を判定する(ステップS5)。そして、脇見判定処理部は、運転者が脇見していると判定した場合、音声等で脇見運転していることを報知し(ステップS6)、運転者が脇見していないと判定した場合、当該報知を行わない。 Next, inattentive looking detection processing will be described. FIG. 15 is a flowchart showing inattentiveness detection processing according to the embodiment. The face direction detection system 1 includes a line-of-sight detection unit 40 that detects the driver's line of sight. The face image capturing camera 10 captures an image of the driver's face from the front of the vehicle and outputs image data representing the captured face of the driver to the face feature point detection unit 20 . The facial feature point detection unit 20 detects a face from image data (step S1), detects the positions of the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the detected face, and represents the positions of the left and right corners of the eyes E. Information (information representing the positional coordinates of the left and right corners of the eyes E) is output to the face direction calculation unit 30, and information representing the positions of the left and right pupils is output to the line of sight detection section 40 (step S2). The face orientation calculation unit 30 detects the orientation of the face based on the information representing the positions of the left and right corners of the eyes E output from the facial feature point detection unit 20 (step S3; detection processing shown in the flowchart of FIG. 16 to be described later). . The line-of-sight detection unit 40 detects the direction of the line of sight based on the information representing the positions of the left and right pupils output from the facial feature point detection unit 20 (step S4). The inattentive-looking determination processing unit (not shown) determines whether the driver is looking aside based on the face direction detected by the face direction calculation unit 30 and the line-of-sight direction detected by the line-of-sight detection unit 40 (step S5). . If it is determined that the driver is looking aside, the inattentive determination processing unit notifies that the driver is looking aside by voice or the like (step S6). Do not notify.
 次に、顔向き検出処理について説明する。図16は、実施形態に係る顔向き検出処理を示すフローチャートである。顔向き検出システム1において、目尻距離演算部31は、顔特徴点検出ユニット20から出力された左右の目尻Eの位置を表す情報(左右の目尻Eの位置座標を表す情報)に基づいて、左右の目尻Eの距離dを演算すると共に、左右の目尻Eの中央の位置座標を演算し(ステップT1)、左右の目尻Eの中央の位置座標におけるX座標を頭軸PのX座標とする(ステップT2)。顔向き検出部33は、ステップT2の頭軸PのX座標を取得する。また、顔向き検出部33は、上述の式(2)を用いて、左右の目尻Eの標準距離d、及び、左右の目尻Eの中央の位置と標準の頭軸Pとの上下方向Yの標準距離hとから初期のパラメータPを求め、式(5)を用いて、左右の目尻Eの標準距離d、及び、左右の目尻Eの中央の位置と標準の頭軸Pとの奥行き方向Zの標準距離wとから初期のパラメータPを求める(ステップT3)。そして、顔向き検出部33は、式(3)を用いて、左右の目尻Eの距離dとパラメータPとから補正後の距離hを求め、式(6)を用いて、左右の目尻Eの距離dとパラメータPとから補正後の距離wを求め、これら初期の距離h、距離wに基づいて定まる補正後の頭軸Pに基づいて、顔の上下方向Yの角度(式(7)~(18))、及び、顔の左右方向Xの角度(式(19)~(28))を求める(ステップT4)。パラメータ更新部34は、予め定められた顔映像取得カメラ10の角度(角度θCPitch、角度θCYaw)を取得し(ステップT5)、顔向き検出部33により求めた初期の顔の角度が出力される(ステップT6)。そして、パラメータ更新部34は、顔向き検出部33により求めた初期の顔の角度から、顔映像取得カメラ10の角度(角度θCPitch、角度θCYaw)を減算した値の絶対値を求め、初期のパラメータP、パラメータPを増加又は減少させて更新する(ステップT7、T9)。顔向き検出部33は、式(3)を用いて、左右の目尻Eの距離dと更新後のパラメータPとから補正後の距離hを求め、式(6)を用いて、左右の目尻Eの距離dと更新後のパラメータPとから補正後の距離wを求め、これら更新後の距離h、距離wに基づいて定まる補正後の頭軸Pに基づいて、顔の上下方向Yの角度(式(7)~(18))、及び、顔の左右方向Xの角度(式(19)~(28))を求め(ステップT10)、顔の向きを出力する(ステップT11)。 Next, face orientation detection processing will be described. FIG. 16 is a flowchart showing face orientation detection processing according to the embodiment. In the face orientation detection system 1, the eye corner distance calculation unit 31 calculates the left and right corners of the eye based on the information representing the positions of the left and right corners of the eye E (information representing the position coordinates of the left and right corners of the eye E) output from the facial feature point detection unit 20. The distance d 0 of the outer corner of the eye E is calculated, and the position coordinates of the center of the left and right corners of the eye E are calculated (step T1). (Step T2). The face orientation detection unit 33 acquires the X coordinate of the head axis P in step T2. In addition, the face orientation detection unit 33 uses the above equation (2) to determine the standard distance d between the left and right corners of the eyes E, and the vertical direction Y between the center position of the left and right corners of the eyes E and the standard head axis P. The initial parameter P h is obtained from the standard distance h, and using equation (5), the standard distance d between the left and right outer corners of the eye E, and the depth direction between the central position of the left and right outer corners of the eye E and the standard head axis P An initial parameter Pw is obtained from the standard distance w of Z (step T3). Then, the face direction detection unit 33 uses equation (3) to obtain the corrected distance h 0 from the distance d 0 between the left and right corners of the eyes E and the parameter P h , and uses equation (6) to obtain the corrected distance h 0 . A corrected distance w 0 is obtained from the distance d 0 of the outer corner of the eye E and the parameter P w , and the vertical direction Y of the face is calculated based on the corrected head axis P determined based on the initial distance h 0 and the distance w 0 . angle (formulas (7) to (18)) and the angle of the horizontal direction X of the face (formulas (19) to (28)) are obtained (step T4). The parameter update unit 34 acquires predetermined angles (angle θ CPitch , angle θ CYaw ) of the face image acquisition camera 10 (step T5), and the initial face angle obtained by the face direction detection unit 33 is output. (step T6). Then, the parameter update unit 34 obtains the absolute value of a value obtained by subtracting the angle (angle θ CPitch , angle θ CYaw ) of the face image acquisition camera 10 from the initial face angle obtained by the face direction detection unit 33, and obtains the initial face angle. parameter P h and parameter P w are increased or decreased and updated (steps T7 and T9). The face direction detection unit 33 uses equation (3) to obtain the corrected distance h 0 from the distance d 0 between the left and right corners of the eyes E and the updated parameter P h , and uses equation (6) to obtain the corrected distance h 0 . The corrected distance w 0 is obtained from the distance d 0 of the outer corner of the eye E and the updated parameter P w , and based on the corrected head axis P determined based on the updated distance h 0 and distance w 0 , The angle of the vertical direction Y of the face (formulas (7) to (18)) and the angle of the horizontal direction X of the face (formulas (19) to (28)) are obtained (step T10), and the orientation of the face is output. (Step T11).
 次に、頭軸Pの位置の計算処理について説明する。図17は、実施形態に係る頭軸Pの位置の計算処理を示すフローチャートである。顔映像取得カメラ10は、運転者の顔が正面を向いた状態で当該運転者の顔を撮像する(ステップU1、U2)。顔特徴点検出ユニット20は、顔映像取得カメラ10により撮像された画像データから顔を検出し(ステップU3)、当該顔の特徴点として左右の目尻E、左右の瞳孔等を検出し、これらの顔の部位の位置を検出する(ステップU4)。目尻距離演算部31は、顔特徴点検出ユニット20から出力された左右の目尻Eの位置を表す情報(左右の目尻Eの位置座標を表す情報)に基づいて、左右の目尻Eの距離dを演算すると共に、左右の目尻Eの中央の位置を演算する(ステップU5)。顔向き検出部33は、予め定められた標準の頭軸Pの位置を、運転者が正面を向いた状態で顔映像取得カメラ10により撮像された運転者の顔の画像から求まる運転者の左右の目尻Eの距離dに応じて補正して補正後の頭軸Pの位置を演算する(ステップU6)。 Next, calculation processing for the position of the head axis P will be described. FIG. 17 is a flow chart showing calculation processing of the position of the head axis P according to the embodiment. The facial image acquisition camera 10 images the driver's face while the driver's face is facing forward (steps U1 and U2). The facial feature point detection unit 20 detects a face from the image data captured by the facial image acquisition camera 10 (step U3), and detects the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the face. The position of the part of the face is detected (step U4). Based on information representing the positions of the left and right corners of the eye E (information representing the position coordinates of the left and right corners of the eye E) output from the facial feature point detection unit 20, the outer corner of the eye distance calculator 31 calculates the distance d 0 between the left and right corners of the eye E. is calculated, and the central positions of the left and right corners of the eyes E are calculated (step U5). The face orientation detection unit 33 detects the position of the predetermined standard head axis P from the left and right sides of the driver obtained from the image of the driver's face taken by the face image acquisition camera 10 with the driver facing the front. The corrected position of the head axis P is calculated according to the distance d0 of the outer corner of the eye E (step U6).
 次に、パラメータ更新処理について説明する。図18は、実施形態に係るパラメータ更新処理を示すフローチャートである。顔向き検出部33は、初期のパラメータP、パラメータPを入力する(ステップL1)。そして、顔向き検出部33は、左右の目尻Eの距離dとパラメータPとから補正後の距離wを求め、左右の目尻Eの距離dとパラメータPとから補正後の距離hを求め、補正後の頭軸Pに基づいて、運転者の初期の顔の角度を演算する(ステップL2)。パラメータ更新部34は、顔向き検出部33により演算された初期の顔の角度と、顔映像取得カメラ10のずれ量に相当する角度θa(角度θCPitch、角度θCYaw)との初期の差分を演算する(ステップL3)。パラメータ更新部34は、パラメータP、パラメータPを増加させ(ステップL4)、顔向き検出部33は、左右の目尻Eの距離dと増加後のパラメータPとから補正後の距離wを求め、左右の目尻Eの距離dと増加後のパラメータPとから補正後の距離hを求め、補正後の頭軸Pに基づいて、運転者の顔の角度を演算する(ステップL5)。再度、パラメータ更新部34は、顔向き検出部33により演算された顔の角度と、顔映像取得カメラ10のずれ量に相当する角度θa(角度θCPitch、角度θCYaw)との差分を演算する(ステップL6)。パラメータ更新部34は、演算した差分が初期値(初期の差分)よりも小さい場合(ステップL7;Yes)、パラメータP、パラメータPを増加させる(ステップL8)。顔向き検出部33は、左右の目尻Eの距離dと増加後のパラメータPとから補正後の距離wを求め、左右の目尻Eの距離dと増加後のパラメータPとから補正後の距離hを求め、補正後の頭軸Pに基づいて、運転者の顔の角度を演算する(ステップL9)。パラメータ更新部34は、顔向き検出部33により演算された顔の角度と、顔映像取得カメラ10のずれ量に相当する角度θa(角度θCPitch、角度θCYaw)との差分を演算する(ステップL10)。パラメータ更新部34は、演算した差分が前回の差分よりも大きい場合(ステップL11;Yes)、前回のパラメータP、パラメータPを更新値とし(ステップL12)、処理を終了する。一方で、パラメータ更新部34は、演算した差分が前回の差分よりも小さい場合(ステップL11;No)、上述のステップL8に戻ってパラメータP、パラメータPを増加させる。 Next, parameter update processing will be described. FIG. 18 is a flowchart showing parameter update processing according to the embodiment. The face orientation detection unit 33 inputs the initial parameters P h and P w (step L1). Then, the face direction detection unit 33 obtains the corrected distance w0 from the distance d0 between the left and right corners of the eye E and the parameter Pw , and the corrected distance w0 from the distance d0 between the left and right corners of the eye E and the parameter Ph . h0 is obtained, and the initial face angle of the driver is calculated based on the corrected head axis P (step L2). The parameter update unit 34 updates the initial difference between the initial face angle calculated by the face direction detection unit 33 and the angle θa (angle θ CPitch , angle θ CYaw ) corresponding to the deviation amount of the face image capturing camera 10. Calculate (step L3). The parameter update unit 34 increases the parameter P h and the parameter P w (step L4), and the face orientation detection unit 33 calculates the corrected distance w from the distance d 0 between the left and right corners of the eyes E and the increased parameter P w 0 is obtained, the corrected distance h 0 is obtained from the distance d 0 between the left and right corners of the eyes E and the increased parameter P h , and the angle of the driver's face is calculated based on the corrected head axis P ( step L5). Again, the parameter update unit 34 calculates the difference between the face angle calculated by the face direction detection unit 33 and the angle θa (angle θ CPitch , angle θ CYaw ) corresponding to the deviation amount of the face image acquisition camera 10. (Step L6). If the calculated difference is smaller than the initial value (initial difference) (step L7; Yes), the parameter updating unit 34 increases the parameters P h and P w (step L8). The face direction detection unit 33 obtains the corrected distance w0 from the distance d0 between the left and right corners of the eyes E and the increased parameter Pw , and calculates the corrected distance w0 from the distance d0 between the left and right corners of the eyes E and the increased parameter Ph . The corrected distance h0 is obtained, and the angle of the driver's face is calculated based on the corrected head axis P (step L9). The parameter update unit 34 calculates the difference between the angle of the face calculated by the face direction detection unit 33 and the angle θa (angle θ CPitch , angle θ CYaw ) corresponding to the deviation amount of the face image acquisition camera 10 (step L10). If the calculated difference is larger than the previous difference (step L11; Yes), the parameter updating unit 34 sets the previous parameters P h and P w as updated values (step L12), and terminates the process. On the other hand, if the calculated difference is smaller than the previous difference (step L11; No), the parameter updating unit 34 returns to step L8 and increases the parameter P h and parameter P w .
 なお、上記ステップL7で、パラメータ更新部34は、演算した差分が初期値(初期の差分)よりも大きい場合(ステップL7;No)、パラメータP、パラメータPを減少させる(ステップL13)。顔向き検出部33は、左右の目尻Eの距離dと減少後のパラメータPとから補正後の距離wを求め、左右の目尻Eの距離dと減少後のパラメータPとから補正後の距離hを求め、補正後の頭軸Pに基づいて、運転者の顔の角度を演算する(ステップL14)。パラメータ更新部34は、顔向き検出部33により演算された顔の角度と、顔映像取得カメラ10のずれ量に相当する角度θa(角度θCPitch、角度θCYaw)との差分を演算する(ステップL15)。パラメータ更新部34は、演算した差分が前回の差分よりも大きい場合(ステップL16;Yes)、前回のパラメータP、パラメータPを更新値とし(ステップL17)、処理を終了する。一方で、パラメータ更新部34は、演算した差分が前回の差分よりも小さい場合(ステップL16;No)、上述のステップL13に戻ってパラメータP、パラメータPを減少させる。 In step L7, when the calculated difference is larger than the initial value (initial difference) (step L7; No), the parameter updating unit 34 reduces the parameters P h and P w (step L13). The face orientation detection unit 33 obtains the corrected distance w0 from the distance d0 between the left and right corners of the eyes E and the reduced parameter Pw , and calculates the corrected distance w0 from the distance d0 between the left and right corners of the eyes E and the reduced parameter Ph . The corrected distance h0 is obtained, and the angle of the driver's face is calculated based on the corrected head axis P (step L14). The parameter update unit 34 calculates the difference between the angle of the face calculated by the face direction detection unit 33 and the angle θa (angle θ CPitch , angle θ CYaw ) corresponding to the deviation amount of the face image acquisition camera 10 (step L15). If the calculated difference is greater than the previous difference (step L16; Yes), the parameter updating unit 34 sets the previous parameters P h and P w as update values (step L17), and terminates the process. On the other hand, if the calculated difference is smaller than the previous difference (step L16; No), the parameter updating unit 34 returns to step L13 and decreases the parameter P h and parameter P w .
 以上のように、実施形態に係る顔向き検出システム1は、顔映像取得カメラ10と、顔特徴点検出ユニット20と、顔向き検出部33とを備える。顔映像取得カメラ10は、人物の顔を撮像する。顔特徴点検出ユニット20は、顔映像取得カメラ10により撮像された人物の顔の画像に基づいて、人物の目尻Eの位置を検出する。顔向き検出部33は、顔特徴点検出ユニット20により検出された目尻Eの位置と、人物の頭部を回旋する際の基準となる頭軸Pの位置とに基づいて、人物の顔の向きを検出する。 As described above, the face orientation detection system 1 according to the embodiment includes the face image acquisition camera 10, the facial feature point detection unit 20, and the face orientation detection section 33. A face image capturing camera 10 captures an image of a person's face. The facial feature point detection unit 20 detects the positions of the corners of the eyes E of the person based on the image of the person's face taken by the face image acquisition camera 10 . The face orientation detection unit 33 detects the orientation of the person's face based on the position of the corner of the eye E detected by the facial feature point detection unit 20 and the position of the head axis P that serves as a reference when turning the person's head. to detect
 この構成により、顔向き検出システム1は、人物が脇見した際に片方の目尻しか検出できない場合やマスク等の装着物により顔の一部が覆われている場合でも、人物の顔の向きを判定することができる。また、顔向き検出システム1は、従来のように多数の顔の特徴点を用いた3次元の回転行列等を用いて顔の向きを検出せずに、2次元の座標データに基づいて顔の向きを検出するので、演算量が増加することを抑制することができ、処理負荷を軽減することができる。この結果、顔向き判定システムは、人物の顔の向きを適正に検出することができる。 With this configuration, the face orientation detection system 1 can determine the orientation of the person's face even if only one corner of the eye can be detected when the person looks aside, or even if the face is partially covered by an object such as a mask. can do. In addition, the face direction detection system 1 does not detect the face direction using a three-dimensional rotation matrix or the like using a large number of facial feature points as in the conventional art, but detects the face direction based on two-dimensional coordinate data. Since the orientation is detected, an increase in the amount of calculation can be suppressed, and the processing load can be reduced. As a result, the face orientation determination system can properly detect the orientation of the person's face.
 上記顔向き検出システム1において、顔向き検出部33は、上下方向Yにおける頭軸Pの位置と目尻Eの位置との間の距離hに基づいて上下方向Yにおける人物の顔の向きを検出する。この構成により、顔向き検出システム1は、演算量が増加することを抑制した上で、人物の上下方向Yの顔の向きを適正に検出することができる。 In the face orientation detection system 1, the face orientation detection unit 33 detects the orientation of the person's face in the vertical direction Y based on the distance h1 between the position of the head axis P and the position of the corner of the eye E in the vertical direction Y. do. With this configuration, the face direction detection system 1 can appropriately detect the face direction of the person in the vertical direction Y while suppressing an increase in the amount of calculation.
 上記顔向き検出システム1において、顔向き検出部33は、上下方向Yに交差する左右方向Xにおける頭軸Pの位置と目尻Eの位置(例えば、右目尻REの位置座標(xRE1、zRE1))とに基づいて左右方向Xにおける人物の顔の向きを検出する。この構成により、顔向き検出システム1は、演算量が増加することを抑制した上で、人物の左右方向Xの顔の向きを適正に検出することができる。 In the face orientation detection system 1, the face orientation detection unit 33 detects the position of the head axis P and the position of the outer corner of the eye E (for example, the position coordinates of the right outer corner of the eye RE (x RE1 , z RE1 )), the orientation of the person's face in the horizontal direction X is detected. With this configuration, the face orientation detection system 1 can appropriately detect the orientation of the person's face in the left-right direction X while suppressing an increase in the amount of calculation.
 上記顔向き検出システム1において、顔向き検出部33は、頭軸Pの位置として、予め定められた標準の頭軸Pの位置を、人物が正面を向いた状態で顔映像取得カメラ10により撮像された人物の顔の画像から求まる人物の左右の目尻Eの距離dに応じて補正した補正後の頭軸Pの位置に基づいて、人物の顔の向きを検出する。この構成により、顔向き検出システム1は、人物の個人差に応じて頭軸Pを補正することができるので、人物の顔の向きの検出精度を向上することができる。 In the face orientation detection system 1, the face orientation detection unit 33 captures a predetermined standard position of the head axis P as the position of the head axis P with the face image acquisition camera 10 while the person faces the front. The orientation of the person's face is detected based on the position of the corrected head axis P corrected according to the distance d0 between the left and right corners of the eyes E of the person obtained from the image of the person's face. With this configuration, the face direction detection system 1 can correct the head axis P according to individual differences of the person, so that the detection accuracy of the person's face direction can be improved.
 上記顔向き検出システム1において、顔向き検出部33は、人物の左右の目尻Eの距離dに応じて、上下方向Yにおける頭軸Pの位置と目尻Eの位置との間の距離h、及び、上下方向Yに交差する奥行き方向Zにおける頭軸Pの位置と左右の目尻Eの中央の位置との間の距離wを補正した補正後の頭軸Pの位置に基づいて、人物の顔の向きを検出する。この構成により、顔向き検出システム1は、人物の個人差に応じて頭軸Pを上下方向Y及び奥行き方向Zに沿って補正することができるので、人物の顔の向きの検出精度を向上することができる。 In the face orientation detection system 1, the face orientation detection unit 33 determines the distance h between the position of the head axis P and the position of the outer corner of the eye E in the vertical direction Y, according to the distance d0 between the left and right corners of the eye E of the person. Then, based on the corrected position of the head axis P obtained by correcting the distance w between the position of the head axis P in the depth direction Z intersecting the vertical direction Y and the central positions of the left and right corners of the eyes E, the human face is determined. to detect the orientation of With this configuration, the face direction detection system 1 can correct the head axis P along the vertical direction Y and the depth direction Z according to individual differences of the person, so that the detection accuracy of the person's face direction is improved. be able to.
 上記顔向き検出システム1において、顔向き検出部33は、頭軸Pの位置として、予め定められた標準の頭軸Pの位置を、人物が正面位置に対する顔映像取得カメラ10の相対位置に応じて補正した補正後の頭軸Pの位置に基づいて、人物の顔の向きを検出する。この構成により、顔向き検出システム1は、顔映像取得カメラ10の取り付け精度の誤差等を補正することができるので、人物の顔向きの検出精度の低下を抑制することができる。 In the face orientation detection system 1, the face orientation detection unit 33 sets a predetermined standard position of the head axis P as the position of the head axis P according to the relative position of the face image acquisition camera 10 with respect to the front position of the person. The orientation of the person's face is detected based on the corrected position of the head axis P corrected by . With this configuration, the face orientation detection system 1 can correct an error in the installation accuracy of the face image acquisition camera 10, etc., so that it is possible to suppress a decrease in accuracy in detecting the face orientation of a person.
 上記顔向き検出システム1において、人物が正面位置に対する顔映像取得カメラ10の相対位置に応じて、上下方向Yにおける頭軸Pの位置と目尻Eの位置との間の距離h、及び、上下方向Yに交差する奥行き方向Zにおける頭軸Pの位置と左右の目尻Eの中央の位置との間の距離wを補正するパラメータ更新部34をさらに備える。この構成により、顔向き検出システム1は、上下方向Y及び奥行き方向Zにおいて、顔映像取得カメラ10の取り付け精度の誤差等を補正することができるので、人物の顔向きの検出精度の低下を抑制することができる。 In the face direction detection system 1, the distance h between the position of the head axis P and the position of the corner of the eye E in the vertical direction Y and the distance h A parameter updating unit 34 is further provided for correcting the distance w between the position of the head axis P in the depth direction Z intersecting Y and the central positions of the left and right corners of the eyes E. With this configuration, the face direction detection system 1 can correct an error in mounting accuracy of the face image acquisition camera 10 in the vertical direction Y and the depth direction Z, thereby suppressing a decrease in accuracy in detecting the face direction of a person. can do.
〔変形例〕
 次に、実施形態の変形例について説明する。なお、変形例では、実施形態と同等の構成要素には同じ符号を付し、その詳細な説明を省略する。図19は、実施形態の第1変形例に係る頭軸Pの位置の計算処理を示すフローチャートである。第1変形例に係る頭軸Pの位置の計算処理は、顔認証により固有の情報を記憶する点で実施形態に係る頭軸Pの位置の計算処理とは異なる。図19において、顔映像取得カメラ10は、運転者の顔が正面を向いた状態で当該運転者の顔を撮像する(ステップV1、V2)。顔特徴点検出ユニット20は、顔映像取得カメラ10により撮像された画像データから顔を検出し(ステップV3)、当該顔の特徴点として左右の目尻E、左右の瞳孔等を検出し、これらの顔の部位の位置を検出する(ステップV4)。目尻距離演算部31は、顔特徴点検出ユニット20から出力された左右の目尻Eの位置を表す情報(左右の目尻Eの位置座標を表す情報)に基づいて、左右の目尻Eの距離dを演算すると共に、左右の目尻Eの中央の位置を演算する(ステップV5)。顔向き検出部33は、予め定められた標準の頭軸Pの位置を、運転者が正面を向いた状態で顔映像取得カメラ10により撮像された運転者の顔の画像から求まる運転者の左右の目尻Eの距離dに応じて補正して補正後の頭軸Pの位置を演算する(ステップV6)。顔認証部(図示省略)は、運転者の顔の認証を行い(ステップV7)、顔画像、頭軸Pの位置、座席位置等を記憶部32に記憶する(ステップV8)。
[Modification]
Next, modifications of the embodiment will be described. In addition, in the modified example, the same reference numerals are given to the same constituent elements as in the embodiment, and detailed description thereof will be omitted. FIG. 19 is a flow chart showing the calculation processing of the position of the head axis P according to the first modified example of the embodiment. The calculation processing of the position of the head axis P according to the first modification differs from the calculation processing of the position of the head axis P according to the embodiment in that unique information is stored by face authentication. In FIG. 19, the face image capturing camera 10 captures an image of the driver's face while the driver's face is facing forward (steps V1 and V2). The face feature point detection unit 20 detects a face from the image data captured by the face image acquisition camera 10 (step V3), detects the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the face, and detects these features. The positions of facial parts are detected (step V4). Based on information representing the positions of the left and right corners of the eye E (information representing the position coordinates of the left and right corners of the eye E) output from the facial feature point detection unit 20, the outer corner of the eye distance calculator 31 calculates the distance d 0 between the left and right corners of the eye E. is calculated, and the central positions of the left and right corners of the eyes E are calculated (step V5). The face orientation detection unit 33 detects the position of the predetermined standard head axis P from the left and right sides of the driver obtained from the image of the driver's face taken by the face image acquisition camera 10 with the driver facing the front. The corrected position of the head axis P is calculated according to the distance d0 of the outer corner of the eye E (step V6). A face authentication unit (not shown) authenticates the driver's face (step V7), and stores the face image, the position of the head axis P, the seat position, etc. in the storage unit 32 (step V8).
 図20は、実施形態の第2変形例に係る頭軸Pの位置の計算処理を示すフローチャートである。顔向き検出システム1は、車両の加速度を検出する加速度センサー50と、運転者が正面を向いたことを判定する正面判定部60とを備える。図20において、顔向き検出システム1は、加速度センサー50により車両の加速度を検出する(ステップW1)。加速度センサー50は、検出した加速度を正面判定部60に出力する。正面判定部60は、加速度センサー50により車両の加速度が検出された場合(ステップW2;Yes)、車両が前進して運転者が正面を向いたと判定する。顔映像取得カメラ10は、正面判定部60により運転者が正面を向いたと判定された場合、運転者の顔が正面を向いた状態で当該運転者の顔を撮像する(ステップW3)。顔特徴点検出ユニット20は、顔映像取得カメラ10により撮像された画像データから顔を検出し(ステップW4)、当該顔の特徴点として左右の目尻E、左右の瞳孔等を検出し、これらの顔の部位の位置を検出する(ステップW5)。目尻距離演算部31は、顔特徴点検出ユニット20から出力された左右の目尻Eの位置を表す情報(左右の目尻Eの位置座標を表す情報)に基づいて、左右の目尻Eの距離dを演算すると共に、左右の目尻Eの中央の位置を演算する(ステップW6)。顔向き検出部33は、予め定められた標準の頭軸Pの位置を、運転者が正面を向いた状態で顔映像取得カメラ10により撮像された運転者の顔の画像から求まる運転者の左右の目尻Eの距離dに応じて補正して補正後の頭軸Pの位置を演算する(ステップW7)。なお、顔向き検出システム1は、加速度センサー50により加速度が検出されない場合(ステップW2;No)、ステップW1に戻って、再度、加速度を検出する。以上のように、顔向き検出システム1は、車両の加速度を検出する加速度センサー50と、加速度センサー50により検出された車両の加速度に基づいて運転者が正面を向いたことを判定する正面判定部60とを更に備える。顔映像取得カメラ10は、正面判定部60により判定された車両の加速度に基づく判定結果に基づいて運転者の顔を撮像する。この構成により、顔向き検出システム1は、車両の加速度に基づいて運転者が正面を向いたことを自動的に判定して運転者の正面の顔を撮像することができる。 FIG. 20 is a flow chart showing the calculation processing of the position of the head axis P according to the second modified example of the embodiment. The face orientation detection system 1 includes an acceleration sensor 50 that detects the acceleration of the vehicle, and a front determination section 60 that determines whether the driver is facing forward. In FIG. 20, the face direction detection system 1 detects the acceleration of the vehicle using the acceleration sensor 50 (step W1). The acceleration sensor 50 outputs the detected acceleration to the front determination section 60 . When the acceleration sensor 50 detects the acceleration of the vehicle (step W2; Yes), the front determination unit 60 determines that the vehicle is moving forward and the driver is facing forward. When the front determination unit 60 determines that the driver faces the front, the face image capturing camera 10 captures the face of the driver while the driver faces the front (step W3). The face feature point detection unit 20 detects a face from the image data captured by the face image acquisition camera 10 (step W4), detects the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the face, and detects these features. The positions of facial parts are detected (step W5). Based on information representing the positions of the left and right corners of the eye E (information representing the position coordinates of the left and right corners of the eye E) output from the facial feature point detection unit 20, the outer corner of the eye distance calculator 31 calculates the distance d 0 between the left and right corners of the eye E. is calculated, and the central positions of the left and right corners of the eyes E are calculated (step W6). The face orientation detection unit 33 detects the position of the predetermined standard head axis P from the left and right sides of the driver obtained from the image of the driver's face taken by the face image acquisition camera 10 with the driver facing the front. Then, the corrected position of the head axis P is calculated according to the distance d0 of the outer corner of the eye E (step W7). If the acceleration sensor 50 does not detect acceleration (step W2; No), the face direction detection system 1 returns to step W1 and detects acceleration again. As described above, the face direction detection system 1 includes the acceleration sensor 50 that detects the acceleration of the vehicle, and the front determination unit that determines whether the driver is facing forward based on the acceleration of the vehicle detected by the acceleration sensor 50. 60. The face image capturing camera 10 captures an image of the driver's face based on the determination result based on the acceleration of the vehicle determined by the front determination unit 60 . With this configuration, the face orientation detection system 1 can automatically determine that the driver is facing forward based on the acceleration of the vehicle, and capture an image of the front face of the driver.
 図21は、実施形態の第3変形例に係る頭軸Pの位置の計算処理を示すフローチャートである。図21において、顔向き検出システム1は、視線検出部40により運転者の視線を検出する(ステップF1)。視線検出部40は、検出した運転者の視線を正面判定部60に出力する。正面判定部60は、視線検出部40により検出された視線が正面を向いている場合(ステップF2;Yes)、運転者が正面を向いていると判定する。顔映像取得カメラ10は、正面判定部60により運転者が正面を向いたと判定された場合、運転者の顔が正面を向いた状態で当該運転者の顔を撮像する(ステップF3)。顔特徴点検出ユニット20は、顔映像取得カメラ10により撮像された画像データから顔を検出し(ステップF4)、当該顔の特徴点として左右の目尻E、左右の瞳孔等を検出し、これらの顔の部位の位置を検出する(ステップF5)。目尻距離演算部31は、顔特徴点検出ユニット20から出力された左右の目尻Eの位置を表す情報(左右の目尻Eの位置座標を表す情報)に基づいて、左右の目尻Eの距離dを演算すると共に、左右の目尻Eの中央の位置を演算する(ステップF6)。顔向き検出部33は、予め定められた標準の頭軸Pの位置を、運転者が正面を向いた状態で顔映像取得カメラ10により撮像された運転者の顔の画像から求まる運転者の左右の目尻Eの距離dに応じて補正して補正後の頭軸Pの位置を演算する(ステップF7)。なお、顔向き検出システム1は視線検出部40により検出した視線が正面を向いていない場合(ステップF2;No)、ステップF1に戻って、再度、視線を検出する。以上のように、顔向き検出システム1は、運転者の視線を検出する視線検出部40と、視線検出部40により検出された運転者の視線に基づいて運転者が正面を向いたことを判定する正面判定部60とを更に備える。顔映像取得カメラ10は、正面判定部60により判定された運転者の視線に基づく判定結果に基づいて運転者の顔を撮像する。この構成により、顔向き検出システム1は、運転者の視線に基づいて運転者が正面を向いたことを自動的に判定して運転者の正面の顔を撮像することができる。 FIG. 21 is a flow chart showing the calculation processing of the position of the head axis P according to the third modified example of the embodiment. In FIG. 21, the face direction detection system 1 detects the line of sight of the driver using the line of sight detection unit 40 (step F1). The line of sight detection unit 40 outputs the detected line of sight of the driver to the front determination unit 60 . When the line of sight detected by the line of sight detection unit 40 is facing the front (step F2; Yes), the front determination unit 60 determines that the driver is facing the front. When the front determination unit 60 determines that the driver faces the front, the face image acquisition camera 10 captures the face of the driver while the driver faces the front (step F3). The facial feature point detection unit 20 detects a face from the image data captured by the face video acquisition camera 10 (step F4), detects the left and right corners of the eyes E, the left and right pupils, etc. as feature points of the face, and The positions of facial parts are detected (step F5). Based on information representing the positions of the left and right corners of the eye E (information representing the position coordinates of the left and right corners of the eye E) output from the facial feature point detection unit 20, the outer corner of the eye distance calculator 31 calculates the distance d 0 between the left and right corners of the eye E. is calculated, and the central positions of the left and right corners of the eyes E are calculated (step F6). The face orientation detection unit 33 detects the position of the predetermined standard head axis P from the left and right sides of the driver obtained from the image of the driver's face taken by the face image acquisition camera 10 with the driver facing the front. The corrected position of the head axis P is calculated according to the distance d0 of the outer corner of the eye E (step F7). When the line of sight detected by the line of sight detection unit 40 is not facing the front (step F2; No), the face orientation detection system 1 returns to step F1 and detects the line of sight again. As described above, the face orientation detection system 1 includes the line-of-sight detection unit 40 that detects the driver's line of sight, and determines whether the driver is facing forward based on the driver's line of sight detected by the line-of-sight detection unit 40. A front determination unit 60 is further provided. The face image acquisition camera 10 captures an image of the driver's face based on the determination result based on the line of sight of the driver determined by the front determination unit 60 . With this configuration, the face orientation detection system 1 can automatically determine that the driver is facing forward based on the line of sight of the driver, and capture an image of the front face of the driver.
 上記説明では、顔向き検出部33は、記憶部32に記憶された標準の頭軸Pを個人に合わせて補正するパーソナライズ処理を行う例について説明したが、これに限定されず、当該パーソナライズ処理を行わずに標準の頭軸Pに基づいて顔向きを検出してもよい。 In the above description, the face orientation detection unit 33 performs personalization processing for correcting the standard head axis P stored in the storage unit 32 according to the individual. The face direction may be detected based on the standard head axis P without performing the detection.
 パラメータ更新部34は、記憶部32に記憶された標準の頭軸Pを顔映像取得カメラ10の相対位置に応じて補正するパラメータ更新処理を行う例について説明したが、これに限定されず、当該パラメータ更新処理を行わずに標準の頭軸Pに基づいて顔向きを検出してもよい。 The parameter update unit 34 performs parameter update processing for correcting the standard head axis P stored in the storage unit 32 in accordance with the relative position of the face image acquisition camera 10, but is not limited to this. The face direction may be detected based on the standard head axis P without performing parameter update processing.
1 顔向き検出システム
10 顔映像取得カメラ(撮像部)
20 顔特徴点検出ユニット(目尻検出部)
33 顔向き検出部
34 パラメータ更新部
40 視線検出部
50 加速度センサー
E 目尻
P 頭軸(頭部回旋基準)
Y 上下方向(鉛直方向)
X 左右方向
Z 奥行き方向
h、h、d、w 距離
1 Face direction detection system 10 Face video acquisition camera (imaging unit)
20 facial feature point detection unit (eye corner detection unit)
33 face orientation detection unit 34 parameter update unit 40 line of sight detection unit 50 acceleration sensor E corner of eye P head axis (based on head rotation)
Y up-down direction (vertical direction)
X Horizontal direction Z Depth direction h, h 1 , d 0 , w Distance

Claims (9)

  1.  人物の顔を撮像する撮像部と、
     前記撮像部により撮像された前記人物の顔の画像に基づいて、前記人物の目尻の位置を検出する目尻検出部と、
     前記目尻検出部により検出された前記目尻の位置と、前記人物の頭部を回旋する際の基準となる頭部回旋基準の位置とに基づいて、前記人物の顔の向きを検出する顔向き検出部と、を備えることを特徴とする顔向き検出システム。
    an imaging unit that captures an image of a person's face;
    an eye corner detection unit that detects the positions of the corners of the eyes of the person based on the image of the person's face captured by the imaging unit;
    Face orientation detection for detecting the orientation of the person's face based on the position of the outer corner of the eye detected by the outer corner of the eye detection unit and a head rotation reference position serving as a reference for turning the head of the person. A face orientation detection system comprising:
  2.  前記顔向き検出部は、鉛直方向における前記頭部回旋基準の位置と前記目尻の位置との間の距離に基づいて前記鉛直方向における前記人物の顔の向きを検出する請求項1に記載の顔向き検出システム。 2. The face according to claim 1, wherein the face orientation detection unit detects the orientation of the person's face in the vertical direction based on the distance between the position of the head rotation reference and the position of the corner of the eye in the vertical direction. Orientation detection system.
  3.  前記顔向き検出部は、鉛直方向に交差する左右方向における前記頭部回旋基準の位置と前記目尻の位置とに基づいて前記左右方向における前記人物の顔の向きを検出する請求項1又は2に記載の顔向き検出システム。 3. The face orientation detection unit detects the orientation of the person's face in the left-right direction based on the position of the head rotation reference and the position of the corner of the eye in the left-right direction intersecting the vertical direction. The described face orientation detection system.
  4.  前記顔向き検出部は、前記頭部回旋基準の位置として、予め定められた標準の前記頭部回旋基準の位置を、前記人物が正面を向いた状態で前記撮像部により撮像された前記人物の顔の画像から求まる前記人物の左右の前記目尻の距離に応じて補正した補正後の前記頭部回旋基準の位置に基づいて、前記人物の顔の向きを検出する請求項1~3のいずれか1項に記載の顔向き検出システム。 The face direction detection unit detects a predetermined standard head rotation reference position as the head rotation reference position of the person captured by the imaging unit while the person faces the front. 4. The direction of the face of the person is detected based on the position of the head rotation reference after correction corrected according to the distance between the left and right corners of the eyes of the person obtained from the face image. 2. The face direction detection system according to claim 1.
  5.  前記顔向き検出部は、前記人物の左右の前記目尻の距離に応じて、鉛直方向における前記頭部回旋基準の位置と前記目尻の位置との間の距離、及び、前記鉛直方向に交差する奥行き方向における前記頭部回旋基準の位置と左右の前記目尻の中央の位置との間の距離を補正した補正後の前記頭部回旋基準の位置に基づいて、前記人物の顔の向きを検出する請求項4に記載の顔向き検出システム。 The face orientation detection unit detects the distance between the position of the head rotation reference in the vertical direction and the position of the outer corner of the eye, and the depth intersecting the vertical direction, according to the distance between the outer corners of the eyes on the left and right sides of the person. Detecting the orientation of the face of the person based on the position of the head rotation reference after the correction of the distance between the position of the head rotation reference in the direction and the center positions of the left and right corners of the eyes. Item 5. The face orientation detection system according to item 4.
  6.  前記顔向き検出部は、前記頭部回旋基準の位置として、予め定められた標準の頭部回旋基準の位置を、前記人物が正面位置に対する前記撮像部の相対位置に応じて補正した補正後の前記頭部回旋基準の位置に基づいて、前記人物の顔の向きを検出する請求項1~5のいずれか1項に記載の顔向き検出システム。 The face direction detection unit corrects a predetermined standard head rotation reference position as the head rotation reference position according to the relative position of the imaging unit with respect to the front position of the person. The face orientation detection system according to any one of claims 1 to 5, wherein the face orientation of the person is detected based on the position of the head rotation reference.
  7.  前記人物が正面位置に対する前記撮像部の相対位置に応じて、鉛直方向における前記頭部回旋基準の位置と前記目尻の位置との間の距離、及び、前記鉛直方向に交差する奥行き方向における前記頭部回旋基準の位置と左右の前記目尻の中央の位置との間の距離を補正するパラメータ更新部をさらに備える請求項6に記載の顔向き検出システム。 According to the relative position of the imaging unit with respect to the front position of the person, the distance between the position of the head rotation reference in the vertical direction and the position of the corner of the eye, and the head in the depth direction intersecting the vertical direction 7. The face orientation detection system according to claim 6, further comprising a parameter updating unit that corrects a distance between a position of a partial rotation reference and a center position of the left and right corners of the eyes.
  8.  車両の加速度を検出する加速度センサーと、
     前記加速度センサーにより検出された前記車両の加速度に基づいて前記人物が正面を向いたことを判定する正面判定部とを更に備え、
     前記撮像部は、前記正面判定部により判定された前記車両の加速度に基づく判定結果に応じて前記人物の顔を撮像する請求項1~7のいずれか1項に記載の顔向き検出システム。
    an acceleration sensor that detects the acceleration of the vehicle;
    a front determination unit that determines whether the person is facing forward based on the acceleration of the vehicle detected by the acceleration sensor;
    The face orientation detection system according to any one of claims 1 to 7, wherein the imaging section images the face of the person according to the determination result based on the acceleration of the vehicle determined by the front determination section.
  9.  前記人物の視線を検出する視線検出部と、
     前記視線検出部により検出された前記人物の視線に基づいて前記人物が正面を向いたことを判定する正面判定部とを更に備え、
     前記撮像部は、前記正面判定部により判定された前記人物の視線に基づく判定結果に応じて前記人物の顔を撮像する請求項1~8のいずれか1項に記載の顔向き検出システム。
    a line-of-sight detection unit that detects the line of sight of the person;
    a front determination unit that determines whether the person faces the front based on the person's line of sight detected by the line of sight detection unit;
    The face orientation detection system according to any one of claims 1 to 8, wherein the imaging unit images the face of the person according to the determination result based on the line of sight of the person determined by the front determination unit.
PCT/JP2022/042961 2021-12-14 2022-11-21 Face direction determination system WO2023112606A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021202576A JP7418386B2 (en) 2021-12-14 2021-12-14 Face orientation determination system
JP2021-202576 2021-12-14

Publications (1)

Publication Number Publication Date
WO2023112606A1 true WO2023112606A1 (en) 2023-06-22

Family

ID=86774113

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/042961 WO2023112606A1 (en) 2021-12-14 2022-11-21 Face direction determination system

Country Status (2)

Country Link
JP (1) JP7418386B2 (en)
WO (1) WO2023112606A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009104524A (en) * 2007-10-25 2009-05-14 Hitachi Ltd Gaze direction measuring method and gaze direction measuring device
WO2015186519A1 (en) * 2014-06-06 2015-12-10 シャープ株式会社 Image processing device and image display device
JP2020144546A (en) * 2019-03-05 2020-09-10 アルパイン株式会社 Facial feature detection apparatus and facial feature detection method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018163411A (en) 2017-03-24 2018-10-18 アルプス電気株式会社 Face direction estimation device and face direction estimation method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009104524A (en) * 2007-10-25 2009-05-14 Hitachi Ltd Gaze direction measuring method and gaze direction measuring device
WO2015186519A1 (en) * 2014-06-06 2015-12-10 シャープ株式会社 Image processing device and image display device
JP2020144546A (en) * 2019-03-05 2020-09-10 アルパイン株式会社 Facial feature detection apparatus and facial feature detection method

Also Published As

Publication number Publication date
JP7418386B2 (en) 2024-01-19
JP2023087986A (en) 2023-06-26

Similar Documents

Publication Publication Date Title
KR101787304B1 (en) Calibration method, calibration device, and computer program product
US10620000B2 (en) Calibration apparatus, calibration method, and calibration program
JP4814669B2 (en) 3D coordinate acquisition device
JP5049300B2 (en) Obstacle detection display
US20160037032A1 (en) Method for detecting mounting posture of in-vehicle camera and apparatus therefor
JP4865517B2 (en) Head position / posture detection device
CN110537207B (en) Face orientation estimating device and face orientation estimating method
JP6722878B2 (en) Face recognition device
US10740923B2 (en) Face direction estimation device and face direction estimation method for estimating the direction of a face represented on an image
JP2019168954A (en) Visual line direction estimation device, visual line direction estimation method, and visual line direction estimation program
JPWO2019163124A1 (en) 3D position estimation device and 3D position estimation method
JP2016001841A (en) Parallax image generation apparatus, parallax image generation method, and image
JP6947066B2 (en) Posture estimator
JP2007257333A (en) Vehicle occupant facing direction detection device and vehicle occupant facing direction detection method
JP2019148491A (en) Occupant monitoring device
WO2018167995A1 (en) Driver state estimation device and driver state estimation method
JP2009139324A (en) Travel road surface detecting apparatus for vehicle
WO2023112606A1 (en) Face direction determination system
JP2020107938A (en) Camera calibration device, camera calibration method, and program
JP6202356B2 (en) Image processing apparatus and method, and imaging apparatus
JP2021193340A (en) Self-position estimation device
KR101875966B1 (en) A missing point restoration method in face recognition for vehicle occupant
JP2020035158A (en) Attitude estimation device and calibration system
JP5098981B2 (en) Face detection device
JP5279453B2 (en) Image shake correction apparatus, imaging apparatus, and image shake correction method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22907137

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE