WO2023095229A1 - Wakefulness estimation device and wakefulness estimation method - Google Patents

Wakefulness estimation device and wakefulness estimation method Download PDF

Info

Publication number
WO2023095229A1
WO2023095229A1 PCT/JP2021/043119 JP2021043119W WO2023095229A1 WO 2023095229 A1 WO2023095229 A1 WO 2023095229A1 JP 2021043119 W JP2021043119 W JP 2021043119W WO 2023095229 A1 WO2023095229 A1 WO 2023095229A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
distance
face
driver
feature points
Prior art date
Application number
PCT/JP2021/043119
Other languages
French (fr)
Japanese (ja)
Inventor
智大 松本
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2021/043119 priority Critical patent/WO2023095229A1/en
Priority to JP2023561121A priority patent/JP7403729B2/en
Publication of WO2023095229A1 publication Critical patent/WO2023095229A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present disclosure relates to an arousal level estimation device and an arousal level estimation method.
  • a person's face with wrinkled brows is not limited to the case of sleepiness and decreased alertness. People may also wrinkle their eyebrows when they feel glare. Therefore, it is difficult to distinguish whether the person's arousal level has decreased or whether the person feels dazzling only from the expression that the person has wrinkled the eyebrows. In the conventional technology represented by the technology disclosed in Patent Document 1, it is difficult to distinguish whether the person's arousal level has decreased or the person feels dazzling only from the expression that the person has wrinkled the eyebrows. , there is a possibility of erroneously inferring that a person's arousal has decreased.
  • the present disclosure has been made in order to solve the above problems, and when a person is in a state of wrinkling their eyebrows, it is assumed that the surrounding situation is a situation in which the person feels dazzling. It is an object of the present invention to provide an arousal level estimation device that prevents erroneous estimation of a decrease in a person's arousal level by estimating a person's arousal level by taking into consideration.
  • the arousal level estimation device includes a face determination information acquisition unit that acquires face determination information indicating whether or not a person has a face with wrinkles between the eyebrows, and a range in which the person's face should be captured. and an estimation unit for estimating the degree of arousal of a person based on the face determination information acquired by the face determination information acquisition unit and the brightness detected by the brightness detection unit. It is a thing.
  • the present disclosure by estimating the degree of arousal of a person in consideration of whether the surrounding situation is a situation where the person is assumed to feel dazzling when the person is in a state of wrinkling the eyebrows. , it is possible to prevent misestimation of a person's arousal reduction.
  • FIG. 1 is a diagram showing a configuration example of an arousal level estimation device according to Embodiment 1;
  • FIG. 4 is a flowchart for explaining the operation of the wakefulness estimation device according to Embodiment 1;
  • FIG. 3 is a flowchart for explaining details of a patient face determination process by a face determination unit in step ST5 of FIG. 2;
  • FIG. 3 is a flowchart for explaining the details of arousal level estimation processing by an estimation unit in step ST8 of FIG. 2;
  • FIG. 4 is a flow chart for explaining the calculation operation of the distance between reference feature points by the reference distance calculation unit in Embodiment 1.
  • FIG. 6A and 6B are diagrams showing an example of the hardware configuration of the wakefulness estimation device according to Embodiment 1.
  • FIG. 6A and 6B are diagrams showing an example of the hardware configuration of the wakefulness estimation device according to Embodiment 1.
  • FIG. 6A and 6B are diagrams showing an example of the hardware configuration of the wakefulness estimation device according
  • FIG. 1 is a diagram showing a configuration example of an arousal level estimation device provided with a personal authentication function in Embodiment 1.
  • FIG. 4 is a flowchart for explaining the operation of the awakening level estimation device 1 having a personal authentication function in Embodiment 1.
  • FIG. 4 is a flowchart for explaining the operation of the awakening level estimation device 1 having a personal authentication function in Embodiment 1.
  • Embodiment 1 estimates a person's arousal level based on a captured image in which at least the face of the person is captured.
  • the arousal level estimating device detects whether the face is due to a decrease in the person's arousal level or because the person feels that the person is dazzled. It distinguishes whether it is a face or not, and estimates the degree of arousal of a person.
  • a face with wrinkles between the eyebrows is referred to as a "bearing face”.
  • a "bearing face” is a so-called frowning face with wrinkled eyebrows and narrowed eyes.
  • the person whose awakening level is estimated by the awakening level estimation device is the driver of the vehicle.
  • the result of estimating whether the driver's wakefulness has decreased, in other words, whether the driver is drowsy, estimated by the wakefulness estimating device, is used, for example, to detect dozing off of the driver.
  • FIG. 1 is a diagram showing a configuration example of an arousal level estimation device 1 according to Embodiment 1.
  • Arousal level estimation device 1 includes face determination device 2 , face determination information acquisition unit 11 , luminance detection unit 12 , and estimation unit 13 .
  • Arousal level estimation device 1 is connected to imaging device 3 .
  • the imaging device 3 is mounted on a vehicle (not shown) and installed so as to be capable of imaging at least the range in which the driver's face should exist.
  • the imaging device 3 may be shared with a so-called DMS (Driver Monitoring System) installed for the purpose of monitoring the interior of the vehicle.
  • the imaging device 3 is a visible light camera or an infrared camera.
  • the imaging device 3 outputs the captured image to the face determination device 2 included in the wakefulness estimation device 1 .
  • the face determination device 2 acquires the captured image and determines whether or not the driver is making a patient face based on the acquired captured image.
  • the face determination device 2 outputs information (hereinafter referred to as “face determination information”) indicating whether or not the driver is making a patient face to the face determination information acquisition section 11 .
  • face determination information information indicating whether or not the driver is making a patient face to the face determination information acquisition section 11 .
  • the face determination information acquisition unit 11 acquires face determination information from the face determination device 2 .
  • the face determination information acquisition unit 11 outputs the acquired face determination information to the estimation unit 13 .
  • the brightness detection unit 12 detects brightness in the captured image captured by the imaging device 3 .
  • the luminance detection unit 12 may acquire the captured image via the face determination device 2 .
  • the brightness in the captured image is, for example, the average value of brightness values of pixels in the captured image.
  • the luminance detection unit 12 may set, for example, the median value of the luminance values of the pixels in the captured image as the luminance in the captured image.
  • the luminance detection unit 12 should detect the luminance of at least an area including the driver's face (hereinafter referred to as "face area") on the captured image.
  • face area an area including the driver's face
  • the brightness detection unit 12 may identify the driver's face area from the captured image using a known image processing technique.
  • the brightness detection unit 12 may acquire information on feature points indicating the driver's face (hereinafter referred to as “feature point information”) from a feature point detection unit 22 (described later) of the face determination device 2, for example. .
  • the luminance detection unit 12 outputs information about the detected luminance in the captured image (hereinafter referred to as “luminance information”) to the estimation unit 13 .
  • the luminance detection unit 12 uses information that associates the captured image with the detected luminance as luminance information.
  • the estimating unit 13 estimates the arousal level of the driver based on the face determination information acquired by the face determination information acquiring unit 11 and the brightness of the captured image detected by the brightness detecting unit 12 .
  • the degree of arousal estimated by the estimation unit 13 may be, for example, a degree of arousal represented by whether the degree of arousal is high or low, or may be a degree of arousal represented by stages such as five stages.
  • the estimation unit 13 first determines whether or not the face determination information is face determination information indicating that the driver is making a patient face.
  • the estimating unit 13 determines that the brightness of the captured image is equal to a preset threshold value (hereinafter referred to as "brightness determination threshold value"). Determine if it is less than If the luminance in the captured image is less than the threshold value for luminance determination, the estimating unit 13 estimates that the driver's wakefulness has decreased, in other words, the driver is drowsy. The estimating unit 13 estimates that the driver's arousal level has not decreased if the luminance in the captured image is equal to or higher than the threshold value for luminance determination.
  • the state in which the driver's arousal level has not decreased includes a state in which the driver's arousal level has increased and a state in which the driver's arousal level is maintained without change.
  • the situations in which the driver makes a patient face are assumed to be situations in which the level of arousal has decreased, and situations in which the surroundings are bright and the driver feels dazzled.
  • the driver feels dazzled when sunlight is shining around the driver, in this case, inside the vehicle.
  • the driver feels dazzling he/she may put up a face even if he/she is not drowsy. That is, when the driver makes a patient face in a situation where it is assumed that the surroundings are bright, it can be estimated that the driver is not drowsy but that the driver made the patient face due to the glare.
  • the driver makes a patient face in a situation where the surroundings are not bright enough to feel dazzling it can be inferred that the driver made a patient face because he is drowsy.
  • the estimating unit 13 determines that the patient face is: It is presumed that this was done because the driver's arousal level was lowered, that is, the driver's arousal level was lowered. If the brightness in the captured image is less than the threshold for brightness determination, it can be determined that the surroundings of the driver are not bright enough to make the driver feel dazzled.
  • the brightness determination threshold value is set based on the brightness of the captured image captured by the administrator or the like under the brightness that is assumed to make the driver feel dazzled.
  • the imaging range here is the interior of the vehicle including at least the range where the driver's face should be present.
  • the face determination information is face determination information indicating that the driver is making a patient face and the luminance in the captured image is equal to or higher than the threshold value for luminance determination
  • the estimating unit 13 determines that the patient face is: It is presumed that this is not caused by a decrease in the driver's arousal level, that is, the driver's arousal level has not decreased. In other words, it is presumed that the persevering face was made due to the feeling of being dazzled. If the brightness in the captured image is equal to or higher than the brightness determination threshold, it can be determined that the surroundings of the driver are bright enough to make the driver feel dazzled.
  • the estimating unit 13 determines whether the patient face is due to a decrease in the driver's arousal level by determining the brightness around the driver. It is discriminated whether it is due to the driver's feeling of being dazzled rather than the decrease in the arousal level of the driver.
  • the estimation unit 13 If it is determined that there is a shadow around the driver's face based on the luminance distribution, it may be estimated that the driver's arousal level has decreased.
  • the luminance information output from the luminance detection unit 12 to the estimation unit 13 includes the captured image.
  • the estimation unit 13 may determine the luminance distribution on the captured image from the luminance value of each pixel on the captured image. Also, the estimation unit 13 may identify the driver's face area on the captured image using a known image processing technique.
  • the face area is, for example, a minimum rectangular area that includes all feature points that indicate the outline of the driver. Then, for example, if the average value of the brightness values of the pixels included in the face region is equal to or less than a preset threshold value (hereinafter referred to as “shadow determination threshold value”), the estimation unit 13 is determined to have a shadow. For example, the estimation unit 13 determines whether or not there is a shadow around the face of the driver by comparing the average luminance value of pixels included in the area where the driver's eyes exist (hereinafter referred to as "eye area”) and the shadow determination value. You may judge by comparison with a threshold value.
  • the eye area is, for example, a minimum rectangular area that includes all characteristic points indicating the driver's eyes, such as the inner and outer corners of the eye.
  • the driver's face may not be shadowed or the driver's face may be illuminated only.
  • the area around the face is locally bright, and the driver may feel dazzled.
  • the driver makes a patient face due to the glare.
  • the driver makes a patient face in a situation where the area around the driver's face is locally bright, it can be inferred that this is not due to a decrease in arousal level but due to the feeling of being dazzled. .
  • the estimation unit 13 If it is determined that the area around the driver's face is bright based on the luminance distribution, it may be estimated that the driver's arousal level has not decreased. For example, if the average value of the brightness values of the pixels included in the face region of the driver on the captured image is equal to or greater than a preset threshold value (hereinafter referred to as "brightness determination threshold value"), the estimation unit 13 , it is determined that the area around the driver's face is bright. For example, the estimating unit 13 may determine whether or not the area around the driver's face is bright by comparing the average brightness value of pixels included in the driver's eye region with a brightness determination threshold.
  • a preset threshold value hereinafter referred to as "brightness determination threshold value
  • the estimating unit 13 determines whether or not the face determination information indicates that the driver is making a patient face. As a result, the face determination information indicates that the driver is not making a patient face. In other words, when the face determination device 2 determines that the driver is not making a patient face, using a known image-based arousal level estimation technique, Estimate the degree of arousal of the driver. For example, the estimating unit 13 estimates that the driver's wakefulness has decreased when the driver's eye opening degree detected based on the captured image is equal to or less than a preset threshold value. The estimating unit 13 estimates that the driver's arousal level has not decreased when the eye opening degree of the driver is larger than a preset threshold value.
  • the estimation unit 13 When the estimation unit 13 estimates that the driver's arousal level has decreased, it outputs information (hereinafter referred to as "warning information") to alert the driver. For example, the estimation unit 13 outputs warning information for outputting a warning sound to an output device (not shown).
  • the output device is assumed to be, for example, a speaker mounted on a vehicle.
  • the estimation unit 13 When the estimation unit 13 outputs warning information, the output device outputs a warning sound.
  • the estimation unit 13 may store the estimation result of the degree of arousal of the driver in the storage unit 26 in association with the estimation date and time. In this case, the estimation unit 13 can determine the tendency of the degree of arousal of the driver, such as how much the degree of arousal of the driver has changed, based on the past estimation result.
  • the face determination device 2 includes an image acquisition section 21 , a feature point detection section 22 , a distance calculation section 23 , a reference distance calculation section 24 , a face determination section 25 and a storage section 26 .
  • the face determination section 25 includes a recalculation instruction section 251 .
  • the image acquisition unit 21 acquires a captured image from the imaging device 3 .
  • the image acquisition unit 21 outputs the acquired captured image to the feature point detection unit 22 and the brightness detection unit 12 .
  • the feature point detection unit 22 detects a plurality of feature points of the driver's face on the captured image based on the captured image acquired by the image acquisition unit 21 .
  • the plurality of feature points detected by the feature point detection unit 22 include at least feature points indicating the inner corners of the driver's eyebrows.
  • the plurality of feature points detected by the feature point detection unit 22 are feature points indicating the inner corners of the eyebrows of the driver.
  • the feature point detection unit 22 outputs feature point information regarding the detected feature points, specifically, feature points indicating the inner corners of the eyebrows of the driver, to the distance calculation unit 23 .
  • the feature point detection unit 22 may detect feature points indicating the upper and lower eyelids of both eyes of the driver and output feature point information indicating the feature points to the estimation unit 13 .
  • the estimating unit 13 can, for example, calculate the degree of eye opening of the driver based on the feature point information output from the feature point detecting unit 22, and estimate the degree of arousal of the driver from the calculated degree of eye opening.
  • the distance calculation unit 23 converts the coordinates of the plurality of feature points detected by the feature point detection unit 22 on the captured image into three-dimensional coordinates. Then, the distance calculation unit 23 calculates distances between the plurality of feature points (hereinafter referred to as "distances between feature points") from the converted three-dimensional coordinates. Specifically, here, the distance calculation unit 23 converts the coordinates of the feature points indicating the inner corners of the driver's eyebrows on the captured image into three-dimensional coordinates, and calculates the distance between the inner corners of the driver's eyebrows from the converted three-dimensional coordinates. The distance (hereinafter referred to as "eyebrow distance”) is calculated as the distance between feature points.
  • the distance calculation unit 23 may convert the coordinates of the feature points on the captured image into three-dimensional coordinates using a known technique for converting two-dimensional coordinates into three-dimensional coordinates, such as AAM (Active Appearance Model).
  • the distance calculation unit 23 outputs information about the calculated distance between feature points (hereinafter referred to as “distance information”) to the face determination unit 25 . Further, the distance calculation unit 23 causes the storage unit 26 to store the distance information. Note that the distance calculation unit 23 stores the distance information in chronological order in association with the calculation time of the inter-feature point distance. The distance information stored in the storage unit 26 is deleted when the power of the vehicle is turned off.
  • the reference distance calculation unit 24 refers to the distance information stored in the storage unit 26, and determines the distance between feature points calculated by the distance calculation unit 23 for a preset period (hereinafter referred to as "reference calculation target period"). ), the distance between reference feature points is calculated.
  • the reference inter-feature point distance is the inter-feature point distance that is used as a reference when determining whether or not the driver is making a patient face in the face determination device 2, and is the feature point when the driver is not making a patient face. This is the distance that is regarded as the inter-distance.
  • the determination of the patient face is performed by the face determination unit 25 . Details of the face determination unit 25 will be described later.
  • the reference distance calculator 24 sets the mode of the inter-feature point distances for the reference calculation target period as the reference inter-feature point distance. Note that this is merely an example, and the reference distance calculation unit 24 may use, for example, the average value or the median value of the distances between feature points for the reference calculation target period as the reference distance between feature points.
  • the reference distance calculator 24 calculates the distance between reference feature points at a predetermined timing (hereinafter referred to as “reference calculation timing"). Specifically, the reference distance calculator 24 calculates the distance between the reference feature points from the distance information for the reference calculation target period, which is stored in the storage unit 26 after it is determined that the reference calculation timing has come. In the first embodiment, the reference calculation timing is when the power of the vehicle is turned on and when recalculation instruction unit 251 outputs an instruction to recalculate the distance between reference feature points. Details of the recalculation instruction unit 251 will be described later. If the storage unit 26 does not store the distance information for the reference calculation period after the reference calculation timing, the reference distance calculation unit 24 waits until the distance information for the reference calculation period is stored. do. The reference distance calculation unit 24 causes the storage unit 26 to store information about the calculated distance between reference feature points (hereinafter referred to as “reference distance information”).
  • reference distance information information about the calculated distance between reference feature points
  • the face determination unit 25 determines whether or not the driver is making a patient face by comparing the distance between feature points calculated by the distance calculation unit 23 and the reference distance between feature points.
  • the face determination unit 25 can identify the distance between reference feature points from the reference distance information stored in the storage unit 26 . Specifically, when the distance between feature points, here, the distance between the driver's eyebrows, is smaller than the reference distance between feature points by a preset threshold (hereinafter referred to as "first threshold") or more, the face determination unit 25 In other words, if "the distance between the reference feature points - the distance between the feature points ⁇ the first threshold", it is determined that the driver is making a patient face.
  • the face determination unit 25 determines that the driver has wrinkled eyebrows, that is, a patient face. If the distance between feature points is not smaller than the distance between reference feature points by a first threshold or more, in other words, if "distance between reference feature points - distance between feature points ⁇ first threshold", the face determination unit 25 determines that the driver is judged to be not showing patience.
  • the face determination unit 25 when it is determined that the driver is not making a patient face, the face determination unit 25 further determines that the distance between feature points is greater than the reference distance between feature points by a preset threshold (hereinafter referred to as "second threshold"). ), in other words, it is determined whether or not "the distance between the feature points - the distance between the reference feature points ⁇ the second threshold".
  • second threshold a preset threshold
  • the face determination unit 25 notifies the recalculation instruction unit 251 to that effect.
  • the re-calculation instruction unit 251 When the recalculation instruction unit 251 is notified by the face determination unit 25 that the inter-feature point distance is larger than the reference inter-feature point distance by the second threshold or more, the re-calculation instruction unit 251 instructs the reference distance calculation unit 24 to calculate the distance between the reference feature points. Instructs recalculation of the distance. If the inter-feature point distance is larger than the reference inter-feature point distance by the second threshold or more, it is assumed that the reference inter-feature point distance is calculated based on the inter-feature point distance when the driver is making a patient face. be. As described above, the reference inter-feature point distance is the distance that is regarded as the inter-feature distance (eyebrow distance) when the driver is not making a patient face.
  • the distance between reference feature points calculated based on is not suitable as the distance between reference feature points. Therefore, when the face determination unit 25 determines that the inter-feature point distance is greater than the reference inter-feature point distance by the second threshold or more, the recalculation instruction unit 251 instructs the reference distance calculation unit 24 to recalculate the inter-reference feature point distance. let them do the calculations.
  • the face determination unit 25 outputs to the face determination information acquisition unit 11 face determination information indicating whether or not it is determined that the driver is making a patient face. Specifically, when determining that the driver is making a patient face, the face determination unit 25 outputs face determination information indicating that the driver is making a patient face to the face determination information acquisition unit 11 . When the face determination unit 25 determines that the driver is not making a patient face, the face determination unit 25 outputs face determination information indicating that the driver is not making a patient face to the face determination information acquisition unit 11 .
  • the face determination unit 25 Judgment information is not output. This is because there is a possibility that the determination that the driver is not making a patient face is an erroneous determination.
  • the storage unit 26 stores distance information and reference distance information. Although the storage unit 26 is provided in the face determination device 2 in Embodiment 1, this is merely an example. The storage unit 26 may be provided outside the face determination device 2 at a location that the face determination device 2 can refer to.
  • FIG. 2 is a flowchart for explaining the operation of the wakefulness estimation device 1 according to the first embodiment.
  • the processing of steps ST1 to ST5 is processing performed by the face determination device 2 included in the arousal level estimation device 1.
  • FIG. 2 is a flowchart for explaining the operation of the wakefulness estimation device 1 according to the first embodiment.
  • the processing of steps ST1 to ST5 is processing performed by the face determination device 2 included in the arousal level estimation device 1.
  • the image acquisition unit 21 acquires a captured image from the imaging device 3 (step ST1).
  • the image acquisition unit 21 outputs the acquired captured image to the feature point detection unit 22 and the brightness detection unit 12 .
  • the feature point detection unit 22 detects a plurality of feature points of the driver's face on the captured image based on the captured image acquired by the image acquisition unit 21 in step ST1 (step ST2). In Embodiment 1, the feature point detection unit 22 detects feature points indicating the inner corners of the eyebrows of the driver. The feature point detection section 22 outputs the feature point information to the distance calculation section 23 .
  • the distance calculator 23 converts the coordinates of the plurality of feature points detected by the feature point detector 22 in step ST2 into three-dimensional coordinates (step ST3). Then, the distance calculation unit 23 calculates the inter-feature point distances between the plurality of feature points from the converted three-dimensional coordinates (step ST4). In Embodiment 1, the distance calculation unit 23 converts the coordinates of the characteristic points indicating the inner corners of the driver's eyebrows on the captured image into three-dimensional coordinates, and calculates the distance between the driver's eyebrows from the converted three-dimensional coordinates. Calculate as a distance. The distance calculation section 23 outputs the distance information to the face determination section 25 .
  • the face determination unit 25 compares the distance between feature points calculated by the distance calculation unit 23 in step ST4 with the reference distance between feature points, thereby determining whether or not the driver is making a patient face. Processing is performed (step ST5).
  • the face determination information acquisition unit 11 acquires face determination information from the face determination device 2 (step ST6). Specifically, the face determination information acquisition unit 11 acquires the face determination information output by the face determination unit 25 of the face determination device 2 performing the patient face determination process in step ST5. The face determination information acquisition unit 11 outputs the acquired face determination information to the estimation unit 13 .
  • the brightness detection unit 12 detects brightness in the captured image captured by the imaging device 3 (step ST7). Specifically, the brightness detection unit 12 acquires the captured image acquired by the image acquisition unit 21 of the face determination device 2 in step ST1, and detects the brightness in the captured image. The luminance detection unit 12 outputs luminance information to the estimation unit 13 .
  • the estimating unit 13 estimates the degree of arousal of the driver based on the face determination information acquired by the face determination information acquiring unit 11 in step ST6 and the brightness in the captured image detected by the brightness detection unit 12 in step ST7. Degree estimation processing is performed (step ST8).
  • step ST6 and step ST7 the processing is performed in the order of step ST6 and step ST7, but the processing order of step ST6 and step ST7 is not limited to this.
  • the processing of step ST7 and step ST6 may be performed in order, or the processing of step ST6 and the processing of step ST7 may be performed in parallel.
  • the process of step ST7 may be performed at any timing after step ST1 until the process of step ST8 is performed.
  • FIG. 3 is a flowchart for explaining the details of the patient face determination process by the face determination unit 25 in step ST5 of FIG.
  • the face determination unit 25 determines whether or not the distance between feature points, here, the distance between the driver's eyebrows, is smaller than the reference distance between feature points by a first threshold or more. First threshold" (step ST21). If the inter-feature point distance is smaller than the reference inter-feature point distance by the first threshold or more ("YES" in step ST21), the face determination unit 25 determines that the driver is making a patient face (step ST22). . If the inter-feature point distance is not smaller than the reference feature point distance by a first threshold or more, in other words, if "the reference feature point distance - the inter-feature point distance ⁇ the first threshold value"("NO” in step ST21) ), the face determination unit 25 determines that the driver is not making a patient face (step ST24).
  • the face determining unit 25 determines whether the distance between feature points is greater than the reference distance between feature points by a second threshold or more. It is determined whether or not "distance between points - distance between reference feature points ⁇ second threshold" (step ST25). If the inter-feature point distance is greater than the reference inter-feature point distance by the second threshold or more (“YES” in step ST25), the face determination section 25 notifies the recalculation instruction section 251 of that effect. Then, the recalculation instruction section 251 instructs the reference distance calculation section 24 to recalculate the distance between reference feature points (step ST26).
  • the face determination section 25 determines face determination information to the face determination information acquisition section 11 (step ST23). If it is necessary to issue an instruction to recalculate the distance between the reference feature points, the face determination unit 25 determines in step ST25 that the distance between the feature points is larger than the distance between the reference feature points by the second threshold or more. In this case, face determination information is not output. This is because the determination that the driver is not making a patient face (see step ST24) may be an erroneous determination.
  • FIG. 4 is a flowchart for explaining the details of the arousal level estimation process by the estimation unit 13 in step ST8 of FIG.
  • the estimation unit 13 determines whether the face determination information acquired by the face determination information acquisition unit 11 in step ST6 in FIG. 2 is face determination information indicating that the driver is making a patient face. It is determined whether or not the determination device 2 has determined that the driver is making a patient face (step ST31).
  • the estimating unit 13 determines that the luminance detecting unit 12 in step ST7 of FIG. It is determined whether or not the brightness in the detected captured image is less than the brightness determination threshold (step ST32).
  • step ST32 When it is determined in step ST32 that the brightness in the captured image is less than the threshold value for brightness determination (“YES” in step ST32), the estimating unit 13 determines that the driver's face periphery is based on the brightness distribution on the captured image. It is determined whether or not it is bright (step ST33).
  • step ST33 When it is determined in step ST33 that the area around the driver's face is bright ("YES” in step ST33), the estimation unit 13 estimates that the driver's arousal level has not decreased (step ST35). The estimating unit 13 may store the result of estimating the degree of wakefulness of the driver in the storage unit 26 . When it is determined in step ST33 that the area around the driver's face is not bright ("NO" in step ST33), the process of the estimating section 13 proceeds to step ST34.
  • step ST32 When it is determined in step ST32 that the brightness of the captured image is not less than the brightness determination threshold value, in other words, when it is determined that the brightness of the captured image is equal to or greater than the brightness determination threshold value ("NO" in step ST32), The estimation unit 13 determines whether or not there is a shadow around the driver's face based on the luminance distribution on the captured image (step ST36).
  • step ST36 When it is determined in step ST36 that there is a shadow around the driver's face ("YES” in step ST36), and when it is determined in step ST33 that the driver's face is not bright (step ST33). If "NO"), the estimation unit 13 estimates that the driver's wakefulness has decreased, in other words, the driver is drowsy (step ST34). Then, the estimation unit 13 outputs warning information.
  • the estimating unit 13 may store the result of estimating the degree of wakefulness of the driver in the storage unit 26 .
  • the estimation unit 13 estimates that the driver's arousal level has not decreased (step ST37).
  • the estimating unit 13 may store the result of estimating the degree of wakefulness of the driver in the storage unit 26 .
  • the estimating unit 13 performs a known image-based wakefulness estimation.
  • Technology is used to estimate the degree of alertness of the driver (step ST38).
  • the estimation unit 13 outputs warning information when it is estimated that the degree of alertness of the driver has decreased.
  • the estimating unit 13 may store the result of estimating the degree of wakefulness of the driver in the storage unit 26 .
  • step ST36 the processing of steps ST33, ST35, and ST36 can be omitted.
  • the estimation unit 13 performs the process of step ST37. proceed to
  • FIG. 5 is a flowchart for explaining the calculation operation of the distance between reference feature points by the reference distance calculator 24 in the first embodiment.
  • the operation shown in the flowchart of FIG. 5 is performed in parallel with the operation of the face determination device 2 described using the flowchart of FIG.
  • the reference distance calculator 24 determines whether or not the reference calculation timing has come (step ST41). Specifically, the reference distance calculation unit 24 determines whether the power of the vehicle is turned on, or whether an instruction to recalculate the distance between reference feature points is output from the recalculation instruction unit 251 (see step ST26 in FIG. 3). , to determine. If it is determined that the reference calculation timing has not come (“NO” in step ST41), the reference distance calculator 24 waits until the reference calculation timing comes.
  • the reference distance calculation unit 24 refers to the distance information stored in the storage unit 26, and calculates the feature points for the reference calculation target period.
  • the inter-characteristic distance is obtained (step ST42), and the reference inter-characteristic point distance is calculated based on the obtained inter-characteristic point distance for the reference calculation target period (step ST43).
  • the reference distance calculation unit 24 acquires the distance between feature points from the distance information for the reference calculation target period, which is stored in the storage unit 26 from the point in time when it is determined that the reference calculation timing has come. If the distance information for the reference calculation target period is not stored in the storage unit 26 after it is determined that the reference calculation timing has come, the reference distance calculation unit 24 stores the distance information for the reference calculation target period. After waiting, the distance between reference feature points is calculated.
  • the reference distance calculation unit 24 causes the storage unit 26 to store the reference distance information (step ST44).
  • the face determination unit 25 Based on the reference distance information calculated by the reference distance calculation unit 24 and stored in the storage unit 26 in this way, the face determination unit 25 performs the patient face determination process (see step ST5 in FIG. 2).
  • the reference distance calculation unit 24 completes the calculation of the reference feature point distance, and until the reference distance information is stored in the storage unit 26, the face determination unit 25 keeps the patient face. Do not perform judgment processing.
  • the reference distance calculation unit 24 determines that it is the reference calculation timing (“YES” in step ST41 of FIG. 5), it sets the reference calculation in progress flag “1”.
  • the reference-calculating flag is set at a location that can be referred to by the reference distance calculation unit 24 and the face determination unit 25.
  • the reference The distance calculator 24 sets the initial value to “0”. For example, when the reference calculation in-progress flag is set to "1”, the face determination unit 25 notifies the face determination information acquisition unit 11 of waiting information (hereinafter referred to as "waiting notification"). Output.
  • the face determination information acquisition unit 11 Upon acquiring the standby notification, the face determination information acquisition unit 11 outputs the standby notification to the estimation unit 13 .
  • the estimation unit 13 performs the same method as when the face determination information acquisition unit 11 outputs the face determination information indicating that the driver is not making a patient face. Then, the degree of wakefulness of the driver is estimated.
  • the estimation unit 13 estimates the degree of arousal of the driver using a known image-based arousal degree estimation technique. Further, for example, when the reference calculation in-progress flag is set to “1”, the face determination unit 25 determines the pre-stored general distance between the inner corners of the eyebrows when the person is not making a patient face. It may be determined whether or not the driver is making a patient face, using an assumed and set distance (hereinafter referred to as “alternative reference distance”) as the reference distance between feature points. The alternative reference distance is set in advance by an administrator or the like and stored in the face determination section 25 .
  • alternative reference distance is set in advance by an administrator or the like and stored in the face determination section 25 .
  • the wakefulness estimation device 1 acquires face determination information indicating whether or not the driver is making a patient face from the face determination device 2, and detects the luminance in the captured image. Then, the awakening level estimation device 1 estimates the driver's awakening level based on the acquired face determination information and the detected brightness.
  • the arousal level estimating device 1 estimates the driver's arousal level by considering whether the surrounding situation is a situation where the driver is assumed to feel dazzling when the driver is making a patient face. It is possible to prevent erroneous estimation of the decrease in arousal level of
  • the arousal level estimation device 1 detects that the acquired face determination information is face determination information indicating that the driver is making a patient face, and the luminance in the detected captured image is less than the luminance determination threshold value. If there is, it is estimated that the driver's arousal level has decreased.
  • the arousal level estimation device 1 determines whether the patient face is a face made due to a decrease in the driver's arousal level or whether the driver is dazzling. I can distinguish whether it is a face made because I felt it. As a result, the awakening level estimation device 1 can prevent erroneous estimation that the driver's awakening level has decreased.
  • the arousal level estimation device 1 determines whether the patient face is a face made due to a decrease in the driver's arousal level or whether the driver is dazzling. I can distinguish whether it is a face made because I felt it. As a result, the awakening level estimation device 1 can prevent erroneous estimation that the driver's awakening level has decreased.
  • the arousal level estimation device 1 determines whether the acquired face determination information is face determination information indicating that the driver is making a patient face, and the luminance in the detected captured image is less than the luminance determination threshold value. Alternatively, if it is determined that the area around the driver's face is bright based on the luminance distribution on the captured image, it may be estimated that the driver's arousal level has not decreased. As a result, the arousal level estimation device 1 can create a situation in which the area around the driver's face becomes locally bright and the driver feels dazzled, even if there is a shadow or the like in the surroundings and the brightness is not high enough for the driver to feel dazzled.
  • the arousal level estimation device 1 can more accurately prevent erroneous estimation that the driver's arousal level has decreased.
  • the arousal level estimation device 1 determines that the acquired face determination information is face determination information indicating that the driver is making a patient face, and the luminance in the detected imaged image is equal to or higher than the luminance determination threshold value. Alternatively, if it is determined that there is a shadow around the driver's face based on the luminance distribution on the captured image, it may be estimated that the driver's arousal level has decreased. As a result, even if the brightness of the surroundings is such that the driver feels dazzled, the wakefulness estimation device 1 is in a situation where the area around the driver's face is locally shadowed and the driver does not feel dazzled.
  • the wakefulness estimation device 1 can more accurately estimate the decrease in the driver's wakefulness.
  • the arousal level estimation device 1 more specifically, the face determination device 2 of the arousal level estimation device 1, detects a plurality of feature points (both eyebrow tips) of the driver's face detected from the captured image, and calculates the coordinates on the captured image as 3 Dimensional coordinates are converted, and distances between feature points (eyebrow distance) between a plurality of feature points are calculated from the three-dimensional coordinates after conversion. Then, the arousal level estimation device 1 determines whether or not the driver is making a patient face by comparing the distance between the feature points with the distance between the reference feature points, and the distance between the feature points is greater than the distance between the reference feature points. is smaller than the first threshold value, it is determined that the driver is making a patient face.
  • the arousal level estimation device 1 converts the coordinates of a plurality of feature points on the captured image into three-dimensional coordinates and calculates the distance between the feature points. Alternatively, it can be made unaffected by tilting to the left or right. That is, the wakefulness estimation apparatus 1 can calculate the distance between feature points with higher accuracy than when calculating the distance between feature points without converting the coordinates of the feature points on the captured image into three-dimensional coordinates. As a result, the awakening level estimation device 1 can accurately determine whether or not the driver is making a patient face.
  • the awakening level estimation device 1 calculates the distance between the reference feature points based on the history of the calculated distance between the feature points for the reference calculation target period, and compares the distance between the feature points with the distance between the reference feature points. .
  • the arousal level estimation device 1 can set the distance between reference feature points according to the driver whose arousal level is to be estimated. Further, when the arousal level estimation device 1 determines that the distance between feature points is greater than the distance between reference feature points by the second threshold or more as a result of comparing the distance between feature points and the distance between reference feature points, the wakefulness estimation apparatus 1 Recalculate the distance between points.
  • the arousal level estimation device 1 determines whether the driver is making a patient face based on the reference feature point distance that is not suitable as the reference feature point distance that is regarded as the feature point distance when the driver is not making a patient face. It is possible not to judge whether or not there is.
  • FIG. 6A and 6B are diagrams showing an example of the hardware configuration of the wakefulness estimation device 1 according to Embodiment 1.
  • FIG. 1 the face determination information acquisition unit 11, the luminance detection unit 12, the estimation unit 13, the image acquisition unit 21, the feature point detection unit 22, the distance calculation unit 23, and the reference distance calculation unit 24 , the functions of the face determination unit 25 are realized by the processing circuit 101 . That is, the arousal level estimating device 1 performs control for estimating the driver's arousal level based on the result of the face determination device 2 determining whether or not the driver is making a patient face based on the captured image and the luminance in the captured image.
  • a processing circuit 101 is provided for.
  • the processing circuitry 101 may be dedicated hardware, as shown in FIG. 6A, or a processor 104 executing a program stored in memory, as shown in FIG. 6B.
  • the processing circuit 101 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable). Gate Array), or a combination thereof.
  • the face determination information acquisition unit 11, the luminance detection unit 12, the estimation unit 13, the image acquisition unit 21, the feature point detection unit 22, the distance calculation unit 23, and the reference distance calculation unit. 24 and the functions of the face determination unit 25 are implemented by software, firmware, or a combination of software and firmware.
  • Software or firmware is written as a program and stored in memory 105 .
  • Processor 104 reads out and executes programs stored in memory 105 to obtain face determination information acquiring unit 11, luminance detecting unit 12, estimating unit 13, image acquiring unit 21, and feature point detecting unit 22. , the distance calculator 23 , the reference distance calculator 24 , and the face determination unit 25 .
  • wakefulness estimation apparatus 1 includes memory 105 for storing a program that, when executed by processor 104, results in execution of steps ST1 to ST8 in FIG. 2 described above.
  • the programs stored in the memory 105 include a face determination information acquisition unit 11, a brightness detection unit 12, an estimation unit 13, an image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference It can also be said that the distance calculation unit 24 and the processing procedure or method of the face determination unit 25 are executed by a computer.
  • the memory 105 is a non-volatile memory such as RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), etc. or volatile
  • RAM random access memory
  • ROM Read Only Memory
  • flash memory a non-volatile memory
  • EPROM Erasable Programmable Read Only Memory
  • EEPROM Electrical Erasable Programmable Read-Only Memory
  • the face determination information acquisition unit 11, the luminance detection unit 12, the estimation unit 13, the image acquisition unit 21, the feature point detection unit 22, the distance calculation unit 23, the reference distance calculation unit 24, and the face determination unit 25 may be partly implemented by dedicated hardware and partly implemented by software or firmware.
  • the function of the image acquisition unit 21 is realized by a processing circuit 101 as dedicated hardware.
  • the functions of the distance calculation unit 23, the reference distance calculation unit 24, and the face determination unit 25 can be realized by the processor 104 reading and executing programs stored in the memory 105.
  • the storage unit 26 is configured by, for example, a memory.
  • the wakefulness estimation apparatus 1 also includes a device such as the imaging device 3 or an output device (not shown), and an input interface device 102 and an output interface device 103 that perform wired or wireless communication.
  • the plurality of feature points detected by the feature point detection unit 22 are the feature points indicating the inner ends of the driver's eyebrows
  • the distance calculation unit 23 detects the feature points indicating the inner ends of the driver's eyebrows on the captured image.
  • the distance between the eyebrows was calculated as the distance between feature points based on the coordinates of .
  • the face determination unit 25 compares the distance between the eyebrows calculated by the distance calculation unit 23 and the distance between the reference feature points to determine whether or not the driver is making a patient face.
  • this is only an example.
  • the plurality of feature points detected by the feature point detection unit 22 are feature points indicating the inner corners of the driver's eyebrows and the inner corners of the driver's eyes.
  • the distance between the eyebrows and the distance between the inner corners of the eyebrows may be calculated as the distances between the feature points, respectively, based on the coordinates of the feature points indicating the inner corners of the eyebrows and the inner corners of the eyes.
  • the face determination unit 25 compares the distance between the eyebrows and the distance between the inner corners of the eyes with the distance between the reference feature points to determine whether or not the driver is making a patient face.
  • the reference distance calculator 24 calculates the distance between the reference feature points corresponding to the distance between the eyebrows and the distance between the reference feature points corresponding to the distance between the inner corners of the eyes.
  • the plurality of feature points detected by the feature point detection unit 22 are feature points indicating the inner corners of the driver's eyes
  • the distance calculation unit 23 detects the inner corners of the driver's eyes on the captured image.
  • the distance between the inner corners of the eyes may be calculated as the inter-feature point distance based on the coordinates of the feature points indicating .
  • the face determination unit 25 compares the distance between the inner corners of the eyes with the distance between the reference feature points to determine whether or not the driver is making a patient face.
  • the reference distance calculator 24 calculates the distance between the reference feature points corresponding to the distance between the inner corners of the eyes.
  • the feature points related to the eyes are more likely not to be detected than the feature points related to the eyebrows.
  • the feature point detection unit 22 cannot detect feature points related to the driver's eyes. Therefore, in the wakefulness estimation devices 1 and 1a, the distance between the eyebrows calculated based on the feature points indicating the inner corners of the eyebrows is compared with the distance between the reference feature points to determine whether or not the driver is making a patient face. is preferred.
  • the arousal level estimation devices 1 and 1a compare the distance between the eyebrows calculated based on the feature points indicating the inner corners of the eyebrows and the distance between the reference feature points to determine whether or not the driver is making a patient face. It is possible to reduce the possibility of falling into a situation where judgment cannot be made.
  • the arousal level estimation device can also have a personal authentication function.
  • the personal authentication function the arousal level estimating device can use the reference feature point distance calculated during the previous driving without waiting for the calculation of the reference feature point distance when the driver starts driving. It can be determined whether or not a certain driver is making a patient face. In this case, the wakefulness estimation device does not have to delete the reference distance information stored in the storage unit when the power of the vehicle is turned off.
  • FIG. 7 is a diagram showing a configuration example of the awakening level estimation device 1a provided with a personal authentication function in Embodiment 1. As shown in FIG. In FIG. 7, the same components as those of the awakening level estimation device 1 described with reference to FIG. Here, it is assumed that the awakening level estimation device 1a has a personal authentication function in the face determination device 2a.
  • a face determination device 2a shown in FIG. 7 differs from the face determination device 2 described with reference to FIG. Further, the specific operations of the reference distance calculation unit 24 and the face determination unit 25 in the face determination device 2a shown in FIG. behavior is different.
  • the personal authentication unit 27 performs personal authentication of the driver.
  • the personal authentication unit 27 stores a face image of the driver in advance, and performs personal authentication of the driver by pattern matching with the captured image obtained from the image obtaining unit 21 .
  • the individual authentication unit 27 may perform individual authentication of the driver using a known individual authentication technique.
  • Personal authentication unit 27 outputs the result of personal authentication (hereinafter referred to as “personal authentication result”) to reference distance calculation unit 24 and face determination unit 25 .
  • the reference distance calculator 24 calculates the distance between reference feature points for each individual based on the result of personal authentication by the personal authentication unit 27 . Specifically, the reference distance calculator 24 calculates the reference feature point distance as the reference feature point distance of the driver authenticated by the personal authentication section 27 .
  • the reference distance calculation unit 24 causes the storage unit 26 to store, as reference distance information, information in which the calculated distance between reference feature points is associated with information that can identify the individual driver. It should be noted that the reference distance calculation unit 24 includes information that can identify the individual driver in the personal authentication result.
  • the face determination unit 25 determines whether or not the driver is making a patient face based on the result of personal authentication by the personal authentication unit 27 .
  • the face determination unit 25 compares the personal identification result of the personal identification unit 27 with the reference distance information stored in the storage unit 26 to specify the distance between the reference feature points of the driver. The face determination unit 25 then compares the distance between feature points calculated by the distance calculation unit 23 with the reference distance between feature points of the specified driver, thereby determining whether or not the driver is making a patient face. . The face determination unit 25 determines that the inter-feature point distance is greater than the reference inter-feature point distance by a second threshold or more, and when notifying the recalculation instruction unit 251 to that effect, the face determination unit 25 uses information that can identify the individual driver. shall be notified together.
  • the recalculation instruction unit 251 instructs the reference distance calculation unit 24 to recalculate the distance between reference feature points
  • the recalculation instruction unit 251 also outputs information that can identify the individual driver. After recalculating the distance between the reference feature points of the driver, the reference distance calculation unit 24 updates the reference distance information corresponding to the driver stored in the storage unit 26 .
  • the reference distance calculation unit 24 stores in the storage unit 26 the reference distance information in which the information that can identify an individual and the distance between reference feature points are associated, so that the face determination unit 25 can
  • the driver can use the reference feature point distance to It can be determined whether or not the person is making a patient face.
  • FIG. 8 is a flow chart for explaining the operation of the awakening level estimation device 1a provided with the personal authentication function in the first embodiment. Specific operations of steps ST51, ST53 to ST55, and steps ST57 to ST59 in FIG. 8 are similar to the specific operations of steps ST1 to ST4 and steps ST6 to ST8 in FIG. We omit the explanation.
  • the personal authentication unit 27 performs personal authentication of the driver (step ST52).
  • Personal authentication section 27 outputs the personal authentication result to reference distance calculation section 24 and face determination section 25 .
  • the face determination section 25 performs a patient face determination process to determine whether or not the driver is making a patient face based on the result of personal authentication by the personal authentication section 27 in step ST52 (step ST56).
  • the face determination unit 25 acquires the distance between the reference feature points of the driver based on the reference distance information, and determines the feature. Compare with distance between points.
  • the recalculation instruction section 251 instructs the reference distance calculation section 24 to recalculate the distance between reference feature points
  • the recalculation instruction section 251 also outputs information that can identify the individual driver.
  • the reference distance calculation unit 24 calculates FIG. Then, the distance between the reference feature points for each individual is calculated. Then, the reference distance calculation unit 24 causes the storage unit 26 to store, as reference distance information, information in which the calculated distance between reference feature points is associated with information that can identify the individual driver.
  • FIG. 6A and 6B show an example of the hardware configuration of the wakefulness estimation device 1a when it has the configuration shown in FIG. 7.
  • FIG. 7 A face determination information acquisition unit 11, a luminance detection unit 12, an estimation unit 13, an image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference distance calculation unit 24, and a face determination unit 25.
  • the functions of the individual authentication unit 27 are realized by the processing circuit 101 . That is, the arousal level estimation device 1a estimates the driver's arousal level based on the brightness in the captured image and the result of the face determination device 2a determining whether or not the driver is making a patient face based on the captured image.
  • a processing circuit 101 is provided for performing authentication and performing control for storing the personal authentication result and the reference distance information in association with each other.
  • the processing circuit 101 reads out and executes programs stored in the memory 105 to obtain a face determination information acquisition unit 11, a luminance detection unit 12, an estimation unit 13, an image acquisition unit 21, and a feature point detection unit 22. , the functions of the distance calculation unit 23, the reference distance calculation unit 24, the face determination unit 25, and the personal authentication unit 27 are executed. That is, wakefulness estimation apparatus 1a includes memory 105 for storing a program that, when executed by processing circuit 101, results in execution of steps ST51 to ST59 in FIG.
  • the programs stored in the memory 105 include a face determination information acquisition unit 11, a brightness detection unit 12, an estimation unit 13, an image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference It can also be said that the distance calculation unit 24, the face determination unit 25, and the personal identification unit 27 are executed by a computer.
  • the wakefulness estimation device 1a includes a device such as the imaging device 3 or an output device (not shown), and an input interface device 102 and an output interface device 103 that perform wired or wireless communication.
  • the person whose arousal level is estimated by the arousal level estimation device 1 is the driver of the vehicle, and that the person whose arousal level is estimated is one person. But this is just one example.
  • the arousal level estimation device may estimate the arousal levels of a plurality of vehicle occupants, including the driver, a front passenger seat passenger, or a rear seat passenger.
  • the feature point detection unit 22 determines the seat position of each passenger based on the captured image, and determines the seat position. A plurality of feature points are detected for each Then, the feature point detection unit 22 outputs to the distance calculation unit 23, as feature point information, information associated with the plurality of detected feature points and the information that can identify each occupant.
  • the information that can identify each occupant is information about the seat position.
  • the distance calculation unit 23 calculates the distance between feature points for each occupant corresponding to the seat position determined by the feature point detection unit 22 .
  • the distance calculation unit 23 causes the storage unit 26 to store information that associates the information that can identify each passenger with the distance between feature points as distance information, and outputs the information to the face determination unit 25 .
  • the reference distance calculator 24 calculates the distance between reference feature points for each occupant corresponding to the seat position determined by the feature point detector 22 .
  • the reference distance calculation unit 24 causes the storage unit 26 to store, as reference distance information, information that associates information that can identify each occupant with the distance between reference feature points.
  • the face determination unit 25 determines whether or not each passenger is making a patient face.
  • the face determination unit 25 outputs to the face determination information acquisition unit 11 as face determination information information in which information that can identify each passenger is associated with information indicating whether or not the passenger is making a patient face.
  • the recalculation instruction section 251 instructs the reference distance calculation section 24 to recalculate the distance between reference feature points, it also outputs information that can identify each passenger. After recalculating the distance between reference feature points, the reference distance calculation unit 24 updates the reference distance information of the corresponding occupant stored in the storage unit 26 .
  • the estimation unit 13 estimates the degree of wakefulness for each passenger. Note that the estimation unit 13 can determine each passenger from, for example, feature point information output from the feature point detection unit 22 and face determination information output from the face determination information acquisition unit 11 .
  • the estimation unit 13 may estimate the degree of wakefulness for each passenger based on the personal authentication result.
  • the feature point detection unit 22 detects a plurality of feature points for each occupant based on the personal authentication result by the personal authentication unit 27 . Note that an arrow from the personal authentication unit 27 to the feature point detection unit 22 is omitted in FIG.
  • the feature point detection unit 22 outputs information in which the plurality of detected feature points and the personal authentication result are associated with each other to the distance calculation unit 23 as feature point information.
  • the distance calculation unit 23 calculates the distance between feature points for each passenger based on the result of personal authentication by the personal authentication unit 27 .
  • the distance calculation unit 23 causes the storage unit 26 to store information in which the calculated inter-feature point distance and the personal authentication result are associated with each other as distance information, and outputs the distance information to the face determination unit 25 .
  • the reference distance calculator calculates the distance between reference feature points for each occupant based on the result of personal identification by the personal identification unit 27 .
  • the reference distance calculation unit 24 causes the storage unit 26 to store, as reference distance information, information in which the distance between reference feature points and the personal authentication result are associated with each other.
  • the face determination unit 25 determines whether or not each passenger is making a patient face based on the result of personal identification by the personal identification unit 27 .
  • the face determination unit 25 outputs to the face determination information acquisition unit 11 as face determination information information in which the information indicating whether or not the person is making a patient face is associated with the personal authentication result.
  • the estimating unit 13 estimates the degree of wakefulness for each occupant based on the result of personal identification by the personal identification unit 27 . Note that the estimation unit 13 can determine each passenger from, for example, feature point information output from the feature point detection unit 22 and face determination information output from the face determination information acquisition unit 11 .
  • the wakefulness level estimation devices 1 and 1a are in-vehicle devices mounted in vehicles, and the face determination information acquisition unit 11, the brightness detection unit 12, the estimation unit 13, and the image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference distance calculation unit 24, a face determination unit 25, and a personal authentication unit when the wakefulness level estimation device 1a is configured as shown in FIG. 27 is provided in the in-vehicle device.
  • a wakefulness estimation system may be configured.
  • a face determination information acquisition unit 11, a luminance detection unit 12, an estimation unit 13, an image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference distance calculation unit 24, and a face determination unit 25 and the personal authentication unit 27 may all be provided in the server.
  • the face determination device 2 is provided in the wakefulness estimation device 1 and the face determination device 2a is provided in the wakefulness estimation device 1a, but this is merely an example.
  • the face determination device 2 may be connected to the wakefulness estimation device 1 outside the wakefulness estimation device 1 .
  • the face determination device 2a may be connected to the wakefulness estimation device 1a outside the wakefulness estimation device 1a.
  • some of the components included in the face determination device 2 may be included in the wakefulness estimation device 1, and some of the components included in the face determination device 2a may be included in the wakefulness estimation device. It may be provided in the device 1a.
  • the image acquisition unit 21, the feature point detection unit 22, the distance calculation unit 23, or the reference distance calculation unit 24 may be included in the wakefulness estimation device 1, and the image acquisition unit 21 and the feature point detection unit 22 , the distance calculator 23, the reference distance calculator 24, or the personal authentication unit 27 may be provided in the wakefulness estimation device 1a.
  • the person whose awakening level is estimated by the awakening level estimation devices 1 and 1a is the driver of the vehicle or the passenger of the vehicle including the driver, but this is only an example. do not have.
  • the awakening degree estimation devices 1 and 1a can estimate the awakening degree of not only the vehicle occupants but also all persons.
  • the wakefulness estimation devices 1 and 1a can estimate the wakefulness of a moving object other than a vehicle or a person in a living room.
  • the imaging device 3 is also not limited to one mounted on the vehicle.
  • the imaging device 3 may be installed at a position capable of imaging the range in which the face of a person whose arousal level is to be estimated by the arousal level estimating devices 1 and 1a should exist.
  • the arousal level estimation apparatuses 1 and 1a acquire face determination information indicating whether or not a person has a face with wrinkles between the eyebrows (restrained face).
  • the arousal level estimation device of the present disclosure estimates a person's arousal level by considering whether the surrounding situation is a situation where the person is assumed to feel dazzling when the person is in a state where the eyebrows are wrinkled. Therefore, it is possible to prevent erroneous estimation of a decrease in a person's arousal level.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The present invention comprises: a facial determination information acquisition unit (11) that acquires facial determination information indicating whether a person is making a face with a furrowed brow; a luminance detection unit (12) that detects the luminance in a captured image capturing a range in which the face of the person should appear; and an estimation unit (13) that estimates the level of wakefulness of the person on the basis of the facial determination information acquired by the facial determination information acquisition unit (11) and the luminance detected by the luminance detection unit (12).

Description

覚醒度推定装置および覚醒度推定方法Arousal level estimation device and arousal level estimation method
 本開示は、覚醒度推定装置および覚醒度推定方法に関するものである。 The present disclosure relates to an arousal level estimation device and an arousal level estimation method.
 人は、眠気を催し覚醒度が低下すると、眉間にしわを寄せた顔をすることがある。そこで、従来、人の顔を撮像して得られた画像に基づいて検出した人の両目の目頭および眉頭をあらわす特徴点の距離が予め定められた閾値以下になったことから人が眉間にしわを寄せた状態を検出し、当該状態が検出された頻度から、人の眠気状態を推定する技術が知られている(例えば、特許文献1)。 When people become drowsy and their arousal level decreases, they may wrinkle their eyebrows. Therefore, conventionally, the distance between feature points representing the inner corners of both eyes and the inner corners of the eyebrows detected based on an image obtained by imaging a person's face is equal to or less than a predetermined threshold. There is known a technique for detecting a state of increased sleepiness and estimating a person's drowsiness state from the frequency of detection of the state (for example, Patent Document 1).
特開2008-212298号公報Japanese Patent Application Laid-Open No. 2008-212298
 人が眉間にしわを寄せた顔をするのは、眠気を催し覚醒度が低下した場合に限られない。人は、眩しいと感じた場合にも、眉間にしわを寄せることがある。したがって、人が眉間にしわを寄せたという表情からだけでは、人の覚醒度が低下したのか、人が眩しいと感じているのかの区別は困難である。
 特許文献1に開示されている技術に代表される従来技術では、人が眉間にしわを寄せたという表情からだけでは人の覚醒度が低下したのか人が眩しいと感じているのかの区別が困難であることが考慮できていないため、人の覚醒度が低下したことを誤推定する可能性があった。
A person's face with wrinkled brows is not limited to the case of sleepiness and decreased alertness. People may also wrinkle their eyebrows when they feel glare. Therefore, it is difficult to distinguish whether the person's arousal level has decreased or whether the person feels dazzling only from the expression that the person has wrinkled the eyebrows.
In the conventional technology represented by the technology disclosed in Patent Document 1, it is difficult to distinguish whether the person's arousal level has decreased or the person feels dazzling only from the expression that the person has wrinkled the eyebrows. , there is a possibility of erroneously inferring that a person's arousal has decreased.
 本開示は上記のような課題を解決するためになされたもので、人が眉間にしわを寄せた状態である場合に、周囲の状況は人が眩しいと感じると想定される状況であるかを考慮して人の覚醒度合いを推定することで、人の覚醒度低下の誤推定を防ぐ覚醒度推定装置を提供することを目的とする。 The present disclosure has been made in order to solve the above problems, and when a person is in a state of wrinkling their eyebrows, it is assumed that the surrounding situation is a situation in which the person feels dazzling. It is an object of the present invention to provide an arousal level estimation device that prevents erroneous estimation of a decrease in a person's arousal level by estimating a person's arousal level by taking into consideration.
 本開示に係る覚醒度推定装置は、人が眉間にしわを寄せた顔をしているか否かを示す顔判定情報を取得する顔判定情報取得部と、人の顔が存在すべき範囲が撮像された撮像画像における輝度を検出する輝度検出部と、顔判定情報取得部が取得した顔判定情報と、輝度検出部が検出した輝度とに基づき、人の覚醒度合いを推定する推定部とを備えたものである。 The arousal level estimation device according to the present disclosure includes a face determination information acquisition unit that acquires face determination information indicating whether or not a person has a face with wrinkles between the eyebrows, and a range in which the person's face should be captured. and an estimation unit for estimating the degree of arousal of a person based on the face determination information acquired by the face determination information acquisition unit and the brightness detected by the brightness detection unit. It is a thing.
 本開示によれば、人が眉間にしわを寄せた状態である場合に、周囲の状況は人が眩しいと感じると想定される状況であるかを考慮して人の覚醒度合いを推定することで、人の覚醒度低下の誤推定を防ぐことができる。 According to the present disclosure, by estimating the degree of arousal of a person in consideration of whether the surrounding situation is a situation where the person is assumed to feel dazzling when the person is in a state of wrinkling the eyebrows. , it is possible to prevent misestimation of a person's arousal reduction.
実施の形態1に係る覚醒度推定装置の構成例を示す図である。1 is a diagram showing a configuration example of an arousal level estimation device according to Embodiment 1; FIG. 実施の形態1に係る覚醒度推定装置の動作について説明するためのフローチャートである。4 is a flowchart for explaining the operation of the wakefulness estimation device according to Embodiment 1; 図2のステップST5における、顔判定部による我慢顔判定処理の詳細を説明するためのフローチャートである。FIG. 3 is a flowchart for explaining details of a patient face determination process by a face determination unit in step ST5 of FIG. 2; FIG. 図2のステップST8における、推定部による覚醒度推定処理の詳細を説明するためのフローチャートである。FIG. 3 is a flowchart for explaining the details of arousal level estimation processing by an estimation unit in step ST8 of FIG. 2; FIG. 実施の形態1における、基準距離算出部による基準特徴点間距離の算出動作について説明するためのフローチャートである。4 is a flow chart for explaining the calculation operation of the distance between reference feature points by the reference distance calculation unit in Embodiment 1. FIG. 図6Aおよび図6Bは、実施の形態1に係る覚醒度推定装置のハードウェア構成の一例を示す図である。6A and 6B are diagrams showing an example of the hardware configuration of the wakefulness estimation device according to Embodiment 1. FIG. 実施の形態1において、個人認証機能を備えるようにした覚醒度推定装置の構成例を示す図である。1 is a diagram showing a configuration example of an arousal level estimation device provided with a personal authentication function in Embodiment 1. FIG. 実施の形態1において、個人認証機能を備えるようにした覚醒度推定装置1の動作を説明するためのフローチャートである。4 is a flowchart for explaining the operation of the awakening level estimation device 1 having a personal authentication function in Embodiment 1. FIG.
 以下、本開示の実施の形態について、図面を参照しながら詳細に説明する。
実施の形態1.
 実施の形態1に係る覚醒度推定装置は、人の少なくとも顔が撮像された撮像画像に基づき、人の覚醒度合いを推定する。特に、覚醒度推定装置は、人が眉間にしわを寄せた顔をした場合に、当該顔は、人の覚醒度が低下したことによってなされた顔なのか、人が眩しいと感じたためになされた顔なのかを区別し、人の覚醒度合いを推定する。
 実施の形態1において、眉間にしわを寄せた顔のことを、「我慢顔」という。「我慢顔」は、眉間にしわを寄せ、目を細めた、いわゆるしかめ顔である。
 また、実施の形態1では、一例として、覚醒度推定装置が覚醒度合いを推定する対象となる人は、車両のドライバとする。
 覚醒度推定装置が推定した、ドライバの覚醒度が低下したか、言い換えれば、ドライバが眠気を催しているかの推定結果は、例えば、ドライバの居眠り検出に用いられる。
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings.
Embodiment 1.
The arousal level estimation device according to Embodiment 1 estimates a person's arousal level based on a captured image in which at least the face of the person is captured. In particular, when a person makes a face with wrinkles between the eyebrows, the arousal level estimating device detects whether the face is due to a decrease in the person's arousal level or because the person feels that the person is dazzled. It distinguishes whether it is a face or not, and estimates the degree of arousal of a person.
In Embodiment 1, a face with wrinkles between the eyebrows is referred to as a "bearing face". A "bearing face" is a so-called frowning face with wrinkled eyebrows and narrowed eyes.
Further, in Embodiment 1, as an example, the person whose awakening level is estimated by the awakening level estimation device is the driver of the vehicle.
The result of estimating whether the driver's wakefulness has decreased, in other words, whether the driver is drowsy, estimated by the wakefulness estimating device, is used, for example, to detect dozing off of the driver.
 図1は、実施の形態1に係る覚醒度推定装置1の構成例を示す図である。
 覚醒度推定装置1は、顔判定装置2、顔判定情報取得部11、輝度検出部12、および、推定部13を備える。
 覚醒度推定装置1は、撮像装置3と接続される。撮像装置3は、車両(図示省略)に搭載され、少なくともドライバの顔が存在すべき範囲を撮像可能に設置されている。例えば、撮像装置3は、車室内をモニタリングすることを目的に設置される、いわゆるDMS(Driver Monitoring System)と共用のものでもよい。撮像装置3は、可視光カメラ、または、赤外線カメラである。撮像装置3は、撮像した撮像画像を、覚醒度推定装置1が備える顔判定装置2に出力する。
FIG. 1 is a diagram showing a configuration example of an arousal level estimation device 1 according to Embodiment 1. As shown in FIG.
Arousal level estimation device 1 includes face determination device 2 , face determination information acquisition unit 11 , luminance detection unit 12 , and estimation unit 13 .
Arousal level estimation device 1 is connected to imaging device 3 . The imaging device 3 is mounted on a vehicle (not shown) and installed so as to be capable of imaging at least the range in which the driver's face should exist. For example, the imaging device 3 may be shared with a so-called DMS (Driver Monitoring System) installed for the purpose of monitoring the interior of the vehicle. The imaging device 3 is a visible light camera or an infrared camera. The imaging device 3 outputs the captured image to the face determination device 2 included in the wakefulness estimation device 1 .
 顔判定装置2は、撮像画像を取得し、取得した撮像画像に基づき、ドライバが我慢顔をしているか否かを判定する。顔判定装置2は、ドライバが我慢顔をしているか否かを示す情報(以下「顔判定情報」という。)を、顔判定情報取得部11に出力する。顔判定装置2の詳細な構成例については、後述する。 The face determination device 2 acquires the captured image and determines whether or not the driver is making a patient face based on the acquired captured image. The face determination device 2 outputs information (hereinafter referred to as “face determination information”) indicating whether or not the driver is making a patient face to the face determination information acquisition section 11 . A detailed configuration example of the face determination device 2 will be described later.
 顔判定情報取得部11は、顔判定装置2から顔判定情報を取得する。
 顔判定情報取得部11は、取得した顔判定情報を、推定部13に出力する。
The face determination information acquisition unit 11 acquires face determination information from the face determination device 2 .
The face determination information acquisition unit 11 outputs the acquired face determination information to the estimation unit 13 .
 輝度検出部12は、撮像装置3が撮像した撮像画像における輝度を検出する。輝度検出部12は、撮像画像を、顔判定装置2を介して取得すればよい。
 実施の形態1において、撮像画像における輝度とは、例えば、撮像画像における各画素の輝度値の平均値とする。なお、これは一例に過ぎず、輝度検出部12は、例えば、撮像画像における各画素の輝度値の中央値を、撮像画像における輝度としてもよい。輝度検出部12は、少なくとも、撮像画像上の、ドライバの顔が存在する領域(以下「顔領域」という。)を含む領域の輝度を検出するようになっていればよい。輝度検出部12は、例えば、撮像画像から、公知の画像処理技術を用いて、ドライバの顔領域を特定すればよい。また、輝度検出部12は、例えば、顔判定装置2の、後述する特徴点検出部22から、ドライバの顔を示す特徴点に関する情報(以下「特徴点情報」という。)を取得してもよい。
 輝度検出部12は、検出した撮像画像における輝度に関する情報(以下「輝度情報」という。)を、推定部13に出力する。輝度検出部12は、撮像画像と検出した輝度とを対応付けた情報を輝度情報とする。
The brightness detection unit 12 detects brightness in the captured image captured by the imaging device 3 . The luminance detection unit 12 may acquire the captured image via the face determination device 2 .
In Embodiment 1, the brightness in the captured image is, for example, the average value of brightness values of pixels in the captured image. Note that this is merely an example, and the luminance detection unit 12 may set, for example, the median value of the luminance values of the pixels in the captured image as the luminance in the captured image. The luminance detection unit 12 should detect the luminance of at least an area including the driver's face (hereinafter referred to as "face area") on the captured image. For example, the brightness detection unit 12 may identify the driver's face area from the captured image using a known image processing technique. Further, the brightness detection unit 12 may acquire information on feature points indicating the driver's face (hereinafter referred to as "feature point information") from a feature point detection unit 22 (described later) of the face determination device 2, for example. .
The luminance detection unit 12 outputs information about the detected luminance in the captured image (hereinafter referred to as “luminance information”) to the estimation unit 13 . The luminance detection unit 12 uses information that associates the captured image with the detected luminance as luminance information.
 推定部13は、顔判定情報取得部11が取得した顔判定情報と、輝度検出部12が検出した撮像画像における輝度とに基づき、ドライバの覚醒度合いを推定する。なお、推定部13が推定する覚醒度合いは、例えば、覚醒度が高いか低いかであらわされる覚醒度合いであってもよいし、5段階等の段階であらわされる覚醒度合いであってもよい。
 具体的には、推定部13は、まず、顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であるか否か、言い換えれば、顔判定装置2において、ドライバが我慢顔をしていると判定されたか否か、を判定する。
 顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報である場合、推定部13は、撮像画像における輝度が、予め設定された閾値(以下「輝度判定用閾値」という。)未満であるかを判定する。
 推定部13は、撮像画像における輝度が輝度判定用閾値未満であれば、ドライバの覚醒度が低下した、言い換えれば、ドライバは眠気を催していると推定する。
 推定部13は、撮像画像における輝度が輝度判定用閾値以上であれば、ドライバの覚醒度は低下していないと推定する。なお、ドライバの覚醒度が低下していない状態は、ドライバの覚醒度合いが上がった状態、および、ドライバの覚醒度合いが変化せず維持されている状態を含む。
The estimating unit 13 estimates the arousal level of the driver based on the face determination information acquired by the face determination information acquiring unit 11 and the brightness of the captured image detected by the brightness detecting unit 12 . Note that the degree of arousal estimated by the estimation unit 13 may be, for example, a degree of arousal represented by whether the degree of arousal is high or low, or may be a degree of arousal represented by stages such as five stages.
Specifically, the estimation unit 13 first determines whether or not the face determination information is face determination information indicating that the driver is making a patient face. It is determined whether or not it is determined that the
When the face determination information is face determination information indicating that the driver is making a patient face, the estimating unit 13 determines that the brightness of the captured image is equal to a preset threshold value (hereinafter referred to as "brightness determination threshold value"). Determine if it is less than
If the luminance in the captured image is less than the threshold value for luminance determination, the estimating unit 13 estimates that the driver's wakefulness has decreased, in other words, the driver is drowsy.
The estimating unit 13 estimates that the driver's arousal level has not decreased if the luminance in the captured image is equal to or higher than the threshold value for luminance determination. The state in which the driver's arousal level has not decreased includes a state in which the driver's arousal level has increased and a state in which the driver's arousal level is maintained without change.
 ドライバが我慢顔をする状況としては、覚醒度が低下した状況と、周囲が明るく、ドライバが眩しいと感じた状況とが想定される。例えば、ドライバの周囲、ここでは車内に、日光が差し込んでいた場合、ドライバが眩しいと感じる状況であると想定される。眩しいと感じると、ドライバは、眠気を催していなくても、我慢顔をすることがある。すなわち、周囲が眩しいと感じると想定される状況でドライバが我慢顔をしたときは、ドライバは、眠気を催しているのではなく眩しいことにより我慢顔をしたと推定できる。これに対し、周囲が眩しいと感じるほどの明るさでない状況でドライバが我慢顔をしたときは、ドライバは、眠気を催していることにより我慢顔をしたと推定できる。 The situations in which the driver makes a patient face are assumed to be situations in which the level of arousal has decreased, and situations in which the surroundings are bright and the driver feels dazzled. For example, it is assumed that the driver feels dazzled when sunlight is shining around the driver, in this case, inside the vehicle. When the driver feels dazzling, he/she may put up a face even if he/she is not drowsy. That is, when the driver makes a patient face in a situation where it is assumed that the surroundings are bright, it can be estimated that the driver is not drowsy but that the driver made the patient face due to the glare. On the other hand, when the driver makes a patient face in a situation where the surroundings are not bright enough to feel dazzling, it can be inferred that the driver made a patient face because he is drowsy.
 そこで、推定部13は、顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であり、かつ、撮像画像における輝度が輝度判定用閾値未満であれば、当該我慢顔は、ドライバの覚醒度が低下したことによってなされた、すなわち、ドライバの覚醒度が低下した、と推定する。撮像画像における輝度が輝度判定用閾値未満であれば、ドライバの周囲は、ドライバが眩しいと感じるほどの明るさではないと判定できる。なお、輝度判定用閾値は、撮像範囲が、ドライバが眩しいと感じると想定される明るさのもと、管理者等によって撮像された撮像画像の輝度に基づいて、設定される。撮像範囲は、ここでは、少なくともドライバの顔が存在すべき範囲を含む車内である。
 一方、推定部13は、顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であり、かつ、撮像画像における輝度が輝度判定用閾値以上であれば、当該我慢顔は、ドライバの覚醒度が低下したことによってなされたものではない、すなわち、ドライバの覚醒度は低下していない、と推定する。言い換えれば、当該我慢顔は、眩しいと感じたことによってなされたと推定する。撮像画像における輝度が輝度判定用閾値以上であれば、ドライバの周囲は、ドライバが眩しいと感じるほどの明るさであると判定できる。
 このように、推定部13は、ドライバが我慢顔をしていると判定された場合、ドライバの周囲の明るさを判定することで、我慢顔はドライバの覚醒度が低下したことによるのか、ドライバの覚醒度が低下したのではなくドライバが眩しいと感じたことによるのかを区別する。
Therefore, if the face determination information is face determination information indicating that the driver is making a patient face and the luminance in the captured image is less than the threshold value for luminance determination, the estimating unit 13 determines that the patient face is: It is presumed that this was done because the driver's arousal level was lowered, that is, the driver's arousal level was lowered. If the brightness in the captured image is less than the threshold for brightness determination, it can be determined that the surroundings of the driver are not bright enough to make the driver feel dazzled. The brightness determination threshold value is set based on the brightness of the captured image captured by the administrator or the like under the brightness that is assumed to make the driver feel dazzled. The imaging range here is the interior of the vehicle including at least the range where the driver's face should be present.
On the other hand, if the face determination information is face determination information indicating that the driver is making a patient face and the luminance in the captured image is equal to or higher than the threshold value for luminance determination, the estimating unit 13 determines that the patient face is: It is presumed that this is not caused by a decrease in the driver's arousal level, that is, the driver's arousal level has not decreased. In other words, it is presumed that the persevering face was made due to the feeling of being dazzled. If the brightness in the captured image is equal to or higher than the brightness determination threshold, it can be determined that the surroundings of the driver are bright enough to make the driver feel dazzled.
In this way, when it is determined that the driver is making a patient face, the estimating unit 13 determines whether the patient face is due to a decrease in the driver's arousal level by determining the brightness around the driver. It is discriminated whether it is due to the driver's feeling of being dazzled rather than the decrease in the arousal level of the driver.
 ただし、ドライバの周囲が、ドライバが眩しいと感じるほどの明るさであったとしても、ドライバの顔に影ができている等、ドライバの顔周辺が局所的に暗くなっている場合、ドライバは眩しいと感じないこともある。この場合、ドライバは、眩しいことによる我慢顔をしない。すなわち、ドライバの顔周辺が局所的に暗くなっている状況で、ドライバが我慢顔をしたのであれば、それは、眩しいと感じたことによるものではなく、覚醒度が低下したことによるものと推定できる。
 よって、推定部13は、顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であり、かつ、撮像画像における輝度が輝度判定用閾値以上であっても、撮像画像上の輝度分布に基づきドライバの顔周辺に影があると判定した場合は、ドライバの覚醒度が低下したと推定してもよい。なお、輝度検出部12から推定部13に出力される輝度情報には撮像画像が含まれる。推定部13は、撮像画像上の各画素の輝度値から、撮像画像上の輝度分布を判定すればよい。また、推定部13は、公知の画像処理技術を用いて、撮像画像上で、ドライバの顔領域を特定すればよい。顔領域は、例えば、ドライバの輪郭を示す特徴点を全て含む最小矩形領域とする。そして、推定部13は、例えば、当該顔領域に含まれる画素の輝度値の平均値が、予め設定されている閾値(以下「影判定用閾値」という。)以下であれば、ドライバの顔周辺に影があると判定する。
 推定部13は、例えば、ドライバの顔周辺に影があるか否かを、ドライバの目が存在する領域(以下「目領域」という。)に含まれる画素の輝度値の平均値と影判定用閾値との比較によって判定してもよい。目領域は、例えば、目頭および目尻等、ドライバの目を示す特徴点を全て含む最小矩形領域である。
However, even if the area around the driver is bright enough to make the driver feel dazzled, if the area around the driver's face is locally dark such as a shadow is formed on the driver's face, the driver will be dazzled. Sometimes I don't feel it. In this case, the driver does not put up with the glare. In other words, if the driver makes a patient face in a situation where the area around the driver's face is locally dark, it can be estimated that this is not due to the feeling of being dazzled, but due to a decrease in the level of arousal. .
Therefore, even if the face determination information is face determination information indicating that the driver is making a patient face and the luminance in the captured image is equal to or higher than the luminance determination threshold value, the estimation unit 13 If it is determined that there is a shadow around the driver's face based on the luminance distribution, it may be estimated that the driver's arousal level has decreased. Note that the luminance information output from the luminance detection unit 12 to the estimation unit 13 includes the captured image. The estimation unit 13 may determine the luminance distribution on the captured image from the luminance value of each pixel on the captured image. Also, the estimation unit 13 may identify the driver's face area on the captured image using a known image processing technique. The face area is, for example, a minimum rectangular area that includes all feature points that indicate the outline of the driver. Then, for example, if the average value of the brightness values of the pixels included in the face region is equal to or less than a preset threshold value (hereinafter referred to as “shadow determination threshold value”), the estimation unit 13 is determined to have a shadow.
For example, the estimation unit 13 determines whether or not there is a shadow around the face of the driver by comparing the average luminance value of pixels included in the area where the driver's eyes exist (hereinafter referred to as "eye area") and the shadow determination value. You may judge by comparison with a threshold value. The eye area is, for example, a minimum rectangular area that includes all characteristic points indicating the driver's eyes, such as the inner and outer corners of the eye.
 逆に、ドライバの周囲が、ドライバが眩しいと感じるほどの明るさでなかったとしても、ドライバの顔には影ができていない、または、ドライバの顔にのみ光が当たっている等、ドライバの顔周辺が局所的に明るくなっており、ドライバが眩しいと感じることもある。この場合、ドライバは、眩しいことによる我慢顔をする。すなわち、ドライバの顔周辺が局所的に明るくなっている状況で、ドライバが我慢顔をしたのであれば、それは、覚醒度が低下したことによるものではなく、眩しいと感じたことによるものと推定できる。
 よって、推定部13は、顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であり、かつ、撮像画像における輝度が輝度判定用閾値未満であっても、撮像画像上の輝度分布に基づきドライバの顔周辺が明るいと判定した場合は、ドライバの覚醒度は低下していないと推定してもよい。推定部13は、例えば、撮像画像上で、ドライバの顔領域に含まれる画素の輝度値の平均値が、予め設定されている閾値(以下「明るさ判定用閾値」という。)以上であれば、ドライバの顔周辺が明るいと判定する。
 推定部13は、例えば、ドライバの顔周辺が明るいか否かを、ドライバの目領域に含まれる画素の輝度値の平均値と明るさ判定用閾値との比較によって判定してもよい。
Conversely, even if the surroundings of the driver are not bright enough to make the driver feel dazzled, the driver's face may not be shadowed or the driver's face may be illuminated only. The area around the face is locally bright, and the driver may feel dazzled. In this case, the driver makes a patient face due to the glare. In other words, if the driver makes a patient face in a situation where the area around the driver's face is locally bright, it can be inferred that this is not due to a decrease in arousal level but due to the feeling of being dazzled. .
Therefore, even if the face determination information is face determination information indicating that the driver is making a patient face and the luminance in the captured image is less than the luminance determination threshold value, the estimation unit 13 If it is determined that the area around the driver's face is bright based on the luminance distribution, it may be estimated that the driver's arousal level has not decreased. For example, if the average value of the brightness values of the pixels included in the face region of the driver on the captured image is equal to or greater than a preset threshold value (hereinafter referred to as "brightness determination threshold value"), the estimation unit 13 , it is determined that the area around the driver's face is bright.
For example, the estimating unit 13 may determine whether or not the area around the driver's face is bright by comparing the average brightness value of pixels included in the driver's eye region with a brightness determination threshold.
 推定部13は、顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であるか否かを判定した結果、当該顔判定情報が、ドライバが我慢顔をしていないことを示す顔判定情報であると判定した場合、言い換えれば、顔判定装置2において、ドライバが我慢顔をしていないと判定された場合は、公知の、画像に基づく覚醒度推定の技術を用いて、ドライバの覚醒度合いを推定する。例えば、推定部13は、撮像画像に基づいて検出したドライバの開眼度が予め設定された閾値以下である場合に、ドライバの覚醒度が低下したと推定する。推定部13は、ドライバの開眼度が予め設定された閾値より大きい場合は、ドライバの覚醒度は低下していないと推定する。 The estimating unit 13 determines whether or not the face determination information indicates that the driver is making a patient face. As a result, the face determination information indicates that the driver is not making a patient face. In other words, when the face determination device 2 determines that the driver is not making a patient face, using a known image-based arousal level estimation technique, Estimate the degree of arousal of the driver. For example, the estimating unit 13 estimates that the driver's wakefulness has decreased when the driver's eye opening degree detected based on the captured image is equal to or less than a preset threshold value. The estimating unit 13 estimates that the driver's arousal level has not decreased when the eye opening degree of the driver is larger than a preset threshold value.
 推定部13は、ドライバの覚醒度が低下していると推定した場合、ドライバに注意喚起するための情報(以下「警告情報」という。)を出力する。例えば、推定部13は、図示しない出力装置に対して、警告音を出力させる警告情報を出力する。出力装置は、例えば、車両に搭載されているスピーカを想定している。出力装置は、推定部13から警告情報が出力されると、警告音を出力する。 When the estimation unit 13 estimates that the driver's arousal level has decreased, it outputs information (hereinafter referred to as "warning information") to alert the driver. For example, the estimation unit 13 outputs warning information for outputting a warning sound to an output device (not shown). The output device is assumed to be, for example, a speaker mounted on a vehicle. When the estimation unit 13 outputs warning information, the output device outputs a warning sound.
 推定部13は、ドライバの覚醒度合いの推定結果を、推定を行った日時と対応付けて記憶部26に記憶してもよい。この場合、推定部13は、過去の推定結果に基づき、例えば、どれぐらいドライバの覚醒度合いが変化したか等、ドライバの覚醒度合いの傾向を判定することができる。 The estimation unit 13 may store the estimation result of the degree of arousal of the driver in the storage unit 26 in association with the estimation date and time. In this case, the estimation unit 13 can determine the tendency of the degree of arousal of the driver, such as how much the degree of arousal of the driver has changed, based on the past estimation result.
 顔判定装置2の構成例について説明する。
 図1に示すように、顔判定装置2は、画像取得部21、特徴点検出部22、距離算出部23、基準距離算出部24、顔判定部25、および、記憶部26を備える。顔判定部25は、再計算指示部251を備える。
A configuration example of the face determination device 2 will be described.
As shown in FIG. 1 , the face determination device 2 includes an image acquisition section 21 , a feature point detection section 22 , a distance calculation section 23 , a reference distance calculation section 24 , a face determination section 25 and a storage section 26 . The face determination section 25 includes a recalculation instruction section 251 .
 画像取得部21は、撮像装置3から撮像画像を取得する。
 画像取得部21は、取得した撮像画像を、特徴点検出部22、および、輝度検出部12に出力する。
The image acquisition unit 21 acquires a captured image from the imaging device 3 .
The image acquisition unit 21 outputs the acquired captured image to the feature point detection unit 22 and the brightness detection unit 12 .
 特徴点検出部22は、画像取得部21が取得した撮像画像に基づいて、撮像画像上の、ドライバの顔の複数の特徴点を検出する。
 特徴点検出部22が検出する複数の特徴点は、少なくとも、ドライバの両眉頭を示す特徴点を含む。実施の形態1では、一例として、特徴点検出部22が検出する複数の特徴点は、ドライバの両眉頭を示す特徴点とする。
 特徴点検出部22は、検出した特徴点、具体的には、ドライバの両眉頭を示す特徴点に関する特徴点情報を、距離算出部23に出力する。
 なお、例えば、特徴点検出部22は、ドライバの両目の上瞼および下瞼を示す特徴点を検出し、当該特徴点を示す特徴点情報を、推定部13に出力してもよい。この場合、推定部13は、例えば、特徴点検出部22から出力された特徴点情報に基づいてドライバの開眼度を算出し、算出した開眼度からドライバの覚醒度合いを推定できる。
The feature point detection unit 22 detects a plurality of feature points of the driver's face on the captured image based on the captured image acquired by the image acquisition unit 21 .
The plurality of feature points detected by the feature point detection unit 22 include at least feature points indicating the inner corners of the driver's eyebrows. In Embodiment 1, as an example, the plurality of feature points detected by the feature point detection unit 22 are feature points indicating the inner corners of the eyebrows of the driver.
The feature point detection unit 22 outputs feature point information regarding the detected feature points, specifically, feature points indicating the inner corners of the eyebrows of the driver, to the distance calculation unit 23 .
Note that, for example, the feature point detection unit 22 may detect feature points indicating the upper and lower eyelids of both eyes of the driver and output feature point information indicating the feature points to the estimation unit 13 . In this case, the estimating unit 13 can, for example, calculate the degree of eye opening of the driver based on the feature point information output from the feature point detecting unit 22, and estimate the degree of arousal of the driver from the calculated degree of eye opening.
 距離算出部23は、特徴点検出部22が検出した複数の特徴点の撮像画像上の座標を3次元座標に変換する。そして、距離算出部23は、変換後の3次元座標から複数の特徴点の間の距離(以下「特徴点間距離」という。)を算出する。
 具体的には、ここでは、距離算出部23は、撮像画像上のドライバの両眉頭を示す特徴点の座標をそれぞれ3次元座標に変換し、変換後の3次元座標からドライバの両眉頭間の距離(以下「眉間距離」という。)を、特徴点間距離として算出する。
 距離算出部23は、AAM(Active Appearance Model)等、2次元座標を3次元座標に変換する公知の技術を用いて、撮像画像上の特徴点の座標を3次元座標に変換すればよい。
 距離算出部23は、算出した特徴点間距離に関する情報(以下「距離情報」という。)を、顔判定部25に出力する。
 また、距離算出部23は、距離情報を、記憶部26に記憶させる。なお、距離算出部23は、距離情報を、特徴点間距離の算出時刻と対応付けて、時系列で記憶させる。
 記憶部26に記憶される距離情報は、車両の電源がオフにされたときに削除されるものとする。
The distance calculation unit 23 converts the coordinates of the plurality of feature points detected by the feature point detection unit 22 on the captured image into three-dimensional coordinates. Then, the distance calculation unit 23 calculates distances between the plurality of feature points (hereinafter referred to as "distances between feature points") from the converted three-dimensional coordinates.
Specifically, here, the distance calculation unit 23 converts the coordinates of the feature points indicating the inner corners of the driver's eyebrows on the captured image into three-dimensional coordinates, and calculates the distance between the inner corners of the driver's eyebrows from the converted three-dimensional coordinates. The distance (hereinafter referred to as "eyebrow distance") is calculated as the distance between feature points.
The distance calculation unit 23 may convert the coordinates of the feature points on the captured image into three-dimensional coordinates using a known technique for converting two-dimensional coordinates into three-dimensional coordinates, such as AAM (Active Appearance Model).
The distance calculation unit 23 outputs information about the calculated distance between feature points (hereinafter referred to as “distance information”) to the face determination unit 25 .
Further, the distance calculation unit 23 causes the storage unit 26 to store the distance information. Note that the distance calculation unit 23 stores the distance information in chronological order in association with the calculation time of the inter-feature point distance.
The distance information stored in the storage unit 26 is deleted when the power of the vehicle is turned off.
 基準距離算出部24は、記憶部26に記憶されている距離情報を参照して、距離算出部23が算出した特徴点間距離の、予め設定された期間(以下「基準算出対象期間」という。)分の履歴に基づき、基準特徴点間距離を算出する。基準特徴点間距離は、顔判定装置2において、ドライバが我慢顔をしているか否かを判定する際の基準となる特徴点間距離であり、ドライバが我慢顔をしていないときの特徴点間距離とみなす距離である。なお、我慢顔の判定は、顔判定部25が行う。顔判定部25の詳細は、後述する。
 例えば、基準距離算出部24は、基準算出対象期間分の特徴点間距離の最頻値を、基準特徴点間距離とする。なお、これは一例に過ぎず、基準距離算出部24は、例えば、基準算出対象期間分の特徴点間距離の平均値または中央値を、基準特徴点間距離としてもよい。
The reference distance calculation unit 24 refers to the distance information stored in the storage unit 26, and determines the distance between feature points calculated by the distance calculation unit 23 for a preset period (hereinafter referred to as "reference calculation target period"). ), the distance between reference feature points is calculated. The reference inter-feature point distance is the inter-feature point distance that is used as a reference when determining whether or not the driver is making a patient face in the face determination device 2, and is the feature point when the driver is not making a patient face. This is the distance that is regarded as the inter-distance. The determination of the patient face is performed by the face determination unit 25 . Details of the face determination unit 25 will be described later.
For example, the reference distance calculator 24 sets the mode of the inter-feature point distances for the reference calculation target period as the reference inter-feature point distance. Note that this is merely an example, and the reference distance calculation unit 24 may use, for example, the average value or the median value of the distances between feature points for the reference calculation target period as the reference distance between feature points.
 基準距離算出部24は、予め決められたタイミング(以下「基準算出タイミング」という。)で、基準特徴点間距離を算出する。詳細には、基準距離算出部24は、基準算出タイミングになったと判定した時点から記憶部26に記憶された、基準算出対象期間分の距離情報から、基準特徴点間距離を算出する。
 実施の形態1において、基準算出タイミングは、車両の電源がオンにされたとき、および、再計算指示部251から基準特徴点間距離の再計算指示が出力されたとき、とする。再計算指示部251の詳細は、後述する。
 なお、基準距離算出部24は、記憶部26に、基準算出タイミングになってから基準算出対象期間分の距離情報が記憶されていない場合、基準算出対象期間分の距離情報が記憶されるまで待機する。
 基準距離算出部24は、算出した基準特徴点間距離に関する情報(以下「基準距離情報」という。)を、記憶部26に記憶させる。
The reference distance calculator 24 calculates the distance between reference feature points at a predetermined timing (hereinafter referred to as "reference calculation timing"). Specifically, the reference distance calculator 24 calculates the distance between the reference feature points from the distance information for the reference calculation target period, which is stored in the storage unit 26 after it is determined that the reference calculation timing has come.
In the first embodiment, the reference calculation timing is when the power of the vehicle is turned on and when recalculation instruction unit 251 outputs an instruction to recalculate the distance between reference feature points. Details of the recalculation instruction unit 251 will be described later.
If the storage unit 26 does not store the distance information for the reference calculation period after the reference calculation timing, the reference distance calculation unit 24 waits until the distance information for the reference calculation period is stored. do.
The reference distance calculation unit 24 causes the storage unit 26 to store information about the calculated distance between reference feature points (hereinafter referred to as “reference distance information”).
 顔判定部25は、距離算出部23が算出した特徴点間距離と基準特徴点間距離とを比較することで、ドライバが我慢顔をしているか否かを判定する。顔判定部25は、基準特徴点間距離を、記憶部26に記憶されている基準距離情報から特定できる。
 具体的には、顔判定部25は、特徴点間距離、ここでは、ドライバの眉間距離が、基準特徴点間距離よりも予め設定された閾値(以下「第1閾値」という。)以上小さい場合、言い換えれば、「基準特徴点間距離-特徴点間距離≧第1閾値」である場合、ドライバは我慢顔をしていると判定する。
 一般に、人が眉間にしわを寄せた顔をすると、人が眉間にしわを寄せていない顔をしているときと比べ、眉間距離は短くなる。そこで、顔判定部25は、ドライバの眉間距離が基準特徴点間距離よりも第1閾値以上小さい場合は、ドライバが眉間にしわを寄せた顔、すなわち、我慢顔をしていると判定する。
 顔判定部25は、特徴点間距離が基準特徴点間距離よりも第1閾値以上小さくなければ、言い換えれば、「基準特徴点間距離-特徴点間距離<第1閾値」であれば、ドライバは我慢顔をしていないと判定する。
The face determination unit 25 determines whether or not the driver is making a patient face by comparing the distance between feature points calculated by the distance calculation unit 23 and the reference distance between feature points. The face determination unit 25 can identify the distance between reference feature points from the reference distance information stored in the storage unit 26 .
Specifically, when the distance between feature points, here, the distance between the driver's eyebrows, is smaller than the reference distance between feature points by a preset threshold (hereinafter referred to as "first threshold") or more, the face determination unit 25 In other words, if "the distance between the reference feature points - the distance between the feature points ≥ the first threshold", it is determined that the driver is making a patient face.
In general, when a person has a face with wrinkled brows, the distance between the brows is shorter than when a person has a face without wrinkled brows. Therefore, when the distance between the driver's eyebrows is smaller than the distance between the reference feature points by a first threshold value or more, the face determination unit 25 determines that the driver has wrinkled eyebrows, that is, a patient face.
If the distance between feature points is not smaller than the distance between reference feature points by a first threshold or more, in other words, if "distance between reference feature points - distance between feature points < first threshold", the face determination unit 25 determines that the driver is judged to be not showing patience.
 ここで、ドライバは我慢顔をしていないと判定した場合、顔判定部25は、さらに、特徴点間距離が、基準特徴点間距離よりも予め設定された閾値(以下「第2閾値」という。)以上大きいか否か、言い換えれば、「特徴点間距離-基準特徴点間距離≧第2閾値」であるか否かを判定する。
 特徴点間距離が基準特徴点間距離よりも第2閾値以上大きい場合、顔判定部25は、その旨を再計算指示部251に通知する。再計算指示部251は、顔判定部25から、特徴点間距離が基準特徴点間距離よりも第2閾値以上大きい旨が通知されると、基準距離算出部24に対して、基準特徴点間距離の再計算を指示する。
 特徴点間距離が基準特徴点間距離よりも第2閾値以上大きい場合、基準特徴点間距離は、ドライバが我慢顔をしているときの特徴点間距離に基づいて算出されたものと想定される。上述のとおり、基準特徴点間距離は、ドライバが我慢顔をしていないときの特徴点間距離(眉間距離)とみなす距離であるので、ドライバが我慢顔をしているときの特徴点間距離に基づいて算出された基準特徴点間距離は、基準特徴点間距離としてふさわしくない。そこで、顔判定部25が特徴点間距離は基準特徴点間距離よりも第2閾値以上大きいと判定した場合は、再計算指示部251が、基準距離算出部24に基準特徴点間距離の再計算を行わせる。
Here, when it is determined that the driver is not making a patient face, the face determination unit 25 further determines that the distance between feature points is greater than the reference distance between feature points by a preset threshold (hereinafter referred to as "second threshold"). ), in other words, it is determined whether or not "the distance between the feature points - the distance between the reference feature points ≥ the second threshold".
When the inter-feature point distance is greater than the reference inter-feature point distance by the second threshold or more, the face determination unit 25 notifies the recalculation instruction unit 251 to that effect. When the recalculation instruction unit 251 is notified by the face determination unit 25 that the inter-feature point distance is larger than the reference inter-feature point distance by the second threshold or more, the re-calculation instruction unit 251 instructs the reference distance calculation unit 24 to calculate the distance between the reference feature points. Instructs recalculation of the distance.
If the inter-feature point distance is larger than the reference inter-feature point distance by the second threshold or more, it is assumed that the reference inter-feature point distance is calculated based on the inter-feature point distance when the driver is making a patient face. be. As described above, the reference inter-feature point distance is the distance that is regarded as the inter-feature distance (eyebrow distance) when the driver is not making a patient face. The distance between reference feature points calculated based on is not suitable as the distance between reference feature points. Therefore, when the face determination unit 25 determines that the inter-feature point distance is greater than the reference inter-feature point distance by the second threshold or more, the recalculation instruction unit 251 instructs the reference distance calculation unit 24 to recalculate the inter-reference feature point distance. let them do the calculations.
 顔判定部25は、ドライバが我慢顔をしていると判定したか否かを示す顔判定情報を、顔判定情報取得部11に出力する。詳細には、顔判定部25は、ドライバが我慢顔をしていると判定した場合、ドライバが我慢顔をしていることを示す顔判定情報を、顔判定情報取得部11に出力する。顔判定部25は、ドライバが我慢顔をしていないと判定した場合、ドライバが我慢顔をしていないことを示す顔判定情報を、顔判定情報取得部11に出力する。 The face determination unit 25 outputs to the face determination information acquisition unit 11 face determination information indicating whether or not it is determined that the driver is making a patient face. Specifically, when determining that the driver is making a patient face, the face determination unit 25 outputs face determination information indicating that the driver is making a patient face to the face determination information acquisition unit 11 . When the face determination unit 25 determines that the driver is not making a patient face, the face determination unit 25 outputs face determination information indicating that the driver is not making a patient face to the face determination information acquisition unit 11 .
 ただし、顔判定部25は、基準特徴点間距離の再計算指示を行う必要があった場合、すなわち、特徴点間距離が基準特徴点間距離よりも第2閾値以上大きいと判定した場合、顔判定情報の出力は行わない。ドライバが我慢顔をしていないとの判定が、誤判定である可能性があるためである。 However, when it is necessary to issue an instruction to recalculate the distance between the reference feature points, that is, when the face determination unit 25 determines that the distance between the reference feature points is greater than the distance between the reference feature points by the second threshold or more, the face determination unit 25 Judgment information is not output. This is because there is a possibility that the determination that the driver is not making a patient face is an erroneous determination.
 記憶部26は、距離情報および基準距離情報を記憶する。
 なお、実施の形態1では、記憶部26は、顔判定装置2に備えられるようにしたが、これは一例に過ぎない。記憶部26は、顔判定装置2の外部の、顔判定装置2が参照可能な場所に備えられてもよい。
The storage unit 26 stores distance information and reference distance information.
Although the storage unit 26 is provided in the face determination device 2 in Embodiment 1, this is merely an example. The storage unit 26 may be provided outside the face determination device 2 at a location that the face determination device 2 can refer to.
 実施の形態1に係る覚醒度推定装置1の動作について説明する。
 図2は、実施の形態1に係る覚醒度推定装置1の動作について説明するためのフローチャートである。なお、図2において、ステップST1~ステップST5の処理は、覚醒度推定装置1が備える顔判定装置2によって行われる処理である。
The operation of the awakening level estimation device 1 according to Embodiment 1 will be described.
FIG. 2 is a flowchart for explaining the operation of the wakefulness estimation device 1 according to the first embodiment. In FIG. 2, the processing of steps ST1 to ST5 is processing performed by the face determination device 2 included in the arousal level estimation device 1. FIG.
 画像取得部21は、撮像装置3から撮像画像を取得する(ステップST1)。
 画像取得部21は、取得した撮像画像を、特徴点検出部22、および、輝度検出部12に出力する。
The image acquisition unit 21 acquires a captured image from the imaging device 3 (step ST1).
The image acquisition unit 21 outputs the acquired captured image to the feature point detection unit 22 and the brightness detection unit 12 .
 特徴点検出部22は、ステップST1にて画像取得部21が取得した撮像画像に基づいて、撮像画像上の、ドライバの顔の複数の特徴点を検出する(ステップST2)。実施の形態1では、特徴点検出部22は、ドライバの両眉頭を示す特徴点を検出する。
 特徴点検出部22は、特徴点情報を、距離算出部23に出力する。
The feature point detection unit 22 detects a plurality of feature points of the driver's face on the captured image based on the captured image acquired by the image acquisition unit 21 in step ST1 (step ST2). In Embodiment 1, the feature point detection unit 22 detects feature points indicating the inner corners of the eyebrows of the driver.
The feature point detection section 22 outputs the feature point information to the distance calculation section 23 .
 距離算出部23は、ステップST2にて特徴点検出部22が検出した複数の特徴点の撮像画像上の座標を3次元座標に変換する(ステップST3)。そして、距離算出部23は、変換後の3次元座標から複数の特徴点の間の特徴点間距離を算出する(ステップST4)。実施の形態1では、距離算出部23は、撮像画像上のドライバの両眉頭を示す特徴点の座標をそれぞれ3次元座標に変換し、変換後の3次元座標からドライバの眉間距離を特徴点間距離として算出する。
 距離算出部23は、距離情報を、顔判定部25に出力する。
The distance calculator 23 converts the coordinates of the plurality of feature points detected by the feature point detector 22 in step ST2 into three-dimensional coordinates (step ST3). Then, the distance calculation unit 23 calculates the inter-feature point distances between the plurality of feature points from the converted three-dimensional coordinates (step ST4). In Embodiment 1, the distance calculation unit 23 converts the coordinates of the characteristic points indicating the inner corners of the driver's eyebrows on the captured image into three-dimensional coordinates, and calculates the distance between the driver's eyebrows from the converted three-dimensional coordinates. Calculate as a distance.
The distance calculation section 23 outputs the distance information to the face determination section 25 .
 顔判定部25は、ステップST4にて距離算出部23が算出した特徴点間距離と基準特徴点間距離とを比較することで、ドライバが我慢顔をしているか否かを判定する我慢顔判定処理を行う(ステップST5)。 The face determination unit 25 compares the distance between feature points calculated by the distance calculation unit 23 in step ST4 with the reference distance between feature points, thereby determining whether or not the driver is making a patient face. Processing is performed (step ST5).
 顔判定情報取得部11は、顔判定装置2から顔判定情報を取得する(ステップST6)。詳細には、顔判定情報取得部11は、顔判定装置2の顔判定部25がステップST5において我慢顔判定処理を行って出力した顔判定情報を取得する。
 顔判定情報取得部11は、取得した顔判定情報を、推定部13に出力する。
The face determination information acquisition unit 11 acquires face determination information from the face determination device 2 (step ST6). Specifically, the face determination information acquisition unit 11 acquires the face determination information output by the face determination unit 25 of the face determination device 2 performing the patient face determination process in step ST5.
The face determination information acquisition unit 11 outputs the acquired face determination information to the estimation unit 13 .
 輝度検出部12は、撮像装置3が撮像した撮像画像における輝度を検出する(ステップST7)。具体的には、輝度検出部12は、ステップST1にて顔判定装置2の画像取得部21が取得した撮像画像を取得し、当該撮像画像における輝度を検出する。
 輝度検出部12は、輝度情報を、推定部13に出力する。
The brightness detection unit 12 detects brightness in the captured image captured by the imaging device 3 (step ST7). Specifically, the brightness detection unit 12 acquires the captured image acquired by the image acquisition unit 21 of the face determination device 2 in step ST1, and detects the brightness in the captured image.
The luminance detection unit 12 outputs luminance information to the estimation unit 13 .
 推定部13は、ステップST6にて顔判定情報取得部11が取得した顔判定情報と、ステップST7にて輝度検出部12が検出した撮像画像における輝度とに基づき、ドライバの覚醒度合いを推定する覚醒度推定処理を行う(ステップST8)。 The estimating unit 13 estimates the degree of arousal of the driver based on the face determination information acquired by the face determination information acquiring unit 11 in step ST6 and the brightness in the captured image detected by the brightness detection unit 12 in step ST7. Degree estimation processing is performed (step ST8).
 なお、図2のフローチャートでは、ステップST6、ステップST7の順で処理が行われるものとしたが、ステップST6、ステップST7の処理の順番はこれに限らない。ステップST7、ステップST6の順で処理が行われてもよいし、ステップST6の処理とステップST7の処理とが並行して行われてもよい。ステップST7の処理は、ステップST1の後、ステップST8の処理が行われるまでの間であれば、どのタイミングで行われてもよい。 In addition, in the flowchart of FIG. 2, the processing is performed in the order of step ST6 and step ST7, but the processing order of step ST6 and step ST7 is not limited to this. The processing of step ST7 and step ST6 may be performed in order, or the processing of step ST6 and the processing of step ST7 may be performed in parallel. The process of step ST7 may be performed at any timing after step ST1 until the process of step ST8 is performed.
 図3は、図2のステップST5における、顔判定部25による我慢顔判定処理の詳細を説明するためのフローチャートである。 FIG. 3 is a flowchart for explaining the details of the patient face determination process by the face determination unit 25 in step ST5 of FIG.
 顔判定部25は、特徴点間距離、ここではドライバの眉間距離が、基準特徴点間距離よりも第1閾値以上小さいか否か、言い換えれば、「基準特徴点間距離-特徴点間距離≧第1閾値」であるか否か判定する(ステップST21)。
 特徴点間距離が基準特徴点間距離よりも第1閾値以上小さい場合(ステップST21の“YES”の場合)、顔判定部25は、ドライバは我慢顔をしていると判定する(ステップST22)。
 特徴点間距離が基準特徴点距離よりも第1閾値以上小さくない場合、言い換えれば、「基準特徴点間距離-特徴点間距離<第1閾値」である場合(ステップST21の“NO”の場合)、顔判定部25は、ドライバは我慢顔をしていないと判定する(ステップST24)。
The face determination unit 25 determines whether or not the distance between feature points, here, the distance between the driver's eyebrows, is smaller than the reference distance between feature points by a first threshold or more. First threshold" (step ST21).
If the inter-feature point distance is smaller than the reference inter-feature point distance by the first threshold or more ("YES" in step ST21), the face determination unit 25 determines that the driver is making a patient face (step ST22). .
If the inter-feature point distance is not smaller than the reference feature point distance by a first threshold or more, in other words, if "the reference feature point distance - the inter-feature point distance < the first threshold value"("NO" in step ST21) ), the face determination unit 25 determines that the driver is not making a patient face (step ST24).
 ステップST24にて、ドライバは我慢顔をしていないと判定した場合、顔判定部25は、特徴点間距離が基準特徴点間距離よりも第2閾値以上大きいか否か、言い換えれば、「特徴点間距離-基準特徴点間距離≧第2閾値」であるか否かを判定する(ステップST25)。
 特徴点間距離が基準特徴点間距離よりも第2閾値以上大きい場合(ステップST25の“YES”の場合)、顔判定部25は、その旨を再計算指示部251に通知する。そして、再計算指示部251は、基準距離算出部24に対して、基準特徴点間距離の再計算を指示する(ステップST26)。
If it is determined in step ST24 that the driver is not making a patient face, the face determining unit 25 determines whether the distance between feature points is greater than the reference distance between feature points by a second threshold or more. It is determined whether or not "distance between points - distance between reference feature points≧second threshold" (step ST25).
If the inter-feature point distance is greater than the reference inter-feature point distance by the second threshold or more (“YES” in step ST25), the face determination section 25 notifies the recalculation instruction section 251 of that effect. Then, the recalculation instruction section 251 instructs the reference distance calculation section 24 to recalculate the distance between reference feature points (step ST26).
 特徴点間距離が基準特徴点間距離よりも第2閾値以上大きくない場合、言い換えれば、「特徴点間距離-基準特徴点間距離<第2閾値」である場合(ステップST25の“NO”の場合)、および、ステップST22にてドライバは我慢顔をしていると判定すると、顔判定部25は、顔判定情報を、顔判定情報取得部11に出力する(ステップST23)。
 なお、顔判定部25は、基準特徴点間距離の再計算指示を行う必要があった場合、すなわち、ステップST25にて特徴点間距離が基準特徴点間距離よりも第2閾値以上大きいと判定した場合、顔判定情報の出力は行わない。ドライバが我慢顔をしていないとの判定(ステップST24参照)が、誤判定である可能性があるためである。
If the inter-feature point distance is not larger than the reference inter-feature point distance by a second threshold or more, in other words, if "inter-feature point distance - reference inter-feature point distance < second threshold" (“NO” in step ST25 case), and when it is determined in step ST22 that the driver is making a patient face, the face determination section 25 outputs face determination information to the face determination information acquisition section 11 (step ST23).
If it is necessary to issue an instruction to recalculate the distance between the reference feature points, the face determination unit 25 determines in step ST25 that the distance between the feature points is larger than the distance between the reference feature points by the second threshold or more. In this case, face determination information is not output. This is because the determination that the driver is not making a patient face (see step ST24) may be an erroneous determination.
 図4は、図2のステップST8における、推定部13による覚醒度推定処理の詳細を説明するためのフローチャートである。 FIG. 4 is a flowchart for explaining the details of the arousal level estimation process by the estimation unit 13 in step ST8 of FIG.
 推定部13は、図2のステップST6にて顔判定情報取得部11が取得した顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であるか否か、言い換えれば、顔判定装置2において、ドライバが我慢顔をしていると判定されたか否か、を判定する(ステップST31)。 The estimation unit 13 determines whether the face determination information acquired by the face determination information acquisition unit 11 in step ST6 in FIG. 2 is face determination information indicating that the driver is making a patient face. It is determined whether or not the determination device 2 has determined that the driver is making a patient face (step ST31).
 推定部13は、顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報である場合(ステップST31の“YES”の場合)、図2のステップST7にて輝度検出部12が検出した撮像画像における輝度が輝度判定用閾値未満であるか否かを判定する(ステップST32)。 When the face determination information indicates that the driver is making a patient face (“YES” in step ST31), the estimating unit 13 determines that the luminance detecting unit 12 in step ST7 of FIG. It is determined whether or not the brightness in the detected captured image is less than the brightness determination threshold (step ST32).
 ステップST32にて、撮像画像における輝度が輝度判定用閾値未満であると判定した場合(ステップST32の“YES”の場合)、推定部13は、撮像画像上の輝度分布に基づきドライバの顔周辺が明るいか否かを判定する(ステップST33)。 When it is determined in step ST32 that the brightness in the captured image is less than the threshold value for brightness determination (“YES” in step ST32), the estimating unit 13 determines that the driver's face periphery is based on the brightness distribution on the captured image. It is determined whether or not it is bright (step ST33).
 ステップST33にて、ドライバの顔周辺が明るいと判定した場合(ステップST33の“YES”の場合)、推定部13は、ドライバの覚醒度は低下していないと推定する(ステップST35)。推定部13は、ドライバの覚醒度合いの推定結果を記憶部26に記憶してもよい。
 ステップST33にて、ドライバの顔周辺が明るくないと判定した場合(ステップST33の“NO”の場合)、推定部13の処理はステップST34に進む。
When it is determined in step ST33 that the area around the driver's face is bright ("YES" in step ST33), the estimation unit 13 estimates that the driver's arousal level has not decreased (step ST35). The estimating unit 13 may store the result of estimating the degree of wakefulness of the driver in the storage unit 26 .
When it is determined in step ST33 that the area around the driver's face is not bright ("NO" in step ST33), the process of the estimating section 13 proceeds to step ST34.
 ステップST32にて、撮像画像における輝度が輝度判定用閾値未満ではないと判定した場合、言い換えれば、撮像画像における輝度が輝度判定用閾値以上と判定した場合(ステップST32の“NO”の場合)、推定部13は、撮像画像上の輝度分布に基づきドライバの顔周辺に影があるか否かを判定する(ステップST36)。 When it is determined in step ST32 that the brightness of the captured image is not less than the brightness determination threshold value, in other words, when it is determined that the brightness of the captured image is equal to or greater than the brightness determination threshold value ("NO" in step ST32), The estimation unit 13 determines whether or not there is a shadow around the driver's face based on the luminance distribution on the captured image (step ST36).
 ステップST36にて、ドライバの顔周辺に影があると判定した場合(ステップST36の“YES”の場合)、および、ステップST33にて、ドライバの顔周辺が明るくないと判定した場合(ステップST33の“NO”の場合)、推定部13は、ドライバの覚醒度が低下した、言い換えれば、ドライバは眠気を催していると推定する(ステップST34)。そして、推定部13は、警告情報を出力する。推定部13は、ドライバの覚醒度合いの推定結果を記憶部26に記憶してもよい。 When it is determined in step ST36 that there is a shadow around the driver's face ("YES" in step ST36), and when it is determined in step ST33 that the driver's face is not bright (step ST33). If "NO"), the estimation unit 13 estimates that the driver's wakefulness has decreased, in other words, the driver is drowsy (step ST34). Then, the estimation unit 13 outputs warning information. The estimating unit 13 may store the result of estimating the degree of wakefulness of the driver in the storage unit 26 .
 ステップST36にて、ドライバの顔周辺に影がないと判定した場合(ステップST36の“NO”の場合)、推定部13は、ドライバの覚醒度は低下していないと推定する(ステップST37)。推定部13は、ドライバの覚醒度合いの推定結果を記憶部26に記憶してもよい。 When it is determined in step ST36 that there is no shadow around the driver's face ("NO" in step ST36), the estimation unit 13 estimates that the driver's arousal level has not decreased (step ST37). The estimating unit 13 may store the result of estimating the degree of wakefulness of the driver in the storage unit 26 .
 一方、顔判定情報が、ドライバが我慢顔をしていないことを示す顔判定情報である場合(ステップST31の“NO”の場合)、推定部13は、公知の、画像に基づく覚醒度推定の技術を用いて、ドライバの覚醒度合いを推定する(ステップST38)。推定部13は、ドライバの覚醒度合いを推定した結果、ドライバの覚醒度は低下したと推定した場合、警告情報を出力する。推定部13は、ドライバの覚醒度合いの推定結果を記憶部26に記憶してもよい。 On the other hand, when the face determination information is face determination information indicating that the driver is not making a patient face (“NO” in step ST31), the estimating unit 13 performs a known image-based wakefulness estimation. Technology is used to estimate the degree of alertness of the driver (step ST38). As a result of estimating the degree of alertness of the driver, the estimation unit 13 outputs warning information when it is estimated that the degree of alertness of the driver has decreased. The estimating unit 13 may store the result of estimating the degree of wakefulness of the driver in the storage unit 26 .
 なお、図4に示すフローチャートにおいて、ステップST33、ステップST35、おおよび、ステップST36の処理は省略可能である。ステップST36の処理が省略された場合、ステップST32にて、撮像画像における輝度が輝度判定用閾値未満ではないと判定すると(ステップST32の“NO”の場合)、推定部13は、ステップST37の処理に進む。 In addition, in the flowchart shown in FIG. 4, the processing of steps ST33, ST35, and ST36 can be omitted. When the process of step ST36 is omitted, if it is determined in step ST32 that the luminance in the captured image is not less than the threshold for luminance determination ("NO" in step ST32), the estimation unit 13 performs the process of step ST37. proceed to
 図5は、実施の形態1における、基準距離算出部24による基準特徴点間距離の算出動作について説明するためのフローチャートである。
 図5のフローチャートで示す動作は、図2のフローチャートを用いて説明した顔判定装置2の動作と並行して行われる。
FIG. 5 is a flowchart for explaining the calculation operation of the distance between reference feature points by the reference distance calculator 24 in the first embodiment.
The operation shown in the flowchart of FIG. 5 is performed in parallel with the operation of the face determination device 2 described using the flowchart of FIG.
 基準距離算出部24は、基準算出タイミングになったか否かを判定する(ステップST41)。詳細には、基準距離算出部24は、車両の電源がオンにされたか、または、再計算指示部251から基準特徴点間距離の再計算指示が出力された(図3のステップST26参照)か、を判定する。
 基準算出タイミングになっていないと判定した場合(ステップST41の“NO”の場合)、基準距離算出部24は、基準算出タイミングになるまで待機する。
The reference distance calculator 24 determines whether or not the reference calculation timing has come (step ST41). Specifically, the reference distance calculation unit 24 determines whether the power of the vehicle is turned on, or whether an instruction to recalculate the distance between reference feature points is output from the recalculation instruction unit 251 (see step ST26 in FIG. 3). , to determine.
If it is determined that the reference calculation timing has not come (“NO” in step ST41), the reference distance calculator 24 waits until the reference calculation timing comes.
 基準算出タイミングになったと判定した場合(ステップST41の“YES”の場合)、基準距離算出部24は、記憶部26に記憶されている距離情報を参照して、基準算出対象期間分の特徴点間距離を取得し(ステップST42)、取得した基準算出対象期間分の特徴点間距離に基づき、基準特徴点間距離を算出する(ステップST43)。なお、基準距離算出部24は、基準算出タイミングになったと判定した時点から記憶部26に記憶された、基準算出対象期間分の距離情報から、特徴点間距離を取得する。記憶部26に、基準算出タイミングになったと判定した時点から基準算出対象期間分の距離情報が記憶されていない場合、基準距離算出部24は、基準算出対象期間分の距離情報が記憶されるまで待機した後、基準特徴点間距離を算出する。 If it is determined that the reference calculation timing has come (“YES” in step ST41), the reference distance calculation unit 24 refers to the distance information stored in the storage unit 26, and calculates the feature points for the reference calculation target period. The inter-characteristic distance is obtained (step ST42), and the reference inter-characteristic point distance is calculated based on the obtained inter-characteristic point distance for the reference calculation target period (step ST43). Note that the reference distance calculation unit 24 acquires the distance between feature points from the distance information for the reference calculation target period, which is stored in the storage unit 26 from the point in time when it is determined that the reference calculation timing has come. If the distance information for the reference calculation target period is not stored in the storage unit 26 after it is determined that the reference calculation timing has come, the reference distance calculation unit 24 stores the distance information for the reference calculation target period. After waiting, the distance between reference feature points is calculated.
 基準距離算出部24は、基準距離情報を、記憶部26に記憶させる(ステップST44)。 The reference distance calculation unit 24 causes the storage unit 26 to store the reference distance information (step ST44).
 このように、基準距離算出部24によって算出され、記憶部26に記憶された基準距離情報に基づいて、顔判定部25は、我慢顔判定処理(図2のステップST5参照)を行う。
 ここで、基準算出タイミングとなった後、基準距離算出部24が基準特徴点距離の算出を完了し、基準距離情報が記憶部26に記憶されるまでの間、顔判定部25は、我慢顔判定処理を行わない。例えば、基準距離算出部24は、基準算出タイミングであると判定すると(図5のステップST41の“YES”の場合)、基準算出中フラグ「1」を設定する。基準算出中フラグは、基準距離算出部24および顔判定部25が参照可能な場所に設定され、基準距離算出部24が基準距離情報を記憶部26に記憶すると(図5のステップST44)、基準距離算出部24によって初期値である「0」にされる。
 例えば、基準算出中フラグに「1」が設定されている場合、顔判定部25は、顔判定情報取得部11に対して、待機中である旨の情報(以下「待機通知」という。)を出力する。顔判定情報取得部11は、待機通知を取得すると、当該待機通知を推定部13に出力する。推定部13は、顔判定情報取得部11から待機通知が出力された場合、顔判定情報取得部11からドライバが我慢顔をしていないことを示す顔判定情報が出力された場合と同様の方法で、ドライバの覚醒度合いの推定を行う。すなわち、推定部13は、公知の、画像に基づく覚醒度推定の技術を用いて、ドライバの覚醒度合いを推定する。
 また、例えば、基準算出中フラグに「1」が設定されている場合、顔判定部25は、予め記憶している、人が我慢顔をしていない場合の一般的な両眉頭間の距離を想定して設定された距離(以下「代替基準距離」という。)を基準特徴点間距離として、ドライバが我慢顔をしているか否かの判定を行ってもよい。代替基準距離は、予め、管理者等によって設定され、顔判定部25に記憶される。
Based on the reference distance information calculated by the reference distance calculation unit 24 and stored in the storage unit 26 in this way, the face determination unit 25 performs the patient face determination process (see step ST5 in FIG. 2).
Here, after the reference calculation timing, the reference distance calculation unit 24 completes the calculation of the reference feature point distance, and until the reference distance information is stored in the storage unit 26, the face determination unit 25 keeps the patient face. Do not perform judgment processing. For example, when the reference distance calculation unit 24 determines that it is the reference calculation timing (“YES” in step ST41 of FIG. 5), it sets the reference calculation in progress flag “1”. The reference-calculating flag is set at a location that can be referred to by the reference distance calculation unit 24 and the face determination unit 25. When the reference distance calculation unit 24 stores the reference distance information in the storage unit 26 (step ST44 in FIG. 5), the reference The distance calculator 24 sets the initial value to “0”.
For example, when the reference calculation in-progress flag is set to "1", the face determination unit 25 notifies the face determination information acquisition unit 11 of waiting information (hereinafter referred to as "waiting notification"). Output. Upon acquiring the standby notification, the face determination information acquisition unit 11 outputs the standby notification to the estimation unit 13 . When the face determination information acquisition unit 11 outputs the standby notification, the estimation unit 13 performs the same method as when the face determination information acquisition unit 11 outputs the face determination information indicating that the driver is not making a patient face. Then, the degree of wakefulness of the driver is estimated. That is, the estimation unit 13 estimates the degree of arousal of the driver using a known image-based arousal degree estimation technique.
Further, for example, when the reference calculation in-progress flag is set to “1”, the face determination unit 25 determines the pre-stored general distance between the inner corners of the eyebrows when the person is not making a patient face. It may be determined whether or not the driver is making a patient face, using an assumed and set distance (hereinafter referred to as “alternative reference distance”) as the reference distance between feature points. The alternative reference distance is set in advance by an administrator or the like and stored in the face determination section 25 .
 このように、覚醒度推定装置1は、顔判定装置2から、ドライバが我慢顔をしているか否かを示す顔判定情報を取得し、撮像画像における輝度を検出する。そして、覚醒度推定装置1は、取得した顔判定情報と、検出した輝度とに基づき、ドライバの覚醒度合いを推定する。
 覚醒度推定装置1は、ドライバが我慢顔をしている場合に、周囲の状況はドライバが眩しいと感じると想定される状況であるかを考慮してドライバの覚醒度合いを推定することで、ドライバの覚醒度低下の誤推定を防ぐことができる。
Thus, the wakefulness estimation device 1 acquires face determination information indicating whether or not the driver is making a patient face from the face determination device 2, and detects the luminance in the captured image. Then, the awakening level estimation device 1 estimates the driver's awakening level based on the acquired face determination information and the detected brightness.
The arousal level estimating device 1 estimates the driver's arousal level by considering whether the surrounding situation is a situation where the driver is assumed to feel dazzling when the driver is making a patient face. It is possible to prevent erroneous estimation of the decrease in arousal level of
 詳細には、覚醒度推定装置1は、取得した顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であり、かつ、検出した撮像画像における輝度が輝度判定用閾値未満である場合、ドライバの覚醒度が低下したと推定する。覚醒度推定装置1は、顔判定情報と撮像画像における輝度とから、ドライバが我慢顔をした場合に、当該我慢顔が、ドライバの覚醒度が低下したことによってなされた顔なのか、ドライバが眩しいと感じたためになされた顔なのかを区別できる。これにより、覚醒度推定装置1は、ドライバの覚醒度が低下したことの誤推定を防ぐことができる。 Specifically, the arousal level estimation device 1 detects that the acquired face determination information is face determination information indicating that the driver is making a patient face, and the luminance in the detected captured image is less than the luminance determination threshold value. If there is, it is estimated that the driver's arousal level has decreased. When the driver makes a patient face based on the face determination information and the brightness in the captured image, the arousal level estimation device 1 determines whether the patient face is a face made due to a decrease in the driver's arousal level or whether the driver is dazzling. I can distinguish whether it is a face made because I felt it. As a result, the awakening level estimation device 1 can prevent erroneous estimation that the driver's awakening level has decreased.
 また、覚醒度推定装置1は、取得した顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であり、かつ、検出した撮像画像における輝度が輝度判定用閾値以上である場合、ドライバの覚醒度が低下していないと推定する。言い換えれば、ドライバは眩しいと感じたため我慢顔をしたと推定する。覚醒度推定装置1は、顔判定情報と撮像画像における輝度とから、ドライバが我慢顔をした場合に、当該我慢顔が、ドライバの覚醒度が低下したことによってなされた顔なのか、ドライバが眩しいと感じたためになされた顔なのかを区別できる。これにより、覚醒度推定装置1は、ドライバの覚醒度が低下したことの誤推定を防ぐことができる。 Further, when the acquired face determination information is face determination information indicating that the driver is making a patient face, and the luminance in the detected imaged image is equal to or higher than the luminance determination threshold value, the arousal level estimation device 1 , it is estimated that the arousal level of the driver has not decreased. In other words, it is presumed that the driver made a patient face because he felt that the light was bright. When the driver makes a patient face based on the face determination information and the brightness in the captured image, the arousal level estimation device 1 determines whether the patient face is a face made due to a decrease in the driver's arousal level or whether the driver is dazzling. I can distinguish whether it is a face made because I felt it. As a result, the awakening level estimation device 1 can prevent erroneous estimation that the driver's awakening level has decreased.
 また、覚醒度推定装置1は、取得した顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であり、かつ、検出した撮像画像における輝度が輝度判定用閾値未満であっても、撮像画像上の輝度分布に基づきドライバの顔周辺が明るいと判定した場合は、ドライバの覚醒度が低下していないと推定するようにしてもよい。これにより、覚醒度推定装置1は、周囲に影等ができていてドライバが眩しいと感じるほどの明るさでなくても、ドライバの顔周辺が局所的に明るくなり、ドライバが眩しいと感じる状況となっている場合を考慮して、ドライバの我慢顔が、ドライバの覚醒度が低下したことによってなされた顔なのか、ドライバが眩しいと感じたためになされた顔なのかを区別できる。覚醒度推定装置1は、より精度よく、ドライバの覚醒度が低下したことの誤推定を防ぐことができる。 Further, the arousal level estimation device 1 determines whether the acquired face determination information is face determination information indicating that the driver is making a patient face, and the luminance in the detected captured image is less than the luminance determination threshold value. Alternatively, if it is determined that the area around the driver's face is bright based on the luminance distribution on the captured image, it may be estimated that the driver's arousal level has not decreased. As a result, the arousal level estimation device 1 can create a situation in which the area around the driver's face becomes locally bright and the driver feels dazzled, even if there is a shadow or the like in the surroundings and the brightness is not high enough for the driver to feel dazzled. Considering the case, it is possible to distinguish whether the driver's patient face is a face made because the driver's arousal level has decreased or a face made because the driver feels dazzling. The arousal level estimation device 1 can more accurately prevent erroneous estimation that the driver's arousal level has decreased.
 また、覚醒度推定装置1は、取得した顔判定情報が、ドライバが我慢顔をしていることを示す顔判定情報であり、かつ、検出した撮像画像における輝度が輝度判定用閾値以上であっても、撮像画像上の輝度分布に基づきドライバの顔周辺に影があると判定した場合は、ドライバの覚醒度が低下したと推定するようにしてもよい。これにより、覚醒度推定装置1は、周囲が、ドライバが眩しいと感じるほどの明るさであっても、ドライバの顔周辺が局所的に影になり、ドライバが眩しいと感じない状況となっている場合を考慮して、ドライバの我慢顔が、ドライバの覚醒度が低下したことによってなされた顔なのか、ドライバが眩しいと感じたためになされた顔なのかを区別できる。覚醒度推定装置1は、より精度よくドライバの覚醒度の低下を推定できる。 Further, the arousal level estimation device 1 determines that the acquired face determination information is face determination information indicating that the driver is making a patient face, and the luminance in the detected imaged image is equal to or higher than the luminance determination threshold value. Alternatively, if it is determined that there is a shadow around the driver's face based on the luminance distribution on the captured image, it may be estimated that the driver's arousal level has decreased. As a result, even if the brightness of the surroundings is such that the driver feels dazzled, the wakefulness estimation device 1 is in a situation where the area around the driver's face is locally shadowed and the driver does not feel dazzled. Considering the situation, it is possible to distinguish whether the driver's patient face is a face made because the driver's arousal level has decreased or a face made because the driver feels dazzling. The wakefulness estimation device 1 can more accurately estimate the decrease in the driver's wakefulness.
 また、覚醒度推定装置1は、詳細には覚醒度推定装置1の顔判定装置2は、撮像画像から検出した、ドライバの顔の複数の特徴点(両眉頭)の撮像画像上の座標を3次元座標に変換し、変換後の3次元座標から複数の特徴点の間の特徴点間距離(眉間距離)を算出する。そして、覚醒度推定装置1は、特徴点間距離と基準特徴点間距離とを比較することでドライバが我慢顔をしているか否かを判定し、特徴点間距離が基準特徴点間距離よりも第1閾値以上小さい場合、ドライバが我慢顔をしていると判定する。
 覚醒度推定装置1は、複数の特徴点の撮像画像上の座標を3次元座標に変換して特徴点間距離を算出することで、算出される特徴点間距離が、ドライバの顔の、前後または左右への傾きの影響を受けないようにできる。つまり、覚醒度推定装置1は、特徴点の撮像画像上の座標を3次元座標に変換することなく特徴点間距離を算出する場合と比べ、精度よく特徴点間距離を算出できる。その結果、覚醒度推定装置1は、精度よく、ドライバが我慢顔をしているか否かを判定できる。
In addition, the arousal level estimation device 1, more specifically, the face determination device 2 of the arousal level estimation device 1, detects a plurality of feature points (both eyebrow tips) of the driver's face detected from the captured image, and calculates the coordinates on the captured image as 3 Dimensional coordinates are converted, and distances between feature points (eyebrow distance) between a plurality of feature points are calculated from the three-dimensional coordinates after conversion. Then, the arousal level estimation device 1 determines whether or not the driver is making a patient face by comparing the distance between the feature points with the distance between the reference feature points, and the distance between the feature points is greater than the distance between the reference feature points. is smaller than the first threshold value, it is determined that the driver is making a patient face.
The arousal level estimation device 1 converts the coordinates of a plurality of feature points on the captured image into three-dimensional coordinates and calculates the distance between the feature points. Alternatively, it can be made unaffected by tilting to the left or right. That is, the wakefulness estimation apparatus 1 can calculate the distance between feature points with higher accuracy than when calculating the distance between feature points without converting the coordinates of the feature points on the captured image into three-dimensional coordinates. As a result, the awakening level estimation device 1 can accurately determine whether or not the driver is making a patient face.
 また、覚醒度推定装置1は、算出した特徴点間距離の、基準算出対象期間分の履歴に基づき、基準特徴点間距離を算出し、特徴点間距離と基準特徴点間距離とを比較する。覚醒度推定装置1は、覚醒度合いを推定する対象となるドライバにあわせて、基準特徴点間距離を設定できる。
 また、覚醒度推定装置1は、特徴点間距離と基準特徴点間距離とを比較した結果、特徴点間距離が基準特徴点間距離よりも第2閾値以上大きいと判定した場合は、基準特徴点間距離を再計算する。これにより、覚醒度推定装置1は、ドライバが我慢顔をしていないときの特徴点間距離とみなす基準特徴点間距離としてふさわしくない基準特徴点間距離に基づいて、ドライバが我慢顔をしているか否かを判定しないようにできる。
Further, the awakening level estimation device 1 calculates the distance between the reference feature points based on the history of the calculated distance between the feature points for the reference calculation target period, and compares the distance between the feature points with the distance between the reference feature points. . The arousal level estimation device 1 can set the distance between reference feature points according to the driver whose arousal level is to be estimated.
Further, when the arousal level estimation device 1 determines that the distance between feature points is greater than the distance between reference feature points by the second threshold or more as a result of comparing the distance between feature points and the distance between reference feature points, the wakefulness estimation apparatus 1 Recalculate the distance between points. As a result, the arousal level estimation device 1 determines whether the driver is making a patient face based on the reference feature point distance that is not suitable as the reference feature point distance that is regarded as the feature point distance when the driver is not making a patient face. It is possible not to judge whether or not there is.
 図6Aおよび図6Bは、実施の形態1に係る覚醒度推定装置1のハードウェア構成の一例を示す図である。
 実施の形態1において、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25の機能は、処理回路101により実現される。すなわち、覚醒度推定装置1は、顔判定装置2が撮像画像に基づきドライバが我慢顔をしているか否かを判定した結果と撮像画像における輝度に基づき、ドライバの覚醒度合いを推定する制御を行うための処理回路101を備える。
 処理回路101は、図6Aに示すように専用のハードウェアであっても、図6Bに示すようにメモリに格納されるプログラムを実行するプロセッサ104であってもよい。
6A and 6B are diagrams showing an example of the hardware configuration of the wakefulness estimation device 1 according to Embodiment 1. FIG.
In Embodiment 1, the face determination information acquisition unit 11, the luminance detection unit 12, the estimation unit 13, the image acquisition unit 21, the feature point detection unit 22, the distance calculation unit 23, and the reference distance calculation unit 24 , the functions of the face determination unit 25 are realized by the processing circuit 101 . That is, the arousal level estimating device 1 performs control for estimating the driver's arousal level based on the result of the face determination device 2 determining whether or not the driver is making a patient face based on the captured image and the luminance in the captured image. A processing circuit 101 is provided for.
The processing circuitry 101 may be dedicated hardware, as shown in FIG. 6A, or a processor 104 executing a program stored in memory, as shown in FIG. 6B.
 処理回路101が専用のハードウェアである場合、処理回路101は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、またはこれらを組み合わせたものが該当する。 If the processing circuit 101 is dedicated hardware, the processing circuit 101 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable). Gate Array), or a combination thereof.
 処理回路がプロセッサ104の場合、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25の機能は、ソフトウェア、ファームウェア、または、ソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェアまたはファームウェアは、プログラムとして記述され、メモリ105に記憶される。プロセッサ104は、メモリ105に記憶されたプログラムを読み出して実行することにより、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25の機能を実行する。すなわち、覚醒度推定装置1は、プロセッサ104により実行されるときに、上述の図2のステップST1~ステップST8が結果的に実行されることになるプログラムを格納するためのメモリ105を備える。また、メモリ105に記憶されたプログラムは、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25の処理の手順または方法をコンピュータに実行させるものであるとも言える。ここで、メモリ105とは、例えば、RAM、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read-Only Memory)等の、不揮発性もしくは揮発性の半導体メモリ、または、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)等が該当する。 When the processing circuit is the processor 104, the face determination information acquisition unit 11, the luminance detection unit 12, the estimation unit 13, the image acquisition unit 21, the feature point detection unit 22, the distance calculation unit 23, and the reference distance calculation unit. 24 and the functions of the face determination unit 25 are implemented by software, firmware, or a combination of software and firmware. Software or firmware is written as a program and stored in memory 105 . Processor 104 reads out and executes programs stored in memory 105 to obtain face determination information acquiring unit 11, luminance detecting unit 12, estimating unit 13, image acquiring unit 21, and feature point detecting unit 22. , the distance calculator 23 , the reference distance calculator 24 , and the face determination unit 25 . That is, wakefulness estimation apparatus 1 includes memory 105 for storing a program that, when executed by processor 104, results in execution of steps ST1 to ST8 in FIG. 2 described above. The programs stored in the memory 105 include a face determination information acquisition unit 11, a brightness detection unit 12, an estimation unit 13, an image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference It can also be said that the distance calculation unit 24 and the processing procedure or method of the face determination unit 25 are executed by a computer. Here, the memory 105 is a non-volatile memory such as RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), etc. or volatile A semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versatile Disc), or the like is applicable.
 なお、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25の機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。例えば、画像取得部21については専用のハードウェアとしての処理回路101でその機能を実現し、顔判定情報取得部11と、輝度検出部12と、推定部13と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25についてはプロセッサ104がメモリ105に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。
 記憶部26は、例えば、メモリで構成される。
 また、覚醒度推定装置1は、撮像装置3または図示しない出力装置等の装置と、有線通信または無線通信を行う入力インタフェース装置102および出力インタフェース装置103を備える。
Note that the face determination information acquisition unit 11, the luminance detection unit 12, the estimation unit 13, the image acquisition unit 21, the feature point detection unit 22, the distance calculation unit 23, the reference distance calculation unit 24, and the face determination unit 25 may be partly implemented by dedicated hardware and partly implemented by software or firmware. For example, the function of the image acquisition unit 21 is realized by a processing circuit 101 as dedicated hardware. The functions of the distance calculation unit 23, the reference distance calculation unit 24, and the face determination unit 25 can be realized by the processor 104 reading and executing programs stored in the memory 105. FIG.
The storage unit 26 is configured by, for example, a memory.
The wakefulness estimation apparatus 1 also includes a device such as the imaging device 3 or an output device (not shown), and an input interface device 102 and an output interface device 103 that perform wired or wireless communication.
 以上の実施の形態1では、特徴点検出部22が検出する複数の特徴点は、ドライバの両眉頭を示す特徴点とし、距離算出部23は、撮像画像上のドライバの両眉頭を示す特徴点の座標に基づいて、眉間距離を、特徴点間距離として算出した。そして、顔判定部25は、距離算出部23が算出した眉間距離と基準特徴点間距離とを比較して、ドライバが我慢顔をしているか否かを判定した。しかし、これは一例に過ぎない。
 例えば、以上の実施の形態1において、特徴点検出部22が検出する複数の特徴点は、ドライバの両眉頭およびドライバの両目頭を示す特徴点とし、距離算出部23は、撮像画像上のドライバの両眉頭および両目頭を示す特徴点の座標に基づいて、眉間距離と両目頭間の距離とを、それぞれ特徴点間距離として算出してもよい。顔判定部25は、眉間距離と両目頭間の距離とを、それぞれ基準特徴点間距離と比較して、ドライバが我慢顔をしているか否かを判定する。なお、この場合、基準距離算出部24は、眉間距離に対応する基準特徴点間距離と、両目頭間の距離に対応する基準特徴点間距離とを算出する。
 また、例えば、以上の実施の形態1において、特徴点検出部22が検出する複数の特徴点は、ドライバの両目頭を示す特徴点とし、距離算出部23は、撮像画像上のドライバの両目頭を示す特徴点の座標に基づいて、両目頭間の距離を特徴点間距離として算出してもよい。顔判定部25は、両目頭間の距離を基準特徴点間距離と比較して、ドライバが我慢顔をしているか否かを判定する。なお、この場合、基準距離算出部24は、両目頭間の距離に対応する基準特徴点間距離を算出する。
 ただし、目に関する特徴点は、眉に関する特徴点に比べ、検出されない可能性が高い。例えば、ドライバがサングラスを装着している場合、特徴点検出部22はドライバの目に関する特徴点を検出できない。したがって、覚醒度推定装置1,1aにおいては、両眉頭を示す特徴点に基づいて算出した眉間距離と基準特徴点間距離とを比較してドライバが我慢顔をしているか否かを判定することが好ましい。覚醒度推定装置1,1aは、両眉頭を示す特徴点に基づいて算出した眉間距離と基準特徴点間距離とを比較してドライバが我慢顔をしているか否かを判定することで、当該判定ができない事態に陥る可能性を低減できる。
In the first embodiment described above, the plurality of feature points detected by the feature point detection unit 22 are the feature points indicating the inner ends of the driver's eyebrows, and the distance calculation unit 23 detects the feature points indicating the inner ends of the driver's eyebrows on the captured image. The distance between the eyebrows was calculated as the distance between feature points based on the coordinates of . Then, the face determination unit 25 compares the distance between the eyebrows calculated by the distance calculation unit 23 and the distance between the reference feature points to determine whether or not the driver is making a patient face. However, this is only an example.
For example, in Embodiment 1 described above, the plurality of feature points detected by the feature point detection unit 22 are feature points indicating the inner corners of the driver's eyebrows and the inner corners of the driver's eyes. The distance between the eyebrows and the distance between the inner corners of the eyebrows may be calculated as the distances between the feature points, respectively, based on the coordinates of the feature points indicating the inner corners of the eyebrows and the inner corners of the eyes. The face determination unit 25 compares the distance between the eyebrows and the distance between the inner corners of the eyes with the distance between the reference feature points to determine whether or not the driver is making a patient face. In this case, the reference distance calculator 24 calculates the distance between the reference feature points corresponding to the distance between the eyebrows and the distance between the reference feature points corresponding to the distance between the inner corners of the eyes.
Further, for example, in Embodiment 1 described above, the plurality of feature points detected by the feature point detection unit 22 are feature points indicating the inner corners of the driver's eyes, and the distance calculation unit 23 detects the inner corners of the driver's eyes on the captured image. The distance between the inner corners of the eyes may be calculated as the inter-feature point distance based on the coordinates of the feature points indicating . The face determination unit 25 compares the distance between the inner corners of the eyes with the distance between the reference feature points to determine whether or not the driver is making a patient face. In this case, the reference distance calculator 24 calculates the distance between the reference feature points corresponding to the distance between the inner corners of the eyes.
However, the feature points related to the eyes are more likely not to be detected than the feature points related to the eyebrows. For example, when the driver wears sunglasses, the feature point detection unit 22 cannot detect feature points related to the driver's eyes. Therefore, in the wakefulness estimation devices 1 and 1a, the distance between the eyebrows calculated based on the feature points indicating the inner corners of the eyebrows is compared with the distance between the reference feature points to determine whether or not the driver is making a patient face. is preferred. The arousal level estimation devices 1 and 1a compare the distance between the eyebrows calculated based on the feature points indicating the inner corners of the eyebrows and the distance between the reference feature points to determine whether or not the driver is making a patient face. It is possible to reduce the possibility of falling into a situation where judgment cannot be made.
 また、以上の実施の形態1において、覚醒度推定装置が、個人認証機能を備えることもできる。個人認証機能を備えることで、覚醒度推定装置は、あるドライバの運転開始時に、基準特徴点間距離の算出を待たなくても、前回の運転時に算出済の基準特徴点間距離を用いて当該あるドライバが我慢顔をしているか否かを判定することができる。なお、この場合、覚醒度推定装置は、記憶部に記憶されている基準距離情報を、車両の電源がオフされた際に削除しなくてよい。以下、詳細に説明する。 Further, in Embodiment 1 described above, the arousal level estimation device can also have a personal authentication function. With the personal authentication function, the arousal level estimating device can use the reference feature point distance calculated during the previous driving without waiting for the calculation of the reference feature point distance when the driver starts driving. It can be determined whether or not a certain driver is making a patient face. In this case, the wakefulness estimation device does not have to delete the reference distance information stored in the storage unit when the power of the vehicle is turned off. A detailed description will be given below.
 図7は、実施の形態1において、個人認証機能を備えるようにした覚醒度推定装置1aの構成例を示す図である。
 図7において、図1を用いて説明した覚醒度推定装置1と同様の構成については、同じ符号を付して重複した説明を省略する。なお、ここでは、覚醒度推定装置1aは、個人認証機能を、顔判定装置2aに備えるものとする。
 図7に示す顔判定装置2aは、図1を用いて説明した顔判定装置2とは、個人認証部27を備える点が異なる。また、図7に示す顔判定装置2aにおける基準距離算出部24および顔判定部25の具体的な動作が、図1に示す顔判定装置2における基準距離算出部24および顔判定部25の具体的な動作とは異なる。
FIG. 7 is a diagram showing a configuration example of the awakening level estimation device 1a provided with a personal authentication function in Embodiment 1. As shown in FIG.
In FIG. 7, the same components as those of the awakening level estimation device 1 described with reference to FIG. Here, it is assumed that the awakening level estimation device 1a has a personal authentication function in the face determination device 2a.
A face determination device 2a shown in FIG. 7 differs from the face determination device 2 described with reference to FIG. Further, the specific operations of the reference distance calculation unit 24 and the face determination unit 25 in the face determination device 2a shown in FIG. behavior is different.
 個人認証部27は、ドライバの個人認証を行う。
 例えば、個人認証部27は、予めドライバの顔画像を記憶しておき、画像取得部21から取得した撮像画像とパターンマッチングにより、ドライバの個人認証を行う。なお、これは一例に過ぎず、個人認証部27は、公知の個人認証技術を用いて、ドライバの個人認証を行えばよい。
 個人認証部27は、個人認証の結果(以下「個人認証結果」という。)を、基準距離算出部24および顔判定部25に出力する。
The personal authentication unit 27 performs personal authentication of the driver.
For example, the personal authentication unit 27 stores a face image of the driver in advance, and performs personal authentication of the driver by pattern matching with the captured image obtained from the image obtaining unit 21 . It should be noted that this is merely an example, and the individual authentication unit 27 may perform individual authentication of the driver using a known individual authentication technique.
Personal authentication unit 27 outputs the result of personal authentication (hereinafter referred to as “personal authentication result”) to reference distance calculation unit 24 and face determination unit 25 .
 基準距離算出部24は、個人認証部27による個人認証結果に基づき、個人毎の基準特徴点間距離を算出する。具体的には、基準距離算出部24は、個人認証部27が認証したドライバの基準特徴点間距離として、当該基準特徴点間距離を算出する。
 基準距離算出部24は、算出した基準特徴点間距離と、ドライバ個人を特定可能な情報とを対応付けた情報を、基準距離情報として、記憶部26に記憶させる。なお、基準距離算出部24は、ドライバ個人を特定可能な情報は、個人認証結果に含まれている。
 また、顔判定部25は、個人認証部27による個人認証結果に基づき、ドライバが我慢顔をしているか否かの判定を行う。具体的には、顔判定部25は、個人認証部27による個人認証結果と、記憶部26に記憶されている基準距離情報とをつきあわせ、ドライバの基準特徴点間距離を特定する。そして、顔判定部25は、距離算出部23が算出した特徴点間距離と、特定したドライバの基準特徴点間距離とを比較することで、ドライバが我慢顔をしているか否かを判定する。顔判定部25は、特徴点間距離が基準特徴点間距離よりも第2閾値以上大きいと判定し、再計算指示部251にその旨の通知をする際には、ドライバ個人を特定可能な情報をあわせて通知する。再計算指示部251は、基準距離算出部24に対して基準特徴点間距離の再計算を指示する際、ドライバ個人を特定可能な情報をあわせて出力する。基準距離算出部24は、ドライバの基準特徴点間距離の再計算を行うと、記憶部26に記憶されている、当該ドライバに対応する基準距離情報を更新する。
The reference distance calculator 24 calculates the distance between reference feature points for each individual based on the result of personal authentication by the personal authentication unit 27 . Specifically, the reference distance calculator 24 calculates the reference feature point distance as the reference feature point distance of the driver authenticated by the personal authentication section 27 .
The reference distance calculation unit 24 causes the storage unit 26 to store, as reference distance information, information in which the calculated distance between reference feature points is associated with information that can identify the individual driver. It should be noted that the reference distance calculation unit 24 includes information that can identify the individual driver in the personal authentication result.
Moreover, the face determination unit 25 determines whether or not the driver is making a patient face based on the result of personal authentication by the personal authentication unit 27 . Specifically, the face determination unit 25 compares the personal identification result of the personal identification unit 27 with the reference distance information stored in the storage unit 26 to specify the distance between the reference feature points of the driver. The face determination unit 25 then compares the distance between feature points calculated by the distance calculation unit 23 with the reference distance between feature points of the specified driver, thereby determining whether or not the driver is making a patient face. . The face determination unit 25 determines that the inter-feature point distance is greater than the reference inter-feature point distance by a second threshold or more, and when notifying the recalculation instruction unit 251 to that effect, the face determination unit 25 uses information that can identify the individual driver. shall be notified together. When the recalculation instruction unit 251 instructs the reference distance calculation unit 24 to recalculate the distance between reference feature points, the recalculation instruction unit 251 also outputs information that can identify the individual driver. After recalculating the distance between the reference feature points of the driver, the reference distance calculation unit 24 updates the reference distance information corresponding to the driver stored in the storage unit 26 .
 このように、基準距離算出部24が、個人を特定可能な情報と基準特徴点間距離とが対応付けられた基準距離情報を記憶部26に記憶しておくことで、顔判定部25は、あるドライバの運転開始時に、基準特徴点間距離の算出を待たなくても、前回の運転時に算出済の基準特徴点間距離があれば、その基準特徴点間距離を用いて、当該あるドライバが我慢顔をしているか否かを判定することができる。 In this way, the reference distance calculation unit 24 stores in the storage unit 26 the reference distance information in which the information that can identify an individual and the distance between reference feature points are associated, so that the face determination unit 25 can When a certain driver starts driving, if there is a reference feature point distance that has been calculated during the previous drive without waiting for the calculation of the reference feature point distance, the driver can use the reference feature point distance to It can be determined whether or not the person is making a patient face.
 図8は、実施の形態1において、個人認証機能を備えるようにした覚醒度推定装置1aの動作を説明するためのフローチャートである。
 図8のステップST51、ステップST53~55、ステップST57~ステップST59の具体的な動作は、それぞれ、図2のステップST1~ステップST4、ステップST6~8の具体的な動作と同様であるため、重複した説明を省略する。
FIG. 8 is a flow chart for explaining the operation of the awakening level estimation device 1a provided with the personal authentication function in the first embodiment.
Specific operations of steps ST51, ST53 to ST55, and steps ST57 to ST59 in FIG. 8 are similar to the specific operations of steps ST1 to ST4 and steps ST6 to ST8 in FIG. We omit the explanation.
 個人認証部27は、ドライバの個人認証を行う(ステップST52)。
 個人認証部27は、個人認証結果を、基準距離算出部24および顔判定部25に出力する。
The personal authentication unit 27 performs personal authentication of the driver (step ST52).
Personal authentication section 27 outputs the personal authentication result to reference distance calculation section 24 and face determination section 25 .
 顔判定部25は、ステップST52における個人認証部27による個人認証結果に基づき、ドライバが我慢顔をしているか否かの我慢顔判定処理を行う(ステップST56)。
 我慢顔判定処理において、顔判定部25は、記憶部26に、ドライバに対応する基準距離情報が記憶されている場合、当該基準距離情報に基づいてドライバの基準特徴点間距離を取得し、特徴点間距離との比較を行う。
 また、我慢顔判定処理において、再計算指示部251は、基準距離算出部24に対して、基準特徴点間距離の再計算を指示する場合、ドライバ個人を特定可能な情報をあわせて出力する。
The face determination section 25 performs a patient face determination process to determine whether or not the driver is making a patient face based on the result of personal authentication by the personal authentication section 27 in step ST52 (step ST56).
In the patient face determination process, when reference distance information corresponding to the driver is stored in the storage unit 26, the face determination unit 25 acquires the distance between the reference feature points of the driver based on the reference distance information, and determines the feature. Compare with distance between points.
In the patient face determination process, when the recalculation instruction section 251 instructs the reference distance calculation section 24 to recalculate the distance between reference feature points, the recalculation instruction section 251 also outputs information that can identify the individual driver.
 なお、覚醒度推定装置1aが図7に示すような構成である場合、基準距離算出部24は、図5のステップST52における個人認証部27による個人認証結果に基づき、個人毎に、図5を用いて説明した処理を行い、個人毎の基準特徴点間距離を算出する。
 そして、基準距離算出部24は、算出した基準特徴点間距離と、ドライバ個人を特定可能な情報とを対応付けた情報を、基準距離情報として、記憶部26に記憶させる。
When the wakefulness estimation device 1a has a configuration as shown in FIG. 7, the reference distance calculation unit 24 calculates FIG. Then, the distance between the reference feature points for each individual is calculated.
Then, the reference distance calculation unit 24 causes the storage unit 26 to store, as reference distance information, information in which the calculated distance between reference feature points is associated with information that can identify the individual driver.
 覚醒度推定装置1aが、図7に示すような構成であった場合のハードウェア構成の一例は、図6Aおよび図6Bに示すとおりである。
 顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25と、個人認証部27の機能は、処理回路101により実現される。すなわち、覚醒度推定装置1aは、顔判定装置2aが撮像画像に基づきドライバが我慢顔をしているか否かを判定した結果と撮像画像における輝度に基づき、ドライバの覚醒度合いを推定するとともに、個人認証を行い、個人認証結果と基準距離情報とを対応付けて記憶させておく制御を行うための処理回路101を備える。
 処理回路101は、メモリ105に記憶されたプログラムを読み出して実行することにより、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25と、個人認証部27の機能を実行する。すなわち、覚醒度推定装置1aは、処理回路101により実行されるときに、上述の図8のステップST51~ステップST59が結果的に実行されることになるプログラムを格納するためのメモリ105を備える。また、メモリ105に記憶されたプログラムは、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25と、個人認証部27の処理の手順または方法をコンピュータに実行させるものであるとも言える。
 覚醒度推定装置1aは、撮像装置3または図示しない出力装置等の装置と、有線通信または無線通信を行う入力インタフェース装置102および出力インタフェース装置103を備える。
6A and 6B show an example of the hardware configuration of the wakefulness estimation device 1a when it has the configuration shown in FIG. 7. FIG.
A face determination information acquisition unit 11, a luminance detection unit 12, an estimation unit 13, an image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference distance calculation unit 24, and a face determination unit 25. , the functions of the individual authentication unit 27 are realized by the processing circuit 101 . That is, the arousal level estimation device 1a estimates the driver's arousal level based on the brightness in the captured image and the result of the face determination device 2a determining whether or not the driver is making a patient face based on the captured image. A processing circuit 101 is provided for performing authentication and performing control for storing the personal authentication result and the reference distance information in association with each other.
The processing circuit 101 reads out and executes programs stored in the memory 105 to obtain a face determination information acquisition unit 11, a luminance detection unit 12, an estimation unit 13, an image acquisition unit 21, and a feature point detection unit 22. , the functions of the distance calculation unit 23, the reference distance calculation unit 24, the face determination unit 25, and the personal authentication unit 27 are executed. That is, wakefulness estimation apparatus 1a includes memory 105 for storing a program that, when executed by processing circuit 101, results in execution of steps ST51 to ST59 in FIG. The programs stored in the memory 105 include a face determination information acquisition unit 11, a brightness detection unit 12, an estimation unit 13, an image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference It can also be said that the distance calculation unit 24, the face determination unit 25, and the personal identification unit 27 are executed by a computer.
The wakefulness estimation device 1a includes a device such as the imaging device 3 or an output device (not shown), and an input interface device 102 and an output interface device 103 that perform wired or wireless communication.
 また、以上の実施の形態1では、覚醒度推定装置1が覚醒度合いを推定する対象となる人は、車両のドライバとし、覚醒度合いを推定する対象となる人は1人であることを前提としていたが、これは一例に過ぎない。覚醒度推定装置が覚醒度合いを推定する対象となる人は、複数であってもよい。例えば、覚醒度推定装置は、ドライバの他、助手席乗員または後席乗員を含む、車両の複数の乗員の覚醒度合いを推定してもよい。 Further, in the first embodiment described above, it is assumed that the person whose arousal level is estimated by the arousal level estimation device 1 is the driver of the vehicle, and that the person whose arousal level is estimated is one person. But this is just one example. There may be a plurality of people whose arousal level is estimated by the arousal level estimation device. For example, the arousal level estimation device may estimate the arousal levels of a plurality of vehicle occupants, including the driver, a front passenger seat passenger, or a rear seat passenger.
 覚醒度推定装置1が図1に示すような構成であり、個人認証機能を備えない場合、例えば、特徴点検出部22は、撮像画像に基づいて、各乗員の座席位置を判定し、座席位置毎に複数の特徴点を検出する。そして、特徴点検出部22は、各乗員を特定可能な情報と検出した複数の特徴点と対応付けた情報を特徴点情報として、距離算出部23に出力する。ここでは、各乗員を特定可能な情報は、座席位置に関する情報とする。
 距離算出部23は、特徴点検出部22が判定した座席位置に対応する乗員毎に、特徴点間距離を算出する。距離算出部23は、各乗員を特定可能な情報と特徴点間距離とを対応付けた情報を距離情報として、記憶部26に記憶させるとともに顔判定部25に出力する。
 基準距離算出部24は、特徴点検出部22が判定した座席位置に対応する乗員毎に、基準特徴点間距離を算出する。基準距離算出部24は、各乗員を特定可能な情報と基準特徴点間距離とを対応付けた情報を基準距離情報として、記憶部26に記憶させる。
 顔判定部25は、距離情報と基準距離情報とに基づき、乗員毎に、我慢顔をしているか否かを判定する。
 顔判定部25は、各乗員を特定可能な情報と我慢顔をしているか否かを示す情報とを対応付けた情報を顔判定情報として、顔判定情報取得部11に出力する。
 なお、再計算指示部251が、基準距離算出部24に対して基準特徴点間距離の再計算を指示する際は、各乗員を特定可能な情報をあわせて出力する。基準距離算出部24は、基準特徴点間距離の再計算を行うと、記憶部26に記憶されている、対応する乗員の基準距離情報を更新する。
 推定部13は、乗員毎に、覚醒度合いを推定する。なお、推定部13は、例えば、特徴点検出部22から出力される特徴点情報と顔判定情報取得部11から出力される顔判定情報とから、各乗員を判定できる。
When the wakefulness estimation device 1 is configured as shown in FIG. 1 and does not have a personal authentication function, for example, the feature point detection unit 22 determines the seat position of each passenger based on the captured image, and determines the seat position. A plurality of feature points are detected for each Then, the feature point detection unit 22 outputs to the distance calculation unit 23, as feature point information, information associated with the plurality of detected feature points and the information that can identify each occupant. Here, the information that can identify each occupant is information about the seat position.
The distance calculation unit 23 calculates the distance between feature points for each occupant corresponding to the seat position determined by the feature point detection unit 22 . The distance calculation unit 23 causes the storage unit 26 to store information that associates the information that can identify each passenger with the distance between feature points as distance information, and outputs the information to the face determination unit 25 .
The reference distance calculator 24 calculates the distance between reference feature points for each occupant corresponding to the seat position determined by the feature point detector 22 . The reference distance calculation unit 24 causes the storage unit 26 to store, as reference distance information, information that associates information that can identify each occupant with the distance between reference feature points.
Based on the distance information and the reference distance information, the face determination unit 25 determines whether or not each passenger is making a patient face.
The face determination unit 25 outputs to the face determination information acquisition unit 11 as face determination information information in which information that can identify each passenger is associated with information indicating whether or not the passenger is making a patient face.
When the recalculation instruction section 251 instructs the reference distance calculation section 24 to recalculate the distance between reference feature points, it also outputs information that can identify each passenger. After recalculating the distance between reference feature points, the reference distance calculation unit 24 updates the reference distance information of the corresponding occupant stored in the storage unit 26 .
The estimation unit 13 estimates the degree of wakefulness for each passenger. Note that the estimation unit 13 can determine each passenger from, for example, feature point information output from the feature point detection unit 22 and face determination information output from the face determination information acquisition unit 11 .
 例えば、覚醒度推定装置1aが図7に示すような構成であり、個人認証機能を備えている場合は、顔判定装置2aは、個人認証結果に基づいて、乗員毎に、我慢顔をしているか否かの判定を行えばよい。また、推定部13は、個人認証結果に基づいて、乗員毎に、覚醒度合いの推定を行えばよい。
 具体的には、特徴点検出部22は、個人認証部27による個人認証結果に基づき、乗員毎に複数の特徴点を検出する。なお、図7では、個人認証部27から特徴点検出部22への矢印は省略している。特徴点検出部22は、検出した複数の特徴点と個人認証結果とを対応付けた情報を特徴点情報として、距離算出部23に出力する。
 距離算出部23は、個人認証部27による個人認証結果に基づき、乗員毎に特徴点間距離を算出する。距離算出部23は、算出した特徴点間距離と個人認証結果とを対応付けた情報を距離情報として、記憶部26に記憶させるとともに顔判定部25に出力する。
 基準距離算出部は、個人認証部27による個人認証結果に基づき、乗員毎の基準特徴点間距離を算出する。基準距離算出部24は、基準特徴点間距離と個人認証結果とを対応付けた情報を基準距離情報として、記憶部26に記憶させる。
 顔判定部25は、個人認証部27による個人認証結果に基づき、乗員毎に、我慢顔をしているか否かを判定する。顔判定部25は、我慢顔をしているか否かを示す情報と個人認証結果とを対応付けた情報を顔判定情報として、顔判定情報取得部11に出力する。
 推定部13は、個人認証部27による個人認証結果に基づき、乗員毎に、覚醒度合いを推定する。なお、推定部13は、例えば、特徴点検出部22から出力される特徴点情報と顔判定情報取得部11から出力される顔判定情報とから、各乗員を判定できる。
For example, when the wakefulness estimation device 1a has the configuration shown in FIG. It suffices to determine whether or not there is. Also, the estimation unit 13 may estimate the degree of wakefulness for each passenger based on the personal authentication result.
Specifically, the feature point detection unit 22 detects a plurality of feature points for each occupant based on the personal authentication result by the personal authentication unit 27 . Note that an arrow from the personal authentication unit 27 to the feature point detection unit 22 is omitted in FIG. The feature point detection unit 22 outputs information in which the plurality of detected feature points and the personal authentication result are associated with each other to the distance calculation unit 23 as feature point information.
The distance calculation unit 23 calculates the distance between feature points for each passenger based on the result of personal authentication by the personal authentication unit 27 . The distance calculation unit 23 causes the storage unit 26 to store information in which the calculated inter-feature point distance and the personal authentication result are associated with each other as distance information, and outputs the distance information to the face determination unit 25 .
The reference distance calculator calculates the distance between reference feature points for each occupant based on the result of personal identification by the personal identification unit 27 . The reference distance calculation unit 24 causes the storage unit 26 to store, as reference distance information, information in which the distance between reference feature points and the personal authentication result are associated with each other.
The face determination unit 25 determines whether or not each passenger is making a patient face based on the result of personal identification by the personal identification unit 27 . The face determination unit 25 outputs to the face determination information acquisition unit 11 as face determination information information in which the information indicating whether or not the person is making a patient face is associated with the personal authentication result.
The estimating unit 13 estimates the degree of wakefulness for each occupant based on the result of personal identification by the personal identification unit 27 . Note that the estimation unit 13 can determine each passenger from, for example, feature point information output from the feature point detection unit 22 and face determination information output from the face determination information acquisition unit 11 .
 また、以上の実施の形態1では、覚醒度推定装置1,1aは、車両に搭載される車載装置とし、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25と、覚醒度推定装置1aが図7に示すような構成であった場合の個人認証部27は、車載装置に備えられているものとした。
 これに限らず、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25と、個人認証部27のうち、一部が車両の車載装置に搭載され、その他が当該車載装置とネットワークを介して接続されるサーバに備えられるものとして、車載装置とサーバとで覚醒度推定システムを構成するようにしてもよい。
 また、顔判定情報取得部11と、輝度検出部12と、推定部13と、画像取得部21と、特徴点検出部22と、距離算出部23と、基準距離算出部24と、顔判定部25と、個人認証部27が全部サーバに備えられてもよい。
Further, in Embodiment 1 described above, the wakefulness level estimation devices 1 and 1a are in-vehicle devices mounted in vehicles, and the face determination information acquisition unit 11, the brightness detection unit 12, the estimation unit 13, and the image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference distance calculation unit 24, a face determination unit 25, and a personal authentication unit when the wakefulness level estimation device 1a is configured as shown in FIG. 27 is provided in the in-vehicle device.
Not limited to this, the face determination information acquisition unit 11, the brightness detection unit 12, the estimation unit 13, the image acquisition unit 21, the feature point detection unit 22, the distance calculation unit 23, the reference distance calculation unit 24, It is assumed that part of the face determination unit 25 and the personal authentication unit 27 is installed in an in-vehicle device of the vehicle, and the others are installed in a server connected to the in-vehicle device via a network. A wakefulness estimation system may be configured.
Also, a face determination information acquisition unit 11, a luminance detection unit 12, an estimation unit 13, an image acquisition unit 21, a feature point detection unit 22, a distance calculation unit 23, a reference distance calculation unit 24, and a face determination unit 25 and the personal authentication unit 27 may all be provided in the server.
 また、以上の実施の形態1では、顔判定装置2は覚醒度推定装置1に備えられ、顔判定装置2aは覚醒度推定装置1aに備えられていたが、これは一例に過ぎない。顔判定装置2は、覚醒度推定装置1の外部にて、覚醒度推定装置1と接続されていてもよい。また、顔判定装置2aは、覚醒度推定装置1aの外部にて、覚醒度推定装置1aと接続されていてもよい。
 また、顔判定装置2が備える各構成部のうちの一部が、覚醒度推定装置1に備えられてもよいし、顔判定装置2aが備える各構成部のうちの一部が、覚醒度推定装置1aに備えられてもよい。例えば、画像取得部21、特徴点検出部22、距離算出部23、または、基準距離算出部24が、覚醒度推定装置1に備えられてもよいし、画像取得部21、特徴点検出部22、距離算出部23、基準距離算出部24、または、個人認証部27が、覚醒度推定装置1aに備えられてもよい。
Further, in Embodiment 1 described above, the face determination device 2 is provided in the wakefulness estimation device 1 and the face determination device 2a is provided in the wakefulness estimation device 1a, but this is merely an example. The face determination device 2 may be connected to the wakefulness estimation device 1 outside the wakefulness estimation device 1 . Further, the face determination device 2a may be connected to the wakefulness estimation device 1a outside the wakefulness estimation device 1a.
Further, some of the components included in the face determination device 2 may be included in the wakefulness estimation device 1, and some of the components included in the face determination device 2a may be included in the wakefulness estimation device. It may be provided in the device 1a. For example, the image acquisition unit 21, the feature point detection unit 22, the distance calculation unit 23, or the reference distance calculation unit 24 may be included in the wakefulness estimation device 1, and the image acquisition unit 21 and the feature point detection unit 22 , the distance calculator 23, the reference distance calculator 24, or the personal authentication unit 27 may be provided in the wakefulness estimation device 1a.
 また、以上の実施の形態1では、覚醒度推定装置1,1aが覚醒度合いを推定する対象となる人は車両のドライバ、または、当該ドライバを含む車両の乗員としたが、これは一例に過ぎない。覚醒度推定装置1,1aは、車両の乗員に限らず、あらゆる人の覚醒度合いを推定するようにできる。例えば、覚醒度推定装置1,1aは、車両以外の移動体、または、居室に存在する人の覚醒度合いを推定するようにできる。なお、この場合、撮像装置3も、車両に搭載されているものに限定されない。撮像装置3は、覚醒度推定装置1,1aが覚醒度合いを推定する対象となる人の顔が存在すべき範囲を撮像可能な位置に設置されていればよい。 Further, in the first embodiment described above, the person whose awakening level is estimated by the awakening level estimation devices 1 and 1a is the driver of the vehicle or the passenger of the vehicle including the driver, but this is only an example. do not have. The awakening degree estimation devices 1 and 1a can estimate the awakening degree of not only the vehicle occupants but also all persons. For example, the wakefulness estimation devices 1 and 1a can estimate the wakefulness of a moving object other than a vehicle or a person in a living room. In this case, the imaging device 3 is also not limited to one mounted on the vehicle. The imaging device 3 may be installed at a position capable of imaging the range in which the face of a person whose arousal level is to be estimated by the arousal level estimating devices 1 and 1a should exist.
 以上のように、実施の形態1によれば、覚醒度推定装置1,1aは、人が眉間にしわを寄せた顔(我慢顔)をしているか否かを示す顔判定情報を取得する顔判定情報取得部11と、人の顔が存在すべき範囲が撮像された撮像画像における輝度を検出する輝度検出部12と、顔判定情報取得部11が取得した顔判定情報と、輝度検出部12が検出した輝度とに基づき、人の覚醒度合いを推定する推定部13とを備えるように構成した。そのため、覚醒度推定装置1は、人が眉間にしわを寄せた状態である場合に、周囲の状況は人が眩しいと感じると想定される状況であるかを考慮して人の覚醒度合いを推定することで、人の覚醒度低下の誤推定を防ぐことができる。 As described above, according to the first embodiment, the arousal level estimation apparatuses 1 and 1a acquire face determination information indicating whether or not a person has a face with wrinkles between the eyebrows (restrained face). A determination information acquisition unit 11; a brightness detection unit 12 that detects brightness in a captured image in which a range in which a human face should exist is captured; face determination information acquired by the face determination information acquisition unit 11; and an estimating unit 13 for estimating the degree of arousal of a person based on the detected luminance. Therefore, the arousal level estimation device 1 estimates the arousal level of a person in consideration of whether the surrounding situation is a situation in which the person is assumed to feel dazzling when the person is in a state where the eyebrows are wrinkled. By doing so, it is possible to prevent erroneous estimation of a decrease in a person's arousal level.
 なお、実施の形態の任意の構成要素の変形、もしくは実施の形態の任意の構成要素の省略が可能である。 It should be noted that any component of the embodiment can be modified, or any component of the embodiment can be omitted.
 本開示の覚醒度推定装置は、人が眉間にしわを寄せた状態である場合に、周囲の状況は人が眩しいと感じると想定される状況であるかを考慮して人の覚醒度合いを推定するため、人の覚醒度低下の誤推定を防ぐことができる。 The arousal level estimation device of the present disclosure estimates a person's arousal level by considering whether the surrounding situation is a situation where the person is assumed to feel dazzling when the person is in a state where the eyebrows are wrinkled. Therefore, it is possible to prevent erroneous estimation of a decrease in a person's arousal level.
 1,1a 覚醒度推定装置、11 顔判定情報取得部、12 輝度検出部、13 推定部、2,2a 顔判定装置、21 画像取得部、22 特徴点検出部、23 距離算出部、24 基準距離算出部、25 顔判定部、251 再計算指示部、26 記憶部、27 個人認証部、3 撮像装置、101 処理回路、102 入力インタフェース装置、103 出力インタフェース装置、104 プロセッサ、105 メモリ。 Reference distance Calculation unit 25 Face determination unit 251 Recalculation instruction unit 26 Storage unit 27 Personal authentication unit 3 Imaging device 101 Processing circuit 102 Input interface device 103 Output interface device 104 Processor 105 Memory.

Claims (13)

  1.  人が眉間にしわを寄せた顔をしているか否かを示す顔判定情報を取得する顔判定情報取得部と、
     人の顔が存在すべき範囲が撮像された撮像画像における輝度を検出する輝度検出部と、
     前記顔判定情報取得部が取得した前記顔判定情報と、前記輝度検出部が検出した前記輝度とに基づき、前記人の覚醒度合いを推定する推定部
     とを備えた覚醒度推定装置。
    a face determination information acquiring unit that acquires face determination information indicating whether or not a person has a face with wrinkled eyebrows;
    a brightness detection unit that detects brightness in a captured image in which a range in which a human face should exist is captured;
    an estimating unit for estimating the degree of arousal of the person based on the face determination information acquired by the face determination information acquiring unit and the brightness detected by the brightness detection unit.
  2.  前記推定部は、
     前記顔判定情報取得部が取得した前記顔判定情報が、前記人が眉間にしわを寄せた顔をしていることを示す前記顔判定情報であり、かつ、前記輝度検出部が検出した前記輝度が輝度判定用閾値未満である場合、前記人の覚醒度が低下したと推定する
     ことを特徴とする請求項1記載の覚醒度推定装置。
    The estimation unit
    The face determination information acquired by the face determination information acquisition unit is the face determination information indicating that the person has a face with wrinkled eyebrows, and the brightness detected by the brightness detection unit. is less than the luminance determination threshold, it is estimated that the person's arousal level has decreased.
  3.  前記推定部は、
     前記顔判定情報取得部が取得した前記顔判定情報が、前記人が眉間にしわを寄せた顔をしていることを示す前記顔判定情報であり、かつ、前記輝度検出部が検出した前記輝度が輝度判定用閾値以上である場合、前記人の覚醒度が低下していないと推定する
     ことを特徴とする請求項1記載の覚醒度推定装置。
    The estimation unit
    The face determination information acquired by the face determination information acquisition unit is the face determination information indicating that the person has a face with wrinkled eyebrows, and the brightness detected by the brightness detection unit. is equal to or greater than the luminance determination threshold, it is estimated that the person's arousal level has not decreased.
  4.  前記推定部は、
     前記顔判定情報取得部が取得した前記顔判定情報が、前記人が眉間にしわを寄せた顔をしていることを示す前記顔判定情報であり、かつ、前記輝度検出部が検出した前記輝度が前記輝度判定用閾値未満であっても、前記撮像画像上の輝度分布に基づき前記人の顔周辺が明るいと判定した場合は、前記人の覚醒度が低下していないと推定する
     ことを特徴とする請求項2記載の覚醒度推定装置。
    The estimation unit
    The face determination information acquired by the face determination information acquisition unit is the face determination information indicating that the person has a face with wrinkled eyebrows, and the brightness detected by the brightness detection unit. is less than the brightness determination threshold, if it is determined that the area around the person's face is bright based on the brightness distribution on the captured image, it is estimated that the person's arousal level has not decreased. 3. The wakefulness estimation device according to claim 2.
  5.  前記推定部は、
     前記顔判定情報取得部が取得した前記顔判定情報が、前記人が眉間にしわを寄せた顔をしていることを示す前記顔判定情報であり、かつ、前記輝度検出部が検出した前記輝度が前記輝度判定用閾値以上であっても、前記撮像画像上の輝度分布に基づき前記人の顔周辺に影があると判定した場合は、前記人の覚醒度が低下したと推定する
     ことを特徴とする請求項3記載の覚醒度推定装置。
    The estimation unit
    The face determination information acquired by the face determination information acquisition unit is the face determination information indicating that the person has a face with wrinkled eyebrows, and the brightness detected by the brightness detection unit. is greater than or equal to the threshold for luminance determination, when it is determined that there is a shadow around the face of the person based on the luminance distribution on the captured image, it is estimated that the arousal level of the person has decreased. 4. The awakening level estimation device according to claim 3.
  6.  前記推定部は、
     前記顔判定情報取得部が取得した前記顔判定情報が、前記人が眉間にしわを寄せた顔をしていないことを示す前記顔判定情報である場合は、前記撮像画像に基づき、前記人の前記覚醒度合いを推定する
     ことを特徴とする請求項1記載の覚醒度推定装置。
    The estimation unit
    When the face determination information acquired by the face determination information acquisition unit is the face determination information indicating that the person does not have a face with wrinkles between the eyebrows, based on the captured image, 2. The arousal level estimation device according to claim 1, wherein the arousal level is estimated.
  7.  前記輝度検出部は、前記撮像画像上の画素の輝度値の平均値を、前記撮像画像の前記輝度とする
     ことを特徴とする請求項1記載の覚醒度推定装置。
    2. The wakefulness level estimation device according to claim 1, wherein the luminance detection unit uses an average value of luminance values of pixels on the captured image as the luminance of the captured image.
  8.  前記撮像画像を取得する画像取得部と、
     前記画像取得部が取得した前記撮像画像上の、前記人の顔の複数の特徴点を検出する特徴点検出部と、
     前記特徴点検出部が検出した前記複数の前記特徴点の前記撮像画像上の座標を3次元座標に変換し、変換後の前記3次元座標から前記複数の前記特徴点の間の特徴点間距離を算出する距離算出部と、
     前記距離算出部が算出した前記特徴点間距離と基準特徴点間距離とを比較することで前記人が眉間にしわを寄せた顔をしているか否かを判定し、前記特徴点間距離が前記基準特徴点間距離よりも第1閾値以上小さい場合、前記人が眉間にしわを寄せた顔をしていると判定する顔判定部
     とを備えた請求項1記載の覚醒度推定装置。
    an image acquisition unit that acquires the captured image;
    a feature point detection unit that detects a plurality of feature points of the person's face on the captured image acquired by the image acquisition unit;
    converting the coordinates of the plurality of feature points detected by the feature point detection unit on the captured image into three-dimensional coordinates, and using the three-dimensional coordinates after conversion as an inter-feature point distance between the plurality of feature points; a distance calculation unit that calculates
    By comparing the distance between the feature points calculated by the distance calculation unit and the distance between the reference feature points, it is determined whether or not the person has a face with wrinkled eyebrows, and the distance between the feature points is determined by comparing the distance between the feature points. 2. The wakefulness estimation device according to claim 1, further comprising: a face determination unit that determines that the person has a face with wrinkled eyebrows when the distance between the reference feature points is smaller than the reference feature point distance by a first threshold or more.
  9.  前記距離算出部が算出した前記特徴点間距離の、基準算出対象期間分の履歴に基づき、前記基準特徴点間距離を算出する基準距離算出部を備え、
     前記顔判定部は、前記距離算出部が算出した前記特徴点間距離と前記基準距離算出部が算出した前記基準特徴点間距離とを比較する
     ことを特徴とする請求項8記載の覚醒度推定装置。
    a reference distance calculation unit that calculates the reference distance between the feature points based on a history of the distance between the feature points calculated by the distance calculation unit for a reference calculation target period;
    9. Arousal level estimation according to claim 8, wherein the face determination unit compares the distance between feature points calculated by the distance calculation unit and the reference distance between feature points calculated by the reference distance calculation unit. Device.
  10.  前記顔判定部が前記特徴点間距離と前記基準特徴点間距離とを比較した結果、前記特徴点間距離が前記基準特徴点間距離よりも第2閾値以上大きいと判定した場合、前記基準距離算出部に対して前記基準特徴点間距離の再計算を指示する再計算指示部を備え、
     前記基準距離算出部は、前記再計算指示部から前記基準特徴点間距離の再計算を指示された場合、前記基準特徴点間距離を再算出する
     ことを特徴とする請求項9記載の覚醒度推定装置。
    When the face determination unit compares the inter-feature point distance and the reference inter-feature point distance and determines that the inter-feature point distance is larger than the reference inter-feature point distance by a second threshold or more, the reference distance a recalculation instruction unit that instructs the calculation unit to recalculate the distance between the reference feature points;
    10. The awakening level according to claim 9, wherein the reference distance calculation unit recalculates the reference inter-feature point distance when instructed by the recalculation instruction unit to recalculate the reference inter-feature point distance. estimation device.
  11.  前記人の個人認証を行う個人認証部を備え、
     前記基準距離算出部は、前記個人認証部による前記個人認証の結果に基づき、個人毎の前記基準特徴点間距離を算出し、
     前記顔判定部は、前記個人認証部による前記個人認証の結果と、前記基準距離算出部が算出した前記個人毎の前記基準特徴点間距離に基づき、前記特徴点間距離と比較する前記基準特徴点間距離を特定する
     ことを特徴とする請求項9記載の覚醒度推定装置。
    a personal authentication unit that performs personal authentication of the person;
    The reference distance calculation unit calculates the distance between the reference feature points for each individual based on the result of the personal authentication by the personal authentication unit,
    The face determination unit compares the distance between the feature points with the distance between the feature points based on the result of the personal authentication by the person authentication unit and the distance between the reference feature points for each individual calculated by the reference distance calculation unit. 10. The arousal level estimation device according to claim 9, wherein distances between points are specified.
  12.  前記複数の前記特徴点は前記人の両眉頭であり、前記特徴点間距離は前記人の両眉頭間の距離である
     ことを特徴とする請求項8記載の覚醒度推定装置。
    9. The awakening degree estimation device according to claim 8, wherein the plurality of feature points are the inner corners of the eyebrows of the person, and the distance between the feature points is the distance between the inner corners of the eyebrows of the person.
  13.  顔判定情報取得部が、人が眉間にしわを寄せた顔をしているか否かを示す顔判定情報を取得するステップと、
     輝度検出部が、人の顔が存在すべき範囲が撮像された撮像画像における輝度を検出するステップと、
     推定部が、前記顔判定情報取得部が取得した前記顔判定情報と、前記輝度検出部が検出した前記輝度とに基づき、前記人の覚醒度合いを推定するステップ
     とを備えた覚醒度推定方法。
    a step in which a face determination information acquisition unit acquires face determination information indicating whether or not a person has a face with wrinkled eyebrows;
    a step in which the brightness detection unit detects brightness in a captured image in which a range in which a human face should exist is captured;
    an estimation unit estimating the degree of arousal of the person based on the face determination information acquired by the face determination information acquisition unit and the brightness detected by the brightness detection unit.
PCT/JP2021/043119 2021-11-25 2021-11-25 Wakefulness estimation device and wakefulness estimation method WO2023095229A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/043119 WO2023095229A1 (en) 2021-11-25 2021-11-25 Wakefulness estimation device and wakefulness estimation method
JP2023561121A JP7403729B2 (en) 2021-11-25 2021-11-25 Awakening level estimation device and arousal level estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/043119 WO2023095229A1 (en) 2021-11-25 2021-11-25 Wakefulness estimation device and wakefulness estimation method

Publications (1)

Publication Number Publication Date
WO2023095229A1 true WO2023095229A1 (en) 2023-06-01

Family

ID=86539106

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/043119 WO2023095229A1 (en) 2021-11-25 2021-11-25 Wakefulness estimation device and wakefulness estimation method

Country Status (2)

Country Link
JP (1) JP7403729B2 (en)
WO (1) WO2023095229A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007245911A (en) * 2006-03-15 2007-09-27 Omron Corp Monitoring device and method, recording medium and program
JP2008212298A (en) * 2007-03-01 2008-09-18 Toyota Central R&D Labs Inc Sleepiness judging apparatus and program
JP2009096384A (en) * 2007-10-18 2009-05-07 Aisin Seiki Co Ltd Driving assistance system and method
JP2009201676A (en) * 2008-02-27 2009-09-10 Toyota Motor Corp Awakening degree estimating apparatus
WO2020121425A1 (en) * 2018-12-12 2020-06-18 三菱電機株式会社 State determination device, state determination method, and state determination program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4375420B2 (en) 2007-03-26 2009-12-02 株式会社デンソー Sleepiness alarm device and program
JP5208711B2 (en) 2008-12-17 2013-06-12 アイシン精機株式会社 Eye open / close discrimination device and program
JP2011043961A (en) 2009-08-20 2011-03-03 Toyota Motor Corp Driver monitoring device
JP6075983B2 (en) 2012-07-03 2017-02-08 大和ハウス工業株式会社 Discomfort degree estimation system and discomfort degree estimation method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007245911A (en) * 2006-03-15 2007-09-27 Omron Corp Monitoring device and method, recording medium and program
JP2008212298A (en) * 2007-03-01 2008-09-18 Toyota Central R&D Labs Inc Sleepiness judging apparatus and program
JP2009096384A (en) * 2007-10-18 2009-05-07 Aisin Seiki Co Ltd Driving assistance system and method
JP2009201676A (en) * 2008-02-27 2009-09-10 Toyota Motor Corp Awakening degree estimating apparatus
WO2020121425A1 (en) * 2018-12-12 2020-06-18 三菱電機株式会社 State determination device, state determination method, and state determination program

Also Published As

Publication number Publication date
JP7403729B2 (en) 2023-12-22
JPWO2023095229A1 (en) 2023-06-01

Similar Documents

Publication Publication Date Title
JP6695503B2 (en) Method and system for monitoring the condition of a vehicle driver
CN104573623B (en) Face detection device and method
JP5109922B2 (en) Driver monitoring device and program for driver monitoring device
JP4375420B2 (en) Sleepiness alarm device and program
US20170311864A1 (en) Health care assisting device and health care assisting method
JP4915413B2 (en) Detection apparatus and method, and program
JP2017536589A (en) Device, method and computer program for detecting instantaneous sleep
JP6187817B2 (en) Face detection apparatus, method and program
JP2008210239A (en) Line-of-sight estimation device
JP2012084012A (en) Image processing device, processing method therefor, and program
JP4776581B2 (en) Sleep state detection method, sleep state detection device, sleep state detection system, and computer program
KR20190083155A (en) Apparatus and method for detecting state of vehicle driver
WO2019215879A1 (en) Occupant monitoring device and occupant monitoring method
WO2023095229A1 (en) Wakefulness estimation device and wakefulness estimation method
JP3905544B2 (en) Face image processing device
JP4412253B2 (en) Awakening degree estimation apparatus and method
JP2019185556A (en) Image analysis device, method, and program
JP3088880B2 (en) Person recognition device
JP7374386B2 (en) Condition determination device and condition determination method
WO2022113275A1 (en) Sleep detection device and sleep detection system
JP6000762B2 (en) Image monitoring device
US20220032922A1 (en) Vehicle and method of controlling the same
CN111696312B (en) Passenger observation device
JP5843618B2 (en) Image processing apparatus, control method thereof, and program
CN107273847B (en) Iris acquisition method and apparatus, electronic device, and computer-readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21965601

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023561121

Country of ref document: JP