WO2020174601A1 - Alertness level estimation device, automatic driving assistance device, and alertness level estimation method - Google Patents

Alertness level estimation device, automatic driving assistance device, and alertness level estimation method Download PDF

Info

Publication number
WO2020174601A1
WO2020174601A1 PCT/JP2019/007500 JP2019007500W WO2020174601A1 WO 2020174601 A1 WO2020174601 A1 WO 2020174601A1 JP 2019007500 W JP2019007500 W JP 2019007500W WO 2020174601 A1 WO2020174601 A1 WO 2020174601A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupant
upper eyelid
downward
opening
estimation device
Prior art date
Application number
PCT/JP2019/007500
Other languages
French (fr)
Japanese (ja)
Inventor
和樹 國廣
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2021501448A priority Critical patent/JP7109649B2/en
Priority to PCT/JP2019/007500 priority patent/WO2020174601A1/en
Priority to DE112019006953.5T priority patent/DE112019006953T5/en
Publication of WO2020174601A1 publication Critical patent/WO2020174601A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • B60W2040/0827Inactivity or incapacity of driver due to sleepiness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/229Attention level, e.g. attentive to driving, reading or sleeping

Definitions

  • the present invention relates to a technique for estimating the wakefulness of a vehicle occupant.
  • a wakefulness estimation device that estimates the wakefulness of a driver (presence or absence of drowsiness) based on the degree of eye opening obtained from an image of the driver's face taken by an in-vehicle camera is known (for example, Patent Document 1). ..
  • the arousal level estimation device of Patent Document 1 estimates the arousal level from the degree of eye opening of the driver, determines whether or not the driver is in the downward-looking state, and determines that the driver is in the downward-looking state. In addition, the erroneous determination is prevented by correcting the driver's awakening degree estimated from the eye opening in the direction of increasing the awakening degree.
  • the degree of eye opening of the driver is calculated based on the distance between the highest point of the contour line of the upper eyelid and the lowest point of the contour line of the lower eyelid, Whether or not the driver is in the downward-looking state is determined based on the shapes of the contour line and the contour line of the lower eyelid.
  • the contour of the lower eyelid is less clear than the contour of the upper eyelid, and it is difficult to detect the contour of the lower eyelid with high accuracy. Therefore, in the method of Patent Document 1, there is a concern that the error of the calculated eye opening becomes large and, as a result, the reliability of the estimation result of the awakening degree of the driver becomes low.
  • a light source that projects the shadow of the lower eyelid on the eyeball of the driver is installed in the vehicle, and the shadow is photographed by a camera to improve the detection accuracy of the contour of the lower eyelid. There is.
  • the present invention has been made to solve the above problems, and an object of the present invention is to provide a wakefulness estimation device capable of highly accurately estimating the wakefulness of a vehicle occupant.
  • the arousal level estimation device is a face image acquisition unit that acquires a face image that is an image of the face of an occupant of a vehicle, and the positions of the inner and outer corners of the eyes, the position of the outer corners of the eyes, and the highest point of the upper eyelid.
  • the upper eyelid opening calculation unit that calculates the opening of the upper eyelid based on the position of, the opening of the upper eyelid is smaller than a predetermined opening threshold, the highest point of the upper eyelid detected from the face image.
  • the position and the downward-viewing determination unit that determines whether the occupant is in the downward-looking state based on the position of the intersection point between the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid, and the opening of the upper eyelid.
  • a wakefulness estimation unit that estimates that the wakefulness of the occupant has decreased when it is determined that the occupant is smaller than the opening threshold and the occupant is not in the downward viewing state.
  • the degree of awakening of the occupant of the vehicle is estimated based on the result of determination of the upper eyelid of the occupant and determination of whether the occupant is in the downward-looking state (downward-looking determination). Since the opening degree of the upper eyelid can be calculated without using the position of the lower eyelid, it can be calculated with high accuracy.
  • the position of the contour line of the lower eyelid is used for the downward vision determination, but the downward vision determination is performed with the upper eyelid opening small, that is, with the contour of the lower eyelid bulging and the contour line being clear. Therefore, it can be performed with high accuracy. As a result, the estimation accuracy of the wakefulness of the occupant is also high.
  • FIG. 3 is a block diagram showing a configuration of an awakening degree estimation device according to the first embodiment.
  • 5 is a flowchart showing an operation of the arousal level estimation device according to the first embodiment.
  • FIG. 6 is a block diagram showing a modified example of the arousal level estimation device according to the first embodiment. It is a figure which shows the hardware structural example of an awakening degree estimation apparatus. It is a figure which shows the hardware structural example of an awakening degree estimation apparatus.
  • FIG. 6 is a block diagram showing a configuration of an automatic driving support device according to a second embodiment.
  • FIG. 1 is a block diagram showing the configuration of the arousal level estimation device 10 according to the first embodiment.
  • wakefulness estimation device 10 is installed in a vehicle.
  • the arousal level estimation device 10 is not limited to one that is permanently installed in the vehicle, and may be built in a portable device that can be brought into the vehicle, such as a mobile phone, a smartphone, or a portable navigation device. Further, part or all of the awakening degree estimation device 10 may be built on a server that can communicate with the vehicle.
  • the alertness estimation device 10 is connected to a camera 21 and a warning device 22 provided in the vehicle.
  • the camera 21 is for photographing the inside of the vehicle, and is installed at a position where the face of the vehicle occupant can be photographed.
  • the camera 21 is assumed to be a wide-angle camera having a shooting range capable of shooting the faces of passengers in all seats of the vehicle.
  • the image capturing range of the camera 21 may be wide enough to capture only the driver's seat.
  • the arousal level estimation device 10 estimates the arousal level (presence or absence of drowsiness) of the occupant based on the image of the occupant's face captured by the camera 21 (hereinafter referred to as “face image”), and sends the estimation result to the warning device 22. Output.
  • face image the image of the occupant's face captured by the camera 21
  • the alertness estimation device 10 estimates the alertness of all occupants captured by the camera 21.
  • the arousal level estimation device 10 outputs the information on the position of the seat of each occupant together with the estimation result of the arousal level of each occupant. That is, the wakefulness estimation device 10 outputs information such as in which seat the occupant estimated to have high wakefulness is in, and which seat the occupant estimated to have low wakefulness is in.
  • the warning device 22 issues a warning to the inside of the vehicle according to the wakefulness of the occupant estimated by the wakefulness estimation device 10, and includes a speaker for issuing a warning sound or a warning message, a display for displaying a warning screen, and the like. ..
  • the warning device 22 gives a warning when it is estimated that the awakening level of the occupant (driver) in the driver's seat has decreased.
  • the operation content of the warning device 22 is not limited, and for example, the warning may be issued even when it is estimated that the wakefulness of an occupant other than the driver has decreased.
  • the arousal level estimation device 10 includes a face image acquisition unit 11, an upper eyelid opening calculation unit 12, a downward vision determination unit 13, and an arousal level estimation unit 14.
  • the face image acquisition unit 11 acquires the face image of the occupant captured by the camera 21.
  • the face image acquisition unit 11 uses the face recognition technology to extract the face image for each occupant from the in-vehicle image captured by the camera 21. To extract.
  • the upper eyelid opening degree calculation unit 12 detects the position of the eyes of the occupant, the position of the outer corners of the eyes, and the position of the highest point of the upper eyelid from the face image of the occupant acquired by the face image acquisition unit 11, and based on their positional relationship. Then, the degree of opening of the upper eyelid (hereinafter referred to as "opening") is calculated. More specifically, the upper eyelid opening degree calculation unit 12 calculates the opening degree of the upper eyelid based on the first distance which is the distance between the straight line connecting the inner corner of the eye and the outer corner of the eye and the highest point of the upper eyelid. To do. As described above, it is difficult to detect the contour of the lower eyelid with higher accuracy than the contour of the upper eyelid.
  • the opening degree of the upper eyelid calculated by the upper eyelid opening degree calculation unit 12 can be calculated without using the information of the contour line of the lower eyelid, the accuracy thereof is high.
  • the highest point of the upper eyelid is the point (vertex) of the upper eyelid that is farthest from the straight line connecting the outer corner of the eye and the inner corner of the eye.
  • the upper eyelid opening degree calculation unit 12 obtains the flatness of the eye by dividing the first distance by the distance between the inner and outer corners of the eye, and the flattening rate is a predetermined reference value. The value obtained by dividing by is calculated as the opening degree of the upper eyelid.
  • the reference value of the flatness of the eyes may be, for example, a constant value determined by the manufacturer of the alertness estimation device 10 as an average value of the flatness of the eyes.
  • the wakefulness estimation device 10 can identify an individual occupant by face authentication using a face image or the like, a reference value for each occupant may be registered in the wakefulness estimation device 10. By using different reference values for each occupant, it is possible to suppress the influence of individual differences in eye size on the calculation result of the upper eyelid opening.
  • the downward viewing determination unit 13 determines that the occupant is in the downward viewing state when the upper eyelid opening of the occupant calculated by the upper eyelid opening calculation unit 12 is smaller than a predetermined threshold (hereinafter referred to as “opening threshold”). A downward view determination is performed to determine whether or not it is. Specifically, when the opening degree of the upper eyelid is smaller than the opening threshold value, the downward viewing determination unit 13 determines the position of the highest point of the upper eyelid and the lower eyelid from the face image acquired by the face image acquisition unit 11. And the position of the intersection of the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid (hereinafter referred to as the "specific point of the lower eyelid"), the highest point of the upper eyelid and the lower eyelid.
  • a predetermined threshold hereinafter referred to as “opening threshold”.
  • the specific point of the lower eyelid may be an intersection of a perpendicular line descending from the highest point of the upper eyelid toward a straight line connecting the inner corner of the eye and the outer corner of the eye and the contour line of the lower eyelid.
  • the lower eyelid bulges and the contour of the lower eyelid becomes clear, so the facial image shows The contour line can be detected with high accuracy.
  • the position of the contour line of the lower eyelid is necessary for the downward view determination performed by the downward view determination unit 13, but the downward view determination is performed when the opening degree of the upper eyelid is small, and therefore the downward view determination is The position of the contour of the lower eyelid obtained with high accuracy is used. Therefore, the downward-view determination performed by the downward-view determination unit 13 is highly accurate.
  • the downward viewing determination unit 13 determines the second distance, which is the distance between the highest point of the upper eyelid and the specific point of the lower eyelid, when the opening degree of the upper eyelid is smaller than the opening degree threshold value. It is determined whether or not the occupant is in the downward-looking state based on and. That is, when the opening degree of the upper eyelid is smaller than the opening threshold value and the second distance is larger than a predetermined threshold value (hereinafter referred to as “distance threshold value”), the downward viewing determination unit 13 causes the occupant to look downward. It is determined to be in the state.
  • distance threshold value a predetermined threshold value
  • the condition "the upper eyelid opening is smaller than the opening threshold and the second distance is larger than the distance threshold” may be referred to as “downward viewing condition”.
  • the second distance may also be referred to as the “upper and lower eyelid distance”.
  • the distance threshold may be, for example, a constant value determined by the manufacturer of the arousal level estimation device 10 as the size of the minimum distance between the upper and lower eyelids required for downward viewing.
  • a distance threshold for each occupant may be registered in the wakefulness estimation device 10.
  • the awakening degree estimation unit 14 determines the occupant's upper eyelid opening calculated by the upper eyelid opening calculation unit 12 and the result of the downward vision determination of the occupant performed by the downward vision determination unit 13 for the occupant. Estimate if the alertness has decreased. Specifically, the wakefulness estimation unit 14 reduces the wakefulness of the occupant when the occupant's upper eyelid opening is smaller than the opening threshold and it is determined that the occupant is not in the downward-looking state. Presumed to have done. As described above, the upper eyelid opening degree calculation unit 12 can calculate the upper eyelid opening degree of the occupant with high accuracy, and the downward vision determination unit 13 can perform downward vision determination with high accuracy. The estimation of the awakening degree of the occupant by 14 is highly accurate.
  • the condition that "the occupant's upper eyelid opening is smaller than the opening threshold value and the occupant is not in the downward-looking state” may also be referred to as "awakening degree lowering condition”.
  • the arousal level estimation device 10 As described above, according to the arousal level estimation device 10 according to the first embodiment, the arousal level of the occupant can be estimated with high accuracy in consideration of whether or not the occupant is in the downward vision state. Further, the detection of the contour of the lower eyelid, which is a relatively difficult process, may be performed at least when the downward vision determination is performed (that is, when the opening of the upper eyelid is smaller than the opening threshold). It can also contribute to the reduction of the processing load of the alertness estimation device 10.
  • FIG. 2 is a flowchart showing the operation of the alertness estimation device 10 according to the first embodiment. The operation of the alertness estimation device 10 will be described below with reference to FIG.
  • the face image acquisition unit 11 acquires the face image of the occupant captured by the camera 21 (step S101).
  • the image captured by the camera 21 includes only the face image of one passenger.
  • face images of a plurality of passengers may be acquired in step S101. In that case, the following processes of steps S102 to S108 are performed on the face images of the respective occupants.
  • the upper eyelid opening degree calculation unit 12 detects the position of the eyes of the occupant, the position of the outer corners of the eyes, and the position of the highest point of the upper eyelid from the face image of the occupant acquired in step S101, and establishes their positional relationship. Based on this, the opening degree of the upper eyelid of the occupant is calculated (step S102).
  • the downward viewing determination unit 13 determines based on the occupant's face image acquired in step S101. Then, the downward vision determination is performed to determine whether the occupant is in the downward vision state (step S104).
  • the awakening degree estimation unit 14 estimates that the awakening degree of the occupant has decreased (step S106).
  • step S107 when the opening of the occupant's upper eyelid calculated in step S102 is larger than the opening threshold value (NO in step S103), or when it is determined in step S104 that the occupant is in the downward-looking state (step S105). YES), the awakening degree estimation unit 14 estimates that the awakening degree of the occupant is high (step S107).
  • the estimation result of the wakefulness of the occupant in step S106 or S107 is output to the warning device 22 (step S108).
  • the warning device 22 issues a warning according to the estimation result of the awakening degree of the occupant.
  • the arousal level estimation device 10 repeatedly executes the above flow.
  • the flow may be constantly executed, for example, after the activation of the awakening degree estimation device 10 (after the ignition of the vehicle is turned on), or may be executed only while the vehicle is running (while the vehicle is stopped). May not be executed).
  • the wakefulness of the occupant is estimated before the vehicle starts moving, which can contribute to the improvement of vehicle safety.
  • the wakefulness estimation unit 14 outputs the estimation result of the wakefulness of the occupant to the warning device 22, but the output destination is not limited to the warning device 22, and any output destination. It can be a device.
  • the estimation result of the wakefulness of the occupant by the wakefulness estimation unit 14 may be input to the automatic driving control device 23 that is an ECU (Electronic Control Unit) that automatically drives the vehicle.
  • ECU Electronic Control Unit
  • the alertness estimation unit 14 estimates that the alertness of the occupant (driver) in the driver's seat has decreased, and the warning device 22 gives a warning. If the driver's awakening level remains low, the automatic driving control device 23 automatically evacuates the vehicle to a safe place (for example, a wide shoulder or a parking area) and stops the vehicle. Such an operation is possible.
  • the downward vision determination unit 13 the occupant has an upper eyelid opening smaller than the opening threshold value, and the occupant's upper and lower eyelid distance (second distance) becomes larger than the distance threshold value.
  • the downward-viewing determination unit 13 may determine that the occupant is in the downward-viewing state when the eyes of the occupant have satisfied the downward-viewing condition for a certain length of time. Specifically, it may be determined that the occupant is in the downward-looking state when the ratio of the time when the downward-looking condition is satisfied to a certain time exceeds a predetermined threshold value (time threshold value).
  • time threshold value a predetermined threshold value
  • the downward-viewing determination unit 13 may determine whether the occupant is in the downward-looking state by taking into consideration the orientation of the occupant's face detected from the face image. For example, the downward vision determination unit 13 causes the eyes of the occupant to satisfy the downward vision condition, and the direction of the occupant's face moves downward (in particular, the direction in which the meter in the vehicle or the screen of the navigation device is located). In this case, it may be determined that the occupant is in the downward-looking state.
  • the awakening degree estimation unit 14 may estimate the awakening degree of the occupant by taking into consideration the orientation of the occupant's face detected from the face image. For example, the awakening degree estimation unit 14 satisfies the condition that the opening of the upper eyelid of the occupant is smaller than the opening threshold value and that the occupant is not in the downward-looking state (awakening degree lowering condition), and If the variation in the face orientation is smaller than a predetermined threshold value, it may be estimated that the wakefulness of the occupant has decreased.
  • the downward-viewing determination unit 13 may determine whether or not the occupant is in the downward-looking state by taking into consideration the operation status of the vehicle-mounted device. For example, when the downward vision determination unit 13 determines that the occupant is in the downward vision state when the eyes of the occupant satisfy the downward vision condition and the occupant operates the vehicle-mounted device at a position lower than the face. It may be determined.
  • the information on the operating status of the vehicle-mounted device can be acquired from, for example, the ECU of the vehicle.
  • the awakening degree estimation unit 14 may estimate the awakening degree of the occupant in consideration of the operation status of the vehicle-mounted device. For example, when the occupant's eyes satisfy the wakefulness reduction condition and the occupant is not operating the in-vehicle device, the wakefulness estimation unit 14 estimates that the wakeup level of the occupant has decreased. May be.
  • Example of hardware configuration 4 and 5 are diagrams each showing an example of the hardware configuration of the arousal level estimation device 10.
  • Each function of the components of the alertness estimation device 10 shown in FIG. 1 is realized by the processing circuit 50 shown in FIG. 4, for example. That is, the arousal level estimation device 10 acquires a face image that is an image of the face of an occupant of the vehicle, and raises it based on the positions of the inner and outer corners of the eyes, the positions of the outer corners of the eyes, and the position of the highest point of the upper eyelid detected from the face image.
  • the processing circuit 50 is provided for estimating that the wakefulness of the occupant has decreased.
  • the processing circuit 50 may be dedicated hardware, or a processor that executes a program stored in a memory (Central Processing Unit (CPU), processing device, arithmetic device, microprocessor, microcomputer, It may be configured using a DSP (also referred to as a digital signal processor).
  • a memory Central Processing Unit (CPU)
  • CPU Central Processing Unit
  • microprocessor microcomputer
  • DSP digital signal processor
  • the processing circuit 50 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), and an FPGA (Field-Programmable). Gate Array), or a combination of these.
  • Each function of the components of the alertness estimation device 10 may be realized by an individual processing circuit, or those functions may be collectively realized by one processing circuit.
  • FIG. 5 shows an example of the hardware configuration of the alertness estimation device 10 when the processing circuit 50 is configured using the processor 51 that executes a program.
  • the functions of the constituent elements of the alertness estimation device 10 are realized by software or the like (software, firmware, or a combination of software and firmware).
  • the software and the like are written as a program and stored in the memory 52.
  • the processor 51 realizes the function of each unit by reading and executing the program stored in the memory 52.
  • the awakening level estimation apparatus 10 acquires, when executed by the processor 51, a process of acquiring a face image that is an image of the face of an occupant of the vehicle, and positions of the inner and outer corners of the eye detected from the face image, and Processing to calculate the opening of the upper eyelid based on the position of the highest point of the upper eyelid, when the opening of the upper eyelid is smaller than a predetermined opening threshold, of the highest point of the upper eyelid detected from the face image
  • a process of determining whether or not the occupant is in the downward-looking state based on the position and the position of the intersection point of the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid, and the opening degree of the upper eyelid is the opening threshold value.
  • this program causes a computer to execute the procedure and method of the operation of the constituent elements of the alertness estimation device 10.
  • the memory 52 is, for example, RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), EEPROM (ElectricallyErasableProgrammableReadOnlyMemory), or the like.
  • Volatile semiconductor memory HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc) and its drive device, or any storage medium used in the future. May be.
  • the configuration of the functions of the components of the awakening degree estimation device 10 being realized by either hardware or software has been described.
  • the configuration is not limited to this, and a configuration in which some of the components of the alertness estimation device 10 are implemented by dedicated hardware and another of the components is implemented by software or the like may be used.
  • the function is realized by the processing circuit 50 as dedicated hardware, and for some of the other constituent elements, the processing circuit 50 as the processor 51 executes the program stored in the memory 52.
  • the function can be realized by reading and executing.
  • the arousal level estimation device 10 can realize each function described above by hardware, software, or a combination thereof.
  • the second embodiment shows an example in which the awakening degree estimation device 10 described in the first embodiment is applied to an automatic driving support device that supports automatic driving of a vehicle.
  • Level 0 No driving automation: Driver performs some or all of the dynamic driving tasks
  • Level 1 Device performs either vertical or horizontal vehicle motion control subtasks in a limited area
  • Level 2 Partial driving automation: System executes both vertical and lateral vehicle motion control subtasks in limited area
  • Level 3 Supplemental driving automation: System executes all dynamic driving tasks in limited area However, if it is difficult to continue operation, the driver appropriately responds to the intervention request from the system.
  • Level 4 advanced operation automation: The system responds to all dynamic driving tasks and when it is difficult to continue operation.
  • Execution in limited area Level 5 complete operation automation: System executes all dynamic driving tasks and responses to cases where it is difficult to continue operation (that is, not within the limited area).
  • the term means all operational and tactical functions (excluding strategic functions such as itinerary planning and selection of transit points) that must be performed in real time when operating a vehicle in road traffic.
  • “limited area” means a specific condition (geographical constraint, road constraint, environmental constraint, traffic constraint, speed constraint, time constraint, etc.) that the system or its function is designed to operate. (Including restrictions).
  • FIG. 6 is a block diagram showing the configuration of the automatic driving support device 30 according to the second embodiment.
  • elements having the same functions as those shown in FIG. 1 are designated by the same reference numerals as those in FIG. 1, and description thereof will be omitted here.
  • the automatic driving support device 30 is connected to the camera 21, the automatic driving control device 23, and the notification device 24.
  • the automatic driving control device 23 is an ECU that performs automatic driving of the vehicle, and corresponds to the above-mentioned "system". In the present embodiment, the automatic driving control device 23 performs at least level 3 automatic driving.
  • the notification device 24 notifies the occupant of the vehicle, and includes a speaker that emits a notification sound or a notification message sound, a display that displays an alarm screen, and the like.
  • the automatic driving support device 30 includes the awakening degree estimation device 10 and an occupant selection unit 31.
  • the arousal level estimation device 10 estimates the arousal level of the occupant of the vehicle as described in the first embodiment, and outputs the estimation result to the occupant selection unit 31.
  • the occupant selection unit 31 selects the occupant (the occupant to be the driver) to intervene in driving the vehicle in response to the intervention request from the automatic driving support device 30.
  • the intervention request is input from the automatic driving support apparatus 30 to the occupant selection unit 31 when the automatic driving support apparatus 30 is performing level 3 automatic driving and it becomes difficult to continue the automatic driving.
  • the occupant selection unit 31 receives the intervention request from the automatic driving support device 30, the occupant selection unit 31 confirms the estimation result of the wakefulness of the occupant by the wakefulness estimation device 10, and drives the occupant estimated to have a high wakefulness to drive the vehicle. To be selected as an occupant to intervene in.
  • the selection result by the passenger selection unit 31 is input to the notification device 24.
  • the notification device 24 notifies the occupant selected by the occupant selection unit 31 to instruct intervention in driving the vehicle.
  • the driving authority of the vehicle can be transferred to the occupant without stopping the vehicle.
  • the automatic driving control device 23 stops the vehicle at a safe place, moves the selected occupant to the driver's seat, and then drives the occupant. Transfer the authority.
  • the occupant selection unit 31 may select any one of them, but may preferentially select the occupant in the driver's seat.
  • the automatic driving support device 30 supports the so-called handover process in which the automatic driving support device 30 transfers the driving authority to the occupant, and the driving authority is transferred to the occupant having a low awakening degree. It can be prevented.
  • 10 arousal level estimation device, 11 face image acquisition part, 12 upper eyelid opening calculation part, 13 downward vision determination part, 14 awakening level estimation part, 21 camera, 22 warning device, 23 automatic driving control device, 24 notification device, 30 Automatic driving support device, 31 passenger selection unit, 50 processing circuit, 51 processor, 52 memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Automation & Control Theory (AREA)
  • Child & Adolescent Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Psychiatry (AREA)
  • Mechanical Engineering (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Mathematical Physics (AREA)
  • Anesthesiology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)

Abstract

In an alertness level estimation device (10), a face image acquisition unit (11) acquires a captured face image representing the face of an occupant in a vehicle. An upper eyelid opening degree calculation unit (12) calculates the degree of opening of an upper eyelid on the basis of the positions, detected from the face image, of an eye inner corner, an eye outer corner, and the highest point of the upper eyelid. When the degree of opening of the upper eyelid is less than a predetermined threshold value, a downward view determination unit (13) determines whether or not the occupant is in a downward view state, on the basis of the position of the highest point of the upper eyelid detected from the face image and the position of an intersection point of a profile line of a lower eyelid and a vertical line drawn downward from the highest point of the upper eyelid. When it is determined that the degree of opening of the upper eyelid becomes less than the threshold value and the occupant is not in a downward view state, an alertness level estimation unit (14) of the alertness level estimation device (10) estimates that the alertness level of the occupant has decreased.

Description

覚醒度推定装置、自動運転支援装置および覚醒度推定方法Arousal level estimation device, automatic driving support device, and arousal level estimation method
 本発明は車両の乗員の覚醒度を推定する技術に関するものである。 The present invention relates to a technique for estimating the wakefulness of a vehicle occupant.
 車載カメラで撮影した運転者の顔の画像から求めた目の開度に基づいて、運転者の覚醒度(眠気の有無)を推定する覚醒度推定装置が知られている(例えば特許文献1)。 A wakefulness estimation device that estimates the wakefulness of a driver (presence or absence of drowsiness) based on the degree of eye opening obtained from an image of the driver's face taken by an in-vehicle camera is known (for example, Patent Document 1). ..
 運転者は、車外前方を見るだけでなく、車内のメータやナビゲーション装置の画面などを見るために視線を下方へ向けることがある(以下、視線を下方へ向けることを「下方視」という)。下方視時の目の開度は、覚醒度が低下したときの目の開度に近い値になるため、覚醒度推定装置においては、運転者が下方視したときに、覚醒度が低下したと誤判定されるのを防止することが課題になる。 Drivers sometimes turn their line of sight downward not only to look outside the front of the vehicle, but also to view the screens of meters and navigation devices inside the vehicle (hereinafter, turning the line of sight downward is referred to as "downward viewing"). Since the eye opening when looking down is close to the eye opening when the awakening level decreases, the awakening level estimating device determines that the awakening level decreases when the driver looks down. The challenge is to prevent erroneous decisions.
 特許文献1の覚醒度推定装置は、運転者の目の開度から覚醒度を推定するとともに、運転者が下方視状態か否かの判定を行い、運転者が下方視状態と判定された場合に、目の開度から推定した運転者の覚醒度を、覚醒度を高くする方向へ補正することによって、上記誤判定を防止している。 The arousal level estimation device of Patent Document 1 estimates the arousal level from the degree of eye opening of the driver, determines whether or not the driver is in the downward-looking state, and determines that the driver is in the downward-looking state. In addition, the erroneous determination is prevented by correcting the driver's awakening degree estimated from the eye opening in the direction of increasing the awakening degree.
特開2008-171065号公報JP, 2008-171065, A 特開2011-048531号公報JP, 2011-048531, A 特開2007-241937号公報JP, 2007-241937, A
 特許文献1の覚醒度推定装置では、上瞼の輪郭線の最高点と下瞼の輪郭線の最下点との間の距離に基づいて運転者の目の開度が算出され、上瞼の輪郭線および下瞼の輪郭線の形状に基づいて運転者が下方視状態か否かが判定される。一般に、下瞼の輪郭線は上瞼の輪郭線に比べて不明瞭であり、下瞼の輪郭線を高い精度で検出するのは難しい。そのため、特許文献1の手法では、算出される目の開度の誤差が大きくなり、その結果、運転者の覚醒度の推定結果の信頼性が低くなることが懸念される。なお、特許文献1では、運転者の眼球内に下瞼の影を投影させる光源を車内に設置し、その影をカメラで撮影することによって、下瞼の輪郭線の検出精度の向上を図っている。 In the arousal level estimation device of Patent Document 1, the degree of eye opening of the driver is calculated based on the distance between the highest point of the contour line of the upper eyelid and the lowest point of the contour line of the lower eyelid, Whether or not the driver is in the downward-looking state is determined based on the shapes of the contour line and the contour line of the lower eyelid. In general, the contour of the lower eyelid is less clear than the contour of the upper eyelid, and it is difficult to detect the contour of the lower eyelid with high accuracy. Therefore, in the method of Patent Document 1, there is a concern that the error of the calculated eye opening becomes large and, as a result, the reliability of the estimation result of the awakening degree of the driver becomes low. In Patent Document 1, a light source that projects the shadow of the lower eyelid on the eyeball of the driver is installed in the vehicle, and the shadow is photographed by a camera to improve the detection accuracy of the contour of the lower eyelid. There is.
 本発明は以上のような課題を解決するためになされたものであり、車両の乗員の覚醒度を高い精度で推定可能な覚醒度推定装置を提供することを目的とする。 The present invention has been made to solve the above problems, and an object of the present invention is to provide a wakefulness estimation device capable of highly accurately estimating the wakefulness of a vehicle occupant.
 本発明に係る覚醒度推定装置は、車両の乗員の顔を撮影した画像である顔画像を取得する顔画像取得部と、顔画像から検出した目頭の位置、目尻の位置および上瞼の最高点の位置に基づいて上瞼の開度を算出する上瞼開度算出部と、上瞼の開度が予め定められた開度閾値より小さいときに、顔画像から検出した上瞼の最高点の位置および当該上瞼の最高点から下ろした鉛直線と下瞼の輪郭線との交点の位置に基づいて乗員が下方視状態か否かを判定する下方視判定部と、上瞼の開度が開度閾値より小さくなり、且つ、乗員が下方視状態でないと判定されると、乗員の覚醒度が低下したと推定する覚醒度推定部と、を備える。 The arousal level estimation device according to the present invention is a face image acquisition unit that acquires a face image that is an image of the face of an occupant of a vehicle, and the positions of the inner and outer corners of the eyes, the position of the outer corners of the eyes, and the highest point of the upper eyelid. When the upper eyelid opening calculation unit that calculates the opening of the upper eyelid based on the position of, the opening of the upper eyelid is smaller than a predetermined opening threshold, the highest point of the upper eyelid detected from the face image. The position and the downward-viewing determination unit that determines whether the occupant is in the downward-looking state based on the position of the intersection point between the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid, and the opening of the upper eyelid. And a wakefulness estimation unit that estimates that the wakefulness of the occupant has decreased when it is determined that the occupant is smaller than the opening threshold and the occupant is not in the downward viewing state.
 本発明において、車両の乗員の覚醒度は、乗員の上瞼の開度および当該乗員が下方視状態か否かの判定(下方視判定)の結果に基づいて推定される。上瞼の開度は、下瞼の位置を用いずに算出できるため高い精度で算出できる。下方視判定には下瞼の輪郭線の位置が用いられるが、下方視判定は、上瞼の開度が小さい状態、すなわち、下瞼が膨らみを帯びて輪郭線が明瞭になった状態で行われるため、高い精度で行うことができる。その結果、乗員の覚醒度の推定精度も高いものとなる。 In the present invention, the degree of awakening of the occupant of the vehicle is estimated based on the result of determination of the upper eyelid of the occupant and determination of whether the occupant is in the downward-looking state (downward-looking determination). Since the opening degree of the upper eyelid can be calculated without using the position of the lower eyelid, it can be calculated with high accuracy. The position of the contour line of the lower eyelid is used for the downward vision determination, but the downward vision determination is performed with the upper eyelid opening small, that is, with the contour of the lower eyelid bulging and the contour line being clear. Therefore, it can be performed with high accuracy. As a result, the estimation accuracy of the wakefulness of the occupant is also high.
 本発明の目的、特徴、態様、および利点は、以下の詳細な説明と添付図面とによって、より明白となる。 The objects, features, aspects and advantages of the present invention will become more apparent by the following detailed description and the accompanying drawings.
実施の形態1に係る覚醒度推定装置の構成を示すブロック図である。FIG. 3 is a block diagram showing a configuration of an awakening degree estimation device according to the first embodiment. 実施の形態1に係る覚醒度推定装置の動作を示すフローチャートである。5 is a flowchart showing an operation of the arousal level estimation device according to the first embodiment. 実施の形態1に係る覚醒度推定装置の変更例を示すブロック図である。FIG. 6 is a block diagram showing a modified example of the arousal level estimation device according to the first embodiment. 覚醒度推定装置のハードウェア構成例を示す図である。It is a figure which shows the hardware structural example of an awakening degree estimation apparatus. 覚醒度推定装置のハードウェア構成例を示す図である。It is a figure which shows the hardware structural example of an awakening degree estimation apparatus. 実施の形態2に係る自動運転支援装置の構成を示すブロック図である。FIG. 6 is a block diagram showing a configuration of an automatic driving support device according to a second embodiment.
 <実施の形態1>
 図1は、実施の形態1に係る覚醒度推定装置10の構成を示すブロック図である。本実施の形態では、覚醒度推定装置10は車両に搭載されているものと仮定する。ただし、覚醒度推定装置10は、車両に常設されるものに限られず、例えば、携帯電話やスマートフォン、ポーダブルナビゲーション装置など、車両に持ち込み可能な携帯型のデバイスに構築されていてもよい。また、覚醒度推定装置10の一部または全部が、車両との通信が可能なサーバー上に構築されていてもよい。
<Embodiment 1>
FIG. 1 is a block diagram showing the configuration of the arousal level estimation device 10 according to the first embodiment. In the present embodiment, it is assumed that wakefulness estimation device 10 is installed in a vehicle. However, the arousal level estimation device 10 is not limited to one that is permanently installed in the vehicle, and may be built in a portable device that can be brought into the vehicle, such as a mobile phone, a smartphone, or a portable navigation device. Further, part or all of the awakening degree estimation device 10 may be built on a server that can communicate with the vehicle.
 図1のように、覚醒度推定装置10は、車両が備えるカメラ21および警告装置22と接続されている。 As shown in FIG. 1, the alertness estimation device 10 is connected to a camera 21 and a warning device 22 provided in the vehicle.
 カメラ21は、車内を撮影するものであり、車両の乗員の顔を撮影可能な位置に設置されている。本実施の形態では、カメラ21は、車両の全座席の乗員の顔を撮影できるだけの撮影範囲を持つ広角カメラであるものとする。ただし、例えば覚醒度推定装置10が運転席の乗員(運転者)の覚醒度のみを推定すればよい場合は、カメラ21の撮影範囲は運転席のみを撮影できる広さでよい。 The camera 21 is for photographing the inside of the vehicle, and is installed at a position where the face of the vehicle occupant can be photographed. In the present embodiment, the camera 21 is assumed to be a wide-angle camera having a shooting range capable of shooting the faces of passengers in all seats of the vehicle. However, for example, when the awakening degree estimation device 10 needs to estimate only the awakening degree of the occupant (driver) in the driver's seat, the image capturing range of the camera 21 may be wide enough to capture only the driver's seat.
 覚醒度推定装置10は、カメラ21が撮影した乗員の顔の画像(以下「顔画像」という)に基づいて、乗員の覚醒度(眠気の有無)を推定し、その推定結果を警告装置22へ出力する。本実施の形態では、覚醒度推定装置10は、カメラ21で撮影された全乗員の覚醒度を推定するものとする。この場合、覚醒度推定装置10は、各乗員の覚醒度の推定結果とともに、各乗員の座席の位置の情報を出力する。すなわち、覚醒度推定装置10からは、覚醒度が高いと推定される乗員がどの座席にいて、覚醒度が低いと推定される乗員がどの座席にいるのか、といった情報が出力される。 The arousal level estimation device 10 estimates the arousal level (presence or absence of drowsiness) of the occupant based on the image of the occupant's face captured by the camera 21 (hereinafter referred to as “face image”), and sends the estimation result to the warning device 22. Output. In the present embodiment, it is assumed that the alertness estimation device 10 estimates the alertness of all occupants captured by the camera 21. In this case, the arousal level estimation device 10 outputs the information on the position of the seat of each occupant together with the estimation result of the arousal level of each occupant. That is, the wakefulness estimation device 10 outputs information such as in which seat the occupant estimated to have high wakefulness is in, and which seat the occupant estimated to have low wakefulness is in.
 警告装置22は、覚醒度推定装置10が推定した乗員の覚醒度に応じて、車内へ警報を発するものであり、警告音や警告メッセージを発するスピーカ、警報画面を表示するディスプレイなどを備えている。本実施の形態では、警告装置22は、運転席の乗員(運転者)の覚醒度が低下したと推定された場合に警告を発するものとする。ただし、警告装置22の動作内容に制約はなく、例えば、運転者以外の乗員の覚醒度が低下したと推定された場合にも、警告を発してもよい。 The warning device 22 issues a warning to the inside of the vehicle according to the wakefulness of the occupant estimated by the wakefulness estimation device 10, and includes a speaker for issuing a warning sound or a warning message, a display for displaying a warning screen, and the like. .. In the present embodiment, it is assumed that the warning device 22 gives a warning when it is estimated that the awakening level of the occupant (driver) in the driver's seat has decreased. However, the operation content of the warning device 22 is not limited, and for example, the warning may be issued even when it is estimated that the wakefulness of an occupant other than the driver has decreased.
 図1のように、覚醒度推定装置10は、顔画像取得部11、上瞼開度算出部12、下方視判定部13および覚醒度推定部14を備えている。 As shown in FIG. 1, the arousal level estimation device 10 includes a face image acquisition unit 11, an upper eyelid opening calculation unit 12, a downward vision determination unit 13, and an arousal level estimation unit 14.
 顔画像取得部11は、カメラ21が撮影した乗員の顔画像を取得する。本実施の形態では、カメラ21は車両の全座席を撮影できる広角カメラであるため、顔画像取得部11は、顔認識技術を用いて、カメラ21が撮影した車内の画像から乗員ごとの顔画像を抽出する。 The face image acquisition unit 11 acquires the face image of the occupant captured by the camera 21. In the present embodiment, since the camera 21 is a wide-angle camera capable of capturing all seats of the vehicle, the face image acquisition unit 11 uses the face recognition technology to extract the face image for each occupant from the in-vehicle image captured by the camera 21. To extract.
 上瞼開度算出部12は、顔画像取得部11が取得した乗員の顔画像から、乗員の目頭の位置、目尻の位置および上瞼の最高点の位置を検出し、それらの位置関係に基づいて、上瞼の開き具合(以下「開度」という)を算出する。より具体的には、上瞼開度算出部12は、目頭と目尻とを結ぶ直線と上瞼の最高点との間の距離である第1の距離に基づいて、上瞼の開度を算出する。先に述べたように、下瞼の輪郭線は、上瞼の輪郭線に比べ、高い精度で検出するのは難しい。上瞼開度算出部12が算出する上瞼の開度は、下瞼の輪郭線の情報を用いずに算出できるため、その精度は高いものとなる。なお、上瞼の最高点とは、目尻と目頭とを結ぶ直線から最も離れている上瞼の点(頂点)である。 The upper eyelid opening degree calculation unit 12 detects the position of the eyes of the occupant, the position of the outer corners of the eyes, and the position of the highest point of the upper eyelid from the face image of the occupant acquired by the face image acquisition unit 11, and based on their positional relationship. Then, the degree of opening of the upper eyelid (hereinafter referred to as "opening") is calculated. More specifically, the upper eyelid opening degree calculation unit 12 calculates the opening degree of the upper eyelid based on the first distance which is the distance between the straight line connecting the inner corner of the eye and the outer corner of the eye and the highest point of the upper eyelid. To do. As described above, it is difficult to detect the contour of the lower eyelid with higher accuracy than the contour of the upper eyelid. Since the opening degree of the upper eyelid calculated by the upper eyelid opening degree calculation unit 12 can be calculated without using the information of the contour line of the lower eyelid, the accuracy thereof is high. In addition, the highest point of the upper eyelid is the point (vertex) of the upper eyelid that is farthest from the straight line connecting the outer corner of the eye and the inner corner of the eye.
 本実施の形態では、上瞼開度算出部12は、第1の距離を、目頭と目尻との間の距離で除算することで目の扁平率を求め、扁平率を予め定められた基準値で除算して得られる値を、上瞼の開度として算出する。 In the present embodiment, the upper eyelid opening degree calculation unit 12 obtains the flatness of the eye by dividing the first distance by the distance between the inner and outer corners of the eye, and the flattening rate is a predetermined reference value. The value obtained by dividing by is calculated as the opening degree of the upper eyelid.
 目の扁平率の基準値は、例えば、目の扁平率の平均的な値として覚醒度推定装置10の製造者が定めた一定の値でよい。あるいは、覚醒度推定装置10が、顔画像を用いた顔認証などによって乗員個人を識別可能な場合には、覚醒度推定装置10に乗員ごとの基準値を登録できるようにしてもよい。乗員ごとに異なる基準値を用いられることで、上瞼の開度の算出結果に、目の大きさの個人差が影響することが抑えられる。 The reference value of the flatness of the eyes may be, for example, a constant value determined by the manufacturer of the alertness estimation device 10 as an average value of the flatness of the eyes. Alternatively, if the wakefulness estimation device 10 can identify an individual occupant by face authentication using a face image or the like, a reference value for each occupant may be registered in the wakefulness estimation device 10. By using different reference values for each occupant, it is possible to suppress the influence of individual differences in eye size on the calculation result of the upper eyelid opening.
 下方視判定部13は、上瞼開度算出部12が算出した乗員の上瞼の開度が予め定められた閾値(以下「開度閾値」という)より小さい場合に、当該乗員が下方視状態か否かを判定する下方視判定を行う。具体的には、下方視判定部13は、上瞼の開度が開度閾値よりも小さい場合に、顔画像取得部11が取得した顔画像から、上瞼の最高点の位置と、下瞼の輪郭線と、上瞼の最高点から下ろした鉛直線と下瞼の輪郭線との交点(以下「下瞼の特定点」という)の位置とを検出し、上瞼の最高点と下瞼の特定点との位置関係に基づいて、乗員が下方視状態か否かを判定する。なお、下瞼の特定点は、目頭と目尻とを結ぶ直線に向かって上瞼の最高点から下した垂線と下瞼の輪郭線との交点でもよい。 The downward viewing determination unit 13 determines that the occupant is in the downward viewing state when the upper eyelid opening of the occupant calculated by the upper eyelid opening calculation unit 12 is smaller than a predetermined threshold (hereinafter referred to as “opening threshold”). A downward view determination is performed to determine whether or not it is. Specifically, when the opening degree of the upper eyelid is smaller than the opening threshold value, the downward viewing determination unit 13 determines the position of the highest point of the upper eyelid and the lower eyelid from the face image acquired by the face image acquisition unit 11. And the position of the intersection of the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid (hereinafter referred to as the "specific point of the lower eyelid"), the highest point of the upper eyelid and the lower eyelid. It is determined whether or not the occupant is in the downward-looking state based on the positional relationship with the specific point. In addition, the specific point of the lower eyelid may be an intersection of a perpendicular line descending from the highest point of the upper eyelid toward a straight line connecting the inner corner of the eye and the outer corner of the eye and the contour line of the lower eyelid.
 一般に、人が目を伏せたり目を細めたりして上瞼の開度が小さくなった状態では、下瞼が膨らみを帯び、下瞼の輪郭線が明瞭になるため、顔画像から下瞼の輪郭線を高い精度で検出できる。下方視判定部13が行う下方視判定には、下瞼の輪郭線の位置が必要であるが、下方視判定は、上瞼の開度が小さいときに行われるため、下方視判定には、高い精度で得られた下瞼の輪郭線の位置が用いられる。従って、下方視判定部13が行う下方視判定は、精度の高いものとなる。 Generally, when the eyelid is narrowed or the eyelid is narrowed by a person, the lower eyelid bulges and the contour of the lower eyelid becomes clear, so the facial image shows The contour line can be detected with high accuracy. The position of the contour line of the lower eyelid is necessary for the downward view determination performed by the downward view determination unit 13, but the downward view determination is performed when the opening degree of the upper eyelid is small, and therefore the downward view determination is The position of the contour of the lower eyelid obtained with high accuracy is used. Therefore, the downward-view determination performed by the downward-view determination unit 13 is highly accurate.
 本実施の形態では、下方視判定部13は、上瞼の開度が開度閾値よりも小さい場合に、上瞼の最高点と下瞼の特定点との間の距離である第2の距離とに基づいて、乗員が下方視状態か否かを判定する。すなわち、下方視判定部13は、上瞼の開度が開度閾値より小さく、且つ、第2の距離が予め定められた閾値(以下「距離閾値」という)より大きい場合に、乗員が下方視状態であると判定する。 In the present embodiment, the downward viewing determination unit 13 determines the second distance, which is the distance between the highest point of the upper eyelid and the specific point of the lower eyelid, when the opening degree of the upper eyelid is smaller than the opening degree threshold value. It is determined whether or not the occupant is in the downward-looking state based on and. That is, when the opening degree of the upper eyelid is smaller than the opening threshold value and the second distance is larger than a predetermined threshold value (hereinafter referred to as “distance threshold value”), the downward viewing determination unit 13 causes the occupant to look downward. It is determined to be in the state.
 以下、「上瞼の開度が開度閾値より小さく、且つ、第2の距離が距離閾値より大きい」という条件を「下方視条件」ということもある。また、第2の距離を「上下瞼間距離」ということもある。 Hereafter, the condition "the upper eyelid opening is smaller than the opening threshold and the second distance is larger than the distance threshold" may be referred to as "downward viewing condition". The second distance may also be referred to as the “upper and lower eyelid distance”.
 距離閾値は、例えば、下方視に最低限必要とされる上下瞼間距離の大きさとして覚醒度推定装置10の製造者が定めた一定の値もよい。あるいは、覚醒度推定装置10が、顔画像を用いた顔認証などによって乗員個人を識別可能な場合には、覚醒度推定装置10に乗員ごとの距離閾値を登録できるようにしてもよい。乗員ごとに異なる距離閾値が用いられることで、下方視判定の結果に、下方視時の目の大きさの個人差が影響することが抑えられる。 The distance threshold may be, for example, a constant value determined by the manufacturer of the arousal level estimation device 10 as the size of the minimum distance between the upper and lower eyelids required for downward viewing. Alternatively, when the wakefulness estimation device 10 can identify an individual occupant by face recognition using a face image or the like, a distance threshold for each occupant may be registered in the wakefulness estimation device 10. By using a different distance threshold for each occupant, it is possible to suppress the result of the downward vision determination from being influenced by the individual difference in the size of the eyes when looking downward.
 覚醒度推定部14は、上瞼開度算出部12が算出した乗員の上瞼の開度、および、下方視判定部13が行った当該乗員の下方視判定の結果に基づいて、当該乗員の覚醒度が低下したか否かを推定する。具体的には、覚醒度推定部14は、乗員の上瞼の開度が開度閾値より小さくなり、且つ、当該乗員が下方視状態でないと判定されるときに、当該乗員の覚醒度が低下したと推定する。上記のように、上瞼開度算出部12は乗員の上瞼の開度を高い精度で算出でき、下方視判定部13は下方視判定を高い精度で行うことができるため、覚醒度推定部14による乗員の覚醒度推定は精度の高いものとなる。 The awakening degree estimation unit 14 determines the occupant's upper eyelid opening calculated by the upper eyelid opening calculation unit 12 and the result of the downward vision determination of the occupant performed by the downward vision determination unit 13 for the occupant. Estimate if the alertness has decreased. Specifically, the wakefulness estimation unit 14 reduces the wakefulness of the occupant when the occupant's upper eyelid opening is smaller than the opening threshold and it is determined that the occupant is not in the downward-looking state. Presumed to have done. As described above, the upper eyelid opening degree calculation unit 12 can calculate the upper eyelid opening degree of the occupant with high accuracy, and the downward vision determination unit 13 can perform downward vision determination with high accuracy. The estimation of the awakening degree of the occupant by 14 is highly accurate.
 以下、「乗員の上瞼の開度が開度閾値より小さくなり、且つ、当該乗員が下方視状態でない」という条件を「覚醒度低下条件」ということもある。 Hereafter, the condition that "the occupant's upper eyelid opening is smaller than the opening threshold value and the occupant is not in the downward-looking state" may also be referred to as "awakening degree lowering condition".
 このように、実施の形態1に係る覚醒度推定装置10によれば、乗員が下方視状態か否かを考慮して、乗員の覚醒度を高い精度で推定することができる。また、比較的困難な処理となる下瞼の輪郭線の検出は、少なくとも下方視判定が行われるとき(つまり、上瞼の開度が開度閾値より小さいとき)に行われればよいことから、覚醒度推定装置10の処理負荷の低減にも寄与できる。 As described above, according to the arousal level estimation device 10 according to the first embodiment, the arousal level of the occupant can be estimated with high accuracy in consideration of whether or not the occupant is in the downward vision state. Further, the detection of the contour of the lower eyelid, which is a relatively difficult process, may be performed at least when the downward vision determination is performed (that is, when the opening of the upper eyelid is smaller than the opening threshold). It can also contribute to the reduction of the processing load of the alertness estimation device 10.
 図2は、実施の形態1に係る覚醒度推定装置10の動作を示すフローチャートである。以下、図2を参照しつつ、覚醒度推定装置10の動作を説明する。 FIG. 2 is a flowchart showing the operation of the alertness estimation device 10 according to the first embodiment. The operation of the alertness estimation device 10 will be described below with reference to FIG.
 車両のイグニッションがオンになり、覚醒度推定装置10が起動すると、まず、顔画像取得部11が、カメラ21が撮影した乗員の顔画像を取得する(ステップS101)。ここでは説明の簡単のため、カメラ21が撮影した画像には、1人の乗員の顔画像だけが含まれているものと仮定する。ただし、本実施の形態のようにカメラ21が広角カメラであれば、ステップS101で複数の乗員の顔画像が取得されることもある。その場合、以下のステップS102~S108の処理は、それぞれの乗員の顔画像に対して実施される。 When the ignition of the vehicle is turned on and the awakening degree estimation device 10 is activated, first, the face image acquisition unit 11 acquires the face image of the occupant captured by the camera 21 (step S101). Here, for simplicity of explanation, it is assumed that the image captured by the camera 21 includes only the face image of one passenger. However, if the camera 21 is a wide-angle camera as in the present embodiment, face images of a plurality of passengers may be acquired in step S101. In that case, the following processes of steps S102 to S108 are performed on the face images of the respective occupants.
 次に、上瞼開度算出部12が、ステップS101で取得された乗員の顔画像から、乗員の目頭の位置、目尻の位置および上瞼の最高点の位置を検出し、それらの位置関係に基づいて、乗員の上瞼の開度を算出する(ステップS102)。 Next, the upper eyelid opening degree calculation unit 12 detects the position of the eyes of the occupant, the position of the outer corners of the eyes, and the position of the highest point of the upper eyelid from the face image of the occupant acquired in step S101, and establishes their positional relationship. Based on this, the opening degree of the upper eyelid of the occupant is calculated (step S102).
 ステップS102で算出された乗員の上瞼の開度が予め定められた開度閾値より小さい場合(ステップS103でYES)、下方視判定部13が、ステップS101で取得された乗員の顔画像に基づいて、当該乗員が下方視状態か否かを判定する下方視判定を行う(ステップS104)。 When the opening degree of the occupant's upper eyelid calculated in step S102 is smaller than the predetermined opening degree threshold value (YES in step S103), the downward viewing determination unit 13 determines based on the occupant's face image acquired in step S101. Then, the downward vision determination is performed to determine whether the occupant is in the downward vision state (step S104).
 ステップS104の下方視判定の結果、乗員が下方視状態でないと判定された場合(ステップS105でNO)、覚醒度推定部14は、当該乗員の覚醒度が低下したと推定する(ステップS106)。 When it is determined that the occupant is not in the downward looking state as a result of the downward looking determination in step S104 (NO in step S105), the awakening degree estimation unit 14 estimates that the awakening degree of the occupant has decreased (step S106).
 一方、ステップS102で算出された乗員の上瞼の開度が開度閾値より大きい場合(ステップS103でNO)、または、ステップS104で乗員が下方視状態であると判定された場合(ステップS105でYES)、覚醒度推定部14は、当該乗員の覚醒度は高いと推定する(ステップS107)。 On the other hand, when the opening of the occupant's upper eyelid calculated in step S102 is larger than the opening threshold value (NO in step S103), or when it is determined in step S104 that the occupant is in the downward-looking state (step S105). YES), the awakening degree estimation unit 14 estimates that the awakening degree of the occupant is high (step S107).
 ステップS106またはS107による乗員の覚醒度の推定結果は、警告装置22に出力される(ステップS108)。その結果、警告装置22からは、乗員の覚醒度の推定結果に応じた警告が発せられる。 The estimation result of the wakefulness of the occupant in step S106 or S107 is output to the warning device 22 (step S108). As a result, the warning device 22 issues a warning according to the estimation result of the awakening degree of the occupant.
 覚醒度推定装置10は、以上のフローを繰り返し実行する。なお、当該フローは、例えば、覚醒度推定装置10の起動後(車両のイグニッションがオンになった後)、常時実行されてもよいし、車両の走行中のみに実行される(車両の停止中は実行されない)ようにしてもよい。当該フローが常時実行される場合、車両が動き出す前から乗員の覚醒度が推定されるため、車両の安全性の向上に寄与できる。ただし、一般的には、車両の停止中に乗員の覚醒度を推定する必要性はあまり高くないため、当該フローが車両の走行中のみに実行されるようにして、覚醒度推定装置10の処理負荷の低減および消費電力の低減を図ってもよい。 The arousal level estimation device 10 repeatedly executes the above flow. Note that the flow may be constantly executed, for example, after the activation of the awakening degree estimation device 10 (after the ignition of the vehicle is turned on), or may be executed only while the vehicle is running (while the vehicle is stopped). May not be executed). When the flow is constantly executed, the wakefulness of the occupant is estimated before the vehicle starts moving, which can contribute to the improvement of vehicle safety. However, in general, it is not necessary to estimate the awakening degree of the occupant while the vehicle is stopped. Therefore, the flow is executed only while the vehicle is running, and the processing of the awakening degree estimation device 10 is performed. The load and the power consumption may be reduced.
 [変更例]
 実施の形態1の覚醒度推定装置10では、覚醒度推定部14が、乗員の覚醒度の推定結果を警告装置22に出力していたが、その出力先は警告装置22に限られず、任意の装置でよい。例えば、図3のように、覚醒度推定部14による乗員の覚醒度の推定結果が、車両の自動運転を行うECU(Electronic Control Unit)である自動運転制御装置23にも入力されてもよい。
[Modification]
In the wakefulness estimation device 10 of the first embodiment, the wakefulness estimation unit 14 outputs the estimation result of the wakefulness of the occupant to the warning device 22, but the output destination is not limited to the warning device 22, and any output destination. It can be a device. For example, as shown in FIG. 3, the estimation result of the wakefulness of the occupant by the wakefulness estimation unit 14 may be input to the automatic driving control device 23 that is an ECU (Electronic Control Unit) that automatically drives the vehicle.
 図3の自動運転制御装置23の動作としては、例えば、覚醒度推定部14が運転席の乗員(運転者)の覚醒度が低下したと推定し、警告装置22が警告を発したにもかかわらず、運転者の覚醒度が低下したままであった場合に、自動運転制御装置23が、車両を自動的に安全な場所(例えば、幅の広い路肩やパーキングエリアなど)へ退避させて停止させるといった動作が考えられる。 As the operation of the automatic driving control device 23 in FIG. 3, for example, the alertness estimation unit 14 estimates that the alertness of the occupant (driver) in the driver's seat has decreased, and the warning device 22 gives a warning. If the driver's awakening level remains low, the automatic driving control device 23 automatically evacuates the vehicle to a safe place (for example, a wide shoulder or a parking area) and stops the vehicle. Such an operation is possible.
 また、実施の形態1では、下方視判定部13は、乗員が上瞼の開度が開度閾値より小さく、且つ、当該乗員の上下瞼間距離(第2の距離)が距離閾値より大きくなるという条件(下方視条件)が満たされると、直ちに、当該乗員が下方視状態であると判定した。しかし、下方視判定部13は、乗員の目が下方視条件を満たした時間がある程度長くなったときに、当該乗員が下方視状態であると判定してもよい。具体的には、下方視条件が満たされた時間の一定時間に占める割合が予め定められた閾値(時間閾値)を超えたときに、当該乗員が下方視状態であると判定してもよい。それにより、例えばノイズの影響などによる、下方視の誤検出を抑制でき、覚醒度の推定精度のさらなる向上に寄与できる。 Further, in the first embodiment, in the downward vision determination unit 13, the occupant has an upper eyelid opening smaller than the opening threshold value, and the occupant's upper and lower eyelid distance (second distance) becomes larger than the distance threshold value. Immediately after the condition (downward viewing condition) is satisfied, it is determined that the occupant is in a downward viewing state. However, the downward-viewing determination unit 13 may determine that the occupant is in the downward-viewing state when the eyes of the occupant have satisfied the downward-viewing condition for a certain length of time. Specifically, it may be determined that the occupant is in the downward-looking state when the ratio of the time when the downward-looking condition is satisfied to a certain time exceeds a predetermined threshold value (time threshold value). As a result, it is possible to suppress erroneous detection of downward vision due to, for example, the influence of noise, which can contribute to further improvement in the estimation accuracy of the arousal level.
 乗員の顔の向きと視線の方向との間には相関がある。具体的には、乗員が下方視するとき、目や瞼が動くだけでなく、顔の向きが視線の方向へ向けて動く傾向がある。そのため、下方視判定部13は、顔画像から検出した乗員の顔の向きを加味して、乗員が下方視状態か否かを判定してもよい。例えば、下方視判定部13が、乗員の目が下方視条件を満たし、なお且つ、当該乗員の顔の向きが下方向(特に、車内のメータやナビゲーション装置の画面がある方向)へ向けて動いた場合に、当該乗員が下方視状態であると判定するようにしてもよい。 There is a correlation between the direction of the occupant's face and the direction of the line of sight. Specifically, when the occupant looks downward, not only the eyes and eyelids move, but also the face tends to move toward the direction of the line of sight. Therefore, the downward-viewing determination unit 13 may determine whether the occupant is in the downward-looking state by taking into consideration the orientation of the occupant's face detected from the face image. For example, the downward vision determination unit 13 causes the eyes of the occupant to satisfy the downward vision condition, and the direction of the occupant's face moves downward (in particular, the direction in which the meter in the vehicle or the screen of the navigation device is located). In this case, it may be determined that the occupant is in the downward-looking state.
 また、乗員の顔の向きと覚醒度との間にも相関がある。具体的には、乗員の覚醒度が下がると、顔の動きが少なくなり、顔の向きがほぼ一定になる傾向がある。そのため、覚醒度推定部14は、顔画像から検出した乗員の顔の向きを加味して、乗員の覚醒度を推定してもよい。例えば、覚醒度推定部14が、乗員の上瞼の開度が開度閾値より小さくなり、且つ、当該乗員が下方視状態でないという条件(覚醒度低下条件)が満たされ、なお且つ、当該乗員の顔の向きの変動が予め定められた閾値よりも小さい場合に、当該乗員の覚醒度が低下したと推定するようにしてもよい。 Also, there is a correlation between the occupant's face orientation and arousal level. Specifically, when the wakefulness of the occupant decreases, the movement of the face tends to decrease, and the orientation of the face tends to be substantially constant. Therefore, the awakening degree estimation unit 14 may estimate the awakening degree of the occupant by taking into consideration the orientation of the occupant's face detected from the face image. For example, the awakening degree estimation unit 14 satisfies the condition that the opening of the upper eyelid of the occupant is smaller than the opening threshold value and that the occupant is not in the downward-looking state (awakening degree lowering condition), and If the variation in the face orientation is smaller than a predetermined threshold value, it may be estimated that the wakefulness of the occupant has decreased.
 乗員は、シフトレバーやナビゲーション装置など、車両に搭載された機器(車載機器)を操作するときに、下方視することが多い。そのため、下方視判定部13は、車載機器の操作状況を加味して、乗員が下方視状態か否かを判定してもよい。例えば、下方視判定部13が、乗員の目が下方視条件を満たし、なお且つ、当該乗員が顔よりも低い位置にある車載機器の操作を行った場合に、乗員が下方視状態であると判定するようにしてもよい。なお、車載機器の動作状況の情報は、例えば車両のECUなどから取得することができる。 ∙ Occupants often look down when operating equipment (vehicle equipment) mounted on the vehicle, such as shift levers and navigation devices. Therefore, the downward-viewing determination unit 13 may determine whether or not the occupant is in the downward-looking state by taking into consideration the operation status of the vehicle-mounted device. For example, when the downward vision determination unit 13 determines that the occupant is in the downward vision state when the eyes of the occupant satisfy the downward vision condition and the occupant operates the vehicle-mounted device at a position lower than the face. It may be determined. The information on the operating status of the vehicle-mounted device can be acquired from, for example, the ECU of the vehicle.
 また、車載機器を操作している乗員は、覚醒度が高いと考えられる。そのため、覚醒度推定部14は、車載機器の操作状況を加味して、乗員の覚醒度を推定してもよい。例えば、覚醒度推定部14が、乗員の目が覚醒度低下条件を満たし、なお且つ、当該乗員が車載機器の操作を行っていない場合に、当該乗員の覚醒度が低下したと推定するようにしてもよい。 Also, it is considered that the occupants operating the in-vehicle device are highly alert. Therefore, the awakening degree estimation unit 14 may estimate the awakening degree of the occupant in consideration of the operation status of the vehicle-mounted device. For example, when the occupant's eyes satisfy the wakefulness reduction condition and the occupant is not operating the in-vehicle device, the wakefulness estimation unit 14 estimates that the wakeup level of the occupant has decreased. May be.
 [ハードウェア構成例]
 図4および図5は、それぞれ覚醒度推定装置10のハードウェア構成の例を示す図である。図1に示した覚醒度推定装置10の構成要素の各機能は、例えば図4に示す処理回路50により実現される。すなわち、覚醒度推定装置10は、車両の乗員の顔を撮影した画像である顔画像を取得し、顔画像から検出した目頭の位置、目尻の位置および上瞼の最高点の位置に基づいて上瞼の開度を算出し、上瞼の開度が予め定められた開度閾値より小さいときに、顔画像から検出した上瞼の最高点の位置および当該上瞼の最高点から下ろした鉛直線と下瞼の輪郭線との交点の位置に基づいて乗員が下方視状態か否かを判定し、上瞼の開度が開度閾値より小さくなり、且つ、乗員が下方視状態でないと判定されると、乗員の覚醒度が低下したと推定するための処理回路50を備える。処理回路50は、専用のハードウェアであってもよいし、メモリに格納されたプログラムを実行するプロセッサ(中央処理装置(CPU:Central Processing Unit)、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)とも呼ばれる)を用いて構成されていてもよい。
[Example of hardware configuration]
4 and 5 are diagrams each showing an example of the hardware configuration of the arousal level estimation device 10. Each function of the components of the alertness estimation device 10 shown in FIG. 1 is realized by the processing circuit 50 shown in FIG. 4, for example. That is, the arousal level estimation device 10 acquires a face image that is an image of the face of an occupant of the vehicle, and raises it based on the positions of the inner and outer corners of the eyes, the positions of the outer corners of the eyes, and the position of the highest point of the upper eyelid detected from the face image. When the eyelid opening is calculated and the eyelid opening is smaller than a predetermined opening threshold, the position of the highest point of the upper eyelid detected from the face image and the vertical line drawn from the highest point of the upper eyelid It is determined whether the occupant is in the downward-looking state based on the position of the intersection between the contour line of the lower eyelid and the lower eyelid, the opening of the upper eyelid is smaller than the opening threshold, and it is determined that the occupant is not in the downward-looking state. Then, the processing circuit 50 is provided for estimating that the wakefulness of the occupant has decreased. The processing circuit 50 may be dedicated hardware, or a processor that executes a program stored in a memory (Central Processing Unit (CPU), processing device, arithmetic device, microprocessor, microcomputer, It may be configured using a DSP (also referred to as a digital signal processor).
 処理回路50が専用のハードウェアである場合、処理回路50は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、またはこれらを組み合わせたものなどが該当する。覚醒度推定装置10の構成要素の各々の機能が個別の処理回路で実現されてもよいし、それらの機能がまとめて一つの処理回路で実現されてもよい。 When the processing circuit 50 is dedicated hardware, the processing circuit 50 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), and an FPGA (Field-Programmable). Gate Array), or a combination of these. Each function of the components of the alertness estimation device 10 may be realized by an individual processing circuit, or those functions may be collectively realized by one processing circuit.
 図5は、処理回路50がプログラムを実行するプロセッサ51を用いて構成されている場合における覚醒度推定装置10のハードウェア構成の例を示している。この場合、覚醒度推定装置10の構成要素の機能は、ソフトウェア等(ソフトウェア、ファームウェア、またはソフトウェアとファームウェアとの組み合わせ)により実現される。ソフトウェア等はプログラムとして記述され、メモリ52に格納される。プロセッサ51は、メモリ52に記憶されたプログラムを読み出して実行することにより、各部の機能を実現する。すなわち、覚醒度推定装置10は、プロセッサ51により実行されるときに、車両の乗員の顔を撮影した画像である顔画像を取得する処理と、顔画像から検出した目頭の位置、目尻の位置および上瞼の最高点の位置に基づいて上瞼の開度を算出する処理と、上瞼の開度が予め定められた開度閾値より小さいときに、顔画像から検出した上瞼の最高点の位置および当該上瞼の最高点から下ろした鉛直線と下瞼の輪郭線との交点の位置に基づいて乗員が下方視状態か否かを判定する処理と、上瞼の開度が開度閾値より小さくなり、且つ、乗員が下方視状態でないと判定されると、乗員の覚醒度が低下したと推定する処理と、が結果的に実行されることになるプログラムを格納するためのメモリ52を備える。換言すれば、このプログラムは、覚醒度推定装置10の構成要素の動作の手順や方法をコンピュータに実行させるものであるともいえる。 FIG. 5 shows an example of the hardware configuration of the alertness estimation device 10 when the processing circuit 50 is configured using the processor 51 that executes a program. In this case, the functions of the constituent elements of the alertness estimation device 10 are realized by software or the like (software, firmware, or a combination of software and firmware). The software and the like are written as a program and stored in the memory 52. The processor 51 realizes the function of each unit by reading and executing the program stored in the memory 52. That is, the awakening level estimation apparatus 10 acquires, when executed by the processor 51, a process of acquiring a face image that is an image of the face of an occupant of the vehicle, and positions of the inner and outer corners of the eye detected from the face image, and Processing to calculate the opening of the upper eyelid based on the position of the highest point of the upper eyelid, when the opening of the upper eyelid is smaller than a predetermined opening threshold, of the highest point of the upper eyelid detected from the face image A process of determining whether or not the occupant is in the downward-looking state based on the position and the position of the intersection point of the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid, and the opening degree of the upper eyelid is the opening threshold value. When it is determined that the occupant is smaller and the occupant is not in the downward-looking state, a process of estimating that the wakefulness of the occupant is lowered, and a memory 52 for storing a program to be executed as a result, Prepare In other words, it can be said that this program causes a computer to execute the procedure and method of the operation of the constituent elements of the alertness estimation device 10.
 ここで、メモリ52は、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリー、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)などの、不揮発性または揮発性の半導体メモリ、HDD(Hard Disk Drive)、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)およびそのドライブ装置等、または、今後使用されるあらゆる記憶媒体であってもよい。 Here, the memory 52 is, for example, RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), EEPROM (ElectricallyErasableProgrammableReadOnlyMemory), or the like. Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc) and its drive device, or any storage medium used in the future. May be.
 以上、覚醒度推定装置10の構成要素の機能が、ハードウェアおよびソフトウェア等のいずれか一方で実現される構成について説明した。しかしこれに限ったものではなく、覚醒度推定装置10の一部の構成要素を専用のハードウェアで実現し、別の一部の構成要素をソフトウェア等で実現する構成であってもよい。例えば、一部の構成要素については専用のハードウェアとしての処理回路50でその機能を実現し、他の一部の構成要素についてはプロセッサ51としての処理回路50がメモリ52に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。 Above, the configuration of the functions of the components of the awakening degree estimation device 10 being realized by either hardware or software has been described. However, the configuration is not limited to this, and a configuration in which some of the components of the alertness estimation device 10 are implemented by dedicated hardware and another of the components is implemented by software or the like may be used. For example, for some of the constituent elements, the function is realized by the processing circuit 50 as dedicated hardware, and for some of the other constituent elements, the processing circuit 50 as the processor 51 executes the program stored in the memory 52. The function can be realized by reading and executing.
 以上のように、覚醒度推定装置10は、ハードウェア、ソフトウェア等、またはこれらの組み合わせによって、上述の各機能を実現することができる。 As described above, the arousal level estimation device 10 can realize each function described above by hardware, software, or a combination thereof.
 <実施の形態2>
 実施の形態2では、実施の形態1で示した覚醒度推定装置10を、車両の自動運転を支援する自動運転支援装置に適用した例を示す。
<Second Embodiment>
The second embodiment shows an example in which the awakening degree estimation device 10 described in the first embodiment is applied to an automatic driving support device that supports automatic driving of a vehicle.
 ここで、自動車の自動運転の自動化レベル(自動運転レベル)の定義について説明する。SAE(Society of Automotive Engineers)International のJ3016(2016年9月)、および、その日本語参考訳であるJASO TP18004(2018年2月)によると、自動運転システムの自動運転レベルは次のように定義されている。 Here, the definition of the automation level of automatic driving of a car (automatic driving level) will be explained. According to J3016 (September 2016) of SAE (Society of Automotive Engineers) International and its Japanese translation, JASO TP18004 (February 2018), the automatic driving level of the automatic driving system is defined as follows. Has been done.
 レベル0(運転自動化なし):運転者が一部又は全ての動的運転タスクを実行
 レベル1(運転支援):システムが縦方向又は横方向のいずれかの車両運動制御のサブタスクを限定領域において実行
 レベル2(部分運転自動化):システムが縦方向及び横方向両方の車両運動制御のサブタスクを限定領域において実行
 レベル3(条件付運転自動化):システムが全ての動的運転タスクを限定領域において実行するが、作動継続が困難な場合は、システムからの介入要求等に運転者が適切に応答
 レベル4(高度運転自動化):システムが全ての動的運転タスク及び作動継続が困難な場合への応答を限定領域において実行
 レベル5(完全運転自動化):システムが全ての動的運転タスク及び作動継続が困難な場合への応答を無制限に(すなわち限定領域内ではない)実行
 なお、「動的運転タスク」とは、道路交通において車両を操作する際にリアルタイムで行う必要がある全ての操作上及び戦術上の機能(行程計画並びに経由地の選択などの戦略上の機能は除く)をいう。また、「限定領域」とは、システム又はその機能が作動するように設計されている特定の条件(地理的制約、道路面の制約、環境的制約、交通の制約、速度上の制約、時間的な制約などを含む)をいう。
Level 0 (No driving automation): Driver performs some or all of the dynamic driving tasks Level 1 (Driving assistance): System performs either vertical or horizontal vehicle motion control subtasks in a limited area Level 2 (Partial driving automation): System executes both vertical and lateral vehicle motion control subtasks in limited area Level 3 (Conditional driving automation): System executes all dynamic driving tasks in limited area However, if it is difficult to continue operation, the driver appropriately responds to the intervention request from the system. Level 4 (advanced operation automation): The system responds to all dynamic driving tasks and when it is difficult to continue operation. Execution in limited area Level 5 (complete operation automation): System executes all dynamic driving tasks and responses to cases where it is difficult to continue operation (that is, not within the limited area). The term means all operational and tactical functions (excluding strategic functions such as itinerary planning and selection of transit points) that must be performed in real time when operating a vehicle in road traffic. In addition, "limited area" means a specific condition (geographical constraint, road constraint, environmental constraint, traffic constraint, speed constraint, time constraint, etc.) that the system or its function is designed to operate. (Including restrictions).
 図6は、実施の形態2に係る自動運転支援装置30の構成を示すブロック図である。図6において、図1に示したものと同様に機能する要素には、図1と同一の参照符号を付しており、ここではそれらの説明は省略する。 FIG. 6 is a block diagram showing the configuration of the automatic driving support device 30 according to the second embodiment. In FIG. 6, elements having the same functions as those shown in FIG. 1 are designated by the same reference numerals as those in FIG. 1, and description thereof will be omitted here.
 図6のように、自動運転支援装置30は、カメラ21、自動運転制御装置23および通知装置24と接続されている。自動運転制御装置23は、車両の自動運転を行うECUであり、上記の「システム」に相当する。本実施の形態では、自動運転制御装置23は、少なくともレベル3の自動運転を行うものとする。通知装置24は、車両の乗員に対する通知を行うものであり、通知音や通知メッセージの音声を発するスピーカ、警報画面を表示するディスプレイなどを備えている。 As shown in FIG. 6, the automatic driving support device 30 is connected to the camera 21, the automatic driving control device 23, and the notification device 24. The automatic driving control device 23 is an ECU that performs automatic driving of the vehicle, and corresponds to the above-mentioned "system". In the present embodiment, the automatic driving control device 23 performs at least level 3 automatic driving. The notification device 24 notifies the occupant of the vehicle, and includes a speaker that emits a notification sound or a notification message sound, a display that displays an alarm screen, and the like.
 また、自動運転支援装置30は、覚醒度推定装置10と、乗員選択部31とを備えている。覚醒度推定装置10は、実施の形態1で説明したものと同様に車両の乗員の覚醒度を推定し、その推定結果を乗員選択部31へ出力する。 Further, the automatic driving support device 30 includes the awakening degree estimation device 10 and an occupant selection unit 31. The arousal level estimation device 10 estimates the arousal level of the occupant of the vehicle as described in the first embodiment, and outputs the estimation result to the occupant selection unit 31.
 乗員選択部31は、自動運転支援装置30からの介入要求に応じて、車両の運転に介入させる乗員(運転者にする乗員)を選択する。介入要求は、自動運転支援装置30がレベル3の自動運転を実行しており、自動運転の継続が困難になった場合に、自動運転支援装置30から乗員選択部31へ入力される。乗員選択部31は、自動運転支援装置30からの介入要求を受けると、覚醒度推定装置10による乗員の覚醒度の推定結果を確認し、覚醒度が高いと推定された乗員を、車両の運転に介入させる乗員として選択する。 The occupant selection unit 31 selects the occupant (the occupant to be the driver) to intervene in driving the vehicle in response to the intervention request from the automatic driving support device 30. The intervention request is input from the automatic driving support apparatus 30 to the occupant selection unit 31 when the automatic driving support apparatus 30 is performing level 3 automatic driving and it becomes difficult to continue the automatic driving. When the occupant selection unit 31 receives the intervention request from the automatic driving support device 30, the occupant selection unit 31 confirms the estimation result of the wakefulness of the occupant by the wakefulness estimation device 10, and drives the occupant estimated to have a high wakefulness to drive the vehicle. To be selected as an occupant to intervene in.
 乗員選択部31による選択結果は、通知装置24に入力される。通知装置24は、乗員選択部31が選択した乗員に対し、車両の運転へ介入を指示するための通知を行う。 The selection result by the passenger selection unit 31 is input to the notification device 24. The notification device 24 notifies the occupant selected by the occupant selection unit 31 to instruct intervention in driving the vehicle.
 乗員選択部31が運転席にいる乗員を選択した場合は、車両を停止させることなく、車両の運転権限をその乗員へ移行することができる。しかし、乗員選択部31が運転席以外の乗員を選択した場合は、自動運転制御装置23が車両を安全な場所に停止させ、選択された乗員が運転席へ移動してから、その乗員に運転権限を移行すればよい。覚醒度が高いと推定された乗員が複数いる場合、乗員選択部31はそのうちの誰を選択してもよいが、運転席にいる乗員を優先的に選択するようにしてもよい。 When the occupant selection unit 31 selects an occupant in the driver's seat, the driving authority of the vehicle can be transferred to the occupant without stopping the vehicle. However, when the occupant selection unit 31 selects a occupant other than the driver's seat, the automatic driving control device 23 stops the vehicle at a safe place, moves the selected occupant to the driver's seat, and then drives the occupant. Transfer the authority. When there are a plurality of occupants estimated to have a high arousal level, the occupant selection unit 31 may select any one of them, but may preferentially select the occupant in the driver's seat.
 このように、実施の形態2の自動運転支援装置30は、自動運転支援装置30が乗員へ運転権限を引き渡す、いわゆるハンドオーバーの処理を支援し、覚醒度の低い乗員に運転権限が移行されることを防止することができる。 As described above, the automatic driving support device 30 according to the second embodiment supports the so-called handover process in which the automatic driving support device 30 transfers the driving authority to the occupant, and the driving authority is transferred to the occupant having a low awakening degree. It can be prevented.
 なお、本発明は、その発明の範囲内において、各実施の形態を自由に組み合わせたり、各実施の形態を適宜、変形、省略したりすることが可能である。 Note that, in the present invention, the respective embodiments can be freely combined, or the respective embodiments can be appropriately modified or omitted within the scope of the invention.
 本発明は詳細に説明されたが、上記した説明は、すべての態様において、例示であって、この発明がそれに限定されるものではない。例示されていない無数の変形例が、この発明の範囲から外れることなく想定され得るものと解される。 Although the present invention has been described in detail, the above description is illustrative in all aspects, and the present invention is not limited thereto. It is understood that innumerable variants not illustrated can be envisaged without departing from the scope of the invention.
 10 覚醒度推定装置、11 顔画像取得部、12 上瞼開度算出部、13 下方視判定部、14 覚醒度推定部、21 カメラ、22 警告装置、23 自動運転制御装置、24 通知装置、30 自動運転支援装置、31 乗員選択部、50 処理回路、51 プロセッサ、52 メモリ。 10 arousal level estimation device, 11 face image acquisition part, 12 upper eyelid opening calculation part, 13 downward vision determination part, 14 awakening level estimation part, 21 camera, 22 warning device, 23 automatic driving control device, 24 notification device, 30 Automatic driving support device, 31 passenger selection unit, 50 processing circuit, 51 processor, 52 memory.

Claims (17)

  1.  車両の乗員の顔を撮影した画像である顔画像を取得する顔画像取得部と、
     前記顔画像から検出した目頭の位置、目尻の位置および上瞼の最高点の位置に基づいて前記上瞼の開度を算出する上瞼開度算出部と、
     前記上瞼の開度が予め定められた開度閾値より小さいときに、前記顔画像から検出した前記上瞼の最高点の位置および当該上瞼の最高点から下ろした鉛直線と下瞼の輪郭線との交点の位置に基づいて前記乗員が下方視状態か否かを判定する下方視判定部と、
     前記上瞼の開度が前記開度閾値より小さくなり、且つ、前記乗員が下方視状態でないと判定されると、前記乗員の覚醒度が低下したと推定する覚醒度推定部と、
    を備える覚醒度推定装置。
    A face image acquisition unit that acquires a face image that is an image of the face of a vehicle occupant;
    Position of the eyelid detected from the face image, the position of the corner of the eye and the position of the highest point of the upper eyelid, an upper eyelid opening calculation unit that calculates the opening of the upper eyelid,
    When the opening of the upper eyelid is smaller than a predetermined opening threshold, the position of the highest point of the upper eyelid detected from the face image and the vertical line and the contour of the lower eyelid lowered from the highest point of the upper eyelid A downward-viewing determination unit that determines whether the occupant is in the downward-viewing state based on the position of the intersection with the line,
    An opening degree of the upper eyelid is smaller than the opening threshold value, and, when it is determined that the occupant is not in a downward-looking state, an awakening degree estimation unit that estimates that the awakening degree of the occupant has decreased,
    An awakening degree estimation device including.
  2.  前記上瞼開度算出部は、前記目頭と前記目尻とを結ぶ直線と前記上瞼の最高点との間の距離である第1の距離に基づいて、前記上瞼の開度を算出する、
    請求項1に記載の覚醒度推定装置。
    The upper eyelid opening calculation unit calculates the opening of the upper eyelid based on a first distance which is a distance between a straight line connecting the inner corner of the eye and the outer corner of the eye and the highest point of the upper eyelid,
    The arousal level estimation device according to claim 1.
  3.  前記上瞼開度算出部は、前記第1の距離を前記目頭と前記目尻との間の距離で除算して得られる扁平率を予め定められた基準値で除算して得られる値を、前記上瞼の開度として算出する、
    請求項2に記載の覚醒度推定装置。
    The upper eyelid opening calculation unit divides the first distance by the distance between the inner corner of the eye and the outer corner of the eye to obtain a value obtained by dividing a flatness rate by a predetermined reference value, Calculated as the opening of the upper eyelid,
    The arousal level estimation device according to claim 2.
  4.  前記上瞼開度算出部は、乗員ごとに異なる値の前記基準値を用いる、
    請求項3に記載の覚醒度推定装置。
    The upper eyelid opening calculation unit uses the reference value of a different value for each occupant,
    The arousal level estimation device according to claim 3.
  5.  前記下方視判定部は、前記上瞼の最高点と当該上瞼の最高点から下ろした鉛直線と前記下瞼の輪郭線との交点との間の距離である第2の距離に基づいて、前記乗員が下方視状態か否かを判定する
    請求項1に記載の覚醒度推定装置。
    The downward viewing determination unit is based on a second distance, which is a distance between the highest point of the upper eyelid and the intersection of the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid, The awakening degree estimation device according to claim 1, wherein it is determined whether or not the occupant is in a downward-looking state.
  6.  前記下方視判定部は、前記上瞼の開度が前記開度閾値より小さく、且つ、前記第2の距離が予め定められた距離閾値より大きい場合に、前記乗員が下方視状態であると判定する
    請求項5に記載の覚醒度推定装置。
    The downward viewing determination unit determines that the occupant is in the downward viewing state when the opening of the upper eyelid is smaller than the opening threshold and the second distance is larger than a predetermined distance threshold. The arousal level estimation device according to claim 5.
  7.  前記下方視判定部は、乗員ごとに異なる値の前記距離閾値を用いる、
    請求項6に記載の覚醒度推定装置。
    The downward-view determination unit uses the distance threshold having a different value for each occupant,
    The awakening degree estimation device according to claim 6.
  8.  前記下方視判定部は、前記上瞼の開度が前記開度閾値より小さく、且つ、前記第2の距離が予め定められた距離閾値より大きくなった時間の一定時間に占める割合が予め定められた時間閾値を超える場合に、前記乗員が下方視状態であると判定する
    請求項5に記載の覚醒度推定装置。
    The downward-viewing determination unit predetermines a ratio of a time period in which the opening degree of the upper eyelid is smaller than the opening degree threshold value and the second distance is larger than a predetermined distance threshold value to a certain time. The awakening degree estimation device according to claim 5, wherein it is determined that the occupant is in a downward-looking state when the time threshold exceeds the time threshold.
  9.  前記下方視判定部は、乗員ごとに異なる値の前記距離閾値を用いる、
    請求項8に記載の覚醒度推定装置。
    The downward-view determination unit uses the distance threshold having a different value for each occupant,
    The arousal level estimation device according to claim 8.
  10.  前記下方視判定部は、前記顔画像から検出した前記乗員の顔の向きを加味して、前記乗員が下方視状態か否かを判定する、
    請求項1に記載の覚醒度推定装置。
    The downward-viewing determination unit determines whether or not the occupant is in a downward-looking state, taking into consideration the orientation of the face of the occupant detected from the face image.
    The arousal level estimation device according to claim 1.
  11.  前記覚醒度推定部は、前記顔画像から検出した前記乗員の顔の向きを加味して、前記乗員の覚醒度を推定する、
    請求項1に記載の覚醒度推定装置。
    The awakening degree estimation unit estimates the awakening degree of the occupant by adding the orientation of the face of the occupant detected from the face image,
    The arousal level estimation device according to claim 1.
  12.  前記下方視判定部は、前記車両に搭載された機器の操作状況を加味して、前記乗員が下方視状態か否かを判定する、
    請求項1に記載の覚醒度推定装置。
    The downward-view determination unit determines whether or not the occupant is in the downward-view state by taking into consideration the operation status of the device mounted on the vehicle.
    The arousal level estimation device according to claim 1.
  13.  前記覚醒度推定部は、前記車両に搭載された機器の操作状況を加味して、前記乗員の覚醒度を推定する、
    請求項1に記載の覚醒度推定装置。
    The awakening degree estimation unit estimates the awakening degree of the occupant in consideration of an operation state of a device mounted on the vehicle,
    The arousal level estimation device according to claim 1.
  14.  前記覚醒度推定部は、前記乗員の覚醒度の推定結果を、前記車両の乗員への警告を行う警告装置へ出力する、
    請求項1に記載の覚醒度推定装置。
    The awakening degree estimation unit outputs the estimation result of the awakening degree of the occupant to a warning device that warns a occupant of the vehicle,
    The arousal level estimation device according to claim 1.
  15.  前記覚醒度推定部は、前記乗員の覚醒度の推定結果を、前記車両の自動運転制御装置へ出力する、
    請求項1に記載の覚醒度推定装置。
    The awakening degree estimation unit outputs an estimation result of the awakening degree of the occupant to an automatic driving control device of the vehicle,
    The arousal level estimation device according to claim 1.
  16.  車両の自動運転を支援する自動運転支援装置であって、
     請求項1に記載の覚醒度推定装置と、
     前記車両の自動運転の継続が困難になった場合に、前記覚醒度推定部による前記乗員の覚醒度の推定結果に基づいて、前記車両の運転へ介入させる乗員を選択する乗員選択部と、
    を備える自動運転支援装置。
    An automatic driving support device for supporting automatic driving of a vehicle,
    An arousal level estimation device according to claim 1;
    When it becomes difficult to continue the automatic driving of the vehicle, based on the estimation result of the awakening degree of the occupant by the awakening degree estimation unit, an occupant selection unit that selects an occupant to intervene in the driving of the vehicle,
    An automatic driving support device.
  17.  覚醒度推定装置の顔画像取得部が、車両の乗員の顔を撮影した画像である顔画像を取得し、
     前記覚醒度推定装置の上瞼開度算出部が、前記顔画像から検出した目頭の位置、目尻の位置および上瞼の最高点の位置に基づいて前記上瞼の開度を算出し、
     前記上瞼の開度が予め定められた開度閾値より小さいときに、前記覚醒度推定装置の下方視判定部が、前記顔画像から検出した前記上瞼の最高点の位置および当該上瞼の最高点から下ろした鉛直線と下瞼の輪郭線との交点の位置に基づいて前記乗員が下方視状態か否かを判定し、
     前記上瞼の開度が前記開度閾値より小さくなり、且つ、前記乗員が下方視状態でないと判定されると、前記覚醒度推定装置の覚醒度推定部が、前記乗員の覚醒度が低下したと推定する、
    覚醒度推定方法。
    The face image acquisition unit of the arousal level estimation device acquires a face image that is an image of the face of the vehicle occupant,
    The upper eyelid opening calculation unit of the awakening degree estimation device, the position of the inner corner of the eye detected from the face image, the position of the outer corner of the eye and the position of the highest point of the upper eyelid, calculates the opening of the upper eyelid,
    When the opening degree of the upper eyelid is smaller than a predetermined opening degree threshold value, the downward vision determination unit of the arousal level estimation device, the position of the highest point of the upper eyelid detected from the face image and the upper eyelid Determine whether the occupant is in the downward-looking state based on the position of the intersection of the vertical line drawn from the highest point and the contour of the lower eyelid,
    When the opening degree of the upper eyelid becomes smaller than the opening threshold value, and when it is determined that the occupant is not in the downward-looking state, the awakening degree estimation unit of the awakening degree estimation device reduces the awakening degree of the occupant. Presumably,
    Arousal level estimation method.
PCT/JP2019/007500 2019-02-27 2019-02-27 Alertness level estimation device, automatic driving assistance device, and alertness level estimation method WO2020174601A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2021501448A JP7109649B2 (en) 2019-02-27 2019-02-27 Arousal level estimation device, automatic driving support device, and arousal level estimation method
PCT/JP2019/007500 WO2020174601A1 (en) 2019-02-27 2019-02-27 Alertness level estimation device, automatic driving assistance device, and alertness level estimation method
DE112019006953.5T DE112019006953T5 (en) 2019-02-27 2019-02-27 Alertness assessment device, automatic driving support device and alertness assessment method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/007500 WO2020174601A1 (en) 2019-02-27 2019-02-27 Alertness level estimation device, automatic driving assistance device, and alertness level estimation method

Publications (1)

Publication Number Publication Date
WO2020174601A1 true WO2020174601A1 (en) 2020-09-03

Family

ID=72239557

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/007500 WO2020174601A1 (en) 2019-02-27 2019-02-27 Alertness level estimation device, automatic driving assistance device, and alertness level estimation method

Country Status (3)

Country Link
JP (1) JP7109649B2 (en)
DE (1) DE112019006953T5 (en)
WO (1) WO2020174601A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6964174B1 (en) * 2020-11-30 2021-11-10 真旭 徳山 Information processing equipment, information processing methods, and programs
WO2024057356A1 (en) * 2022-09-12 2024-03-21 三菱電機株式会社 Level of eyelid opening detection device, level of eyelid opening detection method, and drowsiness assessment system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008167806A (en) * 2007-01-09 2008-07-24 Denso Corp Sleepiness detector
JP2008210285A (en) * 2007-02-27 2008-09-11 Kyushu Univ Drowsy driving prevention device
JP2009003644A (en) * 2007-06-20 2009-01-08 Toyota Motor Corp Eye opening degree decision device
JP2011048531A (en) * 2009-08-26 2011-03-10 Aisin Seiki Co Ltd Drowsiness detection device, drowsiness detection method, and program
JP2018045450A (en) * 2016-09-14 2018-03-22 いすゞ自動車株式会社 Vehicle control apparatus
JP2018508870A (en) * 2015-01-19 2018-03-29 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング Method and apparatus for detecting instantaneous sleep of a vehicle driver

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008167806A (en) * 2007-01-09 2008-07-24 Denso Corp Sleepiness detector
JP2008210285A (en) * 2007-02-27 2008-09-11 Kyushu Univ Drowsy driving prevention device
JP2009003644A (en) * 2007-06-20 2009-01-08 Toyota Motor Corp Eye opening degree decision device
JP2011048531A (en) * 2009-08-26 2011-03-10 Aisin Seiki Co Ltd Drowsiness detection device, drowsiness detection method, and program
JP2018508870A (en) * 2015-01-19 2018-03-29 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング Method and apparatus for detecting instantaneous sleep of a vehicle driver
JP2018045450A (en) * 2016-09-14 2018-03-22 いすゞ自動車株式会社 Vehicle control apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6964174B1 (en) * 2020-11-30 2021-11-10 真旭 徳山 Information processing equipment, information processing methods, and programs
JP2022086525A (en) * 2020-11-30 2022-06-09 真旭 徳山 Information processing device, information processing method, and program
WO2024057356A1 (en) * 2022-09-12 2024-03-21 三菱電機株式会社 Level of eyelid opening detection device, level of eyelid opening detection method, and drowsiness assessment system

Also Published As

Publication number Publication date
DE112019006953T5 (en) 2021-12-16
JP7109649B2 (en) 2022-07-29
JPWO2020174601A1 (en) 2021-09-13

Similar Documents

Publication Publication Date Title
RU2678909C2 (en) Boundary detection system
CN112046500B (en) Automatic driving device and method
US9154923B2 (en) Systems and methods for vehicle-based mobile device screen projection
RU2720591C1 (en) Information displaying method and display control device
JP2010033106A (en) Driver support device, driver support method, and driver support processing program
CN112046502B (en) Automatic driving device and method
JP2019046277A (en) Image processing apparatus, image processing method, and program
WO2020174601A1 (en) Alertness level estimation device, automatic driving assistance device, and alertness level estimation method
WO2019188926A1 (en) Looking-away determining device, looking-away determining system, looking-away determining method, and storage medium
JP5942176B2 (en) In-vehicle display device, control method for in-vehicle display device, and program
US10083612B2 (en) Display device for vehicle
US11189048B2 (en) Information processing system, storing medium storing program, and information processing device controlling method for performing image processing on target region
US20200114932A1 (en) Vehicle and method of outputting information therefor
JP2009146153A (en) Moving object detection device, moving object detection method and moving object detection program
JP7175381B2 (en) Arousal level estimation device, automatic driving support device, and arousal level estimation method
JP6305199B2 (en) Communication control device and communication control method
JP7451423B2 (en) Image processing device, image processing method, and image processing system
WO2018163266A1 (en) Display control device and display control method
KR20200133860A (en) Autonomous driving apparatus and method
KR20200133445A (en) Autonomous driving apparatus and method
JP2021041884A (en) Vehicle control device
WO2021240768A1 (en) Driving inability determination device and driving inability determination method
JP7428076B2 (en) Operation method of server device, control device, vehicle, and information processing system
KR102648470B1 (en) Autonomous driving apparatus and method
WO2021140583A1 (en) Drowsiness estimation device and drowsiness estimation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19916667

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021501448

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 19916667

Country of ref document: EP

Kind code of ref document: A1