WO2020174601A1 - 覚醒度推定装置、自動運転支援装置および覚醒度推定方法 - Google Patents

覚醒度推定装置、自動運転支援装置および覚醒度推定方法 Download PDF

Info

Publication number
WO2020174601A1
WO2020174601A1 PCT/JP2019/007500 JP2019007500W WO2020174601A1 WO 2020174601 A1 WO2020174601 A1 WO 2020174601A1 JP 2019007500 W JP2019007500 W JP 2019007500W WO 2020174601 A1 WO2020174601 A1 WO 2020174601A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupant
upper eyelid
downward
opening
estimation device
Prior art date
Application number
PCT/JP2019/007500
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
和樹 國廣
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2019/007500 priority Critical patent/WO2020174601A1/ja
Priority to JP2021501448A priority patent/JP7109649B2/ja
Priority to DE112019006953.5T priority patent/DE112019006953T5/de
Publication of WO2020174601A1 publication Critical patent/WO2020174601A1/ja

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • B60W2040/0827Inactivity or incapacity of driver due to sleepiness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/229Attention level, e.g. attentive to driving, reading or sleeping

Definitions

  • the present invention relates to a technique for estimating the wakefulness of a vehicle occupant.
  • a wakefulness estimation device that estimates the wakefulness of a driver (presence or absence of drowsiness) based on the degree of eye opening obtained from an image of the driver's face taken by an in-vehicle camera is known (for example, Patent Document 1). ..
  • the arousal level estimation device of Patent Document 1 estimates the arousal level from the degree of eye opening of the driver, determines whether or not the driver is in the downward-looking state, and determines that the driver is in the downward-looking state. In addition, the erroneous determination is prevented by correcting the driver's awakening degree estimated from the eye opening in the direction of increasing the awakening degree.
  • the degree of eye opening of the driver is calculated based on the distance between the highest point of the contour line of the upper eyelid and the lowest point of the contour line of the lower eyelid, Whether or not the driver is in the downward-looking state is determined based on the shapes of the contour line and the contour line of the lower eyelid.
  • the contour of the lower eyelid is less clear than the contour of the upper eyelid, and it is difficult to detect the contour of the lower eyelid with high accuracy. Therefore, in the method of Patent Document 1, there is a concern that the error of the calculated eye opening becomes large and, as a result, the reliability of the estimation result of the awakening degree of the driver becomes low.
  • a light source that projects the shadow of the lower eyelid on the eyeball of the driver is installed in the vehicle, and the shadow is photographed by a camera to improve the detection accuracy of the contour of the lower eyelid. There is.
  • the present invention has been made to solve the above problems, and an object of the present invention is to provide a wakefulness estimation device capable of highly accurately estimating the wakefulness of a vehicle occupant.
  • the arousal level estimation device is a face image acquisition unit that acquires a face image that is an image of the face of an occupant of a vehicle, and the positions of the inner and outer corners of the eyes, the position of the outer corners of the eyes, and the highest point of the upper eyelid.
  • the upper eyelid opening calculation unit that calculates the opening of the upper eyelid based on the position of, the opening of the upper eyelid is smaller than a predetermined opening threshold, the highest point of the upper eyelid detected from the face image.
  • the position and the downward-viewing determination unit that determines whether the occupant is in the downward-looking state based on the position of the intersection point between the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid, and the opening of the upper eyelid.
  • a wakefulness estimation unit that estimates that the wakefulness of the occupant has decreased when it is determined that the occupant is smaller than the opening threshold and the occupant is not in the downward viewing state.
  • the degree of awakening of the occupant of the vehicle is estimated based on the result of determination of the upper eyelid of the occupant and determination of whether the occupant is in the downward-looking state (downward-looking determination). Since the opening degree of the upper eyelid can be calculated without using the position of the lower eyelid, it can be calculated with high accuracy.
  • the position of the contour line of the lower eyelid is used for the downward vision determination, but the downward vision determination is performed with the upper eyelid opening small, that is, with the contour of the lower eyelid bulging and the contour line being clear. Therefore, it can be performed with high accuracy. As a result, the estimation accuracy of the wakefulness of the occupant is also high.
  • FIG. 3 is a block diagram showing a configuration of an awakening degree estimation device according to the first embodiment.
  • 5 is a flowchart showing an operation of the arousal level estimation device according to the first embodiment.
  • FIG. 6 is a block diagram showing a modified example of the arousal level estimation device according to the first embodiment. It is a figure which shows the hardware structural example of an awakening degree estimation apparatus. It is a figure which shows the hardware structural example of an awakening degree estimation apparatus.
  • FIG. 6 is a block diagram showing a configuration of an automatic driving support device according to a second embodiment.
  • FIG. 1 is a block diagram showing the configuration of the arousal level estimation device 10 according to the first embodiment.
  • wakefulness estimation device 10 is installed in a vehicle.
  • the arousal level estimation device 10 is not limited to one that is permanently installed in the vehicle, and may be built in a portable device that can be brought into the vehicle, such as a mobile phone, a smartphone, or a portable navigation device. Further, part or all of the awakening degree estimation device 10 may be built on a server that can communicate with the vehicle.
  • the alertness estimation device 10 is connected to a camera 21 and a warning device 22 provided in the vehicle.
  • the camera 21 is for photographing the inside of the vehicle, and is installed at a position where the face of the vehicle occupant can be photographed.
  • the camera 21 is assumed to be a wide-angle camera having a shooting range capable of shooting the faces of passengers in all seats of the vehicle.
  • the image capturing range of the camera 21 may be wide enough to capture only the driver's seat.
  • the arousal level estimation device 10 estimates the arousal level (presence or absence of drowsiness) of the occupant based on the image of the occupant's face captured by the camera 21 (hereinafter referred to as “face image”), and sends the estimation result to the warning device 22. Output.
  • face image the image of the occupant's face captured by the camera 21
  • the alertness estimation device 10 estimates the alertness of all occupants captured by the camera 21.
  • the arousal level estimation device 10 outputs the information on the position of the seat of each occupant together with the estimation result of the arousal level of each occupant. That is, the wakefulness estimation device 10 outputs information such as in which seat the occupant estimated to have high wakefulness is in, and which seat the occupant estimated to have low wakefulness is in.
  • the warning device 22 issues a warning to the inside of the vehicle according to the wakefulness of the occupant estimated by the wakefulness estimation device 10, and includes a speaker for issuing a warning sound or a warning message, a display for displaying a warning screen, and the like. ..
  • the warning device 22 gives a warning when it is estimated that the awakening level of the occupant (driver) in the driver's seat has decreased.
  • the operation content of the warning device 22 is not limited, and for example, the warning may be issued even when it is estimated that the wakefulness of an occupant other than the driver has decreased.
  • the arousal level estimation device 10 includes a face image acquisition unit 11, an upper eyelid opening calculation unit 12, a downward vision determination unit 13, and an arousal level estimation unit 14.
  • the face image acquisition unit 11 acquires the face image of the occupant captured by the camera 21.
  • the face image acquisition unit 11 uses the face recognition technology to extract the face image for each occupant from the in-vehicle image captured by the camera 21. To extract.
  • the upper eyelid opening degree calculation unit 12 detects the position of the eyes of the occupant, the position of the outer corners of the eyes, and the position of the highest point of the upper eyelid from the face image of the occupant acquired by the face image acquisition unit 11, and based on their positional relationship. Then, the degree of opening of the upper eyelid (hereinafter referred to as "opening") is calculated. More specifically, the upper eyelid opening degree calculation unit 12 calculates the opening degree of the upper eyelid based on the first distance which is the distance between the straight line connecting the inner corner of the eye and the outer corner of the eye and the highest point of the upper eyelid. To do. As described above, it is difficult to detect the contour of the lower eyelid with higher accuracy than the contour of the upper eyelid.
  • the opening degree of the upper eyelid calculated by the upper eyelid opening degree calculation unit 12 can be calculated without using the information of the contour line of the lower eyelid, the accuracy thereof is high.
  • the highest point of the upper eyelid is the point (vertex) of the upper eyelid that is farthest from the straight line connecting the outer corner of the eye and the inner corner of the eye.
  • the upper eyelid opening degree calculation unit 12 obtains the flatness of the eye by dividing the first distance by the distance between the inner and outer corners of the eye, and the flattening rate is a predetermined reference value. The value obtained by dividing by is calculated as the opening degree of the upper eyelid.
  • the reference value of the flatness of the eyes may be, for example, a constant value determined by the manufacturer of the alertness estimation device 10 as an average value of the flatness of the eyes.
  • the wakefulness estimation device 10 can identify an individual occupant by face authentication using a face image or the like, a reference value for each occupant may be registered in the wakefulness estimation device 10. By using different reference values for each occupant, it is possible to suppress the influence of individual differences in eye size on the calculation result of the upper eyelid opening.
  • the downward viewing determination unit 13 determines that the occupant is in the downward viewing state when the upper eyelid opening of the occupant calculated by the upper eyelid opening calculation unit 12 is smaller than a predetermined threshold (hereinafter referred to as “opening threshold”). A downward view determination is performed to determine whether or not it is. Specifically, when the opening degree of the upper eyelid is smaller than the opening threshold value, the downward viewing determination unit 13 determines the position of the highest point of the upper eyelid and the lower eyelid from the face image acquired by the face image acquisition unit 11. And the position of the intersection of the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid (hereinafter referred to as the "specific point of the lower eyelid"), the highest point of the upper eyelid and the lower eyelid.
  • a predetermined threshold hereinafter referred to as “opening threshold”.
  • the specific point of the lower eyelid may be an intersection of a perpendicular line descending from the highest point of the upper eyelid toward a straight line connecting the inner corner of the eye and the outer corner of the eye and the contour line of the lower eyelid.
  • the lower eyelid bulges and the contour of the lower eyelid becomes clear, so the facial image shows The contour line can be detected with high accuracy.
  • the position of the contour line of the lower eyelid is necessary for the downward view determination performed by the downward view determination unit 13, but the downward view determination is performed when the opening degree of the upper eyelid is small, and therefore the downward view determination is The position of the contour of the lower eyelid obtained with high accuracy is used. Therefore, the downward-view determination performed by the downward-view determination unit 13 is highly accurate.
  • the downward viewing determination unit 13 determines the second distance, which is the distance between the highest point of the upper eyelid and the specific point of the lower eyelid, when the opening degree of the upper eyelid is smaller than the opening degree threshold value. It is determined whether or not the occupant is in the downward-looking state based on and. That is, when the opening degree of the upper eyelid is smaller than the opening threshold value and the second distance is larger than a predetermined threshold value (hereinafter referred to as “distance threshold value”), the downward viewing determination unit 13 causes the occupant to look downward. It is determined to be in the state.
  • distance threshold value a predetermined threshold value
  • the condition "the upper eyelid opening is smaller than the opening threshold and the second distance is larger than the distance threshold” may be referred to as “downward viewing condition”.
  • the second distance may also be referred to as the “upper and lower eyelid distance”.
  • the distance threshold may be, for example, a constant value determined by the manufacturer of the arousal level estimation device 10 as the size of the minimum distance between the upper and lower eyelids required for downward viewing.
  • a distance threshold for each occupant may be registered in the wakefulness estimation device 10.
  • the awakening degree estimation unit 14 determines the occupant's upper eyelid opening calculated by the upper eyelid opening calculation unit 12 and the result of the downward vision determination of the occupant performed by the downward vision determination unit 13 for the occupant. Estimate if the alertness has decreased. Specifically, the wakefulness estimation unit 14 reduces the wakefulness of the occupant when the occupant's upper eyelid opening is smaller than the opening threshold and it is determined that the occupant is not in the downward-looking state. Presumed to have done. As described above, the upper eyelid opening degree calculation unit 12 can calculate the upper eyelid opening degree of the occupant with high accuracy, and the downward vision determination unit 13 can perform downward vision determination with high accuracy. The estimation of the awakening degree of the occupant by 14 is highly accurate.
  • the condition that "the occupant's upper eyelid opening is smaller than the opening threshold value and the occupant is not in the downward-looking state” may also be referred to as "awakening degree lowering condition”.
  • the arousal level estimation device 10 As described above, according to the arousal level estimation device 10 according to the first embodiment, the arousal level of the occupant can be estimated with high accuracy in consideration of whether or not the occupant is in the downward vision state. Further, the detection of the contour of the lower eyelid, which is a relatively difficult process, may be performed at least when the downward vision determination is performed (that is, when the opening of the upper eyelid is smaller than the opening threshold). It can also contribute to the reduction of the processing load of the alertness estimation device 10.
  • FIG. 2 is a flowchart showing the operation of the alertness estimation device 10 according to the first embodiment. The operation of the alertness estimation device 10 will be described below with reference to FIG.
  • the face image acquisition unit 11 acquires the face image of the occupant captured by the camera 21 (step S101).
  • the image captured by the camera 21 includes only the face image of one passenger.
  • face images of a plurality of passengers may be acquired in step S101. In that case, the following processes of steps S102 to S108 are performed on the face images of the respective occupants.
  • the upper eyelid opening degree calculation unit 12 detects the position of the eyes of the occupant, the position of the outer corners of the eyes, and the position of the highest point of the upper eyelid from the face image of the occupant acquired in step S101, and establishes their positional relationship. Based on this, the opening degree of the upper eyelid of the occupant is calculated (step S102).
  • the downward viewing determination unit 13 determines based on the occupant's face image acquired in step S101. Then, the downward vision determination is performed to determine whether the occupant is in the downward vision state (step S104).
  • the awakening degree estimation unit 14 estimates that the awakening degree of the occupant has decreased (step S106).
  • step S107 when the opening of the occupant's upper eyelid calculated in step S102 is larger than the opening threshold value (NO in step S103), or when it is determined in step S104 that the occupant is in the downward-looking state (step S105). YES), the awakening degree estimation unit 14 estimates that the awakening degree of the occupant is high (step S107).
  • the estimation result of the wakefulness of the occupant in step S106 or S107 is output to the warning device 22 (step S108).
  • the warning device 22 issues a warning according to the estimation result of the awakening degree of the occupant.
  • the arousal level estimation device 10 repeatedly executes the above flow.
  • the flow may be constantly executed, for example, after the activation of the awakening degree estimation device 10 (after the ignition of the vehicle is turned on), or may be executed only while the vehicle is running (while the vehicle is stopped). May not be executed).
  • the wakefulness of the occupant is estimated before the vehicle starts moving, which can contribute to the improvement of vehicle safety.
  • the wakefulness estimation unit 14 outputs the estimation result of the wakefulness of the occupant to the warning device 22, but the output destination is not limited to the warning device 22, and any output destination. It can be a device.
  • the estimation result of the wakefulness of the occupant by the wakefulness estimation unit 14 may be input to the automatic driving control device 23 that is an ECU (Electronic Control Unit) that automatically drives the vehicle.
  • ECU Electronic Control Unit
  • the alertness estimation unit 14 estimates that the alertness of the occupant (driver) in the driver's seat has decreased, and the warning device 22 gives a warning. If the driver's awakening level remains low, the automatic driving control device 23 automatically evacuates the vehicle to a safe place (for example, a wide shoulder or a parking area) and stops the vehicle. Such an operation is possible.
  • the downward vision determination unit 13 the occupant has an upper eyelid opening smaller than the opening threshold value, and the occupant's upper and lower eyelid distance (second distance) becomes larger than the distance threshold value.
  • the downward-viewing determination unit 13 may determine that the occupant is in the downward-viewing state when the eyes of the occupant have satisfied the downward-viewing condition for a certain length of time. Specifically, it may be determined that the occupant is in the downward-looking state when the ratio of the time when the downward-looking condition is satisfied to a certain time exceeds a predetermined threshold value (time threshold value).
  • time threshold value a predetermined threshold value
  • the downward-viewing determination unit 13 may determine whether the occupant is in the downward-looking state by taking into consideration the orientation of the occupant's face detected from the face image. For example, the downward vision determination unit 13 causes the eyes of the occupant to satisfy the downward vision condition, and the direction of the occupant's face moves downward (in particular, the direction in which the meter in the vehicle or the screen of the navigation device is located). In this case, it may be determined that the occupant is in the downward-looking state.
  • the awakening degree estimation unit 14 may estimate the awakening degree of the occupant by taking into consideration the orientation of the occupant's face detected from the face image. For example, the awakening degree estimation unit 14 satisfies the condition that the opening of the upper eyelid of the occupant is smaller than the opening threshold value and that the occupant is not in the downward-looking state (awakening degree lowering condition), and If the variation in the face orientation is smaller than a predetermined threshold value, it may be estimated that the wakefulness of the occupant has decreased.
  • the downward-viewing determination unit 13 may determine whether or not the occupant is in the downward-looking state by taking into consideration the operation status of the vehicle-mounted device. For example, when the downward vision determination unit 13 determines that the occupant is in the downward vision state when the eyes of the occupant satisfy the downward vision condition and the occupant operates the vehicle-mounted device at a position lower than the face. It may be determined.
  • the information on the operating status of the vehicle-mounted device can be acquired from, for example, the ECU of the vehicle.
  • the awakening degree estimation unit 14 may estimate the awakening degree of the occupant in consideration of the operation status of the vehicle-mounted device. For example, when the occupant's eyes satisfy the wakefulness reduction condition and the occupant is not operating the in-vehicle device, the wakefulness estimation unit 14 estimates that the wakeup level of the occupant has decreased. May be.
  • Example of hardware configuration 4 and 5 are diagrams each showing an example of the hardware configuration of the arousal level estimation device 10.
  • Each function of the components of the alertness estimation device 10 shown in FIG. 1 is realized by the processing circuit 50 shown in FIG. 4, for example. That is, the arousal level estimation device 10 acquires a face image that is an image of the face of an occupant of the vehicle, and raises it based on the positions of the inner and outer corners of the eyes, the positions of the outer corners of the eyes, and the position of the highest point of the upper eyelid detected from the face image.
  • the processing circuit 50 is provided for estimating that the wakefulness of the occupant has decreased.
  • the processing circuit 50 may be dedicated hardware, or a processor that executes a program stored in a memory (Central Processing Unit (CPU), processing device, arithmetic device, microprocessor, microcomputer, It may be configured using a DSP (also referred to as a digital signal processor).
  • a memory Central Processing Unit (CPU)
  • CPU Central Processing Unit
  • microprocessor microcomputer
  • DSP digital signal processor
  • the processing circuit 50 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), and an FPGA (Field-Programmable). Gate Array), or a combination of these.
  • Each function of the components of the alertness estimation device 10 may be realized by an individual processing circuit, or those functions may be collectively realized by one processing circuit.
  • FIG. 5 shows an example of the hardware configuration of the alertness estimation device 10 when the processing circuit 50 is configured using the processor 51 that executes a program.
  • the functions of the constituent elements of the alertness estimation device 10 are realized by software or the like (software, firmware, or a combination of software and firmware).
  • the software and the like are written as a program and stored in the memory 52.
  • the processor 51 realizes the function of each unit by reading and executing the program stored in the memory 52.
  • the awakening level estimation apparatus 10 acquires, when executed by the processor 51, a process of acquiring a face image that is an image of the face of an occupant of the vehicle, and positions of the inner and outer corners of the eye detected from the face image, and Processing to calculate the opening of the upper eyelid based on the position of the highest point of the upper eyelid, when the opening of the upper eyelid is smaller than a predetermined opening threshold, of the highest point of the upper eyelid detected from the face image
  • a process of determining whether or not the occupant is in the downward-looking state based on the position and the position of the intersection point of the vertical line drawn from the highest point of the upper eyelid and the contour line of the lower eyelid, and the opening degree of the upper eyelid is the opening threshold value.
  • this program causes a computer to execute the procedure and method of the operation of the constituent elements of the alertness estimation device 10.
  • the memory 52 is, for example, RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), EEPROM (ElectricallyErasableProgrammableReadOnlyMemory), or the like.
  • Volatile semiconductor memory HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc) and its drive device, or any storage medium used in the future. May be.
  • the configuration of the functions of the components of the awakening degree estimation device 10 being realized by either hardware or software has been described.
  • the configuration is not limited to this, and a configuration in which some of the components of the alertness estimation device 10 are implemented by dedicated hardware and another of the components is implemented by software or the like may be used.
  • the function is realized by the processing circuit 50 as dedicated hardware, and for some of the other constituent elements, the processing circuit 50 as the processor 51 executes the program stored in the memory 52.
  • the function can be realized by reading and executing.
  • the arousal level estimation device 10 can realize each function described above by hardware, software, or a combination thereof.
  • the second embodiment shows an example in which the awakening degree estimation device 10 described in the first embodiment is applied to an automatic driving support device that supports automatic driving of a vehicle.
  • Level 0 No driving automation: Driver performs some or all of the dynamic driving tasks
  • Level 1 Device performs either vertical or horizontal vehicle motion control subtasks in a limited area
  • Level 2 Partial driving automation: System executes both vertical and lateral vehicle motion control subtasks in limited area
  • Level 3 Supplemental driving automation: System executes all dynamic driving tasks in limited area However, if it is difficult to continue operation, the driver appropriately responds to the intervention request from the system.
  • Level 4 advanced operation automation: The system responds to all dynamic driving tasks and when it is difficult to continue operation.
  • Execution in limited area Level 5 complete operation automation: System executes all dynamic driving tasks and responses to cases where it is difficult to continue operation (that is, not within the limited area).
  • the term means all operational and tactical functions (excluding strategic functions such as itinerary planning and selection of transit points) that must be performed in real time when operating a vehicle in road traffic.
  • “limited area” means a specific condition (geographical constraint, road constraint, environmental constraint, traffic constraint, speed constraint, time constraint, etc.) that the system or its function is designed to operate. (Including restrictions).
  • FIG. 6 is a block diagram showing the configuration of the automatic driving support device 30 according to the second embodiment.
  • elements having the same functions as those shown in FIG. 1 are designated by the same reference numerals as those in FIG. 1, and description thereof will be omitted here.
  • the automatic driving support device 30 is connected to the camera 21, the automatic driving control device 23, and the notification device 24.
  • the automatic driving control device 23 is an ECU that performs automatic driving of the vehicle, and corresponds to the above-mentioned "system". In the present embodiment, the automatic driving control device 23 performs at least level 3 automatic driving.
  • the notification device 24 notifies the occupant of the vehicle, and includes a speaker that emits a notification sound or a notification message sound, a display that displays an alarm screen, and the like.
  • the automatic driving support device 30 includes the awakening degree estimation device 10 and an occupant selection unit 31.
  • the arousal level estimation device 10 estimates the arousal level of the occupant of the vehicle as described in the first embodiment, and outputs the estimation result to the occupant selection unit 31.
  • the occupant selection unit 31 selects the occupant (the occupant to be the driver) to intervene in driving the vehicle in response to the intervention request from the automatic driving support device 30.
  • the intervention request is input from the automatic driving support apparatus 30 to the occupant selection unit 31 when the automatic driving support apparatus 30 is performing level 3 automatic driving and it becomes difficult to continue the automatic driving.
  • the occupant selection unit 31 receives the intervention request from the automatic driving support device 30, the occupant selection unit 31 confirms the estimation result of the wakefulness of the occupant by the wakefulness estimation device 10, and drives the occupant estimated to have a high wakefulness to drive the vehicle. To be selected as an occupant to intervene in.
  • the selection result by the passenger selection unit 31 is input to the notification device 24.
  • the notification device 24 notifies the occupant selected by the occupant selection unit 31 to instruct intervention in driving the vehicle.
  • the driving authority of the vehicle can be transferred to the occupant without stopping the vehicle.
  • the automatic driving control device 23 stops the vehicle at a safe place, moves the selected occupant to the driver's seat, and then drives the occupant. Transfer the authority.
  • the occupant selection unit 31 may select any one of them, but may preferentially select the occupant in the driver's seat.
  • the automatic driving support device 30 supports the so-called handover process in which the automatic driving support device 30 transfers the driving authority to the occupant, and the driving authority is transferred to the occupant having a low awakening degree. It can be prevented.
  • 10 arousal level estimation device, 11 face image acquisition part, 12 upper eyelid opening calculation part, 13 downward vision determination part, 14 awakening level estimation part, 21 camera, 22 warning device, 23 automatic driving control device, 24 notification device, 30 Automatic driving support device, 31 passenger selection unit, 50 processing circuit, 51 processor, 52 memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Automation & Control Theory (AREA)
  • Child & Adolescent Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Psychiatry (AREA)
  • Mechanical Engineering (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Mathematical Physics (AREA)
  • Anesthesiology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)
PCT/JP2019/007500 2019-02-27 2019-02-27 覚醒度推定装置、自動運転支援装置および覚醒度推定方法 WO2020174601A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2019/007500 WO2020174601A1 (ja) 2019-02-27 2019-02-27 覚醒度推定装置、自動運転支援装置および覚醒度推定方法
JP2021501448A JP7109649B2 (ja) 2019-02-27 2019-02-27 覚醒度推定装置、自動運転支援装置および覚醒度推定方法
DE112019006953.5T DE112019006953T5 (de) 2019-02-27 2019-02-27 Wachsamkeits-Einschätzungsvorrichtung, Vorrichtung zur Unterstützung eines automatischen Fahrens und Wachsamkeits-Einschätzungsverfahren

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/007500 WO2020174601A1 (ja) 2019-02-27 2019-02-27 覚醒度推定装置、自動運転支援装置および覚醒度推定方法

Publications (1)

Publication Number Publication Date
WO2020174601A1 true WO2020174601A1 (ja) 2020-09-03

Family

ID=72239557

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/007500 WO2020174601A1 (ja) 2019-02-27 2019-02-27 覚醒度推定装置、自動運転支援装置および覚醒度推定方法

Country Status (3)

Country Link
JP (1) JP7109649B2 (de)
DE (1) DE112019006953T5 (de)
WO (1) WO2020174601A1 (de)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6964174B1 (ja) * 2020-11-30 2021-11-10 真旭 徳山 情報処理装置、情報処理方法、及びプログラム
WO2024057356A1 (ja) * 2022-09-12 2024-03-21 三菱電機株式会社 開瞼度検出装置、開瞼度検出方法、および眠気判定システム

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008167806A (ja) * 2007-01-09 2008-07-24 Denso Corp 眠気検知装置
JP2008210285A (ja) * 2007-02-27 2008-09-11 Kyushu Univ 居眠り運転防止装置
JP2009003644A (ja) * 2007-06-20 2009-01-08 Toyota Motor Corp 開眼度判定装置
JP2011048531A (ja) * 2009-08-26 2011-03-10 Aisin Seiki Co Ltd 眠気検出装置、眠気検出方法、及びプログラム
JP2018045450A (ja) * 2016-09-14 2018-03-22 いすゞ自動車株式会社 車両制御装置
JP2018508870A (ja) * 2015-01-19 2018-03-29 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング 車両の運転者の瞬間睡眠を検知するための方法および装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008167806A (ja) * 2007-01-09 2008-07-24 Denso Corp 眠気検知装置
JP2008210285A (ja) * 2007-02-27 2008-09-11 Kyushu Univ 居眠り運転防止装置
JP2009003644A (ja) * 2007-06-20 2009-01-08 Toyota Motor Corp 開眼度判定装置
JP2011048531A (ja) * 2009-08-26 2011-03-10 Aisin Seiki Co Ltd 眠気検出装置、眠気検出方法、及びプログラム
JP2018508870A (ja) * 2015-01-19 2018-03-29 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング 車両の運転者の瞬間睡眠を検知するための方法および装置
JP2018045450A (ja) * 2016-09-14 2018-03-22 いすゞ自動車株式会社 車両制御装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6964174B1 (ja) * 2020-11-30 2021-11-10 真旭 徳山 情報処理装置、情報処理方法、及びプログラム
JP2022086525A (ja) * 2020-11-30 2022-06-09 真旭 徳山 情報処理装置、情報処理方法、及びプログラム
WO2024057356A1 (ja) * 2022-09-12 2024-03-21 三菱電機株式会社 開瞼度検出装置、開瞼度検出方法、および眠気判定システム

Also Published As

Publication number Publication date
DE112019006953T5 (de) 2021-12-16
JP7109649B2 (ja) 2022-07-29
JPWO2020174601A1 (ja) 2021-09-13

Similar Documents

Publication Publication Date Title
CN112046500B (zh) 自动驾驶装置和方法
RU2678909C2 (ru) Система для отслеживания объектов вокруг транспортного средства
JP2010033106A (ja) 運転者支援装置、運転者支援方法および運転者支援処理プログラム
US11189048B2 (en) Information processing system, storing medium storing program, and information processing device controlling method for performing image processing on target region
CN112046502B (zh) 自动驾驶装置和方法
JP2019046277A (ja) 画像処理装置、および画像処理方法、並びにプログラム
WO2020174601A1 (ja) 覚醒度推定装置、自動運転支援装置および覚醒度推定方法
WO2019188926A1 (ja) 余所見判定装置、余所見判定システム、余所見判定方法、記憶媒体
JP5942176B2 (ja) 車載用表示装置、車載表示装置の制御方法、プログラム
US10083612B2 (en) Display device for vehicle
CN111045512A (zh) 车辆、输出车辆的信息的方法及计算机可读记录介质
JP2009146153A (ja) 移動体検出装置、移動体検出方法および移動体検出プログラム
JP7175381B2 (ja) 覚醒度推定装置、自動運転支援装置および覚醒度推定方法
WO2018163266A1 (ja) 表示制御装置および表示制御方法
WO2020122057A1 (ja) 画像処理装置、画像処理方法および画像処理システム
JP5424014B2 (ja) 衝突警戒車両検出システム
KR20200133860A (ko) 자율 주행 장치 및 방법
KR20200133445A (ko) 자율 주행 장치 및 방법
JP2021041884A (ja) 車両制御装置
KR20200082463A (ko) 영상 기록 장치 및 그 동작 방법
WO2021240768A1 (ja) 運転不能判定装置および運転不能判定方法
WO2024150369A1 (ja) 監視システム、情報処理装置、方法及び非一時的なコンピュータ記録媒体
JP7428076B2 (ja) サーバ装置、制御装置、車両、及び情報処理システムの動作方法
KR102648470B1 (ko) 자율 주행 장치 및 방법
WO2021140583A1 (ja) 眠気推定装置および眠気推定方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19916667

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021501448

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 19916667

Country of ref document: EP

Kind code of ref document: A1