WO2018225176A1 - Dispositif de détermination d'état et procédé de détermination d'état - Google Patents

Dispositif de détermination d'état et procédé de détermination d'état Download PDF

Info

Publication number
WO2018225176A1
WO2018225176A1 PCT/JP2017/021121 JP2017021121W WO2018225176A1 WO 2018225176 A1 WO2018225176 A1 WO 2018225176A1 JP 2017021121 W JP2017021121 W JP 2017021121W WO 2018225176 A1 WO2018225176 A1 WO 2018225176A1
Authority
WO
WIPO (PCT)
Prior art keywords
driver
image
posture
determination
unit
Prior art date
Application number
PCT/JP2017/021121
Other languages
English (en)
Japanese (ja)
Inventor
翔悟 甫天
友美 保科
和樹 國廣
洸暉 安部
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2017/021121 priority Critical patent/WO2018225176A1/fr
Priority to JP2019523262A priority patent/JP6960995B2/ja
Publication of WO2018225176A1 publication Critical patent/WO2018225176A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a technique for determining the state of a driver who drives a vehicle.
  • Patent Document 1 discloses that a driver detected while a vehicle is running by sequentially detecting a head above the driver's neck from a captured image of a driver's seat captured by a camera mounted on the vehicle.
  • An inoperable state detection device for a driver that detects that the driver is in an inoperable state because the driving posture is broken when the head is out of a predetermined range of the captured image is disclosed.
  • the head of the driver other than the driver's head such as the head of another occupant reflected in the captured image, the vehicle interior, or the landscape, is the driver's head. If it is erroneously detected, there is a problem that a driver who is not in an inoperable state is erroneously detected as in an inoperable state.
  • the present invention has been made to solve the above-described problems, and it is an object of the present invention to suppress erroneous detection of a driver's inoperable state and improve the determination accuracy of the driver's state.
  • the state determination device includes an image acquisition unit that acquires a captured image in a vehicle including a driver, a face detection unit that detects a driver's face position from the captured image acquired by the image acquisition unit, and face detection The driver's face position detected by the driver is compared with the driver's reference point calculated in advance, and the movement of the driver's face position equal to or greater than the first posture collapse determination amount is detected in the set movement direction.
  • the posture determination unit determines that the driver's posture is broken, and the posture determination unit determines that the driver's posture is broken, the first in the captured image acquired by the image acquisition unit Based on the result of re-determination by the image comparison unit that compares the image of the region with the image of the first region in the comparison image and re-determines whether the driver's posture has collapsed And a state determination unit that determines whether or not the driver is in an inoperable state.
  • the present invention it is possible to suppress erroneous detection of the driver's inability to drive and improve the determination accuracy of the driver's state.
  • FIG. 1 is a block diagram illustrating a configuration of a state determination device according to Embodiment 1.
  • FIG. 2A and 2B are diagrams illustrating an example of the first region set by the image comparison unit of the state determination device according to Embodiment 1.
  • FIG. 3A and 3B are diagrams illustrating a hardware configuration example of the state determination device according to the first embodiment. It is a figure which shows the process of the face detection part of the state determination apparatus which concerns on Embodiment 1, and a reference point calculation part. It is a figure which shows an example of the setting of the moving direction of the driver
  • FIG. 6A, 6B, and 6C are diagrams illustrating processing of the image comparison unit of the state determination device according to Embodiment 1.
  • FIG. 7A, 7B, and 7C are diagrams illustrating processing of the image comparison unit of the state determination device according to Embodiment 1.
  • FIG. 6 is a flowchart showing an operation of a reference point calculation process of the state determination device according to the first embodiment.
  • 3 is a flowchart illustrating an operation of a state determination process of the state determination device according to the first embodiment.
  • 10A, 10B, and 10C are diagrams illustrating determination of the displacement amount of the driver's face by the posture determination unit of the state determination device according to Embodiment 1.
  • FIG. 6 is a block diagram illustrating a configuration of a state determination device according to Embodiment 2.
  • FIG. 13A and 13B are diagrams illustrating detection of the head center axis by the axis detection unit of the state determination device according to the second embodiment.
  • 14A, 14B, and 14C are diagrams illustrating determination of the inclination of the head center axis by the axis detection unit of the state determination device according to the second embodiment.
  • 6 is a flowchart illustrating an operation of a state determination process of the state determination device according to the second embodiment.
  • FIG. 13A and 13B are diagrams illustrating detection of the head center axis by the axis detection unit of the state determination device according to the second embodiment.
  • 14A, 14B, and 14C are diagrams illustrating determination of the inclination of the head center axis by the axis detection unit of the state determination device according to the second embodiment.
  • 6 is a flowchart illustrating an operation of a state determination process of the state determination device
  • FIG. 10 is a block diagram illustrating a configuration of a state determination device according to a third embodiment.
  • FIG. 17A and FIG. 17B are diagrams illustrating a setting example of the second region set by the contour detection unit of the state determination device according to the third embodiment.
  • 12 is a flowchart illustrating an operation of a state determination process of the state determination device according to the third embodiment. It is a figure which shows the example of a display based on the determination result of the state determination apparatus of invention which concerns on Embodiment 1 to Embodiment 3.
  • FIG. 1 is a block diagram illustrating a configuration of a state determination device 100 according to the first embodiment.
  • the state determination device 100 first detects the driver's face and determines whether or not the driver's posture is broken based on the detected movement amount and movement direction of the face.
  • the state determination device 100 is a captured image of the first driver's seat acquired from the imaging device in the vehicle interior when it is determined that the posture has collapsed, and an image captured in advance by the imaging device in the vehicle interior.
  • the image of the second driver's seat when the driver in the normal state is seated in the driver's seat is compared.
  • the state determination device 100 determines the driver's posture collapse based on the comparison result between the captured image of the first driver's seat and the image of the second driver's seat, and whether or not the driver is in an inoperable state. Determine whether.
  • the state determination device 100 is a captured image of the first driver's seat acquired from the imaging device in the vehicle interior when it is determined that the posture is collapsed, and an image captured in advance by the imaging device in the vehicle interior. A comparison is made with an image of a third driver's seat obtained by imaging a driver's seat where the driver is not seated.
  • the state determination device 100 determines the driver's posture collapse based on the comparison result between the captured image of the first driver's seat and the image of the third driver's seat, and whether or not the driver is in an inoperable state. Determine whether.
  • the state determination device 100 calculates the degree of coincidence between the captured image of the first driver seat and the captured image of the second driver seat. Alternatively, the state determination device 100 calculates the degree of deviation between the captured image of the first driver seat and the captured image of the third driver seat. If the calculated degree of coincidence or divergence is small, the state determination device 100 determines that the driver is in a posture different from the normal posture, that is, the posture has collapsed, and determines that the driver is incapable of driving. To do. On the other hand, if the calculated degree of coincidence or divergence is large, the state determination device 100 determines that the driver is in a posture equivalent to the posture in the normal state, that is, the posture is not collapsed, and the driver is not in an inoperable state.
  • the state determination device 100 erroneously detects that the head of another occupant is the driver's face, and determines that the driver's posture has collapsed based on the detection result.
  • the determination check is performed by comparing the captured image of the first driver seat and the captured image of the second driver seat, or by comparing the captured image of the first driver seat and the captured image of the third driver seat. Therefore, it is possible to suppress erroneous detection that the vehicle is in an inoperable state.
  • the state determination device 100 includes an image acquisition unit 101, a face detection unit 102, a reference point calculation unit 103, a posture determination unit 104, an image comparison unit 105, a comparative image storage unit 106, and a state determination unit 107.
  • the camera 200, the vehicle information recognition device 300, the warning device 400, and the vehicle control device 500 are connected to the state determination device 100.
  • the camera 200 is an imaging unit that captures an image of the inside of the vehicle on which the state determination device 100 is mounted.
  • the camera 200 is an infrared camera, for example.
  • the camera 200 is installed at a position where at least the head of the driver seated in the driver's seat can be photographed.
  • the camera 200 is comprised by 1 unit
  • the vehicle information recognition device 300 is a device that recognizes vehicle information of a vehicle on which the state determination device 100 is mounted.
  • the vehicle information recognition device 300 includes, for example, a vehicle speed sensor and a brake sensor.
  • the vehicle speed sensor is a sensor that acquires the traveling speed of the vehicle.
  • the brake sensor is a sensor that detects an operation amount of a brake pedal of the vehicle.
  • the warning device 400 Based on the determination result of the state determination device 100, the warning device 400 generates information indicating a warning or information indicating a warning by voice or voice and display to the driver of the vehicle.
  • the warning device 400 controls the output of the information indicating the generated alert or the information indicating the warning.
  • a speaker, or a speaker and a display constituting the warning device 400 outputs, or outputs and displays, information indicating alerting or information indicating warning based on output control.
  • the vehicle control device 500 controls traveling of the vehicle based on the determination result of the state determination device 100.
  • the image acquisition unit 101 acquires a captured image captured by the camera 200.
  • the captured image is an image captured so that at least the head of the driver in the vehicle is reflected.
  • the image acquisition unit 101 outputs the acquired captured image to the face detection unit 102 and the image comparison unit 105.
  • the face detection unit 102 analyzes the captured image acquired by the image acquisition unit 101 to detect the driver's face position, and outputs the detected position to a reference point calculation unit 103 and a posture determination unit 104 described later.
  • the face position is represented by two-dimensional coordinates on the captured image. For example, the face detection unit 102 sets a frame that is in contact with the contour of the driver's head based on the captured image, and sets the center point of the set frame.
  • the coordinates are the driver's face position. Details of the processing of the face detection unit 102 will be described later. However, the present invention is not limited to this, and the face detection unit 102 may detect both eyes by analyzing a captured image, for example, and may set the center of both eyes as the face position.
  • the reference point calculation unit 103 calculates the face position in the posture when the driver is driving in a normal state (hereinafter referred to as a reference point) based on the face position acquired from the face detection unit 102.
  • the reference point calculation unit 103 outputs the calculated reference point to the posture determination unit 104. Details of the processing of the reference point calculation unit 103 will be described later.
  • the posture determination unit 104 calculates a movement amount and a movement direction of the driver's head from the reference point based on the reference point and the driver's face position acquired by the face detection unit 102.
  • the posture determination unit 104 is out of alignment. Is determined.
  • the posture determination unit 104 determines that the driver's posture is not collapsed when the calculated movement amount is less than a preset first posture collapse determination amount in the calculated movement direction.
  • the posture determination unit 104 outputs the determination result to the image comparison unit 105.
  • the posture determination unit 104 calculates the distance between the reference point calculated by the reference point calculation unit 103 and the driver's face position acquired by the face detection unit 102. In the first embodiment, the calculated distance is set as the movement amount of the driver's head.
  • the posture determination unit 104 divides the captured image into a plurality of regions around the reference point calculated by the reference point calculation unit 103, and the face position of the driver acquired by the face detection unit 102 belongs to any region. Determine whether. In the first embodiment, the region as the determination result is set as the head moving direction.
  • the posture determination unit 104 may use the angle formed by the straight line connecting the reference point and the driver's face position acquired by the face detection unit 102 and the X axis with the reference point as the origin as the head movement direction. . Details of the processing of the posture determination unit 104 will be described later.
  • the image comparison unit 105 is a first area in a captured image used for the determination (hereinafter referred to as a captured image at the time of posture determination). And the image of the first region in the comparison image captured in advance are compared to determine whether the driver has lost his posture.
  • the comparative image is a captured image captured when the driver is seated in the driver's seat in a normal state, or a captured image captured of the seat of the driver's seat where the driver is not seated.
  • the first area is an area determined in consideration of the adjustment width in the front-rear direction of the driver's seat, the adjustment width of the inclination of the driver's seat, and the adjustment width in the vertical direction of the headrest of the driver's seat. There is an area including all the headrests of the driver's seat.
  • the first region may be determined in consideration of an average seat position of the driver's seat, an average seat height of the driver, or a range imaged when a general driver performs a normal driving operation.
  • the first area is an area set in advance based on the position of the driver's seat and the position of the imaging device for each vehicle type.
  • the image comparison unit 105 uses the first region image in the captured image at the time of posture determination and the first region image in the comparison image. The degree of coincidence is calculated.
  • the image comparison unit 105 uses the first region image in the captured image at the time of posture determination and the first region in the comparison image. The degree of deviation from the image is calculated. The image comparison unit 105 re-determines whether or not the driver's posture is broken based on the calculated degree of coincidence or deviation.
  • the image comparison unit 105 determines that the driver's posture has not collapsed when the degree of coincidence or the degree of divergence is equal to or greater than a threshold value. That is, based on the face position that the face detection unit 102 erroneously detects as the driver's face other than the driver's face, the image comparison unit 105 causes the posture determination unit 104 to detect the incorrect face position and the reference point. Based on the above, it is determined that the posture is misjudged. On the other hand, when the degree of coincidence or divergence is less than the threshold, the image comparison unit 105 determines that the driver's posture is broken, that is, the determination result of the posture determination unit 104 is correct. The image comparison unit 105 outputs the determination result to the state determination unit 107. Details of the degree of coincidence, the degree of deviation, and the method of calculating the degree of coincidence or deviation by the image comparison unit 105 will be described later.
  • FIG. 2 is a diagram illustrating an example of the first region used by the image comparison unit 105 of the state determination device 100 according to the first embodiment.
  • FIG. 2A shows an example of a first region in a captured image obtained by imaging the driver's seat from the front
  • FIG. 2B is a diagram showing an example of the first region in a captured image obtained by imaging the driver's seat from diagonally forward.
  • the first areas Ea and Eb shown in FIGS. 2A and 2B are areas including all the headrests H of the driver's seat.
  • the first areas Ea and Eb are areas in which the seat position of the driver's seat, the seat height of the driver X, and normal driving operation are taken into consideration.
  • the first regions Ea and Eb are regions that include all the headrests H of the driver's seat regardless of whether the seat position of the driver's seat is the foremost position or the last position.
  • the comparison image storage unit 106 stores a comparison image to be referred to when the image comparison unit 105 compares captured images.
  • the comparative image is a captured image obtained by capturing in advance the same region as the first region in the captured image at the time of posture determination.
  • the comparative image is a captured image captured when the driver is seated in the driver's seat in a normal state, or a captured image captured of a driver's seat where the driver is not seated in the driver's seat.
  • the captured image in which the driver is seated in the driver's seat in a normal state is, for example, an image captured when calculating a reference point described later.
  • the comparative image storage unit 106 stores a comparative image for each driver.
  • the state determination unit 107 determines whether or not the driver is in a driving impossible state based on the determination result of the image comparison unit 105. Based on the determination result, the state determination unit 107 determines that the driver is incapable of driving when the posture collapse state continues for a certain time or more. More specifically, the state determination unit 107 drives when the posture collapse state continues for a time period in which it is determined that attention is required based on the determination result (hereinafter referred to as a warning determination time). It is determined that the user needs to be alerted (hereinafter, alert state). The state determination unit 107 determines that the driver is in an inoperable state when the time for determining that the alert state is in an inoperable state (hereinafter referred to as an inoperable determination time) continues. The state determination unit 107 outputs the determination result to the external warning device 400 or the vehicle control device 500 when it is determined that the driver is in an alert state or when the driver is determined to be incapable of driving. To do.
  • 3A and 3B are diagrams illustrating a hardware configuration example of the state determination device 100.
  • the functions of the image acquisition unit 101, the face detection unit 102, the reference point calculation unit 103, the posture determination unit 104, the image comparison unit 105, and the state determination unit 107 in the state determination device 100 are realized by a processing circuit. That is, the state determination apparatus 100 includes a processing circuit for realizing the above functions.
  • the processing circuit may be a processing circuit 100a that is dedicated hardware as shown in FIG. 3A or a processor 100b that executes a program stored in the memory 100c as shown in FIG. 3B. Good.
  • the processing circuit 100a For example, a single circuit, a composite circuit, a programmed processor, a processor programmed in parallel, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-programmable Gate Array), or a combination thereof is applicable.
  • the functions of the image acquisition unit 101, the face detection unit 102, the reference point calculation unit 103, the posture determination unit 104, the image comparison unit 105, and the state determination unit 107 may be realized by a processing circuit. You may implement
  • the function of each unit is software , Firmware, or a combination of software and firmware.
  • Software or firmware is described as a program and stored in the memory 100c.
  • the processor 100b reads out and executes the program stored in the memory 100c, whereby the image acquisition unit 101, the face detection unit 102, the reference point calculation unit 103, the posture determination unit 104, the image comparison unit 105, and the state determination unit 107 Implement each function.
  • a memory 100c is provided for storing a program in which each step shown in FIG.
  • these programs may cause a computer to execute the procedures or methods of the image acquisition unit 101, the face detection unit 102, the reference point calculation unit 103, the posture determination unit 104, the image comparison unit 105, and the state determination unit 107. I can say that.
  • the processor 100b is, for example, a CPU (Central Processing Unit), a processing device, an arithmetic device, a processor, a microprocessor, a microcomputer, or a DSP (Digital Signal Processor).
  • the memory 100c may be, for example, a nonvolatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable ROM), or an EEPROM (Electrically EPROM). Further, it may be a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a mini disk, CD (Compact Disc), or DVD (Digital Versatile Disc).
  • the processing circuit 100a in the state determination device 100 can realize the above-described functions by hardware, software, firmware, or a combination thereof.
  • FIG. 4 is a diagram illustrating processing of the face detection unit 102 and the reference point calculation unit 103 of the state determination device 100 according to the first embodiment.
  • the face detection unit 102 acquires the vehicle speed when the captured image A is captured from the vehicle information recognition device 300, and determines whether or not the acquired vehicle speed is a preset threshold value, for example, 30 km / h or more.
  • the face detection unit 102 detects the face area B of the driver's X from the captured image A when the vehicle speed is determined to be, for example, 30 km / h or higher.
  • the face detection unit 102 sets a frame C in contact with the contour of the head above the neck of the driver X in the face area B detected from the captured image A, and acquires the center point P of the set frame C. .
  • the face detection unit 102 calculates the two-dimensional coordinates (px, py) on the captured image A of the acquired center point P, and acquires it as the face position of the driver X.
  • the face detection unit 102 outputs the calculated two-dimensional coordinates (px, py) of the center point P to the reference point calculation unit 103.
  • the reference point calculation unit 103 records the two-dimensional coordinates (px, py) of the center point P calculated by the face detection unit 102 in a recording area (not shown).
  • the reference point calculation unit 103 obtains an average value of the recorded two-dimensional coordinates and performs a reference. Calculate points.
  • the number of samples set in advance is set based on the number of coordinates for an arbitrary fixed time, for example. Specifically, when the imaging speed of the camera 200 is 10 fps, 30 samples corresponding to a fixed time of 3 seconds are set as the number of samples.
  • the reference point calculation unit 103 outputs the calculated reference point to the posture determination unit 104 as the reference point of the driver X.
  • the reference point of the driver X is held in the posture determination unit 104 until the driving by the driver X ends.
  • the reference point calculation unit 103 refers to information input from the vehicle information recognition device 300, for example, vehicle speed information and brake information, and determines that the driving by the driver X has ended, the reference point calculation unit 103 The information recorded in the storage area and the reference point set in the posture determination unit 104 are reset.
  • the face detection unit 102 does not calculate the two-dimensional coordinates of the center point when the vehicle speed is less than a preset threshold value or when the driver's face area is not detected from the captured image. Further, the reference point calculation unit 103 does not calculate the reference point when the number of recorded two-dimensional coordinates of the center point is less than a preset number of samples.
  • the face detection unit 102 monitors the posture determination unit 104 and acquires a face position for posture determination processing when a reference point has already been set.
  • the face detection unit 102 acquires the face position for the posture determination process regardless of the vehicle speed.
  • the face detection unit 102 detects the face region B of the driver X from the captured image A acquired by the image acquisition unit 101 as shown in FIG. 4, and touches the contour of the head included in the detected face region B.
  • the center point P of the frame C is acquired.
  • the face detection unit 102 acquires the two-dimensional coordinates (px, py) of the acquired center point P on the captured image A as the face position Pa of the driver X.
  • the face detection unit 102 outputs the acquired driver's face position Pa to the posture determination unit 104.
  • the posture determination unit 104 is set with a pre-calculated reference point Q, and calculates the movement direction and the movement amount of the face position Pa of the driver X with respect to the reference point Q.
  • the posture determination unit 104 determines whether or not the calculated movement amount of the face position Pa is greater than or equal to a first posture collapse determination amount set in advance in the calculated movement direction of the face position Pa. It is determined whether or not the posture is broken.
  • FIG. 5 is a diagram illustrating an example of setting of the movement direction of the driver's head by the state determination device 100 according to Embodiment 1 and a first posture collapse determination amount set in each movement direction.
  • an area divided into four areas, Area 1, Area 2, Area 3, and Area 4 are set with the reference point Q as the center. Each area indicates the moving direction of the face position Pa.
  • the posture determination unit 104 specifies that the moving direction of the face position Pa is Area2.
  • a first posture collapse determination amount is set.
  • the first posture collapse determination amount in Area 1 is m thr1
  • the first posture collapse determination amount in Area 2 is m thr2
  • the first posture collapse determination amount in Area 3 is the first posture collapse determination amount in m thr3 and Area 4.
  • the posture collapse determination amount is set to m thr4 . Since the face position Pa is located at Area2, the first posture collapse determination amount of the face position Pa is m thr2 .
  • the posture determination unit 104 calculates the distance m between the two points of the face position Pa and the reference point Q and sets it as the movement amount m of the face position Pa.
  • the posture determination unit 104 compares the calculated movement amount m of the face position Pa with the first posture collapse determination amount m thr2 of the face position Pa, and the movement amount m is equal to or greater than the first posture collapse determination amount m thr2 . If there is, it is determined that the posture of the driver X is broken. On the other hand, when the movement amount m is less than the first posture collapse determination amount m thr2 , the posture determination unit 104 determines that the posture of the driver X is not collapsed.
  • FIGS. 6 and 7 are diagrams illustrating processing of the image comparison unit 105 of the state determination device 100 according to the first embodiment.
  • FIG. 6 shows a case where comparison is made based on the degree of coincidence between the image of the first region in the captured image at the time of posture determination and the image of the first region in the comparison image
  • FIG. 6B and FIG. 6C show an example of the captured image at the time of posture determination and the image of the first region in the captured image.
  • FIG. 6A shows a first area Da of a comparative image captured when the driver X is seated in the driver's seat in a normal state.
  • the image of the first region Da of the comparison image in FIG. 6A and the images of the first regions Ea and Eb of the captured image A at the time of posture determination in FIG. 6B or 6C are images obtained by capturing the same region.
  • the image comparison unit 105 compares the image of the first region Ea of the captured image A with the image of the first region Da of the comparative image, or the image of the first region Eb of the captured image A and the first region Da of the comparative image. The degree of coincidence between the two images is calculated by comparing the images.
  • the image comparison unit 105 determines the degree of coincidence between the image of the first area Da of the comparison image and the images of the first areas Ea and Eb of the captured image A, the degree of coincidence of the areas where the headrest is imaged, that is, the areas Ha and Hb, Alternatively, it is calculated based on the degree of coincidence between the region Ha and the region Hc.
  • the image comparison unit 105 calculates the degree of coincidence of the area in which the headrest is imaged using at least one of the number of pixels in which the headrest is imaged, the length of the contour that the headrest can be visually recognized, and the brightness of the image in the first area. To do.
  • the image comparison unit 105 compares the image of the first area Da of the comparison image of FIG. 6A with the image of the first area Ea of the captured image A of FIG. 6B.
  • the degree of coincidence between the area Ha where the headrest is imaged and the area Hb where the headrest is imaged in the image of the first area Ea of the captured image A is calculated.
  • the image comparison unit 105 calculates a high degree of coincidence between the area Ha and the area Hb.
  • the image comparison unit 105 compares the image of the first area Da of the comparison image of FIG. 6A and the image of the first area Eb of FIG. 6C, the headrest of the image of the first area Da of the comparison image is set.
  • the degree of coincidence between the imaged area Ha and the area Hc in which the headrest is imaged in the image of the first area Eb of the captured image A is calculated.
  • the image comparison unit 105 calculates a low degree of coincidence between the area Ha and the area Hc.
  • the image comparison unit 105 determines that the driver's posture has not collapsed when the calculated degree of coincidence is equal to or greater than a preset threshold value. For example, in the comparison between the image of the first area Da of the comparison image of FIG. 6A and the image of the first area Ea of the captured image A of FIG. 6B, the image comparison unit 105 determines the degree of coincidence between the area Ha and the area Hb. Since it is high, it is determined that the posture of the driver X has not collapsed. On the other hand, the image comparison unit 105 determines that the driver's posture is broken when the calculated degree of coincidence is less than a preset threshold value. For example, in the comparison between the first area Da of the comparison image of FIG. 6A and the image of the first area Eb of the captured image A of FIG. 6C, the image comparison unit 105 has a low degree of coincidence between the area Ha and the area Hb. Therefore, it is determined that the posture of the driver X is broken.
  • FIG. 7 shows a case where determination is made based on the degree of divergence between the image of the first region in the captured image at the time of posture determination and the image of the first region in the comparison image.
  • FIG. 7A is stored in the comparison image storage unit 106.
  • FIG. 7B and FIG. 7C show an example of a captured image at the time of posture determination and an image of the first region in the captured image.
  • FIG. 7A shows a first region Db of a comparative image obtained by imaging the seat of the driver's seat where the driver is not seated.
  • the image of the first region Db of the comparison image in FIG. 7A and the images of the first regions Fa and Fb of the captured image A at the time of posture determination in FIG. 7B or 7C are images obtained by capturing the same region.
  • the image comparison unit 105 compares the image of the first region Fa of the captured image A with the image of the first region Db of the comparative image, or the image of the first region Fb of the captured image A and the first region Db of the comparative image. The difference between the two images is calculated by comparing with the images.
  • the image comparison unit 105 determines the divergence between the image of the first area Db of the comparison image and the images of the first areas Fa and Fb of the captured image A, and the divergence between the areas where the headrest is imaged, that is, the areas Hc and Hd, Or it calculates based on the deviation degree of field Hc and field He.
  • the image comparison unit 105 calculates the divergence degree of the area in which the headrest is imaged using at least one of the number of pixels in which the headrest is imaged, the length of the contour that the headrest can be visually recognized, and the brightness of the image in the first area. To do.
  • the image comparison unit 105 compares the image of the first area Db of the comparison image of FIG. 7A with the image of the first area Fa of the captured image A of FIG. 7B, the image of the first area Db of the comparison image is displayed.
  • the degree of divergence between the area Hc in which the headrest is imaged and the area Hd in which the headrest is imaged in the image of the first area Fa of the captured image A is calculated.
  • the image comparison unit 105 calculates a high degree of divergence between the area Hc and the area Hd.
  • the image comparison unit 105 compares the image of the first region Db of the comparison image in FIG. 7A and the image of the first region Fb of the captured image A in FIG. 7C, the image comparison unit 105 of the first region Db of the comparison image.
  • the degree of divergence between the area Hc in which the headrest is imaged and the area He in which the headrest is imaged in the image of the first area Fb of the captured image A is calculated.
  • the image comparison unit 105 calculates a low degree of divergence between the region Hc and the region He.
  • the image comparison unit 105 determines that the driver's posture has not collapsed when the calculated degree of divergence is equal to or greater than a preset threshold value. For example, in the comparison between the image of the first region Db of the comparison image of FIG. 7A and the image of the first region Fa of the captured image A of FIG. 7B, the state determination unit 107 determines the degree of divergence between the region Hc and the region Hd. Since it is high, it is determined that the posture of the driver X has not collapsed. On the other hand, the image comparison unit 105 determines that the driver's posture is broken when the calculated degree of divergence is less than a preset threshold. For example, in the comparison between the first area Db of the comparison image in FIG. 7A and the image of the first area Fb in the captured image A in FIG. 7C, the image comparison unit 105 has a low degree of coincidence between the area Ha and the area Hb. Therefore, it is determined that the posture of the driver X is broken.
  • FIG. 8 is a flowchart showing the operation of the reference point calculation process of the state determination apparatus 100 according to the first embodiment.
  • the face detection unit 102 determines whether the vehicle speed when the captured image is captured is equal to or higher than a preset threshold (step ST2). ). If the vehicle speed is less than a preset threshold value (step ST2; NO), the process ends.
  • step ST2 when the vehicle speed is equal to or higher than a preset threshold (step ST2; YES), the face detection unit 102 determines whether or not a face region can be detected from the captured image input in step ST1 (step ST3). If the face area is not detectable (step ST3; NO), the process is terminated. On the other hand, when the face area can be detected (step ST3; YES), the face detection unit 102 detects the face area from the captured image input in step ST1, acquires the face position from the detected face area, and sets the reference point. It outputs to the calculation part 103 (step ST4).
  • the reference point calculation unit 103 records the face position acquired in step ST4 in a buffer or the like (step ST5).
  • the reference point calculation unit 103 determines whether or not the number of recorded face positions is equal to or greater than a preset threshold value (step ST6). If the number of recorded face positions is less than a preset threshold value (step ST6; NO), the process ends. On the other hand, if the number of recorded face area position information is greater than or equal to a preset threshold value (step ST6; YES), a reference point is calculated from the recorded face position (step ST7).
  • the reference point calculation unit 103 outputs the reference point calculated in step ST7 to the posture determination unit 104 (step ST8), and ends the process.
  • FIG. 9 is a flowchart showing the operation of the state determination process of the state determination device 100 according to the first embodiment.
  • the image acquisition unit 101 always acquires a captured image and outputs it to the face detection unit 102 and the image comparison unit 105.
  • the face detection unit 102 determines whether the posture determination unit 104 determines that the driver's posture is broken (step ST22). .
  • the flowchart proceeds to the process of step ST27.
  • step ST23 determines whether or not a reference point is set in the posture determination unit 104.
  • the determination processing in step ST23 is performed by the face detection unit 102 monitoring the posture determination unit 104.
  • the face detection unit 102 acquires the driver's face position from the captured image input in step ST21, and the posture determination unit 104 Output (step ST24).
  • Posture determination section 104 compares the reference point set in advance by reference point calculation section 103 with the driver's face position acquired in step ST23, and calculates the movement direction and amount of movement of the driver's face position. (Step ST25). The posture determination unit 104 determines whether or not the driver's posture is broken based on the movement direction and movement amount of the driver's face position calculated in step ST25 (step ST26). If the driver's posture is not collapsed (step ST26; NO), the process is terminated.
  • the image comparison unit 105 acquires an image of the first region in the captured image at the time of the posture determination acquired by the image acquisition unit 101 (step ST27).
  • the image comparison unit 105 compares the image of the first region of the captured image at the posture determination acquired in step ST27 with the image of the first region of the comparison image accumulated in the comparison image accumulation unit 106, and the degree of coincidence Alternatively, it is determined whether or not the divergence degree is greater than or equal to a preset threshold value (step ST28).
  • the image comparison unit 105 determines that the posture determination unit 104 has made an erroneous determination. Based on the determination result of the image comparison unit 105, the state determination unit 107 cancels the determination that the posture of the driver is broken by the posture determination unit 104, resets the counter (step ST29), and ends the process. .
  • step ST28 determines that the degree of coincidence or the degree of divergence is not greater than or equal to a preset threshold value (step ST28; NO).
  • the state determination unit 107 determines whether or not a counter (not shown) has been activated (step ST30). If the counter has been activated (step ST30; YES), the process proceeds to step ST32. On the other hand, when the counter is not activated (step ST30; NO), the state determination unit 107 starts counting the counter (step ST31), and counts up the counter (step ST32).
  • the state determination unit 107 refers to the count value of the counter and determines whether or not an alert determination time (for example, 3 seconds) has elapsed (step ST33). If the attention determination time has not elapsed (step ST33; NO), the flowchart ends the process.
  • an alert determination time for example, 3 seconds
  • the state determination unit 107 further refers to the count value of the counter and determines whether or not the operation impossibility determination time (for example, 10 seconds) has elapsed. Is performed (step ST34).
  • the state determination unit 107 determines that the driver needs to be alerted (step ST35). The state determination unit 107 outputs the determination result to the external warning device 400 or the vehicle control device 500, and ends the process.
  • step ST34 when the driving impossibility determination time has elapsed (step ST34; YES), the state determination unit 107 determines that the driver is in an inoperable state (step ST36). The state determination unit 107 outputs the determination result to the external warning device 400 or the vehicle control device 500, and ends the process.
  • step ST35 of the flowchart of FIG. 9 if the state determination unit 107 determines that the driver needs to be alerted, the processing frequency of the subsequent image comparison unit 105 may be reduced. . If the state determination unit 107 determines that the driver needs to be alerted, the state determination unit 107 outputs the determination result to the image comparison unit 105. The image comparison unit 105 reduces the processing frequency of comparing the image of the first region and the comparison image based on the output determination result. Further, when the state determination unit 107 outputs a determination result indicating that the driver is incapable of driving to the image comparison unit 105, the image comparison unit 105 performs a process of comparing the image in the first region with the comparison image. The frequency may be set lower. Thereby, the processing load of the image comparison process can be reduced.
  • the posture determination unit 104 calculates the movement direction of the driver's face position and the movement amount of the face position relative to the reference point, and the calculated movement amount of the face position is the first movement direction in the movement direction of the face position.
  • a configuration is shown in which it is determined whether or not the driver's posture is broken by comparing with the posture collapse determination amount.
  • the posture determination unit 104 may be configured to determine whether or not the driver's posture is broken in consideration of the driver's face orientation in addition to the movement direction and movement amount of the driver's face position.
  • the driver's face orientation is calculated based on the amount of displacement of the driver's face in the left-right direction at a certain time and the amount of displacement of the driver's face in the up-down direction at a certain time.
  • FIG. 10 is a diagram illustrating calculation of the amount of displacement of the driver's face by the posture determination unit 104 of the state determination device 100 according to the first embodiment.
  • FIG. 10A is a diagram defining left and right directions and up and down directions of the driver's face.
  • the Yaw direction is the left-right direction facing the driver X
  • the Pitch direction is the up-down direction facing the driver X.
  • the posture determination unit 104 calculates a displacement amount in the Yaw direction and the Pitch direction with respect to the driver's front view from the captured image acquired by the image acquisition unit 101.
  • FIG. 10B shows a case where the posture determination unit 104 calculates the displacement “Y d ” in the Yaw direction as time elapses.
  • FIG. 10A is a diagram defining left and right directions and up and down directions of the driver's face.
  • the Yaw direction is the left-right direction facing the driver X
  • the Pitch direction is the up-down direction facing the driver X.
  • FIG. 10C shows a case where the posture determination unit 104 calculates the displacement “P d ” in the pitch direction with the passage of time.
  • the posture determination unit 104 adds the displacement amount “Y d ” calculated in FIG. 10B or the displacement amount “P d ” calculated in FIG. 10C in addition to the comparison between the movement amount of the face position and the first posture collapse determination amount.
  • a second posture collapse determination amount which is a preset threshold value for the amount of displacement in the face direction, to determine whether or not the driver's posture is collapsed.
  • the second posture collapse determination amount is configured by at least one of a face direction yaw direction determination amount and a face direction pitch direction determination amount.
  • FIG. 11 is a diagram illustrating an example of setting of the movement direction of the driver's head by the state determination device 100 according to the first embodiment and a second posture collapse determination amount set in each movement direction.
  • a second posture collapse determination amount is set in addition to the first posture collapse determination amount.
  • the first posture collapse determination amount in Area1 is m thr1
  • the face direction Yaw direction determination amount (second posture collapse determination amount) is y thr1
  • the face direction pitch direction determination amount (second The posture collapse determination amount) is set to P thr1 .
  • the first posture collapse determination amount of the movement amount of the face position in Area3 is set to m thr3
  • the Pitch direction determination amount of the face direction is set to P thr3 .
  • the posture determination unit 104 calculates a distance m between the two points of the face position Pa and the reference point Q and sets it as the movement amount m of the face position Pa. Further, the posture determining unit 104 calculates the displacement amount Y d and the displacement amount P d of the driver's face direction at the face position Pa.
  • the posture determination unit 104 has the calculated movement amount m of the face position Pa equal to or larger than the first posture collapse determination amount m thr2 , the calculated face direction displacement amount Y d is equal to or larger than the Yaw direction determination amount Y thr2 , and when the displacement amount P d is Pitch direction determination amount P thr2 above, it determines that the posture of the driver is collapsed.
  • the posture determination unit 104 determines that the movement amount m is less than the first posture collapse determination amount m thr2 , the face direction displacement amount Y d is less than the Yaw direction determination amount Y thr2 , or the displacement amount. If P d is less than Pitch direction determination amount P thr2, it determines that the posture of the driver is not collapsed.
  • the posture determination unit 104 determines whether or not the driver's posture is broken in consideration of the amount of displacement in a certain time of the driver's face direction, so that the driver's driving state determination accuracy is improved. improves.
  • the image acquisition unit 101 that acquires a captured image in the vehicle including the driver
  • the face detection unit 102 that detects the face position of the driver from the acquired captured image.
  • the detected driver's face position is compared with the driver's reference point calculated in advance, and the driver's face position that is greater than or equal to the first posture collapse determination amount is moved in the set moving direction.
  • the posture determination unit 104 determines that the posture of the driver is collapsed, and the image of the first region in the acquired captured image when it is determined that the posture of the driver is collapsed The comparison with the image of the first region in the comparison image to re-determine whether or not the driver's posture has collapsed, and based on the result of the re-determination, the driver is unable to drive And a state determination unit 107 for determining whether or not Even if the driver's head other than the driver's head, such as the head of another occupant reflected in the captured image, vehicle interior, or scenery, is mistakenly detected as the driver's head, It is possible to prevent the person from being erroneously detected as being incapable of driving. As a result, it is possible to accurately determine whether the driver is unable to drive.
  • the posture determination unit 104 detects a displacement of the driver's face that is greater than or equal to the second posture collapse determination amount, it is determined that the driver's posture is collapsed. Since it comprised so, a driver
  • the image comparison unit 105 determines the first in the captured image. Since the processing frequency for comparing the image of the region and the image of the first region in the comparison image is reduced, the processing load of the state determination device can be reduced.
  • FIG. 12 is a block diagram showing a configuration of state determination apparatus 100A according to the second embodiment.
  • the state determination device 100A according to the second embodiment is configured by adding an axis detection unit 108 to the state determination device 100 according to the first embodiment shown in FIG.
  • the same or corresponding parts as those of the state determination apparatus 100 according to the first embodiment are denoted by the same reference numerals as those used in the first embodiment, and the description thereof is omitted or simplified.
  • the axis detection unit 108 refers to the determination result of the posture determination unit 104, and if the determination result indicates that the driver's posture is broken, the driver is determined based on the captured image at the time of posture determination acquired by the image acquisition unit 101.
  • the center axis of the head is detected.
  • the axis detector 108 calculates the inclination of the detected head central axis with respect to a preset axis.
  • the axis detection unit 108 determines whether or not the calculated inclination is within a preset threshold range.
  • the axis detection unit 108 cancels the determination that the posture of the driver is broken by the posture determination unit 104 when the calculated inclination is within a preset threshold range.
  • the axis detection unit 108 maintains the determination that the driver's posture is broken by the posture determination unit 104 and outputs the result to the image comparison unit 105.
  • FIG. 13 is a diagram illustrating detection of the head center axis by the axis detection unit 108 of the state determination device 100A according to the second embodiment.
  • FIG. 13A shows a case where the driver X is seated in the driver's seat in a normal state.
  • the lateral direction of the vehicle (not shown) on which the driver X is riding and the horizontal direction on the road surface is the x axis
  • the longitudinal direction of the vehicle and the horizontal direction on the road surface is the y axis
  • the vertical direction of the vehicle the x axis and the y axis.
  • FIG. 13B is a view of the head of the driver X as viewed from above.
  • the head center axis R is an axis passing through the center of a circle Xa obtained when the driver X's head is viewed from above.
  • the axis detection unit 108 detects the head center axis R based on the positions of both eyes, the base of the nose, and the position of the top of the nose obtained from the captured image obtained at the time of posture determination acquired by the image acquisition unit 101. .
  • the axis detection unit 108 detects the inclination of the head center axis R with respect to the x axis based on the center position of both eyes and the position of the top of the nose.
  • the axis detection unit 108 detects the inclination of the head center axis R with respect to the y axis based on the distance between the top of the nose and the base of the nose.
  • FIG. 14 is a diagram illustrating determination of the inclination of the head center axis by the axis detection unit 108 of the state determination apparatus 100A according to the second embodiment.
  • 14A is a diagram illustrating the inclination of the head center axis R with respect to the x axis
  • FIG. 14B is a diagram illustrating the inclination of the head center axis R with respect to the y axis.
  • the axis detector 108 calculates an angle ⁇ fx formed by the head center axis R and the x axis.
  • the axis detector 108 calculates an angle ⁇ fy between the head center axis R and the y axis.
  • Axis detecting unit 108 outputs the calculated angle theta f-x and the angle theta f-y is, it is determined whether or not the threshold range shown in FIG. 14C (range between the angle theta thr2 from the angle theta thr1) . If the angle theta f-x and the angle theta f-y is in the range between the angle theta thr1 angle theta thr2, the determination of the axis detecting unit 108 is collapsed position of the driver by the posture determining unit 104 Is released.
  • the operation by the posture determining unit 104 is maintained.
  • the above-described threshold range is an example and can be set as appropriate. Also the threshold range of the angle ⁇ f-x, and a threshold range of the angle theta f-y may be respectively configured separately.
  • the axis detection unit 108 in the state determination apparatus 100A is a processor 100b that executes a program stored in the processing circuit 100a illustrated in FIG. 3A or the memory 100c illustrated in FIG. 3B.
  • FIG. 15 is a flowchart showing the operation of the state determination process of the state determination device 100A according to the second embodiment.
  • the same steps as those of state determination apparatus 100 according to Embodiment 1 are denoted by the same reference numerals as those used in FIG. 9, and the description thereof is omitted or simplified.
  • the image acquisition unit 101 always acquires a captured image and outputs it to the face detection unit 102, the axis detection unit 108, and the image comparison unit 105.
  • the axis detection unit 108 drives from the captured image at the time of posture determination acquired by the image acquisition unit 101.
  • a person's head central axis is detected (step ST41).
  • the axis detection unit 108 calculates the angle formed between the detected head center axis and the x axis, and the angle formed between the head center axis and the y axis (step ST42).
  • the axis detection unit 108 determines whether or not the calculated angle with the x axis and the angle with the y axis are within a preset threshold range (step ST43). If the calculated angle with the x-axis and the angle with the y-axis are within a preset threshold range (step ST43; YES), the axis detection unit 108 causes the posture determination unit 104 to lose the driver's posture. Is released (step ST44), and the process is terminated.
  • the axis detection unit 108 determines the driver's posture by the posture determination unit 104.
  • the determination that the driver's posture is broken by the posture determination unit 104 is notified to the image comparison unit 105 (step ST45).
  • the image comparison unit 105 acquires an image of the first region in the captured image at the time of posture determination acquired by the image acquisition unit 101 (step ST27). Thereafter, the flowchart performs the processing after step ST28.
  • the posture determination unit 104 determines that the driver's posture is broken, the inclination of the center axis of the driver's head is detected from the acquired captured image.
  • the detected inclination of the central axis of the head is within a preset threshold range, it is configured to include the axis detection unit 108 that cancels the determination that the driver's posture has collapsed. It is possible to prevent a driver who is driving in a posture different from the posture in the state from being erroneously determined as a state requiring caution or an inoperable state. Thereby, the precision of a driver
  • driving with the driver's head moving to the front side of the vehicle, or driving the driver looking into the front side of the vehicle may be misjudged as a state requiring caution or an inoperable state. Can be suppressed.
  • FIG. 16 is a block diagram showing a configuration of state determination apparatus 100B according to the third embodiment.
  • the state determination device 100B according to Embodiment 3 is configured by adding a contour detection unit 109 to the state determination device 100 according to Embodiment 1 shown in FIG.
  • the same or corresponding parts as those of the state determination apparatus 100 according to the first embodiment are denoted by the same reference numerals as those used in the first embodiment, and the description thereof is omitted or simplified.
  • the contour detection unit 109 refers to the determination result of the posture determination unit 104, and when the determination result indicates that the driver's posture is broken, the contour detection unit 109 outputs the second image in the posture determination acquired by the image acquisition unit 101. Get an image of the region.
  • the second area is an area including all the headrests of the driver's seat. Further, the second region is a range where the driver's arm base enters when the driver is seated in the driver's seat in a normal state, and the driver's both elbows when the driver widens both arms. This is the area that falls within.
  • FIG. 17 is a diagram illustrating an example of the second region acquired by the contour detection unit 109 of the state determination device 100B according to the third embodiment.
  • FIG. 17A shows an example of the second area set in the captured image obtained by imaging the driver's seat from the front
  • FIG. 17B shows an example of the second area acquired from the captured image obtained by imaging the driver's seat from diagonally forward.
  • the second areas Ec and Ed shown in FIGS. 17A and 17B are areas including all the headrests H of the driver's seat.
  • the second regions Ec and Ed are regions in consideration of the position of the base Xb of the arm of the driver X and the positions of both elbows Xc when the driver X spreads both arms.
  • the second areas Ec and Ed are areas in which the seat position of the driver's seat, the seat height of the driver X, and normal driving operation are taken into consideration.
  • the second areas Ec and Ed are areas including all the headrests H of the driver's seat in the area regardless of whether the seat position of the driver's seat is the frontmost position or the last position.
  • the contour detection unit 109 performs edge detection on the acquired image of the second region and detects the contour of the driver.
  • the contour detection unit 109 determines that the driver is performing armrest driving when a predefined triangle shape exists in the contour around the driver's neck among the detected contours of the driver.
  • the contour detection unit 109 cancels the determination that the posture of the driver is broken by the posture determination unit 104 when it is determined that the driver is performing the armrest driving.
  • the contour detecting unit 109 maintains the determination that the posture of the driver is broken by the posture determining unit 104 when the contour around the neck of the driver does not exist, and the image comparing unit 105. Output to.
  • the contour detection unit 109 performs edge detection on the images of the second areas Ec and Ed shown in FIGS. 17A and 17B, for example, and when detecting the contour of the driver X, the contour Sa around the neck has a triangular shape Sa and It is determined that the triangle Sb exists. The contour detection unit 109 determines that the driver X is performing armrest driving on the captured images shown in FIGS. 17A and 17B.
  • the contour detection unit 109 in the state determination device 100B is a processor 100b that executes a program stored in the processing circuit 100a illustrated in FIG. 3A or the memory 100c illustrated in FIG. 3B.
  • FIG. 18 is a flowchart showing the operation of the state determination process of the state determination device 100B according to the third embodiment.
  • the same steps as those of state determination apparatus 100 according to Embodiment 1 are denoted by the same reference numerals as those used in FIG. 9, and the description thereof is omitted or simplified.
  • the image acquisition unit 101 always acquires a captured image and outputs it to the face detection unit 102, the contour detection unit 109, and the image comparison unit 105.
  • step ST ⁇ b> 26 when the posture determination unit 104 determines that the driver's posture has collapsed (step ST ⁇ b>26; YES), the contour detection unit 109 outputs the second image in the posture determination acquired by the image acquisition unit 101. An image of the area is acquired (step ST51). The contour detection unit 109 detects the contour of the driver from the image of the second area acquired in step ST51 (step ST52).
  • the contour detection unit 109 determines whether or not there is a shape that matches or is similar to a triangular shape preset for the contour around the driver's neck among the contours of the driver detected in step ST52 (step S52). ST53). If there is a shape that matches or resembles a preset triangular shape (step ST53; YES), the contour detection unit 109 determines that the driver is performing an armrest operation. (Step ST54). In addition, the contour detection unit 109 cancels the determination by the posture determination unit 104 that the driver's posture is broken (step ST55), and ends the process.
  • step ST53 when there is no shape that matches or resembles a preset triangular shape (step ST53; NO), the contour detection unit 109 maintains the determination that the posture of the driver is broken by the posture determination unit 104, The image comparison unit 105 is notified of the determination by the posture determination unit 104 that the driver's posture has collapsed (step ST56). Based on the determination notified in step ST56, the image comparison unit 105 acquires the image of the first region in the captured image at the time of posture determination acquired by the image acquisition unit 101 (step ST27). Thereafter, the flowchart performs the processing after step ST28.
  • the posture determination unit 104 determines that the driver's posture is broken
  • the acquired image of the second region in the captured image at the time of posture determination is obtained. Since the contour of the driver is detected, and the detected contour includes a triangular shape, the contour detection unit 109 for canceling the determination that the posture of the driver is broken is provided. It is possible to prevent the driver who is performing the armrest driving in the moved state from being erroneously determined to be in a state that requires alerting or incapable of driving. Thereby, the precision of a driver
  • the configuration in which the contour detection unit 109 is added and applied to the state determination device 100 described in the first embodiment has been described, but the state determination device 100A illustrated in the second embodiment is applied to the state determination device 100A.
  • the contour detection unit 109 may perform processing in parallel with the axis detection unit 108 or may execute processing at a stage subsequent to the processing of the axis detection unit 108.
  • FIG. 19 is a diagram illustrating an example of notifying the driver of the determination results of the state determination devices 100, 100A, and 100B according to the first to third embodiments.
  • Display 401 displays determination results 402 of state determination apparatuses 100, 100A, and 100B and a button 403 for canceling the determination results.
  • the determination result 402 is displayed as, for example, “determining that operation is not possible”.
  • the button 403 displays, for example, “Reset to normal state”.
  • the driver refers to the determination result 402 displayed on the display 401. If the driver is not in an inoperable state, the driver causes the state determination devices 100, 100A, and 100B to cancel the determination result by pressing the button 403. be able to
  • the present invention can freely combine each embodiment, modify any component of each embodiment, or omit any component of each embodiment. It is.
  • the state determination apparatus is applied to a driver monitoring system or the like that requires improvement in determination accuracy, and is suitable for determining a driving state based on a driver's posture change.
  • 100, 100A, 100B state determination device 101 image acquisition unit, 102 face detection unit, 103 reference point calculation unit, 104 posture determination unit, 105 image comparison unit, 106 comparison image storage unit, 107 state determination unit, 108 axis detection unit 109, contour detection unit.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

La présente invention concerne un dispositif de détermination d'état comprenant : une unité de détection de visage (102) qui détecte la position du visage d'un conducteur à partir d'une image capturée de l'intérieur d'un véhicule ; une unité de détermination de posture (104) qui compare la position détectée du visage du conducteur à un point de référence précalculé pour le conducteur, et détermine que la posture du conducteur s'est affaissée si l'unité de détermination de posture (104) détecte que le déplacement (éventuel) du visage du conducteur dans une direction de déplacement prédéfinie est égal ou supérieur à une première valeur de détermination d'affaissement de la posture ; une unité de comparaison d'images (105) qui, s'il est déterminé que la posture du conducteur s'est affaissée, compare une image de première région incluse dans l'image capturée à une image de première région correspondante incluse dans une image de comparaison pour déterminer à nouveau si oui ou non la posture du conducteur s'est affaissée ; et une unité de détermination d'état (107) qui détermine si oui ou non le conducteur est dans un état dans lequel le conducteur est incapable de conduire, sur la base du résultat de détermination à nouveau.
PCT/JP2017/021121 2017-06-07 2017-06-07 Dispositif de détermination d'état et procédé de détermination d'état WO2018225176A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/021121 WO2018225176A1 (fr) 2017-06-07 2017-06-07 Dispositif de détermination d'état et procédé de détermination d'état
JP2019523262A JP6960995B2 (ja) 2017-06-07 2017-06-07 状態判定装置および状態判定方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/021121 WO2018225176A1 (fr) 2017-06-07 2017-06-07 Dispositif de détermination d'état et procédé de détermination d'état

Publications (1)

Publication Number Publication Date
WO2018225176A1 true WO2018225176A1 (fr) 2018-12-13

Family

ID=64566648

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/021121 WO2018225176A1 (fr) 2017-06-07 2017-06-07 Dispositif de détermination d'état et procédé de détermination d'état

Country Status (2)

Country Link
JP (1) JP6960995B2 (fr)
WO (1) WO2018225176A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020179656A1 (fr) * 2019-03-06 2020-09-10 オムロン株式会社 Dispositif de surveillance de conducteur
WO2020258719A1 (fr) * 2019-06-28 2020-12-30 深圳市商汤科技有限公司 Procédé, appareil et dispositif de détection de l'état en service d'un conducteur, et support de stockage informatique
WO2021001943A1 (fr) * 2019-07-02 2021-01-07 三菱電機株式会社 Dispositif de traitement d'image embarqué et procédé de traitement d'image embarqué
WO2021001944A1 (fr) * 2019-07-02 2021-01-07 三菱電機株式会社 Dispositif de traitement d'image embarqué et procédé de traitement d'image embarqué
WO2021200341A1 (fr) * 2020-03-31 2021-10-07 いすゞ自動車株式会社 Dispositif de détermination d'autorisation/d'interdiction
JP2022526932A (ja) * 2019-03-26 2022-05-27 ケンブリッジ モバイル テレマティクス,インク. 車両ユーザの安全性
WO2024069785A1 (fr) * 2022-09-28 2024-04-04 三菱電機株式会社 Dispositif de détermination d'état d'occupant, système de détermination d'état d'occupant, procédé de détermination d'état d'occupant, programme, et système de commande de véhicule

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110103816B (zh) * 2019-03-15 2022-04-19 河南理工大学 一种驾驶状态检测方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008137639A (ja) * 2006-11-06 2008-06-19 Quality Kk 車両制御装置および車両制御プログラム
JP2016009255A (ja) * 2014-06-23 2016-01-18 株式会社デンソー ドライバの運転不能状態検出装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008137639A (ja) * 2006-11-06 2008-06-19 Quality Kk 車両制御装置および車両制御プログラム
JP2016009255A (ja) * 2014-06-23 2016-01-18 株式会社デンソー ドライバの運転不能状態検出装置

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020179656A1 (fr) * 2019-03-06 2020-09-10 オムロン株式会社 Dispositif de surveillance de conducteur
JP2022526932A (ja) * 2019-03-26 2022-05-27 ケンブリッジ モバイル テレマティクス,インク. 車両ユーザの安全性
WO2020258719A1 (fr) * 2019-06-28 2020-12-30 深圳市商汤科技有限公司 Procédé, appareil et dispositif de détection de l'état en service d'un conducteur, et support de stockage informatique
US11423676B2 (en) 2019-06-28 2022-08-23 Shenzhen Sensetime Technology Co., Ltd. Method and apparatus for detecting on-duty state of driver, device, and computer storage medium
WO2021001944A1 (fr) * 2019-07-02 2021-01-07 三菱電機株式会社 Dispositif de traitement d'image embarqué et procédé de traitement d'image embarqué
JPWO2021001944A1 (ja) * 2019-07-02 2021-11-04 三菱電機株式会社 車載用画像処理装置、および、車載用画像処理方法
JPWO2021001943A1 (ja) * 2019-07-02 2021-11-25 三菱電機株式会社 車載用画像処理装置、および、車載用画像処理方法
WO2021001943A1 (fr) * 2019-07-02 2021-01-07 三菱電機株式会社 Dispositif de traitement d'image embarqué et procédé de traitement d'image embarqué
JP7183420B2 (ja) 2019-07-02 2022-12-05 三菱電機株式会社 車載用画像処理装置、および、車載用画像処理方法
JP2021163124A (ja) * 2020-03-31 2021-10-11 いすゞ自動車株式会社 許否決定装置
WO2021200341A1 (fr) * 2020-03-31 2021-10-07 いすゞ自動車株式会社 Dispositif de détermination d'autorisation/d'interdiction
CN115398508A (zh) * 2020-03-31 2022-11-25 五十铃自动车株式会社 允许与否确定装置
JP7351253B2 (ja) 2020-03-31 2023-09-27 いすゞ自動車株式会社 許否決定装置
CN115398508B (zh) * 2020-03-31 2024-01-05 五十铃自动车株式会社 允许与否确定装置
WO2024069785A1 (fr) * 2022-09-28 2024-04-04 三菱電機株式会社 Dispositif de détermination d'état d'occupant, système de détermination d'état d'occupant, procédé de détermination d'état d'occupant, programme, et système de commande de véhicule

Also Published As

Publication number Publication date
JP6960995B2 (ja) 2021-11-05
JPWO2018225176A1 (ja) 2019-12-12

Similar Documents

Publication Publication Date Title
WO2018225176A1 (fr) Dispositif de détermination d'état et procédé de détermination d'état
US10796171B2 (en) Object recognition apparatus, object recognition method, and object recognition program
CN104573623B (zh) 人脸检测装置、方法
JP5867273B2 (ja) 接近物体検知装置、接近物体検知方法及び接近物体検知用コンピュータプログラム
JP6573193B2 (ja) 判定装置、判定方法、および判定プログラム
CN109997148B (zh) 信息处理装置、成像装置、设备控制系统、移动对象、信息处理方法和计算机可读记录介质
JP6775197B2 (ja) 表示装置及び表示方法
JP7290930B2 (ja) 乗員モデリング装置、乗員モデリング方法および乗員モデリングプログラム
JP2020056717A (ja) 位置検出装置
JP2005066023A (ja) 運転者状態検出装置
JP6594595B2 (ja) 運転不能状態判定装置および運転不能状態判定方法
JP2004334786A (ja) 状態検出装置及び状態検出システム
JP2017030578A (ja) 自動運転制御装置、自動運転制御方法
JP2005018655A (ja) 運転者行動推定装置
US11915495B2 (en) Information processing apparatus, and recording medium
WO2022113275A1 (fr) Dispositif de détection de sommeil et système de détection de sommeil
JP6711128B2 (ja) 画像処理装置、撮像装置、移動体機器制御システム、画像処理方法、及びプログラム
JP2009278185A (ja) 画像認識装置
WO2018097269A1 (fr) Dispositif de traitement d'informations, dispositif d'imagerie, système de commande d'équipement, objet mobile, procédé de traitement d'informations et support d'enregistrement lisible par ordinateur
WO2024079779A1 (fr) Dispositif de détermination d'état de passager, système de détermination d'état de passager, procédé de détermination d'état de passager et programme
JP2004334784A (ja) 確認動作検出装置及び警報システム
JP7127282B2 (ja) 運転状態判定装置及び運転状態判定方法
JP7258262B2 (ja) 調整装置、調整システム、表示装置、乗員監視装置、および、調整方法
US11355015B2 (en) Display device for vehicle, display method for vehicle, and storage medium
JP7342743B2 (ja) 安全運転判定装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17912476

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019523262

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17912476

Country of ref document: EP

Kind code of ref document: A1