WO2022113275A1 - 睡眠検出装置及び睡眠検出システム - Google Patents
睡眠検出装置及び睡眠検出システム Download PDFInfo
- Publication number
- WO2022113275A1 WO2022113275A1 PCT/JP2020/044237 JP2020044237W WO2022113275A1 WO 2022113275 A1 WO2022113275 A1 WO 2022113275A1 JP 2020044237 W JP2020044237 W JP 2020044237W WO 2022113275 A1 WO2022113275 A1 WO 2022113275A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- occupant
- unit
- posture
- blunting
- sleep
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 313
- 230000007958 sleep Effects 0.000 title claims abstract description 241
- 238000004364 calculation method Methods 0.000 claims description 105
- 206010041349 Somnolence Diseases 0.000 claims description 78
- 230000008447 perception Effects 0.000 claims description 74
- 230000002159 abnormal effect Effects 0.000 claims description 69
- 230000004397 blinking Effects 0.000 claims description 31
- 208000032140 Sleepiness Diseases 0.000 claims description 23
- 230000037321 sleepiness Effects 0.000 claims description 23
- 238000003384 imaging method Methods 0.000 abstract description 5
- 208000028752 abnormal posture Diseases 0.000 abstract description 4
- 230000001953 sensory effect Effects 0.000 abstract 2
- 238000000034 method Methods 0.000 description 56
- 238000010586 diagram Methods 0.000 description 16
- 230000037007 arousal Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 210000000744 eyelid Anatomy 0.000 description 7
- 230000001815 facial effect Effects 0.000 description 5
- 238000010801 machine learning Methods 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 230000000052 comparative effect Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 210000004709 eyebrow Anatomy 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 230000008921 facial expression Effects 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 238000002161 passivation Methods 0.000 description 2
- 230000021542 voluntary musculoskeletal movement Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000000586 desensitisation Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000002747 voluntary effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/06—Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
- B60K28/066—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver actuating a signalling device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
- B60W2040/0827—Inactivity or incapacity of driver due to sleepiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/221—Physiology, e.g. weight, heartbeat, health or special needs
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/223—Posture, e.g. hand, foot, or seat position, turned or inclined
Definitions
- the occupant may fall asleep momentarily and fall into so-called microsleep, and in order to prevent drowsy driving, it is necessary to detect the above-mentioned momentary sleep.
- the momentary sleep cannot be detected only by estimating which drowsiness stage the occupant is in, and the accuracy of detecting the occupant's sleep is poor.
- the present disclosure has been made to solve the above-mentioned problems, and an object of the present disclosure is to provide a sleep detection device that detects the momentary sleep of an occupant and improves the detection accuracy of the occupant's sleep.
- the sleep detection device is extracted from an image acquisition unit that acquires an image captured by the occupant and an image captured by the image acquisition unit from an image pickup device mounted on the vehicle that captures an image of the occupant inside the vehicle.
- a blunting determination unit that determines whether or not the occupant has been slowed down, and an attitude determination unit that determines whether or not the occupant's posture extracted from the captured image acquired by the image acquisition unit is abnormal.
- a sleep detection unit that detects the sleep of the occupant using the determination result of the blunting determination unit and the determination result of the posture determination unit is provided. When it is determined that the occupant is in an abnormal position and the occupant's posture is determined to be abnormal by the attitude determination unit, it is considered that the occupant has fallen into sleep.
- the sleep detection system includes an image pickup device mounted on the vehicle to capture an image of an occupant inside the vehicle, an image acquisition unit for acquiring an image captured by the occupant from the image pickup device, and an image acquisition unit. Whether or not the occupant's posture extracted from the captured image acquired by the image acquisition unit is abnormal, and the blunting determination unit that determines whether or not the occupant extracted from the acquired image has been slowed down.
- the sleep detection unit is provided with a posture determination unit for determining the occupant, a sleep detection unit for detecting the sleep of the occupant using the determination result of the blunting determination unit and the determination result of the posture determination unit. When it is determined that the sensation is slowed down and the posture determination unit determines that the occupant's posture is abnormal, it is assumed that the occupant has fallen asleep.
- FIG. It is a block diagram which shows the structural example of the sleep detection system which concerns on Embodiment 1.
- FIG. It is explanatory drawing which shows the image pickup range of the image pickup apparatus which concerns on Embodiment 1.
- FIG. It is explanatory drawing which shows the detection process of the feature point detection part of the sleep detection apparatus which concerns on Embodiment 1.
- FIG. It is explanatory drawing which shows the face orientation detection processing of the face orientation detection part of the sleep detection apparatus which concerns on Embodiment 1.
- FIG. It is explanatory drawing which shows the sleep detection result of the sleep detection apparatus which concerns on Embodiment 1.
- FIG. It is a flowchart which shows the operation example of the sleep detection apparatus which concerns on Embodiment 1.
- FIG. It is a flowchart which shows the operation example of the sleep detection apparatus which concerns on Embodiment 1.
- FIG. It is a figure which shows the hardware configuration example of the sleep detection apparatus which concerns on Embodiment 1.
- FIG. It is a block diagram which shows the structural example of the sleep detection system which concerns on Embodiment 2.
- FIG. It is a flowchart which shows the operation example of the sleep detection apparatus which concerns on Embodiment 2.
- FIG. 1 is a block diagram showing a configuration example of the sleep detection system 100 according to the first embodiment.
- the sleep detection system 100 includes a sleep detection device 10 and an image pickup device 20, and the sleep detection device 10 and the image pickup device 20 are mounted on a vehicle, respectively.
- FIG. 2 is an explanatory diagram showing an imaging range of the imaging apparatus 20 according to the first embodiment.
- the image pickup device 20 is composed of, for example, a wide-angle camera, an infrared camera, or the like, and images the inside of the vehicle.
- the image pickup device 20 may be a distance image sensor such as a TOF (Time-of-flight) camera capable of capturing an image reflecting the distance between the image pickup device 20 and the subject.
- the image pickup apparatus 20 images the inside of the vehicle at intervals of, for example, 30 to 60 fps (frames per second), and outputs the captured images to the image acquisition unit 11 of the sleep detection device 10.
- the image captured by the image pickup apparatus 20 is referred to as a captured image.
- the image pickup region of the image pickup apparatus 20 is shown by the region A.
- the image pickup apparatus 20 is arranged in one or a plurality of units, an instrument panel, a steering column, a rearview mirror, etc. so that at least the occupant 211 and the occupant 212 seated in the driver's seat 201 and the passenger's seat 202 can be simultaneously imaged. ..
- the image pickup apparatus 20 may include a rear seat (not shown) in the image pickup range.
- the occupants 211 and 212 to be imaged by the image pickup apparatus 20 are collectively referred to as “occupants”. That is, the occupants include the driver.
- the sleep detection device 10 includes an image acquisition unit 11 that acquires an image captured by an occupant from the image pickup device 20, and a blunting determination unit that determines whether or not the occupant extracted from the captured image has a blunted perception. 16, the posture determination unit 17 for determining whether or not the posture of the occupant extracted from the captured image is abnormal, and the determination results of the blunting determination unit 16 and the posture determination unit 17 are used to sleep the occupant.
- a sleep detection unit 18 for detecting is provided.
- the image acquisition unit 11 acquires an image captured by an occupant inside the vehicle from the image pickup device 20. Further, it is preferable that the sleep detection device 10 performs the sleep detection process described later each time the captured image is acquired from the image pickup device 20. Then, the captured image acquired by the image acquisition unit 11 is output to the feature point detection unit 12 described below.
- the feature point detection unit 12 extracts feature points related to the occupant's body organs from the captured image acquired by the image acquisition unit 11.
- Various known algorithms can be used in the feature point extraction process by the feature point detection unit 12, and detailed description of these algorithms will be omitted.
- the feature point detection unit 12 corresponds to each of a plurality of face parts (for example, right eye, left eye, right eyebrow, left eyebrow, nose, and mouth). Executes the process of detecting the feature points of.
- FIG. 3 is an explanatory diagram showing a detection process of the feature point detection unit 12 of the sleep detection device 10 according to the first embodiment.
- 3A is an explanatory diagram of the captured image 300
- FIG. 3B is an explanatory diagram of the face region 301
- FIG. 3C is an explanatory diagram of the feature points.
- the feature point detection unit 12 acquires, for example, the captured image 300 shown in FIG. 3A from the image acquisition unit 11.
- the feature point detection unit 12 detects the face region 301, which is a region including the face parts, with respect to the captured image 300 acquired from the image acquisition unit 11.
- the feature point detection unit 12 detects the face region 301 shown in FIG. 3B with respect to the captured image 300 of FIG. 3A.
- the feature point detecting unit 12 is a face part such as both outer corners 311, both inner corners 312, both upper eyelids 313, and both lower eyelids 314 with respect to the eyes. Acquire the position information of the component. Further, the feature point detection unit 12 obtains position information of components of the face parts such as the nose root 315, the nose tip 316, the dorsal nose, and the ala of nose with respect to the nose in the face region 301, for example, as shown in FIG. 3C. get.
- the feature point detection unit 12 acquires the position information of the face parts such as the upper lip 317, the lower lip 318, and the mouth angle 319 with respect to the mouth in the face region 301, for example, as shown in FIG. 3C.
- the position information of the component of each face part acquired by the feature point detection unit 12 is the coordinates starting from the specific position O of the captured image 300 shown in FIG. 3A, or the center in the captured image 300 shown in FIG. 3A. Information indicating the coordinates and the like as the starting point.
- the position information acquired by the feature point detection unit 12 is detected as a feature point.
- the feature points detected by the feature point detection unit 12 may be recorded in a storage unit (not shown) of the sleep detection device 10.
- the feature amount calculation unit 13 calculates the feature amount of the occupant from the feature points extracted by the feature point detection unit 12.
- the feature amount of the occupant is, specifically, information indicating the positional relationship between the feature points of the occupant, such as the distance between the right eye and the left eye and the position of the nose on the face of the occupant.
- the feature amount calculation unit 13 may acquire the position information of the face region 301 such as the size and coordinates of the captured image 300 of the face region 301.
- Various known algorithms can also be used for calculating the feature amount.
- the feature amount calculated by the feature amount calculation unit 13 may also be recorded in the storage unit of the sleep detection device 10.
- the face orientation detection unit 14 detects the face orientation of the occupant in the captured image acquired by the image acquisition unit 11 by using the feature amount calculated by the feature amount calculation unit 13.
- the face orientation detected by the face orientation detection unit 14 may also be recorded in the storage unit of the sleep detection device 10.
- FIG. 4 is an explanatory diagram showing a face orientation detection process of the face orientation detection unit 14 of the sleep detection device 10 according to the first embodiment.
- the face orientation detection unit 14 acquires the positional relationship between the nose tip 316 and both eyes. Further, the face orientation detection unit 14 detects the face orientation of the occupant from the positional relationship between the nose tip 316 and both eyes.
- the face orientation detection unit 14 detects that the occupant's face orientation is right because the nose tip 316 is located near the straight line Qa passing through the outer corner of the left eye 311.
- the face orientation detection unit 14 detects that the occupant's face orientation is in front because the nose tip 316 is located between the binocular heads 312.
- the face orientation detection unit 14 since the face orientation detection unit 14 is located near the straight line Qb where the nose tip 316 passes through the outer corner of the right eye 311, the face orientation detection unit 14 detects that the occupant's face orientation is on the left. .. In this way, the face orientation detection unit 14 detects the left-right orientation of the occupant's face, that is, the yaw angle of the occupant's face.
- the face orientation detection unit 14 detects the vertical orientation of the occupant's face, that is, the pitch angle of the occupant's face, using the straight line Qc passing through the point where the nose tip 316 is located. do.
- the straight line Qc may be set from, for example, an captured image in which the occupant is facing the front in both the vertical direction and the horizontal direction.
- Various known algorithms can be used for the face orientation detection process of the occupant by the face orientation detection unit 14, and various known algorithms can be used for the detection process of the roll angle of the occupant's face.
- the posture determination unit 17 described later it is preferable to use at least one of the roll angle and the pitch angle of the occupant's face in the posture determination process of the occupant by the posture determination unit 17 described later. If the occupant falls asleep momentarily, it is accompanied by nodding, falling, or squeezing. As a result, the occupant who has fallen into a momentary sleep changes at least one of the roll angle and the pitch angle of the face. Therefore, if at least one of the roll angle and the pitch angle is used for the posture determination process described later, the posture determination is performed. The reliability of the result can be improved.
- the face orientation change amount calculation unit 15 calculates the change amount of the occupant's face orientation using the occupant's face orientation detected by the face orientation detection unit 14.
- the amount of change in the occupant's face orientation is, for example, the pitch angle of the occupant's face with respect to the reference position detected from the captured image in frame A and the pitch angle of the occupant's face with respect to the reference position detected from the captured image in frame B. It is the difference with.
- the amount of change in the face orientation of the occupant includes not only the amount of change in the angle of the occupant's face but also the amount of change in the position of the occupant's face or the size of the face area and the coordinates in the captured image.
- the amount of change may be not only the difference of each parameter but also the difference from the average value calculated from each parameter detected in a plurality of captured images, and is calculated from each parameter detected in a plurality of captured images. It may be a median value, a standard deviation, or the like.
- the blunting determination unit 16 determines whether or not the occupant extracted from the captured image has a blunting of perception.
- the fact that the occupant has a blunted perception means that the occupant does not show a strong reaction to an external stimulus. It has been found that if the occupant falls into a momentary sleep, the occupant will experience a desensitization prior to this momentary sleep. Therefore, in order to accurately detect the momentary sleep of the occupant, it is necessary to determine whether or not the occupant has a blunted perception.
- the blunting determination unit 16 will explain by giving an example of determining whether or not the occupant's perception is blunted by using the occupant's face orientation. For the sake of explanation, the determination as to whether or not the occupant's perception is slowed down is also referred to as a slowdown determination.
- the blunting determination unit 16 is voluntary to the occupant when the state in which the amount of change in the face orientation is less than the set threshold value (hereinafter referred to as the blunting determination threshold value) continues for a set time (hereinafter referred to as the blunting determination time). It is determined that the occupant's perception has slowed down, assuming that the number of behaviors has decreased.
- the occupant will behave voluntarily. Therefore, it is determined that the occupant's perception has not slowed down.
- the blunted state the state in which the occupant does not see any voluntary movement
- the change in face orientation may be small during driving. Therefore, when the blunting state continues for the set blunting determination time or longer, the blunting determination determination result is reflected in the sleep detection process described later, so that the sleep detection accuracy of the occupant can be improved.
- the blunting determination unit 16 acquires, for example, the standard deviation of the pitch angle of the occupant's face as the amount of change in face orientation in a set period from the face orientation change amount calculation unit 15. Then, if the acquired standard deviation is less than the blunting determination threshold value and the standard deviation is less than the blunting determination threshold value continues for the blunting determination time or longer, that is, the standard deviation is less than the blunting determination threshold value. If the period of time is longer than the blunting determination time, it is determined that the occupant's perception is blunted. On the other hand, if the acquired standard deviation is equal to or greater than the blunting determination threshold value, the blunting determination unit 16 determines that the occupant's perception has not been blunted.
- the blunting determination unit 16 determines that the occupant's perception has not been blunted even if the period during which the acquired standard deviation is less than the blunting determination threshold value is less than the blunting determination time. Further, the blunting determination unit 16 outputs the determination result to the sleep detection unit 18, which will be described later.
- the posture determination unit 17 determines whether or not the posture of the occupant extracted from the captured image is abnormal. When the occupant falls asleep momentarily, the occupant is accompanied by an abnormal posture such as nodding, falling down, or squeezing. Therefore, the posture determination unit 17 determines whether or not the occupant is in an abnormal posture, and the determination result of the posture determination unit 17 is used for the sleep detection process described later.
- the posture determination unit 17 determines whether or not the posture of the occupant extracted from the captured image is abnormal by using, for example, the amount of change in the face orientation calculated by the face orientation change amount calculation unit 15. Then, the posture determination unit 17 outputs the determination result to the sleep detection unit 18. For the sake of explanation, the determination by the posture determination unit 17 whether or not the posture of the occupant is abnormal is also referred to as a posture determination.
- the posture determination unit 17 has, for example, from the face orientation change amount calculation unit 15, the maximum value of the pitch angle of the occupant's face as the face orientation change amount within a set time (hereinafter referred to as posture determination time) and Get the minimum value. Then, the posture determination unit 17 determines whether or not the posture of the occupant is abnormal depending on whether or not the amount of change in the face orientation is equal to or greater than the set threshold value (hereinafter referred to as the posture determination threshold value) within the posture determination time. To judge.
- the posture determination threshold value the set threshold value
- the posture determination unit 17 from the face orientation change amount calculation unit 15 determines the maximum value of the pitch angle with respect to the reference position of the occupant's face and the minimum value of the pitch angle with respect to the reference position of the occupant's face within the posture determination time. And get each. Then, if the difference between the minimum value and the maximum value is equal to or greater than the posture determination threshold value, the posture determination unit 17 determines that the posture of the occupant is abnormal. On the other hand, if the difference between the minimum value and the maximum value is less than the posture determination threshold value, the posture determination unit 17 determines that the posture of the occupant is not abnormal.
- the occupant may slowly change his / her face direction to secure the front view, for example, even if he / she does not fall into a momentary sleep.
- the posture determination unit 17 determines the posture using the amount of change in the face orientation within the posture determination time as described above, the occupant does not fall into a momentary sleep and the face orientation changes slowly. It is possible to suppress the false detection that the occupant has fallen asleep.
- the posture determination unit 17 may perform posture determination using the size of the face region.
- the posture determination unit 17 acquires the size of the face area as the feature amount from the feature amount calculation unit 13, and if the size of the face area is outside the set range, the posture of the occupant is abnormal. If the size of the face area is within the set range, it may be determined that the posture of the occupant is not abnormal. In this case, the posture determination unit 17 may not be connected to the face orientation change amount calculation unit 15, but may be connected to the feature amount calculation unit 13.
- the posture determination unit 17 may perform posture determination using the distance between the image pickup device 20 and the occupant.
- the image pickup device 20 is composed of an image pickup device 20 capable of capturing an image reflecting the distance between the image pickup device 20 such as a TOF camera and the subject, and the distance indicating the distance between the occupant and the image pickup device 20 included in the captured image. It is possible to determine whether or not the posture is abnormal using the data.
- the posture determination unit 17 determines that the occupant's posture is abnormal. Then, if the distance between the occupant's face and the image pickup device 20 is within the set range, it is determined that the occupant's posture is not abnormal.
- the feature amount calculation unit 13 may be configured so that the distance between the occupant's face included in the captured image and the image pickup device 20 can be calculated, and the posture determination unit 17 and the feature amount calculation unit 13 may be connected. ..
- the posture determination unit 17 uses distance data indicating the distance between the occupant and the sensor acquired from a sensor capable of acquiring the distance to the occupant, such as a depth sensor provided in the vehicle, to determine whether or not the posture is abnormal. It is also possible to determine whether or not. That is, in the above example, the TOF camera is also included in the depth sensor.
- the sleep detection unit 18 detects the sleep of the occupant using the determination result of the blunting determination unit 16 and the determination result of the posture determination unit 17.
- the sleep detection unit 18 determines that the occupant has slowed perception by the blunting determination unit 16, and the posture determination unit 17 determines that the occupant's posture is abnormal, the occupant has fallen into sleep. Is detected.
- the sleep detection unit 18 outputs the detection result to a vehicle control device (not shown) that controls the in-vehicle device mounted on the vehicle, and operates the notification unit 50 mounted on the vehicle to perform notification.
- the notification unit 50 may be provided in the sleep detection device 10.
- the sleep detection device 10 and the vehicle control device are connected to each other.
- the notification unit 50 may be, for example, at least one of a speaker or a display mounted on the vehicle, or may be a mobile terminal or the like possessed by an occupant.
- the notification unit 50 notifies the occupant by issuing a warning sound or the like in order to prevent the occupant from falling asleep.
- the sleep detection device 10 may be connected to a communication unit (not shown) and communication may be performed between the communication unit and the mobile terminal.
- the vehicle control device may control an in-vehicle device such as an air conditioner so as to eliminate the drowsiness.
- FIG. 5 is an explanatory diagram showing a sleep detection result of the sleep detection device 10 according to the first embodiment.
- FIG. 5A is a diagram showing the sleep detection results in the comparative example
- FIG. 5B is a diagram showing the sleep detection results of the sleep detection device 10 according to the present embodiment.
- the vertical axis of FIGS. 5A and 5B shows the detection result, and the horizontal axis shows the time. In FIGS. 5A and 5B, it represents that the occupant was detected to have fallen asleep at the time the points were plotted.
- the sleep of the occupant is detected only by the determination result of the posture determination unit 17.
- the face changes due to various factors.
- the occupant actually falls into a momentary sleep.
- FIG. 5A in the comparative example, it is detected that the occupant has fallen into sleep other than t1, and the accuracy of detecting the sleep of the occupant is poor.
- the posture determination unit 17 determines that the occupant's posture is abnormal. In some cases, the occupant fell asleep. Therefore, as shown in FIG. 5B, if the occupant's face is simply changed without falling into a momentary sleep, the occupant is not erroneously detected as having fallen into sleep, and the occupant actually falls into a momentary sleep. It can be detected that the occupant has fallen asleep only at time t1. As described above, by using the determination result of the posture determination unit 17 and the determination result of the blunting determination unit 16 in the sleep detection process, the sleep detection accuracy of the occupant can be improved.
- the same result can be obtained regardless of whether the posture determination by the posture determination unit 17 uses the size of the face region instead of the face orientation or the distance between the occupant and the sensor. , As shown in FIG. 5B, the sleep of the occupant can be detected with high accuracy.
- FIGS. 6 and 7 are flowcharts showing operation examples of the sleep detection device 10 according to the first embodiment, respectively.
- the sleep detection device 10 performs a blunting determination and a posture determination using the face orientation of the occupant will be described.
- the flowcharts of FIGS. 6 and 7 do not show the process of terminating the operation of the sleep detection device 10
- the sleep detection device 10 receives a command from the vehicle control device to end the operation. End the operation.
- the processes shown in FIGS. 6 and 7 are repeated at predetermined intervals, for example, with the power of the sleep detection device 10 turned on.
- the image acquisition unit 11 of the sleep detection device 10 acquires an captured image from the image pickup device 20 (ST1). Then, the image acquisition unit 11 outputs the acquired captured image to the feature point detection unit 12. Next, the feature point detection unit 12 detects the occupant's face part in the captured image using the captured image, and extracts the feature points of the occupant's face (ST2). Then, the feature amount calculation unit 13 calculates the feature amount of the occupant extracted from the captured image by using the extracted feature points (ST3).
- feature amount calculation process ST101 the processes of ST1 to ST3 shown in FIG. 6 are collectively referred to as feature amount calculation process ST101.
- the face orientation detection unit 14 acquires the feature amount and detects the face orientation of the occupant extracted from the captured image (ST102). Then, the face orientation detection unit 14 outputs the detected face orientation of the occupant to the face orientation change amount calculation unit 15. Next, the face orientation change amount calculation unit 15 calculates the change amount of the occupant's face orientation using the occupant's face orientation acquired from the face orientation detection unit 14 (ST103). Here, the amount of change in the face orientation may be recorded in the storage unit.
- the blunting determination unit 16 determines whether or not the occupant's perception is blunted (ST104).
- the blunting determination unit 16 acquires, for example, the standard deviation of the pitch angle indicating the face orientation of the occupant from the face orientation change amount calculation unit 15 as the amount of change in a set period. Then, when the acquired change amount is equal to or greater than the blunting determination threshold value, the blunting determination unit 16 determines that the occupant's perception has not been blunted (ST104; NO), and outputs the determination result to the sleep detection unit 18. do.
- the process of the sleep detection device 10 proceeds to ST101.
- the blunting determination unit 16 determines that the occupant is in a state where the perception is blunted (ST104; YES), and performs the next ST105 process. move on.
- the blunting determination unit 16 determines whether or not the state in which the occupant's perception is blunted continues for the blunting determination time or longer (ST105).
- the blunting determination unit 16 continues, for example, when the acquired standard deviation is less than the blunting determination threshold value for the blunting determination time or longer, the occupant's perception is blunted for the blunting determination time or longer. It is determined that it has been done.
- the blunting determination unit 16 determines that the state in which the occupant's perception is blunted does not continue for the blunting determination time or longer (ST105; NO), it is determined that the occupant's perception has not been blunted. The process of the sleep detection device 10 proceeds to ST101. On the other hand, when the blunting determination unit 16 determines that the state in which the occupant's perception is blunted continues for the blunting determination time or longer (ST105; YES), it is determined that the occupant's perception is blunted. Then, the determination result is output to the sleep detection unit 18. Then, the process of the sleep detection device 10 proceeds to ST106 described below.
- the posture determination unit 17 of the sleep detection device 10 acquires the amount of change in the face orientation of the occupant from the face orientation change amount calculation unit 15 (ST106).
- the posture determination unit 17 acquires, for example, the amount of change in the face orientation from the face orientation change amount calculation unit 15, and determines whether the posture of the occupant is abnormal depending on whether or not the amount of change in the face orientation is equal to or greater than the posture determination threshold value. It is determined whether or not (ST107).
- the posture determination unit 17 acquires, for example, the maximum value of the pitch angle with respect to the reference position of the occupant's face and the minimum value of the pitch angle with respect to the reference position of the occupant's face in a set period as the amount of change. ..
- the posture determination unit 17 determines that the posture of the occupant is not abnormal if the difference between the minimum value and the maximum value is less than the posture determination threshold value (ST107; NO), and then the process of the sleep detection device 10 is performed. Proceed to ST101.
- the posture determination unit 17 determines that the posture of the occupant is abnormal (ST107; YES). The process of the sleep detection device 10 proceeds to ST108 described below.
- the posture determination unit 17 determines whether or not the posture of the occupant is determined to be abnormal within the posture determination time (ST108). For example, the posture determination unit 17 refers to the frame A in which the captured image in which the pitch angle of the occupant's face is maximized is acquired, and the frame B in which the captured image in which the pitch angle of the occupant's face is minimized is acquired. , The elapsed time between frame A and frame B is calculated. If the calculated elapsed time is equal to or less than the posture determination time, it is determined that the amount of change in the face orientation of the occupant is equal to or greater than the posture determination threshold within the posture determination time. The above-mentioned processing can be omitted if the face orientation change amount calculation unit 15 calculates the minimum value and the maximum value of the pitch angle within the posture determination time.
- the process of the sleep detection device 10 proceeds to ST101. That is, if it is determined in the process of ST108 that the attitude of the occupant is abnormal is not the determination within the attitude determination time, the determination result that the attitude of the occupant is abnormal in the process of ST107 is rejected.
- the posture On the other hand, when the posture determination unit 17 determines that the determination that the attitude of the occupant is abnormal is within the set attitude determination time (ST108; YES), it is determined that the attitude of the occupant is abnormal. , The determination result is output to the sleep detection unit 18. Then, the sleep detection unit 18 detects that the occupant has fallen into sleep, and the notification unit 50 notifies the occupant by issuing a warning sound or the like (ST109).
- FIG. 8 is a diagram showing a hardware configuration example of the sleep detection device 10 according to the first embodiment.
- Eighteen functions are realized by a processing circuit. That is, the image acquisition unit 11, the feature point detection unit 12, the feature amount calculation unit 13, the face orientation detection unit 14, the face orientation change amount calculation unit 15, the blunting determination unit 16, the posture determination unit 17, and the sleep detection device 10 of the sleep detection device 10.
- the sleep detection unit 18 may be a processing circuit 10a which is dedicated hardware as shown in FIG. 8A, or may be a processor 10b which executes a program stored in the memory 10c as shown in FIG. 8B. May be good.
- the processing circuit 10a may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-programmable). GateArray) or a combination of these is applicable.
- Each may be realized by a processing circuit, or the functions of each part may be collectively realized by one processing circuit.
- the detection unit 18 is the processor 10b
- the function of each unit is realized by software, firmware, or a combination of software and firmware.
- the software or firmware is described as a program and stored in the memory 10c.
- the processor 10b reads and executes the image acquisition unit 11, the feature point detection unit 12, the feature amount calculation unit 13, the face orientation detection unit 14, the face orientation change amount calculation unit 15, and the like.
- Each function of the blunting determination unit 16, the posture determination unit 17, and the sleep detection unit 18 is realized.
- the image acquisition unit 11, the feature point detection unit 12, the feature amount calculation unit 13, the face orientation detection unit 14, the face orientation change amount calculation unit 15, the blunting determination unit 16, the posture determination unit 17, and the sleep detection unit 18 are It comprises a memory 10c for storing a program for which each step shown in FIG. 4 will resultly be executed when executed by the processor 10b. Further, these programs include an image acquisition unit 11, a feature point detection unit 12, a feature amount calculation unit 13, a face orientation detection unit 14, a face orientation change amount calculation unit 15, a blunting determination unit 16, a posture determination unit 17, and sleep. It can also be said that the procedure or method of the detection unit 18 is executed by the computer.
- the processor 10b is, for example, a CPU (Central Processing Unit), a processing device, a computing device, a processor, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), or the like.
- the memory 10c may be, for example, a non-volatile or volatile semiconductor memory such as RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableROM), and EPROM (ElectricallyEPROM).
- RAM RandomAccessMemory
- ROM ReadOnlyMemory
- flash memory EPROM (ErasableProgrammableROM), and EPROM (ElectricallyEPROM).
- it may be a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a mini disk, a CD (Compact Disc), or a DVD (Digital Versatile Disc).
- Some of the functions may be realized by dedicated hardware, and some may be realized by software or firmware.
- the processing circuit 10a in the sleep detection device 10 can realize each of the above-mentioned functions by hardware, software, firmware, or a combination thereof.
- Some functions may be executed by an external server.
- the sleep detection unit 18 includes a posture determination unit 17 for determining the occupant's sleep, and a sleep detection unit 18 for detecting the sleep of the occupant using the determination result of the blunting determination unit 16 and the determination result of the attitude determination unit 17.
- the determination unit 17 determines that the occupant's posture is abnormal and the blunting determination unit 16 determines that the occupant has a blunted perception, it is assumed that the occupant has fallen asleep. It can detect momentary sleep and improve the accuracy of occupant sleep detection.
- the face orientation detecting unit 14 may acquire the position of the right eye, the position of the left eye, and the distance between the right eye and the left eye as the feature amount from the feature amount calculation unit 13, and may use it for detecting the face orientation.
- the face orientation detection unit 14 may detect feature points related to organs in the body other than the face, such as the position of the shoulders of the occupant.
- FIG. 9 is a block diagram showing a configuration example of the sleep detection system 101 according to the second embodiment. Similar to the first embodiment, the sleep detection device 30 according to the second embodiment determines whether or not the image acquisition unit 11 for acquiring the captured image and the occupant extracted from the captured image have slowed perception. Using the blunting determination unit 32 for determination, the posture determination unit 17 for determining whether or not the posture of the occupant extracted from the captured image is abnormal, and the determination result of the blunting determination unit 32 and the determination result of the posture determination unit 17. A sleep detection unit 18 for detecting the sleep of the occupant is provided.
- the present embodiment is different from the first embodiment in that the blunting determination unit 32 determines whether or not the occupant has a blunting of perception by using the occupant's face information extracted from the captured image. ..
- the same components as those in the first embodiment are designated by the same reference numerals, and the description thereof will be omitted.
- the blunting determination can be performed not only by using the amount of change in the face orientation described in the first embodiment, but also by using the face information which is the information regarding the state of the occupant's face.
- the facial information is information on the state of the occupant's face, such as the degree of eye opening, the frequency of blinking, and the change in facial expression.
- the sleep detection device 30 includes a face information detection unit 31.
- the face information detection unit 31 detects the face information of the occupant by using the feature amount of the occupant acquired from the feature amount calculation unit 13.
- the face information detection unit 31 When acquiring the degree of eye opening as face information, the face information detection unit 31 acquires, for example, the distance between the upper and lower eyelids of the eyes as the feature amount from the feature amount calculation unit 13, and uses this distance as the face. Detect as information.
- the face information detection unit 31 acquires, for example, the distance between the upper and lower eyelids of the eyes as the feature amount calculation unit 13. Then, it is assumed that blinking is performed when this distance is less than the set threshold value, and it is assumed that the eyes are open when this distance is equal to or more than the set threshold value, and blinking is performed from the state in which the eyes are open.
- the face information detection unit 31 acquires, for example, the distance between the upper and lower eyelids of the eyes as the feature amount calculation unit 13. Then, assuming that the blink is performed when this distance is less than the set threshold value, the time elapsed between the previous blink and the next blink is acquired as face information.
- the face information detection process of the face information detection unit 31 is not limited to the above example, and various known algorithms can be used.
- the blunting determination unit 32 acquires face information from the face information detection unit 31, and uses the face information to determine whether or not the occupant has a blunted perception. For example, when the blunting determination unit 32 acquires the eye opening degree as face information from the face information detection unit 31, the occupant when the eye opening degree is less than the blunting determination threshold value continues for the blunting determination time or longer. It is determined that the perception is slowed down. On the other hand, the blunting determination unit 32 is perceived by the occupant when the eye opening degree is equal to or higher than the blunting determination threshold value or when the eye opening degree is less than the blunting determination threshold value does not continue for the blunting determination time or longer. It is determined that the blunting of is not occurring.
- the blunting determination unit 32 acquires the blinking speed as face information from the face information detection unit 31, the occupant when the blinking speed is less than the blunting determination threshold value continues for the blunting determination time or longer. It is determined that the perception is slowed down.
- the blunting determination unit 32 slows down the perception of the occupant when the blinking speed is equal to or higher than the blunting determination threshold value or when the blinking speed is less than the blunting determination threshold value does not continue for the blunting determination time or longer. Is not generated.
- the blunting determination unit 32 acquires the blink frequency as face information from the face information detection unit 31, the occupant when the blink frequency is less than the blunting determination threshold value continues for the blunting determination time or longer. It is determined that the perception is slowed down.
- the blunting determination unit 32 slows down the perception of the occupant when the frequency of blinking is equal to or higher than the blunting determination threshold value or when the frequency of blinking is less than the blunting determination threshold value does not continue for the blunting determination time or longer. Is not generated.
- the blunting determination threshold here is different from the blunting determination threshold in the first embodiment. That is, the blunting determination threshold in the first embodiment is a threshold for the amount of change in the face orientation, whereas the blunting determination threshold in the second embodiment is a threshold for the face information.
- the blunting determination threshold value to be compared with the eye opening degree is the threshold value for the eye opening degree
- the blunting determination threshold value to be compared with the blinking speed is the threshold value for the blinking speed, and is compared with the blinking frequency.
- the blunting determination threshold is a threshold for the frequency of blinking.
- the blunting determination time may be different for each determination.
- the blunting determination threshold value may be a predetermined value or a value set based on the face information acquired from the time of boarding to the lapse of a predetermined time. This is because it is considered that the occupant's arousal degree is high at the time of boarding, and the accuracy of the blunting determination can be improved by using the face information acquired when the arousal degree is high as a reference.
- FIG. 10 is a flowchart showing an operation example of the sleep detection device 30 according to the second embodiment.
- the same steps as the processing of the sleep detection device 30 according to the first embodiment are designated by the same reference numerals as those shown in FIGS. 6 and 7, and the description thereof will be omitted or simplified.
- the face information detection unit 31 detects the face information using the feature amount acquired from the feature amount calculation unit 13 (ST201).
- the face information detection unit 31 detects the degree of eye opening as face information
- the blunting determination unit 32 makes a blunting determination using the blunting determination threshold values for the eye opening degree and the eye opening degree. explain.
- the blunting determination unit 32 determines whether or not the occupant's perception is blunted (ST202).
- the blunting determination unit 32 acquires, for example, the degree of eye opening of the occupant's eyes during a period set as face information from the face information detection unit 31. Then, when the acquired eye opening degree is equal to or higher than the blunting determination threshold value, the blunting determination unit 32 determines that the occupant has not been blunted in perception (ST202; NO), and outputs the determination result to the sleep detection unit 18. do.
- the process of the sleep detection device 30 proceeds to ST101.
- the blunting determination unit 32 determines that the occupant is in a state where the perception is blunted (ST202; YES), and next Proceed to ST105 processing.
- the blunting determination unit 32 determines whether or not the state in which the occupant's perception is blunted continues for the blunting determination time or longer (ST105).
- ST105 determines that the state in which the occupant is experiencing blunting of perception does not continue for the blunting determination time or longer (ST105; NO)
- ST105 determines that the occupant is not blunted in perception.
- the process of the sleep detection device 30 proceeds to ST101.
- the blunting determination unit 32 determines that the state in which the occupant's perception is blunted continues for the blunting determination time or longer (ST105; YES)
- the determination result is output to the sleep detection unit 18.
- the process of the sleep detection device 30 proceeds to the next ST102.
- the face orientation detection unit 14 acquires the feature amount and detects the face orientation of the occupant extracted from the captured image (ST102). Then, the posture determination unit 17 of the sleep detection device 30 acquires the amount of change in the face orientation of the occupant from the face orientation change amount calculation unit 15 (ST106). The posture determination unit 17 acquires the amount of change in face orientation from the face orientation change amount calculation unit 15, and determines whether or not the posture of the occupant is abnormal depending on whether or not the acquired amount of change is equal to or greater than the posture determination threshold value. Judgment (ST107). If the acquired change amount is less than the posture determination threshold value, the posture determination unit 17 determines that the posture of the occupant is not abnormal (ST107; NO), and outputs the determination result to the sleep detection unit 18. Next, the process of the sleep detection device 30 proceeds to ST101. As in the first embodiment, the posture determination by the posture determination unit 17 does not use the face orientation, but can also use the size of the face region and the distance between the occupant and the sensor.
- the posture determination unit 17 determines that the posture of the occupant is abnormal if, for example, the difference between the minimum value and the maximum value of the pitch angle is equal to or greater than the set threshold value (ST107; YES), and further, sleep.
- the process of the detection device 30 proceeds to ST108 described below.
- the posture determination unit 17 determines whether or not the posture of the occupant is determined to be abnormal within the posture determination time (ST108). When the posture determination unit 17 determines that the posture of the occupant is abnormal and the determination is not within the posture determination time (ST108; NO), it determines that the posture of the occupant is not abnormal and processes the sleep detection device 30. Proceeds to ST101. That is, if it is determined in the process of ST108 that the attitude of the occupant is abnormal is not the determination within the attitude determination time, the determination result that the attitude of the occupant is abnormal in the process of ST107 is rejected. The posture.
- the posture determination unit 17 determines that the determination that the posture of the occupant is abnormal is the determination within the posture determination time (ST108; YES)
- the determination result is output to the sleep detection unit 18.
- the sleep detection unit 18 detects that the occupant has fallen into sleep, and the notification unit 50 notifies the occupant (ST109).
- the sleep detection device 30 is further provided with a face information detection unit 31 that detects face information regarding the state of the occupant's face by using the feature amount calculated by the feature amount calculation unit 13, and the blunting determination unit 32 is provided.
- the blunted perception that appears before the occupant's momentary sleep is caused by the occupant. It can be determined from the condition of the face, and it is possible to detect the momentary sleep of the occupant and improve the detection accuracy of the occupant's sleep.
- the blunting determination unit 32 acquires the eye opening degree as face information from the face information detection unit 31 and uses the eye opening degree for the blunting determination.
- the blunting determination unit 32 has been described.
- the facial information used for the blunting determination is not limited to the degree of eye opening.
- the blunting determination unit 32 acquires the degree of eye opening, the speed of blinking, and the frequency of blinking as face information from the face information detection unit 31, and at least one of the acquired face information is a set threshold value. If it is less than, it may be determined that the occupant has a blunted perception.
- FIG. 11 is a block diagram showing a configuration example of the sleep detection system 102 according to the third embodiment. Similar to the first embodiment, the sleep detection device 40 according to the third embodiment determines whether or not the perception is slowed down in the image acquisition unit 11 that acquires the captured image and the occupant extracted from the captured image. Using the blunting determination unit 16 for determination, the posture determination unit 17 for determining whether or not the posture of the occupant extracted from the captured image is abnormal, the determination result of the blunting determination unit 16 and the determination result of the posture determination unit 17. A sleep detection unit 42 that detects the sleep of the occupant is provided.
- the sleep detection unit 42 uses the drowsiness level calculation result of the drowsiness level calculation unit 41 in addition to the determination results of the drowsiness determination unit 16 and the posture determination unit 17, and the occupant's perception is slowed down. It is different from the first embodiment in that it determines whether or not it is.
- the same components as those in the first embodiment are designated by the same reference numerals, and the description thereof will be omitted.
- the sleep detection device 40 of the present embodiment includes a face information detection unit 31 and a sleepiness level calculation unit 41 for calculating the sleepiness level of the occupant. Since the configuration of the face information detection unit 31 is the same as the configuration described in the second embodiment, the description thereof will be omitted.
- the drowsiness level is a scale of drowsiness that is gradually divided from a state with a high degree of arousal to a state with a low degree of arousal.
- the state with the highest arousal level is defined as drowsiness level 1
- the state with the lowest arousal level is defined as drowsiness level 9
- the drowsiness level 1 is changed to drowsiness level 9. ..
- the sleep detection unit 42 improves the reliability of the sleep detection result of the occupant by using the sleepiness level calculation result of the sleepiness level calculation unit 41 in addition to the determination results of the blunting determination unit 16 and the posture determination unit 17. can do.
- the drowsiness level calculation unit 41 acquires face information from the face information detection unit 31 and calculates the drowsiness level of the occupant. For example, when the eye opening degree is acquired as face information, the drowsiness level calculation unit 41 acquires the eye opening degree from the face information detection unit 31, and calculates the sleepiness level of the occupant higher as the eye opening degree becomes smaller. do.
- a plurality of threshold values may be set for the degree of eye opening, and the drowsiness level may be calculated depending on whether or not the threshold values are less than the respective threshold values.
- the drowsiness level calculation unit 41 acquires the blinking speed from the face information detecting unit 31, and calculates the drowsiness level of the occupant higher as the blinking speed becomes smaller. ..
- a plurality of threshold values may be set for the blinking speed, and the drowsiness level may be calculated depending on whether or not the threshold value is less than each threshold value.
- the drowsiness level calculation unit 41 acquires the frequency of blinking from the face information detection unit 31, and calculates the drowsiness level of the occupant higher as the frequency of blinking decreases. ..
- a plurality of threshold values may be set for the frequency of blinking, and the drowsiness level may be calculated depending on whether or not the threshold value is less than each threshold value.
- the drowsiness level calculation process by the drowsiness level calculation unit 41 is not limited to the above example, and various known algorithms can be used. Then, the calculation result by the drowsiness level calculation unit 41 is output to the sleep detection unit 42.
- the sleep detection unit 42 acquires the calculation result by the sleepiness level calculation unit 41, and if the sleepiness level of the occupant calculated by the sleepiness level calculation unit 41 is equal to or higher than the set sleepiness level, that is, the set threshold value, the slowdown determination unit 42 The sleep of the occupant is detected by using the determination result of 16 and the determination result of the posture determination unit 17.
- the sleep detection unit 42 uses the determination result of the blunting determination unit 16 and the determination result of the posture determination unit 17 to occupy the occupant. It is preferable to detect sleep.
- the blunting determination unit 16 determines that the occupant's perception is blunted, and the posture determination unit 17 determines that the occupant's posture is abnormal.
- the sleep detection unit 42 detects that the occupant has fallen asleep.
- FIG. 12 is a flowchart showing an operation example of the sleep detection device 40 according to the third embodiment.
- the same steps as the processing of the sleep detection device 40 according to the first embodiment are designated by the same reference numerals as those shown in FIGS. 6 and 7, and the description thereof will be omitted or simplified.
- the face information detection unit 31 detects face information using the feature amount acquired from the feature amount calculation unit 13 (ST301), and the detected face information is drowsy. Output to the level calculation unit 41.
- the face information detection unit 31 detects the occupant's eye opening degree as face information
- the drowsiness level calculation unit 41 calculates the occupant's drowsiness level using the eye opening degree. ..
- the sleepiness level calculation unit 41 calculates the sleepiness level of the occupant using the degree of opening / closing of the eyes of the occupant acquired as face information (ST302), and outputs the calculated sleepiness level of the occupant to the sleep detection unit 42. ..
- the drowsiness level calculation unit 41 acquires the eye opening degree as face information from the face information detection unit 31, and calculates the drowsiness level of the occupant higher as the eye opening degree becomes smaller.
- the sleep detection unit 42 determines whether or not the acquired drowsiness level is equal to or higher than the set threshold value (ST303).
- the sleep detection unit 42 determines that the sleepiness level is less than the set threshold value (ST303; NO)
- the process of the sleep detection device 40 proceeds to ST101.
- the sleep detection unit 42 determines that the sleepiness level is equal to or higher than the set threshold value (ST303; YES)
- the process of the sleep detection device 40 proceeds to ST102.
- the face orientation detection unit 14 acquires the feature amount of the occupant and detects the occupant's face orientation in the captured image (ST102). Then, the face orientation detection unit 14 outputs the face orientation of the occupant detected by the face orientation change amount calculation unit 15. Next, the face orientation change amount calculation unit 15 calculates the change amount of the occupant's face orientation using the occupant's face orientation acquired from the face orientation detection unit 14 (ST103).
- the blunting determination unit 16 determines whether or not the occupant's perception is blunted (ST104).
- the blunting determination unit 16 acquires, for example, the amount of change in the face orientation of the occupant from the face orientation change amount calculation unit 15. Then, when the acquired change amount of the face orientation is equal to or larger than the blunting determination threshold value, the blunting determination unit 16 determines that the occupant's perception has not been blunted (ST104; NO), and determines the determination result as the sleep detection unit. Output to 42. Next, the process of the sleep detection device 40 proceeds to ST101.
- the blunting determination unit 16 determines that the occupant is in a state where the perception is blunted (ST104; YES), and the next ST105 Proceed to the process of.
- the blunting determination by the blunting determination unit 16 can be performed not by using the face orientation but also by using the face information.
- the blunting determination unit 16 determines whether or not the state in which the occupant's perception is blunted continues for the blunting determination time or longer (ST105).
- the blunting determination unit 16 determines that the state in which the occupant's perception is blunted does not continue for the blunting determination time or longer (ST105; NO)
- the process of the sleep detection device 40 proceeds to ST101.
- the blunting determination unit 16 determines that the state in which the occupant is experiencing blunting of perception continues for the blunting determination time or longer (ST105; YES)
- the determination result is output to the sleep detection unit 18, and the process of the sleep detection device 40 proceeds to ST106 described below.
- the posture determination unit 17 of the sleep detection device 40 acquires the amount of change in the face orientation of the occupant from the face orientation change amount calculation unit 15 (ST106).
- the posture determination unit 17 acquires the amount of change in face orientation from the face orientation change amount calculation unit 15, and determines whether or not the posture of the occupant is abnormal depending on whether or not the amount of change in face orientation is equal to or greater than the posture determination threshold. Is determined (ST107). If the amount of change in the face orientation is less than the posture determination threshold value, the posture determination unit 17 determines that the posture of the occupant is not abnormal (ST107; NO), and outputs the determination result to the sleep detection unit 42. Next, the process of the sleep detection device 40 proceeds to the process of ST101.
- the posture determination unit 17 determines that the posture of the occupant is abnormal (ST107; YES), and proceeds to the process of ST108 described below.
- the posture determination by the posture determination unit 17 does not use the face orientation, but can also use the size of the face region and the distance between the occupant and the sensor.
- the posture determination unit 17 determines whether or not the posture of the occupant is determined to be abnormal within the posture determination time (ST108). When the posture determination unit 17 determines that the determination that the posture of the occupant is abnormal is not the determination within the posture determination time (ST108; NO), it determines that the attitude of the occupant is not abnormal, and the sleep detection device 40 determines. The process proceeds to ST101. That is, if it is determined in the process of ST108 that the attitude of the occupant is abnormal is not the determination within the attitude determination time, the determination result that the attitude of the occupant is abnormal in the process of ST107 is rejected. The posture.
- the posture determination unit 17 determines that the determination that the attitude of the occupant is abnormal is within the set attitude determination time (ST108; YES), it is determined that the attitude of the occupant is abnormal. Then, the determination result is output to the sleep detection unit 42. Then, the sleep detection unit 42 assumes that the occupant has fallen into sleep, and the notification unit 50 issues an alarm to the occupant (ST109).
- the blunting determination by the blunting determination unit, the posture determination by the posture determination unit, and the calculation of the drowsiness level by the drowsiness level calculation unit have been described as being performed using the occupant's feature amount.
- the blunting determination, the posture determination, and the calculation of the drowsiness level are not limited to those performed using the occupant's feature amount.
- the blunting determination, the posture determination, and the calculation of the drowsiness level may be performed using, for example, machine learning.
- a learning device is provided in the blunting determination unit, and the learning device includes, for example, an image of an occupant when a blunting of perception is occurring, and a blunting of perception. The image of the occupant when it is not occurring is input and learned. Then, the learning device of the blunting determination unit may acquire the captured image from the image acquisition unit, and it may be determined whether or not the occupant's perception in the captured image is blunted.
- the posture determination unit is provided with a learning device, and the learning device is equipped with an occupant when the posture is abnormal, for example, by sneaking, nodding, or falling down.
- the captured image and the captured image of the occupant when the posture is not abnormal are input and learned.
- the learning device of the posture determination unit may acquire the captured image from the image acquisition unit, and determine whether or not the posture of the occupant in the captured image is abnormal.
- the drowsiness level calculation unit is provided with a learning device, and the learning device is used, for example, an image of an occupant corresponding to the stepwise drowsiness level. To learn by inputting each. Then, the learning device of the drowsiness level calculation unit may acquire the captured image from the image acquisition unit, and the drowsiness level of the occupant in the captured image may be calculated.
- 10, 30, 40 sleep detection device 10a processing circuit, 10b processor, 10c memory, 11 image acquisition unit, 12 feature point detection unit, 13 feature amount calculation unit, 14 face orientation detection unit, 15 face orientation change amount calculation unit, 16, 32 blunting determination unit, 17 posture determination unit, 18, 42 sleep detection unit, 20 imaging device, 31 face information detection unit, 41 sleepiness level calculation unit, 50 notification unit, 100, 101, 102 sleep detection system, 201 operation Seat, 202 passenger seat, 211, 212 occupants, 300 captured images, 301 face area, 311 outer corners of the eyes, 312 inner corners, 313 upper eyelids, 314 lower eyelids, 315 nose roots, 316 nose tips, 317 upper lips, 318 lower lips, 319 mouth corners.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Emergency Management (AREA)
- Business, Economics & Management (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- General Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
図1は、実施の形態1に係る睡眠検出システム100の構成例を示すブロック図である。睡眠検出システム100は、睡眠検出装置10及び撮像装置20を備えており、睡眠検出装置10及び撮像装置20はそれぞれ車両に搭載される。
図9は、実施の形態2に係る睡眠検出システム101の構成例を示すブロック図である。実施の形態2に係る睡眠検出装置30は、実施の形態1と同様に、撮像画像を取得する画像取得部11と、撮像画像から抽出された乗員に知覚の鈍化が発生しているか否かを判定する鈍化判定部32と、撮像画像から抽出された乗員の姿勢が異常であるか否かを判定する姿勢判定部17と、鈍化判定部32の判定結果及び姿勢判定部17の判定結果を用いて、乗員の睡眠を検出する睡眠検出部18と、を備える。本実施の形態では、鈍化判定部32が、撮像画像から抽出された乗員の顔情報を用いて、乗員に知覚の鈍化が発生しているか否かを判定する点について、実施の形態1と異なる。実施の形態1と同じ構成要素には同じ符号を付し、その説明を省略する。
図11は、実施の形態3に係る睡眠検出システム102の構成例を示すブロック図である。実施の形態3に係る睡眠検出装置40は、実施の形態1と同様に、撮像画像を取得する画像取得部11と、撮像画像から抽出された乗員に知覚の鈍化が発生しているか否かを判定する鈍化判定部16と、撮像画像から抽出された乗員の姿勢が異常であるか否かを判定する姿勢判定部17と、鈍化判定部16の判定結果及び姿勢判定部17の判定結果を用いて、乗員の睡眠を検出する睡眠検出部42と、を備える。本実施の形態では、睡眠検出部42が、鈍化判定部16及び姿勢判定部17の判定結果に加え、眠気レベル算出部41の眠気レベルの算出結果を用いて、乗員に知覚の鈍化が発生しているか否かを判定する点について、実施の形態1と異なる。実施の形態1と同じ構成要素には同じ符号を付し、その説明を省略する。
Claims (13)
- 車両に搭載され前記車両の内部の乗員を撮像する撮像装置から、前記乗員が撮像された撮像画像を取得する画像取得部と、
前記画像取得部が取得した前記撮像画像から抽出された前記乗員に知覚の鈍化が発生しているか否かを判定する鈍化判定部と、
前記画像取得部が取得した前記撮像画像から抽出された前記乗員の姿勢が異常であるか否かを判定する姿勢判定部と、
前記鈍化判定部の判定結果及び前記姿勢判定部の判定結果を用いて、前記乗員の睡眠を検出する睡眠検出部と、を備え、
前記睡眠検出部は、前記鈍化判定部により前記乗員に知覚の鈍化が発生していると判定され、前記姿勢判定部により前記乗員の姿勢が異常であると判定された場合、前記乗員が睡眠に陥ったとする
ことを特徴とする睡眠検出装置。 - 前記睡眠検出部は、前記鈍化判定部により前記乗員に知覚の鈍化が発生していると判定された後に、前記姿勢判定部により前記乗員の姿勢が異常であると判定された場合に、前記乗員が睡眠に陥ったとする
ことを特徴とする請求項1に記載の睡眠検出装置。 - 前記姿勢判定部は、前記乗員との距離を取得するセンサから取得した前記乗員と前記センサとの距離が、設定された範囲外である場合、前記乗員の姿勢が異常であると判定する
ことを特徴とする請求項1又は請求項2に記載の睡眠検出装置。 - 前記姿勢判定部は、前記撮像画像から抽出された前記乗員の顔パーツが含まれる領域である顔領域の大きさが、設定された範囲外である場合、前記乗員の姿勢が異常であると判定する
ことを特徴とする請求項1又は請求項2に記載の睡眠検出装置。 - 前記画像取得部が取得した前記撮像画像から抽出された前記乗員の顔向きを検出する顔向き検出部と、
前記顔向き検出部が検出した前記乗員の顔向きの変化量を算出する顔向き変化量算出部と、をさらに備え、
前記姿勢判定部は、設定された姿勢判定時間内において、前記顔向き変化量算出部が算出した前記乗員の顔向きの変化量が、設定された姿勢判定閾値以上となれば、前記乗員の姿勢が異常であると判定する
ことを特徴とする請求項1又は請求項2に記載の睡眠検出装置。 - 前記画像取得部が取得した前記撮像画像から抽出された前記乗員の顔向きを検出する顔向き検出部と、
前記顔向き検出部が検出した前記乗員の顔向きの変化量を算出する顔向き変化量算出部と、をさらに備え、
前記鈍化判定部は、前記顔向き変化量算出部が算出した前記乗員の顔向きの変化量が、設定された鈍化判定閾値未満である状態が、設定された鈍化判定時間以上継続した場合、前記乗員に知覚の鈍化が発生していると判定する
ことを特徴とする請求項1から請求項4のいずれか一項に記載の睡眠検出装置。 - 前記画像取得部が取得した前記撮像画像から抽出された前記乗員の顔向きを検出する顔向き検出部と、
前記顔向き検出部が検出した前記乗員の顔向きの変化量を算出する顔向き変化量算出部と、をさらに備え、
前記鈍化判定部は、前記顔向き変化量算出部が算出した前記乗員の顔向きの変化量が、設定された鈍化判定閾値未満である状態が、設定された鈍化判定時間以上継続した場合、前記乗員に知覚の鈍化が発生していると判定し、
前記姿勢判定部は、設定された姿勢判定時間内において、前記顔向き変化量算出部が算出した前記乗員の顔向きの変化量が、設定された姿勢判定閾値以上となれば、前記乗員の姿勢が異常であると判定する
ことを特徴とする請求項1又は請求項2に記載の睡眠検出装置。 - 前記画像取得部が取得した前記撮像画像から抽出された前記乗員の顔の状態に関する顔情報を検出する顔情報検出部をさらに備え、
前記鈍化判定部は、前記顔情報検出部が検出した顔情報を用いて、前記乗員に知覚の鈍化が発生しているか否かを判定する
ことを特徴とする請求項1から請求項5のいずれか一項に記載の睡眠検出装置。 - 前記顔情報検出部は、前記乗員の、目の開眼度、瞬きの速度、及び瞬きの頻度の少なくともいずれかを前記顔情報として検出し、
前記鈍化判定部は、前記乗員の、目の開眼度、瞬きの速度、及び瞬きの頻度の少なくともいずれかが、設定された鈍化判定閾値未満である状態が、設定された鈍化判定時間以上継続した場合、前記乗員に知覚の鈍化が発生していると判定する
ことを特徴とする請求項8に記載の睡眠検出装置。 - 前記顔情報検出部が検出した前記顔情報を用いて、前記乗員の眠気レベルを算出する眠気レベル算出部をさらに備え、
前記睡眠検出部は、前記眠気レベル算出部により前記乗員の眠気レベルが設定された閾値以上にあると判定され、前記鈍化判定部により前記乗員に知覚の鈍化が発生していると判定され、前記姿勢判定部により前記乗員の姿勢が異常であると判定された場合に、前記乗員が睡眠に陥ったとする
ことを特徴とする請求項8又は請求項9に記載の睡眠検出装置。 - 前記画像取得部が取得した前記撮像画像から抽出された前記乗員の状態に関する顔情報を検出する顔情報検出部と、
前記顔情報検出部が検出した前記顔情報を用いて、前記乗員の眠気レベルを算出する眠気レベル算出部と、をさらに備え、
前記睡眠検出部は、前記眠気レベル算出部により、前記乗員の眠気レベルが設定された閾値以上にあると判定された場合に、前記鈍化判定部により前記乗員に知覚の鈍化が発生していると判定され、前記姿勢判定部により前記乗員の姿勢が異常であると判定されたら、前記乗員が睡眠に陥ったとする
ことを特徴とする請求項1から請求項7のいずれか一項に記載の睡眠検出装置。 - 前記車両に搭載され、前記睡眠検出部により前記乗員が睡眠に陥ったことが検出された場合、前記乗員に向けて報知を行う報知部をさらに備えた
ことを特徴とする請求項1から請求項11のいずれか一項に記載の睡眠検出装置。 - 車両に搭載され前記車両の内部の乗員を撮像する撮像装置と、
前記撮像装置から、前記乗員が撮像された撮像画像を取得する画像取得部と、
前記画像取得部が取得した前記撮像画像から抽出された前記乗員に知覚の鈍化が発生しているか否かを判定する鈍化判定部と、
前記画像取得部が取得した前記撮像画像から抽出された前記乗員の姿勢が異常であるか否かを判定する姿勢判定部と、
前記鈍化判定部の判定結果及び前記姿勢判定部の判定結果を用いて、前記乗員の睡眠を検出する睡眠検出部と、を備え、
前記睡眠検出部は、前記鈍化判定部により前記乗員に知覚の鈍化が発生していると判定され、前記姿勢判定部により前記乗員の姿勢が異常であると判定された場合、前記乗員が睡眠に陥ったとする
ことを特徴とする睡眠検出システム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/044237 WO2022113275A1 (ja) | 2020-11-27 | 2020-11-27 | 睡眠検出装置及び睡眠検出システム |
JP2022564938A JPWO2022113275A1 (ja) | 2020-11-27 | 2020-11-27 | |
DE112020007803.5T DE112020007803T5 (de) | 2020-11-27 | 2020-11-27 | Schlafdetektionsvorrichtung und Schlafdetektionsverfahren |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/044237 WO2022113275A1 (ja) | 2020-11-27 | 2020-11-27 | 睡眠検出装置及び睡眠検出システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022113275A1 true WO2022113275A1 (ja) | 2022-06-02 |
Family
ID=81755449
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/044237 WO2022113275A1 (ja) | 2020-11-27 | 2020-11-27 | 睡眠検出装置及び睡眠検出システム |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2022113275A1 (ja) |
DE (1) | DE112020007803T5 (ja) |
WO (1) | WO2022113275A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116077798A (zh) * | 2023-04-10 | 2023-05-09 | 南京信息工程大学 | 一种基于语音诱导与计算机视觉相结合的催眠方法 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0757172A (ja) * | 1993-08-16 | 1995-03-03 | Mitsubishi Electric Corp | 運転者異常事態発生警報装置 |
JP2010198313A (ja) * | 2009-02-25 | 2010-09-09 | Denso Corp | 開眼度特定装置 |
JP2014092965A (ja) * | 2012-11-05 | 2014-05-19 | Denso Corp | 乗員監視装置 |
JP2016539446A (ja) * | 2013-10-29 | 2016-12-15 | キム,ジェ−チョル | 動作と、顔面と、目及び口形状の認知を通じた2段階に亘っての居眠り運転の防止装置 |
JP2017049636A (ja) * | 2015-08-31 | 2017-03-09 | マツダ株式会社 | 運転者状態検出装置 |
JP2019148491A (ja) * | 2018-02-27 | 2019-09-05 | オムロン株式会社 | 乗員監視装置 |
JP2020086907A (ja) * | 2018-11-26 | 2020-06-04 | トヨタ自動車株式会社 | 漫然運転判定装置 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4259585B2 (ja) | 2007-03-08 | 2009-04-30 | 株式会社デンソー | 眠気判定装置,プログラムおよび眠気判定方法 |
-
2020
- 2020-11-27 WO PCT/JP2020/044237 patent/WO2022113275A1/ja active Application Filing
- 2020-11-27 DE DE112020007803.5T patent/DE112020007803T5/de active Pending
- 2020-11-27 JP JP2022564938A patent/JPWO2022113275A1/ja active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0757172A (ja) * | 1993-08-16 | 1995-03-03 | Mitsubishi Electric Corp | 運転者異常事態発生警報装置 |
JP2010198313A (ja) * | 2009-02-25 | 2010-09-09 | Denso Corp | 開眼度特定装置 |
JP2014092965A (ja) * | 2012-11-05 | 2014-05-19 | Denso Corp | 乗員監視装置 |
JP2016539446A (ja) * | 2013-10-29 | 2016-12-15 | キム,ジェ−チョル | 動作と、顔面と、目及び口形状の認知を通じた2段階に亘っての居眠り運転の防止装置 |
JP2017049636A (ja) * | 2015-08-31 | 2017-03-09 | マツダ株式会社 | 運転者状態検出装置 |
JP2019148491A (ja) * | 2018-02-27 | 2019-09-05 | オムロン株式会社 | 乗員監視装置 |
JP2020086907A (ja) * | 2018-11-26 | 2020-06-04 | トヨタ自動車株式会社 | 漫然運転判定装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116077798A (zh) * | 2023-04-10 | 2023-05-09 | 南京信息工程大学 | 一种基于语音诱导与计算机视觉相结合的催眠方法 |
Also Published As
Publication number | Publication date |
---|---|
DE112020007803T5 (de) | 2023-11-16 |
JPWO2022113275A1 (ja) | 2022-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104573623B (zh) | 人脸检测装置、方法 | |
JP6573193B2 (ja) | 判定装置、判定方法、および判定プログラム | |
JP4915413B2 (ja) | 検出装置および方法、並びに、プログラム | |
JP6973928B2 (ja) | 瞼開閉判定装置および眠気検知装置 | |
JP7369184B2 (ja) | 運転者注意状態推定 | |
US11453401B2 (en) | Closed eye determination device | |
JP4776581B2 (ja) | 入眠状態検知方法、入眠状態検知装置、入眠状態検知システム及びコンピュータプログラム | |
US20240096116A1 (en) | Devices and methods for detecting drowsiness of drivers of vehicles | |
WO2022113275A1 (ja) | 睡眠検出装置及び睡眠検出システム | |
US11430231B2 (en) | Emotion estimation device and emotion estimation method | |
JP7024332B2 (ja) | ドライバモニタシステム | |
US11195108B2 (en) | Abnormality detection device and abnormality detection method for a user | |
US11161470B2 (en) | Occupant observation device | |
JP7046748B2 (ja) | 運転者状態判定装置および運転者状態判定方法 | |
JP2011125620A (ja) | 生体状態検出装置 | |
JP6594595B2 (ja) | 運転不能状態判定装置および運転不能状態判定方法 | |
JP6689470B1 (ja) | 情報処理装置、プログラム及び情報処理方法 | |
JP7019394B2 (ja) | 視認対象検知装置、視認対象検知方法、およびプログラム | |
JP7325687B2 (ja) | 眠気推定装置及び眠気推定システム | |
JP7420277B2 (ja) | 運転状態判定装置、方法、及びプログラム | |
WO2023017595A1 (ja) | 乗員状態判定装置、乗員状態判定方法、および、乗員状態判定システム | |
JP7374386B2 (ja) | 状態判定装置および状態判定方法 | |
JP2022161318A (ja) | 顔認識装置 | |
WO2022190413A1 (ja) | 目開閉判定装置および目開閉判定方法 | |
JP2019079285A (ja) | 安全運転促進装置及び安全運転促進方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20963539 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022564938 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112020007803 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20963539 Country of ref document: EP Kind code of ref document: A1 |