CN112183176A - Occupant observation device, occupant observation method, and storage medium - Google Patents
Occupant observation device, occupant observation method, and storage medium Download PDFInfo
- Publication number
- CN112183176A CN112183176A CN202010594597.2A CN202010594597A CN112183176A CN 112183176 A CN112183176 A CN 112183176A CN 202010594597 A CN202010594597 A CN 202010594597A CN 112183176 A CN112183176 A CN 112183176A
- Authority
- CN
- China
- Prior art keywords
- occupant
- eye
- unit
- index value
- degree
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 19
- 210000003128 head Anatomy 0.000 claims abstract description 18
- 238000003384 imaging method Methods 0.000 claims abstract description 17
- 230000002618 waking effect Effects 0.000 claims abstract description 8
- 238000012545 processing Methods 0.000 claims description 34
- 238000001514 detection method Methods 0.000 claims description 33
- 238000009795 derivation Methods 0.000 claims description 9
- 238000010586 diagram Methods 0.000 description 18
- 206010041349 Somnolence Diseases 0.000 description 9
- 239000000284 extract Substances 0.000 description 6
- 239000008186 active pharmaceutical agent Substances 0.000 description 5
- 238000013135 deep learning Methods 0.000 description 3
- 230000004399 eye closure Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention provides an occupant observation device, an occupant observation method and a storage medium capable of accurately determining whether an occupant is dozing. The passenger observation device is provided with: an image pickup unit that picks up an image of a head of an occupant of a vehicle; an index value deriving unit that derives an index value indicating an opening/closing degree of the eyes of the occupant based on the image generated by the imaging unit; a tilt estimation unit that estimates a degree of change in the tilt of the head of the occupant with respect to a waking state; and a determination unit that determines whether the occupant is dozing based on a result of determination based on both the index value derived by the index value deriving unit and the degree of change detected by the inclination estimating unit.
Description
Technical Field
The invention relates to an occupant observation device, an occupant observation method, and a storage medium.
Background
Conventionally, studies have been made on a structure in which an occupant of a vehicle is determined to be drowsy by a device. An important element for determining that the occupant is drowsy is the state of the eyes. Therefore, a device for capturing an image of an occupant with a camera, analyzing the image, and observing the state of eyes has been put to practical use (see, for example, japanese patent laid-open No. 6-266981).
Problems to be solved by the invention
Here, drowsiness of the occupant may also be manifested in a region other than the eyes. However, the conventional technology cannot determine whether or not the occupant is drowsy based on factors other than the state of the eyes.
Disclosure of Invention
An object of the present invention is to provide an occupant observation device, an occupant observation method, and a storage medium that can accurately determine whether an occupant is dozing.
Means for solving the problems
The occupant observation device, the occupant observation method, and the storage medium according to the present invention adopt the following configurations.
(1) An occupant observation device according to an aspect of the present invention includes: an image pickup unit that picks up an image of a head of an occupant of a vehicle; an index value deriving unit that derives an index value indicating an opening/closing degree of the eyes of the occupant based on the image generated by the imaging unit; a tilt estimation unit that estimates a degree of change in the tilt of the head of the occupant with respect to a waking state; and a determination unit that determines whether the occupant is dozing based on a result of determination based on both the index value derived by the index value derivation unit and the degree of change detected by the inclination estimation unit.
(2) The occupant observation device according to (1) above further includes an eye detection unit that detects at least a part of a contour of the eyes of the occupant based on the image generated by the imaging unit, and the index value derivation unit derives the index value based on a positional relationship of a plurality of feature points in the contour detected by the eye detection unit.
(3) In the occupant observation device according to the above-described aspect (1) or (2), the determination unit determines that the occupant is drowsy when the index value indicates that the eye-open degree is less than a first threshold value, or the index value indicates that a state in which the eye-close degree is equal to or greater than a second threshold value continues for a first predetermined time or longer.
(4) In the occupant observation device according to any one of the above (1) to (3), the determination unit determines that the occupant is drowsy when a state in which the degree of change is inclined to or above a third threshold value continues for a second predetermined time or longer.
(5) In the occupant observation method according to another aspect of the present invention, the computer performs: capturing a head of an occupant of a vehicle and generating an image; deriving an index value indicating the degree of opening and closing of the eyes of the occupant based on the generated image, and estimating the degree of change in the inclination of the head of the occupant with respect to the waking state; and determining whether the occupant is drowsy based on a result of determination based on both the derived index value and the estimated degree of change.
(6) A program stored in a storage medium according to another aspect of the present invention causes a computer to perform: capturing a head of an occupant of a vehicle and generating an image; deriving an index value indicating the degree of opening and closing of the eyes of the occupant based on the generated image, and estimating the degree of change in the inclination of the head of the occupant with respect to the waking state; and determining whether the occupant is drowsy based on a result of determination based on both the derived index value and the estimated degree of change.
Effects of the invention
According to (1) to (6), it is possible to accurately determine whether the occupant is drowsy.
According to (3) to (4), it is possible to determine with higher accuracy whether the occupant is dozing.
Drawings
Fig. 1 is a diagram illustrating an example of a structure and a use environment of an occupant observation apparatus.
Fig. 2 is a diagram illustrating a position where an image pickup unit is provided.
Fig. 3 is a diagram schematically showing the content of processing performed by the eye detection section.
Fig. 4 is a diagram (1) for explaining the processing of the eye-open ratio deriving section.
Fig. 5 is a diagram (2) for explaining the processing of the eye-open ratio deriving section.
Fig. 6 is a diagram schematically showing the contents of the processing performed by the inclination estimating unit.
Fig. 7 is a flowchart showing an example of the flow of processing executed by the image processing apparatus.
Description of the reference numerals
1 … passenger observation device, 10 … image pickup unit, 20 … image processing device, 22 … eye detection unit, 24 … eye opening rate derivation unit, 26 … estimation unit, 28 … determination unit, 100 … various in-vehicle devices, BN … nose bridge, CL1 … first line segment, CL2 … second line segment, CL2d … predetermined line segment, CT … face contour, ECT … eye contour, EG … edge, EW … eye detection window, NM … nose detection window, Th1 … first threshold, Th2 … second threshold, Th3 … third threshold, α … eye opening rate, β … eye closing rate, θ 1 … angle, θ 2 … angle, and θ ini … reference angle.
Detailed Description
Hereinafter, embodiments of an occupant observation device, an occupant observation method, and a storage medium according to the present invention will be described with reference to the drawings.
< embodiment >
Fig. 1 is a diagram illustrating an example of a structure and a use environment of the occupant observation device 1. The occupant observation device 1 includes, for example, an imaging unit 10 and an image processing device 20. The image processing device 20 includes, for example, an eye detection unit 22, an eye-open rate derivation unit 24, a tilt estimation unit 26, and a determination unit 28. The occupant observation device 1 determines whether or not the state of the occupant of the vehicle is dozing, for example, and outputs the determination result to various in-vehicle devices 100. The occupants include at least the driver and may also include a passenger in the front passenger seat. The various in-vehicle devices 100 are a driving support device, an automatic driving control device, an intelligent device, and other devices, and the occupant observation device 1 estimates and outputs the state of the occupant according to the type and purpose of the various in-vehicle devices 100.
Fig. 2 is a diagram illustrating a position where the imaging unit 10 is provided. The imaging unit 10 is provided, for example, at a center portion of an instrument panel of the vehicle, and captures at least a head of an occupant of the vehicle to generate an image. The imaging unit 10 is provided at the following positions: a position offset in the lateral direction from a position facing either of the driver seat DS (an example of a seating position) and the passenger seat AS (another example of a seating position) on which the steering wheel SW is provided. Therefore, the image generated by the imaging unit 10 is an image generated by imaging the head of the occupant diagonally in the lateral direction. In other words, at least the head of the occupant is not present on the optical axis of the imaging unit 10.
Referring back to fig. 1, each part of the image processing apparatus 20 will be described. The components of the image Processing apparatus 20 are realized by a hardware processor such as a cpu (central Processing unit) executing a program (software). Some or all of these components may be realized by hardware (including circuit units) such as lsi (large Scale integration), asic (application Specific Integrated circuit), FPGA (Field-Programmable Gate Array), and gpu (graphics Processing unit), or may be realized by cooperation between software and hardware. The program may be stored in advance in a storage device (a storage device including a non-transitory storage medium) such as an hdd (hard Disk drive) or a flash memory, or may be stored in a removable storage medium (a non-transitory storage medium) such as a DVD or a CD-ROM, and may be attached to the drive device via the storage medium.
The function of the eye detection unit 22 will be described as an example of a method of detecting an eye after extracting an edge. The eye detection unit 22 first extracts an edge in an image generated by the imaging unit 10 (hereinafter, referred to as a captured image), for example. The edge is a pixel (or a group of pixels) having a pixel value difference from its peripheral pixel larger than a reference value, that is, a characteristic pixel. The eye detection unit 22 extracts an edge using an edge extraction filter such as a SOBEL filter. The SOBEL filter is used only as an example, and the eye detecting unit 22 may extract the edge based on another filter or algorithm.
The eye detection unit 22 detects at least a part of the eyes of the occupant in the captured image, for example, based on the distribution of the edges extracted in the captured image. At this time, the eye detection unit 22 detects one of the eyes of the passenger which is close to the imaging unit 10. The eye close to the imaging unit 10 is the left eye of the occupant seated in the driver seat DS, and the right eye of the occupant seated in the passenger seat AS. The eye detection unit 22 may directly detect a part of the eye (or a feature point described later) by inputting the captured image into a learned model generated by a machine learning method such as deep learning, without extracting an edge.
Fig. 3 is a diagram schematically showing the content of the processing performed by the eye detecting section 22. In the figure, IM indicates an image obtained by superimposing the edge EG on the captured image. In the present figure, attention is focused exclusively on the occupant seated in the driver seat DS. As shown in the upper diagram of fig. 3, the eye detection unit 22 first extracts the contour CT of the face by fitting an ellipse, an egg-shaped model, or the like to the edge EG. Next, as shown in the middle diagram of fig. 3, the eye detecting unit 22 sets a nose detection window NM with reference to the contour CT of the face, and detects a position of the nose bridge BN, which is a portion where an edge is easily clearly extracted, in the nose detection window NM. Next, as shown in the lower diagram of fig. 3, the eye detection unit 22 sets an eye detection window EW of a predetermined size on the right side of the bridge BN where the left eye of the occupant is to be present, with reference to the position of the bridge BN, and detects at least a part of the eyes in the eye detection window EW. Thus, since eye detection window EW is set at a position overlapping the left eye of the occupant seated in driver seat DS, the left eye is detected in eye detection window EW. Specific examples of the processing of "detecting at least a part of an eye" are variously defined, but in the following description, the processing means "detecting at least a part of a contour of an eye". When detecting the contour, the eye detection unit 22 detects the contour by fitting a curve model to the distribution of the edge EG, for example.
The eye-opening rate deriving section 24 derives the eye-opening rate α of the eyes of the occupant based on the positional relationship of the plurality of feature points in the contour of the eyes detected by the eye detecting section 22. The plurality of feature points include, for example, a first feature point which is an end portion (corresponding to the external canthus) on the side close to the imaging unit 10 in the lateral direction in the contour of the eye, a second feature point which is an upper end portion, and a third feature point which is a lower end portion. Fig. 4 is a diagram (1) for explaining the processing of the eye-open ratio deriving unit 24. In the figure, P1 is a first feature point, P2 is a second feature point, and P3 is a third feature point. The eye-opening rate deriving unit 24 virtually moves a vertical line from the right end of the eye detection window EW to the left in the eye detection window EW, for example, and sets an intersection point when the vertical line initially intersects the contour ECT of the eye as the first feature point P1. Further, the eye-opening rate deriving unit 24 virtually moves the horizontal line downward from the upper end of the eye detection window EW within the eye detection window EW, for example, and sets the intersection point when the horizontal line initially intersects the contour ECT of the eye as the second feature point P2. Further, the eye-opening rate deriving unit 24 virtually moves the horizontal line upward from the lower end of the eye detection window EW within the eye detection window EW, for example, and sets the intersection point when the horizontal line initially intersects the contour ECT of the eye as the third feature point P3.
Then, the eye-open rate derivation section 24 derives the eye-open rate α of the occupant based on the angle formed by the first straight line connecting the first characteristic point P1 and the second characteristic point P2 and the second straight line connecting the first characteristic point P1 and the third characteristic point P3. Fig. 5 is a diagram (2) for explaining the processing of the eye-open ratio deriving unit 24. In the figure, L1 is a first straight line, L2 is a second straight line, and θ 1 is an angle formed by these lines. The eye-open rate deriving unit 24 defines, for example, a reference angle θ ini obtained by averaging angles derived from the captured images of several minutes from the time the occupant gets into the vehicle as 100 [% ] of the eye-open rate α, and then divides the derived angle θ 1 by the reference angle θ ini to derive the eye-open rate α (see expression (1)). When performing passenger authentication, the reference angle corresponding to 100 [% ] for each passenger may be stored in the memory, and the reference angle may be read from the memory for each passenger and used for calculation. The predetermined value may be set as the reference angle θ ini, or the predetermined value may be used at the beginning and gradually matched to the average angle of the occupant.
α=MIN{θ1/θini,100[%]}…(1)
In the above description, the eye-open rate α is derived based on the angle θ 1 on the image plane, but for example, by preparing a three-dimensional model of the eyes in advance, two-dimensionally mapping the model from the eye rotated according to the face orientation angle estimated from the relationship between the contour CT of the face and the bridge of the nose BN, and then performing the above-described processing, the accuracy of estimating the eye-open rate α can be improved. The eye-open ratio deriving unit 24 may derive the eye-closed ratio of the occupant as an index value indicating the degree of opening and closing of the eyes. In this case, the eye-open rate deriving unit 24 derives a value obtained by subtracting the eye-open rate α from 100 [% ] as the eye-closing rate β. The eye-open rate α and the eye-closing rate β are examples of "index values indicating the degree of opening and closing of the eyes", the eye-open rate deriving unit 24 is an example of "index value deriving unit", the eye-open rate α is an example of "eye-open degree", and the eye-closing rate β is an example of "eye-closing degree".
Fig. 6 is a diagram schematically showing the contents of the processing performed by the inclination estimation unit 26. Note that, in this figure, attention is focused exclusively on the occupant seated in the driver seat DS. First, the inclination estimation unit 26 extracts the contour CT of the face by fitting a model such as an ellipse or an egg shape to the edge EG as shown in the upper diagram of fig. 6, using a part of the processing procedure of the eye detection unit 22. Next, as shown in the middle diagram of fig. 6, the inclination estimation unit 26 detects the position of the second line CL2 connecting the midpoint of the virtual first line CL1 connecting the left eye and the right eye detected by the eye detection unit 22 to the midpoint of the arc of the extracted contour CT of the face (i.e., the position of the jaw). Instead of the above, the inclination estimation unit 26 may directly detect the second line CL2 by inputting the captured image into a learned model generated by a machine learning method such as deep learning.
Then, as shown in the lower diagram of fig. 6, the inclination estimation unit 26 estimates the degree of change of the second line segment CL2 obtained with reference to the predetermined line segment CL2 d. The predetermined line segment CL2d is a line segment detected when the occupant is awake. The predetermined line segment CL2d is used as a reference for evaluating how much the second line segment CL2 is inclined compared to when awake. For example, the predetermined line segment CL2d is detected at the timing when the occupant gets in the vehicle. The information indicating the detected predetermined line segment CL2d is stored (updated), for example, in a storage unit (not shown) provided in the occupant observation device 1. The inclination estimation unit 26 estimates, for example, an angle θ 2 formed by the second line segment CL2 and the predetermined line segment CL2d when the lower end of the second line segment CL2 is overlapped with the lower end of the predetermined line segment CL2d, as a degree of change of the second line segment CL2 based on the predetermined line segment CL2 d.
In the above description, the degree of change is derived based on the angle θ 2 on the image plane, but for example, by preparing a three-dimensional model of the second line segment CL2 in advance, two-dimensionally mapping the model from the second line segment CL2 rotated in accordance with the face orientation angle estimated from the relationship between the contour CT of the face and the bridge BN of the nose, and then performing the above-described processing, the accuracy of estimating the degree of change can be improved.
The determination unit 28 determines whether or not the occupant is drowsy based on, for example, the eye-open rate α derived by the eye-open rate derivation unit 24 and the angle θ 2 estimated by the inclination estimation unit 26, and outputs the determination result to various in-vehicle devices 100. For example, the determination unit 28 determines that the drowsiness of the occupant is stronger as the eye-open rate α is smaller, and the determination unit 28 determines that the drowsiness of the occupant is stronger or the occupant is drowsy when the eye-open rate α is smaller than the first threshold Th 1. The determination unit 28 determines that the drowsiness of the occupant is stronger as the eye-closing rate β is larger, and the determination unit 28 determines that the drowsiness of the occupant is stronger or the occupant is drowsy when the eye-closing rate β is equal to or higher than the second threshold value Th 2. For example, the first threshold value Th1 is a value indicating the eye-open rate α that can distinguish between a state in which the occupant is not drowsy and a state in which the occupant is drowsy, and the second threshold value Th2 is a value indicating the eye-close rate β that can distinguish between a state in which the occupant is not drowsy and a state in which the occupant is drowsy.
The determination unit 28 determines that the drowsiness of the occupant is stronger as the angle θ 2 is larger, and when the angle θ 2 is equal to or larger than the third threshold Th3, the determination unit 28 determines that the drowsiness of the occupant is stronger or the occupant is drowsy. The third threshold Th3 is a value indicating the angle θ 2 that can identify a state where the occupant is not drowsy from a state where the occupant is drowsy.
The first threshold Th1, the second threshold Th2, and the third threshold Th3 may be set in advance, may be determined for each occupant of the passenger vehicle, or may be derived using a learning model learned by deep learning using an image generated by the imaging unit 10.
The determination unit 28 determines that the occupant is drowsy when the state in which the eye-open rate α is smaller than the first threshold value Th1 continues for the first predetermined time or longer, or the state in which the eye-close rate β is equal to or greater than the second threshold value Th2 continues for the first predetermined time or longer. The determination unit 28 determines that the occupant is drowsy when the state in which the angle θ 2 is equal to or greater than the third threshold value Th3 continues for a second predetermined time or longer.
[ operation procedure ]
Fig. 7 is a flowchart showing an example of the flow of processing executed by the image processing apparatus 20. First, the image processing apparatus 20 acquires an image generated by the imaging unit 10 (step S100). Next, the eye detecting unit 22 extracts an edge from the acquired image (step S102).
The eye detecting unit 22 detects at least a part of the eyes of the occupant in the captured image based on the distribution of the edges extracted in the captured image (step S104). The eye-opening rate deriving section 24 derives the eye-opening rate α of the eyes of the occupant based on the positional relationship of the plurality of feature points in the contour of the eyes detected by the eye detecting section 22 (step S106). The eye-opening ratio deriving unit 24 may derive the eye-closing ratio β of the eyes of the occupant based on the positional relationship between the plurality of feature points.
The determination unit 28 determines whether the eye-open rate α derived by the eye-open rate derivation unit 24 is smaller than the first threshold value Th1 or whether the eye-closed rate β is equal to or greater than the second threshold value Th2 (step S108). When determining that the eye-open rate α is equal to or higher than the first threshold value Th1 or that the eye-close rate β is lower than the second threshold value Th2, the determination unit 28 assumes that the occupant is not drowsy and ends the processing. When determining that the eye-open rate α is smaller than the first threshold value Th1 or the eye-closure rate β is equal to or greater than the second threshold value Th2, the determination unit 28 determines that the occupant feels drowsiness, and further determines whether or not the state in which the eye-open rate α is smaller than the first threshold value Th1 or the state in which the eye-closure rate β is equal to or greater than the second threshold value Th2 continues for a first predetermined time or longer (step S110). When determining that the eye-open rate α is smaller than the first threshold value Th1 or the eye-close rate β is equal to or greater than the second threshold value Th2 for the first predetermined time or longer, the determination unit 28 proceeds to step S120 to determine that the occupant is drowsy (step S120).
When the determination unit 28 determines that the state in which the eye-open rate α is smaller than the first threshold Th1 or the state in which the eye-close rate β is equal to or greater than the second threshold Th2 does not continue for the first predetermined time or longer, the inclination estimation unit 26 detects the position of the second line segment CL2 based on the edge EG extracted by the eye detection unit 22 (step S112). The inclination estimation unit 26 estimates the degree of change (i.e., the angle θ 2) of the second line segment CL2 obtained with reference to the predetermined line segment CL2d (step S114).
The determination unit 28 determines whether or not the angle θ 2 estimated by the inclination estimation unit 26 is equal to or greater than the third threshold Th3 (step S116). When determining that the angle θ 2 is smaller than the third threshold Th3, the determination unit 28 assumes that the occupant is not drowsy and ends the process. When determining that the angle θ 2 is equal to or greater than the third threshold value Th3, the determination unit 28 determines that the occupant feels drowsiness, and further determines whether or not the state where the angle θ 2 is equal to or greater than the third threshold value Th3 continues for a second predetermined time or longer (step S118).
The determination unit 28 determines that the occupant is not drowsy and ends the process when it determines that the state in which the eye-open rate α is smaller than the first threshold value Th1, the state in which the eye-close rate β is equal to or greater than the second threshold value Th2 has not continued for the first predetermined time or longer, and the state in which the angle θ 2 is equal to or greater than the third threshold value Th3 has not continued for the second predetermined time or longer. When determining that the state in which the angle θ 2 is equal to or greater than the third threshold value Th3 has continued for at least the second predetermined time, the determination unit 28 determines that the occupant is drowsy (step S120).
In the above description, the image processing apparatus 20 has been described as performing the determination processing (the processing of steps S112 to S118) for the angle θ 2 after performing the determination processing (the processing of steps S104 to S110) for the eye-open ratio α or the eye-closed ratio β, but the present invention is not limited thereto. For example, the image processing device 20 may perform the determination process for the eye-open rate α or the eye-close rate β and the determination process for the angle θ 2 in parallel, or may perform the determination process for the eye-open rate α or the eye-close rate β after performing the determination process for the angle θ 2.
As described above, according to the occupant observation device 1 of the present embodiment, it is possible to accurately determine whether or not the occupant is drowsy, based on the result of determination based on both the index value that is the index value indicating the degree of opening and closing of the eyes derived by the processing of the eye detection unit 22 and the eye-open rate derivation unit 24 and the degree of change that is the degree of change in the inclination of the head of the occupant estimated by the inclination estimation unit 26 with respect to the waking hours, compared to using any index.
While the present invention has been described with reference to the embodiments, the present invention is not limited to the embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.
Claims (7)
1. An occupant observation apparatus, wherein,
the occupant observation device includes:
an image pickup unit that picks up an image of a head of an occupant of a vehicle;
an index value deriving unit that derives an index value indicating an opening/closing degree of the eyes of the occupant based on the image generated by the imaging unit;
a tilt estimation unit that estimates a degree of change in the tilt of the head of the occupant with respect to a waking state; and
and a determination unit that determines whether the occupant is dozing based on a result of determination based on both the index value derived by the index value derivation unit and the degree of change detected by the inclination estimation unit.
2. The occupant observation device according to claim 1,
the occupant observation device further includes an eye detection unit that detects at least a part of a contour of the eyes of the occupant based on the image generated by the image pickup unit,
the index value deriving unit derives the index value based on a positional relationship of a plurality of feature points in the contour detected by the eye detecting unit.
3. The occupant observation device according to claim 1,
the determination unit determines that the occupant is drowsy when the index value indicates that the eye-open degree is less than a first threshold value, or that the index value indicates that a state in which the eye-close degree is equal to or greater than a second threshold value continues for a first predetermined time or longer.
4. The occupant observation device according to claim 2,
the determination unit determines that the occupant is drowsy when the index value indicates that the eye-open degree is less than a first threshold value, or that the index value indicates that a state in which the eye-close degree is equal to or greater than a second threshold value continues for a first predetermined time or longer.
5. The occupant observation device according to any one of claims 1 to 4,
the determination unit determines that the occupant is drowsy when a state in which the degree of change is inclined by a third threshold value or more continues for a second predetermined time or more.
6. A method of observing an occupant, wherein,
the occupant observation method causes a computer to perform:
capturing a head of an occupant of a vehicle and generating an image;
deriving an index value indicating the degree of opening and closing of the eyes of the occupant based on the generated image, and estimating the degree of change in the inclination of the head of the occupant with respect to the waking state; and
and determining whether the occupant is dozing based on a result of determination based on both the derived index value and the estimated degree of change.
7. A storage medium storing a program, wherein,
the program causes a computer to perform the following processing:
capturing a head of an occupant of a vehicle and generating an image;
deriving an index value indicating the degree of opening and closing of the eyes of the occupant based on the generated image, and estimating the degree of change in the inclination of the head of the occupant with respect to the waking state; and
and determining whether the occupant is dozing based on a result of determination based on both the derived index value and the estimated degree of change.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019124391A JP2021007717A (en) | 2019-07-03 | 2019-07-03 | Occupant observation device, occupant observation method and program |
JP2019-124391 | 2019-07-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112183176A true CN112183176A (en) | 2021-01-05 |
Family
ID=73918816
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010594597.2A Pending CN112183176A (en) | 2019-07-03 | 2020-06-24 | Occupant observation device, occupant observation method, and storage medium |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2021007717A (en) |
CN (1) | CN112183176A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN204303129U (en) * | 2014-11-27 | 2015-04-29 | 程长明 | Anti-fatigue warning system and anti-fatigue eyeglasses |
CN105144199A (en) * | 2013-02-21 | 2015-12-09 | Iee国际电子工程股份公司 | Imaging device based occupant monitoring system supporting multiple functions |
CN105701445A (en) * | 2014-12-15 | 2016-06-22 | 爱信精机株式会社 | determination apparatus and determination method |
CN205881127U (en) * | 2016-07-27 | 2017-01-11 | 尤明洲 | Vehicle driver fatigue monitors alarm system |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004314750A (en) * | 2003-04-15 | 2004-11-11 | Denso Corp | Vehicle instrument operation control device |
JP4458146B2 (en) * | 2007-02-16 | 2010-04-28 | 株式会社デンソー | Sleepiness determination apparatus and program |
FR3038770B1 (en) * | 2015-07-10 | 2021-03-19 | Innov Plus | OPERATOR VIGILANCE MONITORING SYSTEM |
US10793149B2 (en) * | 2015-09-30 | 2020-10-06 | Sony Corporation | Control apparatus, control method, and program |
-
2019
- 2019-07-03 JP JP2019124391A patent/JP2021007717A/en active Pending
-
2020
- 2020-06-24 CN CN202010594597.2A patent/CN112183176A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105144199A (en) * | 2013-02-21 | 2015-12-09 | Iee国际电子工程股份公司 | Imaging device based occupant monitoring system supporting multiple functions |
CN204303129U (en) * | 2014-11-27 | 2015-04-29 | 程长明 | Anti-fatigue warning system and anti-fatigue eyeglasses |
CN105701445A (en) * | 2014-12-15 | 2016-06-22 | 爱信精机株式会社 | determination apparatus and determination method |
CN205881127U (en) * | 2016-07-27 | 2017-01-11 | 尤明洲 | Vehicle driver fatigue monitors alarm system |
Also Published As
Publication number | Publication date |
---|---|
JP2021007717A (en) | 2021-01-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4915413B2 (en) | Detection apparatus and method, and program | |
CN101523411B (en) | Eye opening detection system and method of detecting eye opening | |
US9928404B2 (en) | Determination device, determination method, and non-transitory storage medium | |
US9822576B2 (en) | Method for operating an activatable locking device for a door and/or a window, securing device for a vehicle, vehicle | |
US20140037144A1 (en) | Eyelid-detection device, eyelid-detection method, and recording medium | |
JP6584717B2 (en) | Face orientation estimation apparatus and face orientation estimation method | |
JP4992823B2 (en) | Face detection apparatus and face detection method | |
US11161470B2 (en) | Occupant observation device | |
WO2018167995A1 (en) | Driver state estimation device and driver state estimation method | |
JP5691834B2 (en) | Image identification apparatus and program | |
JP5349350B2 (en) | Eye opening degree determination device and eye opening degree determination method | |
CN112183176A (en) | Occupant observation device, occupant observation method, and storage medium | |
KR20170028631A (en) | Method and Apparatus for Detecting Carelessness of Driver Using Restoration of Front Face Image | |
US20220188992A1 (en) | Image processing apparatus, image processing method, and storage medium | |
CN111696312B (en) | Passenger observation device | |
JP5035139B2 (en) | Eye image processing device | |
TWI579173B (en) | An driver fatigue monitoring and detection method base on an ear-angle | |
JP2011086051A (en) | Eye position recognition device | |
WO2024075205A1 (en) | Occupant condition determination device, occupant condition determination system, occupant condition determination method, and program | |
JP2009003644A (en) | Eye opening degree decision device | |
JP7301256B2 (en) | Face direction determination device and face direction determination method | |
US11551446B2 (en) | Video detection device, and video detection method | |
JP7204068B2 (en) | Occupant temperature estimation device, occupant state detection device, occupant temperature estimation method, and occupant temperature estimation system | |
JP2009015656A (en) | Face image processor | |
JP2008276406A (en) | Face image processor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |