JP2012084068A - Image analyzer - Google Patents

Image analyzer Download PDF

Info

Publication number
JP2012084068A
JP2012084068A JP2010231672A JP2010231672A JP2012084068A JP 2012084068 A JP2012084068 A JP 2012084068A JP 2010231672 A JP2010231672 A JP 2010231672A JP 2010231672 A JP2010231672 A JP 2010231672A JP 2012084068 A JP2012084068 A JP 2012084068A
Authority
JP
Japan
Prior art keywords
image
driver
degree
license
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2010231672A
Other languages
Japanese (ja)
Inventor
Kazumichi Hoshiya
一路 星屋
Original Assignee
Denso Corp
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp, 株式会社デンソー filed Critical Denso Corp
Priority to JP2010231672A priority Critical patent/JP2012084068A/en
Publication of JP2012084068A publication Critical patent/JP2012084068A/en
Pending legal-status Critical Current

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide an image analyzer for calculating a degree (presence/absence) of drowsiness and inattentive driving of a driver through acquiring by a certain method an image of the driver, especially a front face image of him/her when he/she is awake and analyzing the image by using it.SOLUTION: A face image of a driving license carried by the driver is acquired by a card reader 6 and his/her face image is imaged by a near-infrared camera 2. A partial image is extracted as a template image from the face image in the license which is an arousal front face image of the driver, and a recognition processing is conducted to the image imaged by the camera 2. By comparing a feature amount between the face image in the license and the image imaged by the camera 2, a drowsiness degree and an inattentive driving degree of the driver are determined and an alarm is issued by an alarm device 4 when there is a drowsiness or an inattentive driving.

Description

  The present invention relates to an image analysis apparatus.

  In order to prevent a driver of a car from taking a nap or looking aside, a system has been proposed in which a driver's state is photographed and monitored by a camera, and an alarm is issued when a nap or aside driving is detected. For example, Patent Document 1 below proposes an apparatus capable of performing recognition with high accuracy without moving or changing a direction with respect to an image input apparatus that captures a face image in face recognition processing.

JP 2001-243466 A

  However, the method of Patent Document 1 has problems such as that it takes time for recognition and the accuracy of recognition is not high because image analysis is performed using an average face template image. It is desirable to have a system that obtains the driver's image by some method, more ideally, a frontal image at awakening, and uses that to analyze the image to calculate the driver's sleepiness and the degree of presence (existence) However, there is no proposal of such a system in the prior art.

  Therefore, in view of the above problems, the problem to be solved by the present invention is to acquire an image of the driver himself, particularly a frontal image at awakening by some method, and use the image analysis to analyze the driver's sleepiness and An object of the present invention is to provide an image analysis device that calculates the degree of presence (existence).

Means for Solving the Problems and Effects of the Invention

  In order to achieve the above object, an image analysis apparatus according to the present invention includes a photographing means for photographing a current image that is a face image of a driver seated in a driver's seat of a vehicle, and a driver's license possessed by the driver. Using the acquisition means for acquiring the in-license image that is the driver's face image stored in the image, and the face recognition processing in the current image captured by the imaging means, using the in-license image acquired by the acquisition means. A recognition processing unit to perform, a first calculation unit for calculating a predetermined feature amount in the current image recognized by the recognition processing unit, and a degree to which the feature amount calculated by the first calculation unit is away from the reference value. Determining means for determining.

  Thus, in the image analysis apparatus according to the present invention, the face image in the license possessed by the driver is acquired, and using this, the driver recognizes the photographed image, and the feature in the recognized image A quantity is calculated to determine how far it is from the reference value. Therefore, by using the front image of the driver himself / herself in the license for image recognition, for example, in a system for suppressing the driver's dozing or side-by-side driving, image recognition can be performed quickly and with high accuracy. An image analysis apparatus that can quickly and accurately determine a person's dozing driving or aside driving is realized.

  In addition, the image processing apparatus includes an extraction unit that extracts a partial image of a predetermined part of the face in the image in the license acquired by the acquisition unit, and the recognition processing unit includes the partial image of the predetermined part extracted by the extraction unit and the current image A matching process may be performed on the image to recognize a predetermined part in the current image.

  As a result, image recognition of the driver's photographed image is performed by matching processing using the partial image in the image in the license, so that matching processing using the partial image of the driver's own front image can be performed quickly and with high accuracy. The part can be recognized. Therefore, for example, in a system for suppressing a driver's nap driving or aside driving, an image analysis apparatus capable of quickly and highly accurately recognizing images and quickly and accurately determining a driver's napping driving or aside driving. realizable.

  The image analysis apparatus according to the present invention also includes a photographing means for photographing a current image, which is a face image of a driver seated in a driver's seat of a vehicle, and a driver's license stored in the driver's license possessed by the driver. An acquisition unit that acquires an image in a license that is a face image, a second calculation unit that calculates a predetermined feature amount in the current image acquired by the photographing unit, and an image in the license acquired by the acquisition unit A third calculation unit that calculates a predetermined feature amount, and a feature amount calculated by the second calculation unit is far from a reference value determined according to the predetermined feature amount calculated by the third calculation unit And determining means for determining.

  As a result, the face image in the license possessed by the driver is acquired, and the feature amount of the driver is compared with the feature amount of the photographed image of the driver, and the feature amount of the photographed image is the feature of the image in the license. It is determined whether the distance is away from the reference value determined by the amount. Therefore, by using the feature value of the front image when the driver wakes up in the driver's license as a comparison target, for example, in a system for suppressing the driver's dozing driving or side-by-side driving, the determination of sleepiness or side-by-side is high. It can be done accurately. Therefore, it is possible to realize an image analysis device that can determine the driver's drowsy driving or aside driving with high accuracy.

  Further, the predetermined feature amount may include a numerical value indicating a position of a part indicating the degree of sleepiness in the face image.

  Thus, by using the position of the part indicating the degree of sleepiness in the face image as a feature amount, image analysis using the front image when the driver himself / herself awakens in the license is performed, and the degree of sleepiness of the driver is determined. Judge quickly and with high accuracy. Therefore, it is possible to realize an image analysis apparatus that can quickly and accurately determine the driver's drowsy driving.

  The numerical value indicating the position of the part indicating the degree of sleepiness may be a numerical value correlated with the eyelid opening degree in the face image.

  By using a numerical value correlated with the eyelid opening in the face image as a feature value, image analysis using the front image at the driver's awakening in the driver's license is performed, and the degree of sleepiness of the driver Can be determined quickly and with high accuracy. Therefore, it is possible to realize an image analysis apparatus that can quickly and accurately determine the driver's drowsy driving.

  The predetermined feature amount may include a numerical value indicating the degree of looking aside in the face image.

  By using a numerical value indicating the degree of aside look in the face image as a feature amount, image analysis using the front image at the time of the driver's awakening in the driver's license is performed, and the degree of the driver's aside look is quick, Can be determined with high accuracy. Accordingly, it is possible to realize an image analysis apparatus that can quickly and accurately determine the driver's side-by-side driving.

  Further, the numerical value indicating the degree of looking aside may be a numerical value that correlates with the angle that the face orientation has with respect to the front-rear direction of the vehicle.

  By using a numerical value that correlates with the angle that the direction of the face has with respect to the front and rear direction of the vehicle as a feature amount, image analysis using the front image at the time of the driver's awakening in the license is performed, The degree of driver's side aside can be determined quickly and with high accuracy. Accordingly, it is possible to realize an image analysis apparatus that can quickly and accurately determine the driver's side-by-side driving.

The block diagram of the image-analysis apparatus in the Example of this invention. The flowchart which shows the procedure of the image process with respect to the image in a license. The flowchart which shows the procedure of the image process with respect to the image image | photographed with the near-infrared camera. The figure which shows the example of the feature-value of the eye regarding a sleepiness degree. The figure which shows the example of the feature-value of the face regarding the degree of a side look. The figure which shows the example of the step of the level of a side look.

  Embodiments of the present invention will be described below with reference to the drawings. First, FIG. 1 is a schematic diagram of an apparatus configuration of an image analysis system 1 (image analysis apparatus, hereinafter referred to as system) according to the present invention. The system 1 is installed in, for example, an automobile vehicle.

  The system 1 includes a near-infrared camera 2, a driver monitor electronic control device 3 (ECU: Electronic Control Unit), an alarm device 4, and a card reader 6. The system 1 uses a driver's license 5 (license) possessed by the driver. The near-infrared camera 2, driver monitor ECU 4, alarm device 5, and card reader 6 are connected by, for example, in-vehicle communication and can exchange information.

  The near-infrared camera 2 is, for example, a CCD camera having sensitivity in the near-infrared region. The near-infrared camera 2 is installed in the passenger compartment, irradiates the driver with near-infrared light, receives the reflected light, and takes a picture. The installation location may be any location where a face image in front of the driver can be taken, for example, a steering column. In this system, an image photographed by the near-infrared camera 2 is treated as a monochrome grayscale image (grayscale image). By using the near-infrared camera 2, a driver's face image can be acquired even at night.

  The driver monitor ECU 3 has a structure similar to that of an ordinary computer, and includes a CPU 30 that executes information processing such as various calculations, a RAM 31 that is a temporary storage unit as a work area of the CPU 30, and a ROM 32 that stores various information. The ROM 32 may include an EEPROM whose stored contents can be rewritten.

  The alarm device 4 issues an alarm toward the passenger compartment or the driver when it is detected that the driver's drowsiness level is high or the driver is looking aside by image processing to be described later. As the alarm device 4 of the present invention, a device that informs the driver (and the occupant) that the degree of drowsiness is high and that he is looking aside may be used in various ways. For example, notification by sound (or sound such as a buzzer) that informs that the degree of drowsiness is high or looking aside is also possible. The notification which displays a character, a pattern, and a pattern on the display apparatus of an instrument panel may be sufficient. Moreover, the alert | report by sending cool air toward a driver | operator from an air conditioner may be sufficient. A system in which a vibration device is provided in the driver's seat or the steering wheel and the driver is notified by vibration may be used.

  The license 5 incorporates an IC chip 50. Various information on the license is stored in the IC chip 50, and a face image 51 facing the front of the driver (the license holder) is stored in the information. The IC chip 50 is a non-contact type, includes an antenna 52, and information in the IC chip is acquired wirelessly. In this system, the image in the license is also handled as a monochrome grayscale image. If the image in the license is not a grayscale image, it is converted to a grayscale image.

  The card reader 6 is equipped to acquire information in the license and has a transmission unit 60 and a reception unit 61. When an electromagnetic wave is transmitted from the transmitter 60 and the license 5 is within the reach of the electromagnetic wave, an electromotive force is generated in the IC chip due to the influence of the electromagnetic field. The electromotive force causes a current to flow in the IC chip 50, and a signal including various information on the license including the driver face image 51 is transmitted from the antenna 52. The card reader 6 receives the signal transmitted from the antenna 52 by the receiving unit 61 and acquires the driver face image 51 from the received signal.

  Under the above configuration, the system 1 performs a process of calculating the degree of sleepiness of the driver and the presence or absence (degree) of looking aside. The processing procedure is shown in FIGS. FIG. 2 shows a procedure for extracting and calculating a template image of each part, various feature amounts and feature points from the face image 51 in the license. 3 recognizes the driver's current face image captured by the near-infrared camera 2 using the template image obtained in FIG. 2, and determines whether or not the driver is drowsy or looks aside (degree). It is a procedure to calculate. The processing procedures in FIGS. 2 and 3 may be programmed in advance and stored in the ROM 32, for example, and the CPU 30 may automatically execute them.

  The process of FIG. 2 will be described. In the process of FIG. 2, first, in S10, the CPU 30 detects the opening of the driver's seat door. Next, in S20, the CPU 30 acquires the face image 51 in the license. Naturally, the face image 51 is a front face image of the driver himself / herself at the time of awakening. Since there is a possibility that other occupants of the vehicle have their own licenses, it is necessary to devise so that the card reader 6 acquires the in-license image only from the driver's license. For this purpose, the communication range of the transmission unit 60 and the reception unit 61 may be limited to the driver's seat, for example, a place where the card reader 6 is equipped is a driver's seat door. Or the form which equips the vehicle interior with the slot which inserts a license may be sufficient.

  In S <b> 30, the CPU 30 calls a template image such as an eye or nose from the ROM 32. The template image such as eyes or nose may be a template image extracted from an arbitrary person's face image, or may be a standard template image obtained by averaging the face images of a plurality of persons. The ROM 32 stores such template images such as eyes and nose in advance.

  In S40, the CPU 30 executes template matching with the template image such as the eyes and nose called in S30 and the image in the license acquired in S20. Thereby, the position with the highest likelihood calculated by matching is recognized as the driver's eyes and nose.

  In S50, the CPU 30 extracts the portion recognized as the driver's eyes and nose in S40 from the in-license image 51, and stores it in the ROM 32 (EEPROM), for example, as the driver's eyes and nose template image. In response to the completion of the image recognition of the in-license image 51 up to S50, in the following in S60 to S80, in the in-license image 51, various feature amounts, which are amounts indicating various features of the face, are calculated, Remember. In S60 and S70, a numerical value correlated with the opening degree of the eyelid is calculated as a numerical value indicating the degree of sleepiness of the driver. Naturally, since the image in the license is an image when the driver is awakened, the numerical value calculated below is a reference value for calculating the sleepiness level.

  First, in S60, the CPU 30 determines the eyelid end curve (a coefficient parameter defining the eyelid end curve) as a feature amount in the eye template image stored in S50 (or the portion recognized as the eye in S40 in the in-license image 51). Parameter for curvature). The eyelid end curve is the lower eyelid curve A1 of the upper eyelid shown in FIG.

  In general, the eyelid end curve becomes closer to a straight line (for example, the curvature becomes smaller) as the degree of drowsiness increases, so the eyelid end curve can be used for the purpose of calculating the degree of drowsiness. Specifically, the calculation in S60 first detects the lower end of the upper eyelid. In general, for example, the eyelid portion has a high lightness and the eyeball has a low lightness. Therefore, the above-described tendency is used in the in-license image 51 of the grayscale image (or converted into the grayscale image). The brightness of each pixel in the region recognized as the eye region in S50 is detected, and the pixel at the lower end portion of the upper eyelid can be detected from the distribution.

  Then, the shape of the pixel at the lower end portion of the upper eyelid is fitted to a quadratic curve, for example. As a fitting method, for example, a coefficient parameter that minimizes the square sum of the difference value between the quadratic curve and the position of the pixel at the lower end of the upper eyelid may be calculated by the least square method.

  Next, in S70, the CPU 30 calculates the eye vertical width as a feature amount in the template image of the eye stored in S50 (or the portion recognized as the eye in S40 in the in-license image 51). As shown in FIG. 4, the upper eyelid A2 is the upper eyelid at the center in the left-right direction of the eye (or at the point where the distance between the lower end of the upper eyelid and the upper end of the lower eyelid is the largest). The distance between the lower end of the eyelid and the upper end of the lower eyelid. In general, the eye length decreases as the degree of sleepiness increases, so the eye length can be used for the purpose of calculating the degree of sleepiness.

  More specifically, the calculation in S70 first detects pixels at the lower end of the upper eyelid and the upper end of the lower eyelid. This detection process may be performed in the same manner as the detection of the pixel at the lower end of the upper eyelid in S60. Then, the difference between the position of the pixel at the lower end of the upper eyelid and the position of the pixel at the upper end of the lower eyelid is calculated in the horizontal direction, and the maximum value is calculated.

  Next, in S <b> 80, the CPU 30 stores the position of a predetermined feature point as a feature amount in the in-license image 51. Here, the feature point is a point specified in advance on the face, and in the following explanation, numerical values used for calculating the degree of the driver's looking aside, for example, the center of the right eye, the center of the left eye, and the middle of the left and right nostrils To do. Since the image in the license is clearly an image facing the front, the position of the feature point in the image in the license is the position of the feature point in the state of facing the front (hereinafter, the front feature point position).

  In S40, since the eye and nose are recognized in the license image 51, in S80, in the recognized eye and nose, the center position of the right eye, the center of the left eye, and the intermediate position of the right and left nostrils. Is calculated. Thereby, the center position of the right eye, the center position of the left eye, and the intermediate position of the left and right nostrils in the front image when the driver himself / herself is awakened are acquired. FIG. 5 shows an example of the center position B1 of the right eye, the center position B2 of the left eye, and the middle position B3 of the left and right nostrils. The above is the processing procedure of FIG.

  Next, the processing procedure of FIG. 3 will be described. In the process of FIG. 3, the CPU 30 first captures the driver's current face image (hereinafter, the current image) in S100. Next, in S110, the CPU 30 calls the template image of the driver's eyes and nose stored in S50.

  Next, in S120, the CPU 30 executes template matching between the eye and nose template images called in S110 and the current image of the driver photographed in S100. With this process, the eyes and nose in the current image are recognized. This is a quick and highly accurate recognition process because a template of the front face image of the driver himself is used.

  Subsequently, in S130, the CPU 30 calculates and stores the eyelid end curve A1 in the current face image of the driver that has been shot in S100 and whose eyes and nose are recognized in S120. The calculation of the eyelid end curve A1 may be the same as S60 described above. That is, the region recognized as the eye region in S120 in the current image of the grayscale image (or converted to the grayscale image) using the tendency that the eyelid part has high brightness and the eyeball has low brightness. The brightness of each pixel is detected, and the pixel at the lower end of the upper eyelid is detected from the distribution.

  The quadratic curve that best fits the shape of the pixel at the lower end of the upper eyelid is minimized so that, for example, the sum of squares of the difference value between the quadratic curve and the position of the pixel at the lower end of the upper eyelid is minimized. Calculate using the square method.

  Next, in S140, the CPU 30 calculates the eye length A2 in the current image of the driver. The calculation of the eye length A2 may be the same as S70 described above. That is, similarly to the detection of the pixel at the lower end of the upper eyelid at S130, the pixel at the lower end of the upper eyelid and the pixel at the upper end of the lower eyelid are detected in the region recognized as the eye region in S120 in the current image of the driver. . Then, the difference between the position of the pixel at the lower end of the upper eyelid and the position of the pixel at the upper end of the lower eyelid is calculated in the horizontal direction, and the maximum value is calculated.

  In S150, the CPU 30 calculates the degree of sleepiness (sleepiness level) using the awake and current eyelid end curves calculated in S60 and S130, and the awake and current eye height calculated in S70 and S140. To do. First, sleepiness is determined for each of the eyelid end curve and the eye length. The sleepiness level is, for example, “sleepiness level 0 (no sleepiness)”, “sleepiness level 1”,..., “Sleepiness level n (completely sleeping)” in n + 1 stages (n is an arbitrary integer).

  Specifically, a degree (difference value) in which the curvature of the current eyelid end curve is separated from the curvature of the eyelid end curve at the time of awakening is determined, and it is determined where the numerical value belongs in the n + 1 stage. A threshold value for dividing into n + 1 stages may be set in advance. Similarly, the degree (difference value) in which the current eye length is separated from the eye height at the time of awakening is calculated in regard to the eye length, and it is determined where the numerical value belongs in the above n + 1 stages. A threshold value for dividing into n + 1 stages may be set in advance. In addition, for example, the difference between the curvature of the eyelid end curve at the time of awakening and the curvature of the current eyelid end curve, or the difference value between the width of the eye at the time of awakening and the current eye vertical width itself is set as sleepiness. In addition, the sleepiness level may be continuous.

  A total sleepiness level is calculated by combining the above two sleepiness levels. The rule for calculating the total sleepiness from the two sleepiness levels may be determined in advance (for example, the average and rounding of the two sleepiness levels, where the lower (higher) sleepiness level is the overall sleepiness level) The overall sleepiness, etc.).

  In S160, the CPU 30 calculates the center position of the right eye, the center position of the left eye, and the middle position of the left and right nostrils in the current image in which the eye and nose regions are recognized in S120. Thereby, the center position of the right eye, the center position of the left eye, and the intermediate position of the left and right nostrils in the current image of the driver are acquired.

  Next, in S170, the CPU 30 calculates the current yaw angle (rotation angle in the yaw direction) of the driver's head. Here, the rotation in the yaw direction is rotation around the axis of the driver's neck, and the yaw angle is an angle between the direction of the driver's face and the traveling direction of the vehicle. This calculation is specifically performed as follows, for example. The right side of FIG. 5 shows an example of the center position B1 of the right eye, the center position B2 of the left eye, and the middle position B3 of the left and right nostrils in the side-view state.

  As shown in the figure, when the driver is driving aside, the middle position B3 of the right and left nostrils tends to deviate from the middle between the center position B1 of the right eye and the center position B2 of the left eye. B1 and b2 (the line connecting B1 and B2 shown in the figure is internally divided at the horizontal position of B3, the distance from the internal dividing point to B1, and from the internal dividing point to B2 The distance of the driver is b1 and b2, respectively), the shorter the b1 (the longer b2), the more the driver's face is facing to the right, and the longer the b1 (the shorter b2), the driver's face The face is relatively left.

  Using the above tendency, in S170, b1 and b2 are obtained from B1, B2, and B3 obtained in S160, and the current yaw angle of the driver's head is calculated from the ratio of b1 and b2. When the near-infrared camera 2 is installed at the center position in the left-right direction when viewed from the driver, b1 and b2 are equal when the driver is looking at the front (front direction of the vehicle). The installation position of the near-infrared camera 2 is not necessarily limited to this.

  Therefore, the near-infrared camera 2 may be installed at a position shifted rightward or leftward when viewed from the driver. In any case, the relationship (functional relationship) between the ratio between b1 and b2 and the yaw angle of the driver's head is obtained in advance and stored in, for example, the ROM 32 (EEPROM), and this is used in S170. Good.

  Alternatively, the yaw angle of the driver's head may be calculated by comparing the image in the license with the current image. In that case, for example, the distance between the point B1 and the point B2 from the front direction in the image in the license is utilized by utilizing the property that the distance between the point B1 and the point B2 in FIG. And the yaw angle may be calculated by comparing the current distance with the same distance in the current image. Naturally, since the image in the license is an image facing the front (the yaw angle is zero), the distance between the point B1 and the point B2 in the image in the license is a reference value in the determination of the aside.

  What is necessary is just to obtain | require previously the relationship (or calculation formula) of the relationship (ratio) and the yaw angle between the distance between the point B1 and the point B2 in the image in the license and the same distance in the current image. In comparison between the license image and the current image, for example, a part of the contour in both images may be detected, and then the images may be enlarged / reduced to match the sizes of both images. Instead of calculating the yaw angle itself, a numerical value correlated with the yaw angle (for example, the ratio between the distance between the point B1 and the point B2 in the image in the license and the same distance in the current image) is calculated. It is good.

  Subsequently, in S180, the CPU 30 determines the degree of looking aside (degree of looking aside) according to the yaw angle calculated in S170. An example is shown in FIG. In the example of FIG. 6, the yaw angle of the driver's head is determined according to the difference value from zero, which is the reference value in both the left and right directions, “side aside (normal)”, “mild aside”, “severe aside” Is divided into three stages. The present invention is not limited to the example of FIG. 6, and the degree of looking aside may be divided into any number of stages such as two stages and four stages. Alternatively, the degree of looking aside may be a continuous number so that the yaw angle itself is a value for the degree of looking aside.

  Next, in S190, the CPU 30 executes an alarm process regarding the dozing operation and the aside driving. The alarm is issued when the range of normal (no sleepiness, no side aside) is out of the range according to the determination result of the sleepiness degree in S150 and the determination result of the lookout degree in S180.

  In the case of a sleepiness level alarm, for example, an alarm is issued according to the presence or absence of sleepiness. In this case, the sleepiness level may be set in two stages: “sleepy” and “not sleepy”. When “Drowsiness” is selected, a voice or sound indicating that the driver is drowsy is output, a sound such as a buzzer is output, or a character or pattern indicating that the driver is drowsy The alarm is output such as displaying the symbol on the display device of the instrument panel, sending cool air from the air conditioner toward the driver, vibrating the seat and steering (or a combination thereof). When “no sleepiness”, such an alarm is not output.

  Or you may change the grade of the alarm output according to the degree of sleepiness. That is, by setting a plurality of sleepiness levels, determining which sleepiness level the sleepiness level calculated in S150 belongs to, and in the case of belonging to a sleepiness level with a higher sleepiness level, an audio having a content indicating that the sleepiness level is higher , Sounds such as louder buzzers, characters, pictures, and patterns that indicate more drowsiness, more cool or cooler wind output, stronger vibrations of the seat and steering (or these Alarm) is output.

  Similarly, in the case of an alarm for the degree of looking aside, for example, an alarm is issued according to the presence or absence of an aside. In this case, the degree of looking aside may be set in two stages: “looking aside” and “not looking aside”. In case of "looking aside", it outputs a sound indicating that the driver is looking aside, outputs a sound such as a buzzer, a character or a picture indicating that the driver is looking aside The alarm is output on the display device of the instrument panel that displays the symbols, the cool air is sent from the air conditioner to the driver, the seat and the steering are vibrated (or a combination thereof). When “not looking aside”, such an alarm is not output.

  Or you may change the grade of the alarm output according to the grade of an aside. That is, by setting a plurality of side-by-step stages, it is determined which side-by-side stage the side-by-side state calculated in S180 (or the yaw angle calculated in S170) belongs, and when it belongs to a side-by-stage stage with a higher degree of side-by-side, High-level sound, sound such as a louder buzzer, characters, patterns and designs that indicate a higher level of looking aside, stronger and cooler cold wind output, sheets and Outputs warning of stronger vibration of steering (or a combination of these).

  Note that the alarm processing in S190 may be performed based on not only the current sleepiness level and the degree of looking aside but also the sleepiness level and the degree of looking aside (for example, an average) within a predetermined period including the present. Therefore, the sleepiness degree calculated in S150 and the look-aside degree calculated in S180 may be stored in the RAM 31, for example, for a certain period. As a result, the accuracy of the determination of sleepiness and the degree of looking aside improves. The above is the processing procedure of FIG.

  The above embodiment is merely an embodiment of the present invention, and may be changed as appropriate without departing from the spirit of the present invention described in the claims. For example, the feature amount need not be limited to the feature amount used in the above embodiment. Specifically, the distance between the left and right mouth corners, the distance between the eyebrows and the eyes, the head tilt angle, the distance between the midpoint of the eyebrow and the midpoint of the eye, the distance between the eyebrows, the midpoint of the lower nose and the midpoint of the eye The distance between the middle point of the mouth edge and the middle point of the eye, the distance between the lower edge of the lower lip and the middle point of the eye, the distance between the upper and lower eyelids, the distance between the outer edges of the left and right nostrils, the upper edge of the upper lip The distance between the midpoints of the eyes, the inclination angle of the head to the front and rear, the inclination angle of the head to the left and right, and the like may be used as the feature amount.

  These feature values may be used in calculating sleepiness. Eyebrow-eye distance, distance between midpoint of eyebrow and midpoint of eye, distance between browhead, distance between midpoint of lower nose and midpoint of eye, midpoint of mouth and midpoint of eye The greater the distance, the distance between the lower lip lower end and the midpoint of the eye, the distance between the upper lip upper end and the midpoint of the upper eye, the head tilt angle to the front and back, and the head tilt angle to the left and right, the greater the drowsiness Tend to be high. Also, the smaller the numerical value of the distance between the left and right mouth corners, the distance between the upper and lower eyelids, and the distance between the outer edges of the left and right nostrils, the higher the drowsiness (see JP 2009-45418 A for details). ).

  These feature values are calculated for both the in-license image acquired in S20 and the current image captured in S100, and in the calculation of sleepiness in S150, similar to the eyelid end curve A1 and the eye vertical width A2. Can be used. That is, for each of these feature amounts, a difference value between each feature amount in the license image and each feature amount in the current image is calculated, and n threshold values are set in advance. It is determined where the drowsiness level of n + 1 level belongs to the feature amount. Then, the total drowsiness is determined by comprehensively combining them (for example, the lowest value, the highest value, the average and the rounding off, etc.).

  Further, for example, the face image recognition described above may be combined with a method using an AAM (Active Appearance Model). AAM uses a statistical model representing the correlation between the luminance (brightness) value distribution of an image and the three-dimensional shape, thereby obtaining the three-dimensional shape from information on the distribution of luminance values of the actually acquired image. (Details of AAM are described in, for example, TFCootes et al .: Active appearance models, Proc. 15th European Conf. Computer Vision, vol. 2, pp. 484-498 (1998)). When AAM is used in the image recognition, the AAM converts the image in the license and the current image taken by the near-infrared camera into a three-dimensional shape, and calculates the feature amount in the three-dimensional shape. Then, the degree of drowsiness and looking aside may be calculated using these feature amounts.

  In the above embodiment, the yaw angle (left / right rotation angle) of the driver's head is calculated. In addition, the pitch angle (front / rear direction tilt angle) and roll angle (left / right direction tilt angle) are calculated. May be. In this case, for example, a map or a calculation formula for calculating the pitch angle and roll angle is obtained in advance from the length and inclination of the line segment between the points B1 and B2 in FIG. 5 and the distance between the line segment and the point B3. . For example, when the roll angle or the pitch angle is large (or greatly fluctuates), it is determined that the sleepiness level is large.

  In the above embodiment, the points B1, B2, and B3 shown in FIG. 5 are obtained for the determination of the degree of looking aside. However, it is not necessary to limit to these, and other characteristic points whose position changes due to the change of the yaw angle But you can. When multiple feature quantities are used to determine the degree of looking aside, the difference value between each feature quantity in the license image and each feature quantity in the current image is calculated in the same manner as in the case of the sleepiness degree. Predetermined (for example, m) threshold values are set in advance, and it is determined where the individual feature amount belongs to the degree of looking aside in m + 1 stages. Then, they are comprehensively determined (for example, the lowest value, the highest value, the average and rounded off, etc., as described above) to determine the overall degree of looking aside.

  In the image recognition in the above embodiment, the template images of the eyes and nose are used. However, the image need not be limited to only the eyes and nose. Recognition may be performed.

1 Image analysis device 2 Near-infrared camera 3 Driver monitor ECU
4 Alarm device 5 Driver's license 6 Card reader

Claims (7)

  1. Photographing means for photographing a current image that is a face image of a driver seated in a driver seat of a vehicle;
    An obtaining means for obtaining an image in the driver's face, which is a face image of the driver stored in the driver's license possessed by the driver;
    Recognition processing means for performing face recognition processing in the current image taken by the photographing means, using the image in the license obtained by the obtaining means;
    First calculation means for calculating a predetermined feature amount in the current image recognized by the recognition processing means;
    Determination means for determining the extent to which the feature amount calculated by the first calculation means is away from the reference value;
    An image analysis apparatus comprising:
  2. An extraction means for extracting a partial image of a predetermined part of the face in the image in the license acquired by the acquisition means;
    The image according to claim 1, wherein the recognition processing unit recognizes a predetermined part in the current image by performing a matching process between the partial image of the predetermined part extracted by the extracting unit and the current image. Analysis device.
  3. Photographing means for photographing a current image that is a face image of a driver seated in a driver seat of a vehicle;
    An obtaining means for obtaining an image in the driver's face, which is a face image of the driver stored in the driver's license possessed by the driver;
    Second calculating means for calculating a predetermined feature amount in the current image acquired by the photographing means;
    Third calculation means for calculating a predetermined feature amount in the image in the license acquired by the acquisition means;
    A determination unit that determines a degree to which the feature amount calculated by the second calculation unit is away from a reference value determined according to the predetermined feature amount calculated by the third calculation unit;
    An image analysis apparatus comprising:
  4.   The image analysis apparatus according to claim 1, wherein the predetermined feature amount includes a numerical value indicating a position of a part indicating a degree of sleepiness in a face image.
  5.   The image analysis apparatus according to claim 4, wherein the numerical value indicating the position of the part indicating the degree of sleepiness is a numerical value correlated with an eyelid opening degree in the face image.
  6.   The image analysis apparatus according to claim 1, wherein the predetermined feature amount includes a numerical value indicating a degree of looking aside in the face image.
  7.   The image analysis apparatus according to claim 6, wherein the numerical value indicating the degree of looking aside is a numerical value correlated with an angle that a face direction has with respect to a front-rear direction of the vehicle.
JP2010231672A 2010-10-14 2010-10-14 Image analyzer Pending JP2012084068A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010231672A JP2012084068A (en) 2010-10-14 2010-10-14 Image analyzer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010231672A JP2012084068A (en) 2010-10-14 2010-10-14 Image analyzer

Publications (1)

Publication Number Publication Date
JP2012084068A true JP2012084068A (en) 2012-04-26

Family

ID=46242842

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010231672A Pending JP2012084068A (en) 2010-10-14 2010-10-14 Image analyzer

Country Status (1)

Country Link
JP (1) JP2012084068A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103310539A (en) * 2013-06-25 2013-09-18 宁夏新航信息科技有限公司 Automatic reminding system of ATM
JP2016502165A (en) * 2012-10-19 2016-01-21 オートリブ ディベロップメント エービー Driver attention detection method and apparatus
JP2016539446A (en) * 2013-10-29 2016-12-15 キム,ジェ−チョル A device for preventing doze driving in two stages through recognition of movement, face, eyes and mouth shape
JP2017511165A (en) * 2014-02-24 2017-04-20 ソニー株式会社 Smart wearable device and method for customized haptic feedback
KR101737800B1 (en) * 2015-04-07 2017-05-29 엘지전자 주식회사 Vehicle terminal and control method thereof
JP6264492B1 (en) * 2017-03-14 2018-01-24 オムロン株式会社 Driver monitoring device, driver monitoring method, learning device, and learning method
JP2018508870A (en) * 2015-01-19 2018-03-29 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング Method and apparatus for detecting instantaneous sleep of a vehicle driver

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000301963A (en) * 1999-04-23 2000-10-31 Fujitsu Ltd Accident deterrent system and method
JP2001243466A (en) * 2000-03-02 2001-09-07 Honda Motor Co Ltd Device and method for face recognition
JP2008204107A (en) * 2007-02-19 2008-09-04 Toyota Motor Corp Carelessness warning device, vehicle equipment control method for the device and program for vehicle control device
JP2009090691A (en) * 2007-10-03 2009-04-30 Toshiba Corp Unlawful driving preventive device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000301963A (en) * 1999-04-23 2000-10-31 Fujitsu Ltd Accident deterrent system and method
JP2001243466A (en) * 2000-03-02 2001-09-07 Honda Motor Co Ltd Device and method for face recognition
JP2008204107A (en) * 2007-02-19 2008-09-04 Toyota Motor Corp Carelessness warning device, vehicle equipment control method for the device and program for vehicle control device
JP2009090691A (en) * 2007-10-03 2009-04-30 Toshiba Corp Unlawful driving preventive device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016502165A (en) * 2012-10-19 2016-01-21 オートリブ ディベロップメント エービー Driver attention detection method and apparatus
CN103310539A (en) * 2013-06-25 2013-09-18 宁夏新航信息科技有限公司 Automatic reminding system of ATM
JP2016539446A (en) * 2013-10-29 2016-12-15 キム,ジェ−チョル A device for preventing doze driving in two stages through recognition of movement, face, eyes and mouth shape
JP2017511165A (en) * 2014-02-24 2017-04-20 ソニー株式会社 Smart wearable device and method for customized haptic feedback
JP2018508870A (en) * 2015-01-19 2018-03-29 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング Method and apparatus for detecting instantaneous sleep of a vehicle driver
KR101737800B1 (en) * 2015-04-07 2017-05-29 엘지전자 주식회사 Vehicle terminal and control method thereof
JP6264492B1 (en) * 2017-03-14 2018-01-24 オムロン株式会社 Driver monitoring device, driver monitoring method, learning device, and learning method
JP6264495B1 (en) * 2017-03-14 2018-01-24 オムロン株式会社 Driver monitoring device, driver monitoring method, learning device, and learning method
JP6264494B1 (en) * 2017-03-14 2018-01-24 オムロン株式会社 Driver monitoring device, driver monitoring method, learning device, and learning method
WO2018167991A1 (en) * 2017-03-14 2018-09-20 オムロン株式会社 Driver monitoring device, driver monitoring method, learning device, and learning method

Similar Documents

Publication Publication Date Title
Kaplan et al. Driver behavior analysis for safe driving: A survey
US9180887B2 (en) Driver identification based on face data
US10088899B2 (en) Eye gaze tracking utilizing surface normal identification
US10572746B2 (en) Apparatus detecting driving incapability state of driver
Mbouna et al. Visual analysis of eye state and head pose for driver alertness monitoring
Ji et al. Real-time nonintrusive monitoring and prediction of driver fatigue
DE102004012811B4 (en) Device for monitoring the environment of a vehicle
DE10301468B4 (en) Device for monitoring the environment of a vehicle
US8406479B2 (en) Visual axis direction detection device and visual line direction detection method
DE102006024979B4 (en) Method and apparatus for detecting the presence of an occupant within a vehicle
US8537000B2 (en) Anti-drowsing device and anti-drowsing method
US8810653B2 (en) Vehicle surroundings monitoring apparatus
EP2288287B1 (en) Driver imaging apparatus and driver imaging method
EP2074550B1 (en) Eye opening detection system and method of detecting eye opening
US7692550B2 (en) Method and system for detecting operator alertness
CN104200192B (en) Driver&#39;s gaze detection system
JP4380412B2 (en) Imaging control apparatus and program
US20140043459A1 (en) Line-of-sight direction determination device and line-of-sight direction determination method
EP2032034B1 (en) Method for determining and analyzing a location of visual interest
US8823792B2 (en) Wakefulness level estimation apparatus
US7609893B2 (en) Method and apparatus for producing classifier training images via construction and manipulation of a three-dimensional image model
US9460601B2 (en) Driver distraction and drowsiness warning and sleepiness reduction for accident avoidance
EP1933256B1 (en) Eye closure recognition system and method
JP4370915B2 (en) In-vehicle application selection system and in-vehicle application selection device
AU2015276536B2 (en) Device, method, and computer program for detecting momentary sleep

Legal Events

Date Code Title Description
RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20121013

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20121109

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20131015

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20131016

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20131210

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20140516