WO2013008305A1 - 瞼検出装置 - Google Patents
瞼検出装置 Download PDFInfo
- Publication number
- WO2013008305A1 WO2013008305A1 PCT/JP2011/065825 JP2011065825W WO2013008305A1 WO 2013008305 A1 WO2013008305 A1 WO 2013008305A1 JP 2011065825 W JP2011065825 W JP 2011065825W WO 2013008305 A1 WO2013008305 A1 WO 2013008305A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eyelid
- face
- range
- facial
- bounds
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention relates to wrinkle detection that detects the position of the upper and lower eyelids from a face image.
- Patent Document 1 describes that an open / closed state of an eye is detected based on a wrinkle shape corresponding to a stored facial posture and a detected wrinkle shape.
- Patent Document 1 has a problem that wrinkles cannot be accurately detected due to a disturbance caused by a red-eye phenomenon or glasses. For example, since an edge is generated around the red eye, there is a possibility that the edge is erroneously detected as the position of the upper and lower eyelids. Further, if the edge of the spectacle frame reflected in the face image is strong, this spectacle frame may be erroneously detected as the position of the upper and lower eyelids.
- an object of the present invention is to provide a wrinkle detection device that can detect the position of the upper and lower eyelids with high accuracy against disturbance caused by a red-eye phenomenon or glasses.
- the eyelid detection apparatus is an eyelid detection apparatus that detects the position of the upper and lower eyelids from a face image, and is a face estimated by fitting a facial feature point detected from the face image to a three-dimensional face model Based on the orientation, the position of the upper and lower eyelids is detected.
- the face orientation can be estimated by matching the facial feature points to the three-dimensional face model. Since the range in which the upper and lower eyelids can exist is limited according to the face orientation, only the range in which the upper and lower eyelids can exist can be detected by detecting the position of the upper and lower eyelids based on the estimated face orientation in this way. The position of the upper and lower eyelids can be detected. As a result, it is possible to eliminate the influence of the red-eye phenomenon that occurs in the range where the upper and lower eyelids cannot exist and the disturbance caused by the glasses, so that the position of the upper and lower eyelids can be detected with high accuracy.
- the angle range in the vertical direction for detecting vertical wrinkles can be limited according to the face orientation. Since the driver is gazing forward regardless of the face direction during driving, the range where the upper and lower eyelids exist is specified. Therefore, during driving, by restricting the vertical angle range for detecting vertical wrinkles according to the face orientation, the effects of red-eye phenomenon and eyeglass disturbances that occur in areas where the vertical wrinkles cannot exist are appropriately eliminated. be able to.
- the face direction when the face direction is downward, it is preferable to increase the lower limit angle of the upper and lower eyelids than when the face direction is the front direction. In this way, when the face direction is downward, there is a high possibility that the line of sight is facing upward, so by setting the lower limit angle of the upper and lower eyelids higher than when the face direction is the front direction, It is possible to appropriately eliminate the influence of the red-eye phenomenon and the disturbance caused by the glasses that occur in a range that cannot exist.
- the present invention it is possible to detect the position of the upper and lower eyelids with high accuracy against a disturbance caused by a red-eye phenomenon or glasses.
- the eyelid detection device is mounted on a driving assistance control device that performs driving assistance control of a vehicle by estimating the drowsiness level of the driver from the degree of eye opening calculated from the position of the upper and lower eyelids.
- the position of the upper and lower eyelids means the position of the upper eyelid and the position of the lower eyelid.
- FIG. 1 is a diagram showing a block configuration of a wrinkle detection apparatus according to the embodiment.
- the wrinkle detection device 1 includes an image sensor 10, a vehicle speed sensor 20, and an ECU (Electronic Control Unit) 30.
- ECU Electronic Control Unit
- the image sensor 10 is a sensor that images a driver's face.
- the image sensor 10 for example, a CCD camera fixed to a steering column of a vehicle is used.
- An image (face image) captured by the image sensor 10 is composed of image information representing the position and color information of each pixel. Then, the image sensor 10 outputs image information of the captured image to the ECU 30.
- the vehicle speed sensor 20 is a sensor that measures the vehicle speed of the vehicle.
- the vehicle speed sensor 20 measures the vehicle speed of the vehicle, for example, by measuring the rotational speed of each wheel of the vehicle. Then, the vehicle speed sensor 20 outputs the measured vehicle speed to the ECU 30.
- the ECU 30 is a computer of an automotive device that performs electronic control, and includes a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and an input / output interface. .
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the ECU 30 is connected to the image sensor 10 and the vehicle speed sensor 20, and includes a vehicle speed determination unit 31, a face position / face feature point detection unit 32, a face posture estimation unit 33, a heel range setting unit 34, and a heel detection unit 35. ing.
- the vehicle speed determination unit 31 has a function of determining whether or not the vehicle is traveling. For example, when the vehicle speed output from the vehicle speed sensor 20 is higher than 0 km / h, the vehicle speed determination unit 31 determines that the vehicle is traveling.
- the face position / face feature point detection unit 32 has a function of detecting a feature point of a driver's face from an image captured by the image sensor 10. More specifically, the face position / face feature point detection unit 32 first uses the entire image captured by the image sensor 10 as a search range, and uses a statistical method such as neural network or boosting. Discover the location. Then, the face position / face feature point detection unit 32 sets a face position area including the found face position, and determines a face feature point from the set face position area by a statistical method such as neural network or boosting. To detect. Examples of facial feature points include the right eye corner, right eye head, left eye corner, left eye head, nasal cavity center, and left and right mouth edges. The detection of each feature point of the face is not limited to this method, and other known methods may be used.
- the face posture estimation unit 33 has a function of estimating the driver's face posture (face orientation) from the face feature points detected by the face position / face feature point detection unit 32. Specifically, the face posture estimation unit 33 first fits a 3D face model (3D face model) to the coordinate position of the face feature point detected by the face position / face feature point detection unit 32. Then, the face posture estimation unit 33 estimates the driver's face posture (face orientation) from the posture of the fitted 3D face model.
- the 3D face model includes a 3D eyeball model, and it is also possible to represent the line-of-sight direction of the 3D eyeball model, the position of the upper and lower eyelids covering the 3D eyeball model, and the like.
- the heel range setting unit 34 is based on the face posture estimated by the face posture estimation unit 33, and the upper heel presence range, which is a range where the upper heel position can exist, and the lower range, where the lower heel position can exist. It has a function to set the presence range.
- a driver who is driving is considered to be gazing forward, regardless of the angle of the face in the vertical direction. For this reason, during driving, the upper eyelid presence range and the lower eyelid presence range are specified corresponding to the vertical angle of the face. Further, when driving, if the face is directed upward or downward, the angle at which the heel opens is narrower than when the face is directed to the front.
- the eyelid range setting unit 34 sets the upper eyelid presence range and the lower eyelid presence range according to the face posture (face orientation) estimated by the face posture estimation unit 33.
- the upper eyelid presence range and the lower eyelid presence range are represented by angle ranges in the three-dimensional eyeball model.
- the eyelid range setting unit 34 converts the upper eyelid presence range and the lower eyelid presence range represented by the angle range in the three-dimensional eyeball model into the two-dimensional face position region set by the face position / face feature point detection unit 32. By projecting, the upper eyelid existing range and the lower eyelid existing range expressed in two dimensions are set as the face position region.
- the hail detection unit 35 has a function of detecting the position of the upper and lower eyelids in the upper hail presence range and the lower hail presence range set by the hail range setting unit 34. More specifically, the eyelid detection unit 35 applies an Sobel filter, for example, to the face position region set by the face position / face feature point detection unit 32 to generate an edge image that is an image with enhanced edges. . Further, the eyelid detection unit 35 uses the eye corner and eye feature points detected by the face position / face feature point detection unit 32 as the start point and the end point in the upper eyelid presence range and the lower eyelid presence range set by the eyelid range setting unit 34. A plurality of curves to be projected onto the edge image.
- Sobel filter for example
- the eyelid detection unit 35 detects the position of the upper and lower eyelids from the intensity of the edge on the curve (pixel value of the edge image). In other words, the eyelid detection unit 35 detects the position of the upper and lower eyelids by collating a plurality of curves (upper and lower eyelid curve models) projected on the upper eyelid existence range and the lower eyelid existence range with the edge image.
- the detection of the position of the upper and lower eyelids is not limited to this method, and other known methods may be used.
- FIG. 2 is a flowchart illustrating wrinkle detection processing operation of the wrinkle detection device according to the embodiment.
- the process shown in FIG. 2 is performed under the control of the ECU 30, and is repeatedly performed at a predetermined interval from the timing when the ignition is turned on until the ignition is turned off, for example.
- the ECU 30 inputs an image of the driver imaged by the image sensor 10 (step S1).
- step S1 an image F1 shown in FIG. 3 captured by the image sensor 10 is input.
- FIG. 3 is an example of an image captured by the image sensor.
- the ECU 30 determines whether or not the vehicle is traveling (step S2).
- the vehicle speed determination unit 31 performs the process of step S2.
- the vehicle speed determination unit 31 determines that the vehicle is traveling when the vehicle speed output from the vehicle speed sensor 20 is greater than 0 km / h, and travels when the vehicle speed output from the vehicle speed sensor 20 is 0 km / h. It is determined that it is not inside.
- step S2 NO
- the soot detection process is terminated.
- step S3 the ECU 30 determines that the vehicle is running (step S2: YES)
- the ECU 30 detects a face position / face feature point (step S3).
- the face position / face feature point detection unit 32 performs the process of step S3.
- the face position / face feature point detection unit 32 uses the entire image F1 input in step S1 as a search range to find a face position by a statistical method such as neural network or boosting.
- the face position / face feature point detector 32 sets a face position region G1.
- FIG. 4 is a schematic diagram for explaining a face feature point detection method, and shows a face position region G1.
- the face position area G1 is an area including the found face position, and is an area of the image F1.
- the face position / face feature point detection unit 32 uses the set face position region G1 as a search range, and a statistical technique such as neural network or boosting, the right eye corner, right eye head, left eye corner, left eye head, nasal cavity center , Feature points such as left and right mouth edges are detected.
- a statistical technique such as neural network or boosting, the right eye corner, right eye head, left eye corner, left eye head, nasal cavity center , Feature points such as left and right mouth edges are detected.
- Step S4 is performed by the face posture estimation unit 33.
- the face posture estimation unit 33 first fits the 3D face model to the coordinate position of the face feature point detected by the face position / face feature point detection unit 32 in step S3.
- FIG. 5 is a schematic diagram showing an example of a 3D face model.
- the 3D face model the Ym direction along the vertical direction of the face, the Xm direction along the horizontal direction of the face, and the Zm direction along the front-back direction of the face are used.
- the rotation around the Xm axis is the pitch
- the rotation around the Zm axis is the roll.
- the 3D face model holds the distance from the head rotation center for each feature point. Therefore, the face posture estimation unit 33 fits this 3D face model to the feature points of the face, and sets the position and rotation (yaw, pitch, roll) at the time of matching as the face posture at that time.
- the method for estimating the face posture is not limited to this method, and other known methods may be used.
- the face posture estimation unit 33 estimates the driver's face posture (face orientation) from the posture of the fitted 3D face model.
- Step S5 is performed by the eyelid range setting unit 34.
- the eyelid range setting unit 34 first sets the upper eyelid presence range and the lower eyelid presence range represented by the angle range in the three-dimensional eyeball model according to the face orientation estimated by the face posture estimation unit 33 in step S3.
- FIG. 6 is a diagram showing a three-dimensional eyeball model when the face direction is the front direction.
- FIG. 7 is a diagram showing a three-dimensional eyeball model when the face is facing upward.
- FIG. 8 is a diagram showing a three-dimensional eyeball model when the face direction is downward.
- O represents the eyeball center of the three-dimensional eyeball model
- E Upr represents the upper eyelid
- E Lwr represents the lower eyelid
- ⁇ Upr represents the position of the upper eyelid
- ⁇ Lwr indicates the position of the lower eyelid.
- the upper eyelid existing range and the lower eyelid existing range described below are examples, and other values may be adopted.
- the eyelid range setting unit 34 first determines whether the face direction estimated by the face posture estimation unit 33 in step S3 is a front direction, an upward direction, or a downward direction. In this determination, when the vertical angle when the face is facing directly in front is 0 °, the vertical angle when the face is in the range of ⁇ 10 ° to 10 ° is defined as the front direction. It is determined that the upward / downward angle of the face direction is greater than 10 °, the upward direction is determined, and when the upward / downward angle of the face direction is less than ⁇ 10 °, the downward direction is determined.
- the eyelid range setting unit 34 determines that the face direction is the front direction, as shown in FIG. 6, the upper eyelid presence range where the upper eyelid position ⁇ Upr may exist is set to ⁇ 45 ° to 55 °. (-45 ° ⁇ ⁇ Upr ⁇ 55 °), and the range of the lower eyelid where the lower eyelid position ⁇ Lwr can exist is ⁇ 45 ° or more and ⁇ 15 ° or less ( ⁇ 45 ° ⁇ ⁇ Lwr ⁇ ⁇ 15 ° ).
- the eyelid range setting unit 34 determines that the face direction is upward, as shown in FIG. 7, the upper eyelid presence range in which the upper eyelid position ⁇ Upr can exist is set to ⁇ 45 ° to 30 ° ( ⁇ 45 ° ⁇ ⁇ Upr ⁇ 30 °), and the range in which the lower eyelid position ⁇ Lwr can exist is ⁇ 45 ° to ⁇ 15 ° ( ⁇ 45 ° ⁇ ⁇ Lwr ⁇ ⁇ 15 °) Set to. That is, when it is determined that the face direction is upward, the upper limit angle of the upper eyelid presence range is made 25 ° smaller than when the face direction is determined to be the front direction.
- the eyelid range setting unit 34 determines that the face direction is downward, as shown in FIG. 8, the upper eyelid presence range where the upper eyelid position ⁇ Upr may exist is set to ⁇ 30 ° to 55 ° ( -30 ° ⁇ ⁇ Upr ⁇ 55 °), and the range where the lower eyelid position ⁇ Lwr can exist is -30 ° or more and ⁇ 15 ° or less ( ⁇ 30 ° ⁇ ⁇ Lwr ⁇ ⁇ 15 °) Set to. That is, when it is determined that the face direction is downward, the lower limit angles of the upper eyelid presence range and the lower eyelid presence range are increased by 15 ° than when it is determined that the face direction is the front direction.
- the eyelid range setting unit 34 next displays the upper eyelid presence range and the lower eyelid presence range represented by the angle range in the three-dimensional eyeball model.
- the eyelid presence range is projected onto the two-dimensional face position region set by the face position / face feature point detection unit 32 in step S3, and the upper eyelid presence range and the lower eyelid presence range in the face position region are set.
- the ECU 30 detects the position of the upper eyelid and the position of the lower eyelid in the upper eyelet existing range and the lower eyelid existing range set by the eyelid range setting unit 34 in step S5 (step S6).
- the wrinkle detection unit 35 performs the process of step S6.
- FIG. 9 is a schematic diagram for explaining a method for detecting upper and lower eyelids.
- the wrinkle detection unit 35 applies, for example, a Sobel filter to the face position region G1 set by the face position / face feature point detection unit 32 in step S3 and emphasizes the edge.
- An edge image G3 is generated.
- the eyelid detection unit 35 includes a plurality of feature points of the corner of the eye and the eye detected in step S3 in the upper eyelet presence range and the lower eyelid presence range set by the eyelid range setting unit 34 in step S5. Project a curve. For example, a Bezier curve is used as the curve.
- the eyelid detection unit 35 projects a curve as a candidate for the lower eyelid only in the lower eyelid existence range set by the eyelid range setting unit 34, and only in the upper eyelet existence range set by the eyelid range setting unit 34.
- Project a curve as a candidate for the upper eyelid That is, the eyelid detection unit 35 does not project a curve as a candidate for the lower eyelid outside the lower eyelid presence range set by the eyelid range setting unit 34, and is outside the upper eyelet existence range set by the eyelid range setting unit 34. Do not project curves as candidates for upper armpits.
- the curve q1 shown in FIG. 9 is not projected as a lower eyelid candidate because it is located above the lower eyelid existence range set by the eyelid range setting unit 34.
- the curve q2 shown in FIG. 9 is not projected as an upper eyelid and lower eyelid candidate because it is located below the upper eyelid existing range and the lower eyelid existing range set by the eyelid range setting unit 34.
- the curve q3 shown in FIG. 9 is positioned above the upper eyelid presence range and the lower eyelid presence range set by the eyelid range setting unit 34, and therefore is not projected as a candidate for the upper eyelid and the lower eyelid.
- the eyelid detection unit 35 calculates the strength of the edge on the curve (pixel value of the edge image), and the curve with a strong edge strength is represented by the position of the upper eyelid.
- the wrinkle detection process ends.
- FIG. 10 is a schematic diagram for explaining erroneous detection of the position of the upper and lower eyelids.
- FIG. 10 (a) when a red-eye phenomenon occurs at night, an unnecessary edge is generated near the red eye. Therefore, there is a possibility that the unnecessary edge generated near the red eye is erroneously detected as the position of the upper and lower eyelids. is there.
- FIGS. 10B and 10C when the driver wears spectacles, the edge of the spectacle frame becomes strong, and this spectacle frame may be erroneously detected as the position of the upper and lower eyelids.
- the eyelid detection unit 35 detects the position of the upper and lower eyelids in the upper eyelid presence range and the lower eyelid presence range set by the eyelid range setting unit 34, the position of the upper and lower eyelid positions as shown in FIG. False detection can be prevented.
- the curve q1 shown in FIG. 9 is not projected as a lower eyelid candidate because it is located above the lower eyelid existence range set by the eyelid range setting unit 34. Therefore, as shown in FIG. An unnecessary edge generated in the vicinity is not erroneously detected as the position of the upper and lower eyelids. Further, the curve q2 shown in FIG.
- the eyelid detection device 1 during driving, the upper eyelid presence range and the lower eyelid presence range in which the position of the upper and lower eyelids can exist are set according to the face orientation, Since the positions of the upper and lower eyelids are detected in the set upper eyelid presence range and lower eyelid presence range, it is possible to eliminate the influence of the red-eye phenomenon and the disturbance caused by the glasses occurring in the range where the upper and lower eyelids cannot exist. Thereby, the position of the upper and lower eyelids can be detected with high accuracy.
- the position of the upper and lower eyelids can be detected appropriately.
- the upper angle limit of the upper eyelid is lower in the upper eyelid presence range than when the face orientation is frontward, and when the face orientation is downward, the face orientation is frontward.
- the eyelid range setting unit 34 sets the upper eyelid presence range and the lower eyelid presence range
- the eyelid detection unit 35 detects the position of the upper and lower eyelids from the upper eyelid presence range and the lower eyelid presence range.
- the position of the upper and lower eyelids may be detected by any means.
- a range where the upper and lower eyelids can exist in the edge image is calculated from the face direction, and a curve that is a candidate for the upper and lower eyelids is projected in this range Good.
- the eyelid range setting unit 34 sets the upper eyelid presence range and the lower eyelid presence range represented by the angle range in the three-dimensional eyeball model
- the two-dimensional upper eyelid presence range and the lower eyelid presence range are set.
- the range has been described as being set in the face position area, a two-dimensional upper eyelid presence range and lower eyelid presence range may be directly set in the face position area.
- It can be used as a wrinkle detection device that detects the position of the upper and lower eyelids from the face image.
- SYMBOLS 1 DESCRIPTION OF SYMBOLS 1 ... Haze detection apparatus, 10 ... Image sensor, 20 ... Vehicle speed sensor, 30 ... ECU, 31 ... Vehicle speed determination part, 32 ... Face position / face feature point detection part, 33 ... Face posture estimation part, 34 ... Haze range setting part , 35 ... wrinkle detection unit, F1 ... image, G1 ... face position area, G3 ... edge image.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims (5)
- 顔画像から上下瞼の位置を検出する瞼検出装置であって、
前記顔画像から検出される顔の特徴点を三次元顔モデルに適合させることにより推定される顔向きに基づいて、上下瞼の位置を検出する、瞼検出装置。 - 前記顔向きから推定される上下瞼の曲線モデルを、前記顔画像のエッジが強調されたエッジ画像に照合させて上下瞼の位置を検出する、請求項1に記載の瞼検出装置。
- 運転中は、前記顔向きに応じて上下瞼を検出する上下方向の角度範囲を制限する、請求項2に記載の瞼検出装置。
- 前記顔向きが上向きである場合は、前記顔向きが正面向きである場合よりも、上瞼の上限角度を低くする、請求項3に記載の瞼検出装置。
- 前記顔向きが下向きである場合は、前記顔向きが正面向きである場合よりも、上下瞼の下限角度を高くする、請求項3に記載の瞼検出装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112011105441.6T DE112011105441B4 (de) | 2011-07-11 | 2011-07-11 | Augenliderfassungsvorrichtung |
PCT/JP2011/065825 WO2013008305A1 (ja) | 2011-07-11 | 2011-07-11 | 瞼検出装置 |
JP2013523729A JP5790762B2 (ja) | 2011-07-11 | 2011-07-11 | 瞼検出装置 |
US14/131,531 US9202106B2 (en) | 2011-07-11 | 2011-07-11 | Eyelid detection device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/065825 WO2013008305A1 (ja) | 2011-07-11 | 2011-07-11 | 瞼検出装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013008305A1 true WO2013008305A1 (ja) | 2013-01-17 |
Family
ID=47505624
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/065825 WO2013008305A1 (ja) | 2011-07-11 | 2011-07-11 | 瞼検出装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9202106B2 (ja) |
JP (1) | JP5790762B2 (ja) |
DE (1) | DE112011105441B4 (ja) |
WO (1) | WO2013008305A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107862732A (zh) * | 2017-11-08 | 2018-03-30 | 清华大学 | 实时的三维眼皮重建方法及装置 |
WO2021024905A1 (ja) * | 2019-08-02 | 2021-02-11 | オムロン株式会社 | 画像処理装置、モニタリング装置、制御システム、画像処理方法、コンピュータプログラム、及び記憶媒体 |
JP2021166107A (ja) * | 2015-08-21 | 2021-10-14 | マジック リープ, インコーポレイテッドMagic Leap, Inc. | 眼ポーズ測定を用いた眼瞼形状推定 |
JP2022513978A (ja) * | 2018-12-26 | 2022-02-09 | 巽騰(広東)科技有限公司 | 表情グループに基づく操作決定方法、装置及び電子機器 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6264665B2 (ja) * | 2013-04-17 | 2018-01-24 | パナソニックIpマネジメント株式会社 | 画像処理方法および画像処理装置 |
KR101613091B1 (ko) * | 2014-04-24 | 2016-04-20 | 한국과학기술연구원 | 시선 추적 장치 및 방법 |
CN105701445A (zh) * | 2014-12-15 | 2016-06-22 | 爱信精机株式会社 | 判定装置及判定方法 |
EP3259734A4 (en) * | 2015-02-20 | 2019-02-20 | Seeing Machines Limited | GLARE REDUCTION |
CN105125174A (zh) * | 2015-08-03 | 2015-12-09 | 刘天键 | 一种可重构眼镜式疲劳检测设备及软件处理方法 |
AU2016340222B2 (en) | 2015-10-16 | 2021-07-01 | Magic Leap, Inc. | Eye pose identification using eye features |
CN105662407A (zh) * | 2015-12-31 | 2016-06-15 | 清华大学苏州汽车研究院(吴江) | 一种基于表面肌电技术的驾驶员疲劳检测系统 |
CN105726046B (zh) * | 2016-01-29 | 2018-06-19 | 西南交通大学 | 一种驾驶员警觉度状态检测方法 |
CN106446766A (zh) * | 2016-07-25 | 2017-02-22 | 浙江工业大学 | 一种视频中人脸特征点的稳定检测方法 |
CN111559382B (zh) * | 2020-05-09 | 2021-11-02 | Oppo广东移动通信有限公司 | 车辆行驶控制方法及装置 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011125620A (ja) * | 2009-12-21 | 2011-06-30 | Toyota Motor Corp | 生体状態検出装置 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3293308B2 (ja) * | 1994-03-10 | 2002-06-17 | 三菱電機株式会社 | 人物状態検出装置 |
JP4469476B2 (ja) * | 2000-08-09 | 2010-05-26 | パナソニック株式会社 | 眼位置検出方法および眼位置検出装置 |
US6664956B1 (en) * | 2000-10-12 | 2003-12-16 | Momentum Bilgisayar, Yazilim, Danismanlik, Ticaret A. S. | Method for generating a personalized 3-D face model |
JP4471607B2 (ja) | 2003-08-29 | 2010-06-02 | 富士通株式会社 | 眼の追跡装置、眼の状態判定装置及びコンピュータプログラム |
JP2006260397A (ja) * | 2005-03-18 | 2006-09-28 | Konica Minolta Holdings Inc | 開眼度推定装置 |
US20070127787A1 (en) * | 2005-10-24 | 2007-06-07 | Castleman Kenneth R | Face recognition system and method |
JP4137969B2 (ja) * | 2006-12-04 | 2008-08-20 | アイシン精機株式会社 | 眼部検出装置、眼部検出方法及びプログラム |
JP4895797B2 (ja) * | 2006-12-26 | 2012-03-14 | アイシン精機株式会社 | 瞼検出装置、瞼検出方法及びプログラム |
JP4895847B2 (ja) * | 2007-02-08 | 2012-03-14 | アイシン精機株式会社 | 瞼検出装置及びプログラム |
US8045766B2 (en) * | 2007-02-16 | 2011-10-25 | Denso Corporation | Device, program, and method for determining sleepiness |
JP4375420B2 (ja) * | 2007-03-26 | 2009-12-02 | 株式会社デンソー | 眠気警報装置、及びプログラム |
JP4966816B2 (ja) * | 2007-10-25 | 2012-07-04 | 株式会社日立製作所 | 視線方向計測方法および視線方向計測装置 |
JP2009245338A (ja) | 2008-03-31 | 2009-10-22 | Secom Co Ltd | 顔画像照合装置 |
JP2010033305A (ja) * | 2008-07-29 | 2010-02-12 | Hitachi Ltd | 画像情報処理方法、及び装置 |
EP2550918A1 (en) * | 2010-03-23 | 2013-01-30 | Aisin Seiki Kabushiki Kaisha | Alertness determination device, alertness determination method, and program |
JP4893862B1 (ja) * | 2011-03-11 | 2012-03-07 | オムロン株式会社 | 画像処理装置、および画像処理方法 |
CN103493097B (zh) * | 2011-04-15 | 2015-09-16 | 爱信精机株式会社 | 眼睑检测装置、眼睑检测方法 |
-
2011
- 2011-07-11 DE DE112011105441.6T patent/DE112011105441B4/de active Active
- 2011-07-11 WO PCT/JP2011/065825 patent/WO2013008305A1/ja active Application Filing
- 2011-07-11 JP JP2013523729A patent/JP5790762B2/ja active Active
- 2011-07-11 US US14/131,531 patent/US9202106B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011125620A (ja) * | 2009-12-21 | 2011-06-30 | Toyota Motor Corp | 生体状態検出装置 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021166107A (ja) * | 2015-08-21 | 2021-10-14 | マジック リープ, インコーポレイテッドMagic Leap, Inc. | 眼ポーズ測定を用いた眼瞼形状推定 |
JP7231676B2 (ja) | 2015-08-21 | 2023-03-01 | マジック リープ, インコーポレイテッド | 眼ポーズ測定を用いた眼瞼形状推定 |
CN107862732A (zh) * | 2017-11-08 | 2018-03-30 | 清华大学 | 实时的三维眼皮重建方法及装置 |
CN107862732B (zh) * | 2017-11-08 | 2020-06-19 | 清华大学 | 实时的三维眼皮重建方法及装置 |
JP2022513978A (ja) * | 2018-12-26 | 2022-02-09 | 巽騰(広東)科技有限公司 | 表情グループに基づく操作決定方法、装置及び電子機器 |
WO2021024905A1 (ja) * | 2019-08-02 | 2021-02-11 | オムロン株式会社 | 画像処理装置、モニタリング装置、制御システム、画像処理方法、コンピュータプログラム、及び記憶媒体 |
JP2021026420A (ja) * | 2019-08-02 | 2021-02-22 | オムロン株式会社 | 画像処理装置、モニタリング装置、制御システム、画像処理方法、及びコンピュータプログラム |
Also Published As
Publication number | Publication date |
---|---|
DE112011105441T5 (de) | 2014-03-27 |
JPWO2013008305A1 (ja) | 2015-02-23 |
DE112011105441B4 (de) | 2019-11-14 |
US20140140577A1 (en) | 2014-05-22 |
JP5790762B2 (ja) | 2015-10-07 |
US9202106B2 (en) | 2015-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5790762B2 (ja) | 瞼検出装置 | |
JP6350145B2 (ja) | 顔向き検出装置及び車両用警告システム | |
JP5737400B2 (ja) | 赤目検出装置 | |
JP5184596B2 (ja) | 脇見判定装置 | |
EP3113073A1 (en) | Determination device, determination method, and non-transitory storage medium | |
JP5737401B2 (ja) | 瞼検出装置 | |
JP5737399B2 (ja) | 赤目判定装置 | |
US10664712B2 (en) | Eyelid opening/closing determination apparatus and drowsiness detection apparatus | |
JP2009294753A (ja) | 画像処理装置および画像処理方法 | |
WO2012140782A1 (ja) | 瞼検出装置、瞼検出方法及びプログラム | |
JP2009219555A (ja) | 眠気検知装置、運転支援装置、眠気検知方法 | |
JP4795281B2 (ja) | 車両の安全装置 | |
JP7240910B2 (ja) | 乗員観察装置 | |
JP4978574B2 (ja) | 眼検出装置 | |
JP2010262478A (ja) | 車両制御システム及び安全確認判定装置 | |
JP5035139B2 (ja) | 眼画像処理装置 | |
JP2009297321A (ja) | 視線方向認識エラー検出装置 | |
WO2020255238A1 (ja) | 情報処理装置、プログラム及び情報処理方法 | |
JP4623044B2 (ja) | 眼の開閉状態検出装置 | |
US20230394702A1 (en) | Device, method, and computer program for estimating seat position | |
JP2022161318A (ja) | 顔認識装置 | |
JP2022166702A (ja) | 顔認識装置 | |
JP2021152768A (ja) | 開眼度算出装置 | |
JP2009037560A (ja) | 顔画像処理装置 | |
JP2010009140A (ja) | 眼画像処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11869470 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013523729 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14131531 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120111054416 Country of ref document: DE Ref document number: 112011105441 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11869470 Country of ref document: EP Kind code of ref document: A1 |