WO2010113821A1 - 顔特徴点検出装置及びプログラム - Google Patents
顔特徴点検出装置及びプログラム Download PDFInfo
- Publication number
- WO2010113821A1 WO2010113821A1 PCT/JP2010/055454 JP2010055454W WO2010113821A1 WO 2010113821 A1 WO2010113821 A1 WO 2010113821A1 JP 2010055454 W JP2010055454 W JP 2010055454W WO 2010113821 A1 WO2010113821 A1 WO 2010113821A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- parameter
- corner
- eyelid
- shape model
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
Definitions
- the present invention relates to a face feature point detection apparatus and program, and more particularly, to a face feature point detection apparatus and program for detecting an eye position and an eye position as face feature points.
- a face included in an image to be detected is detected, the eyes constituting the face are detected using the detection information of the face, and the eyes and the corners of the eyes are detected using the detection information of the eyes
- a face feature point detection method that detects the eyes and the corners of the eyes using a learning discriminator generated in advance using a plurality of sample images (see, for example, Patent Document 1).
- the learning discriminator can discriminate the eye corner or the eye-likeness of the area to be detected, but the accuracy is not good for detecting the eye corner position and the eye position as face feature points. There is a problem. Further, in order to improve the detection accuracy, it is necessary to generate and store a high-definition learning discriminator, which increases the calculation time and the required memory amount.
- An object of the present invention is to provide a face feature point detection device and a program that can be detected.
- a face feature point detection device includes an acquisition unit that acquires image data of an image including an eye region having an eye and an eye corner, and a pattern from the image data acquired by the acquisition unit.
- Position detecting means for detecting the first position of the eye and the first position of the corner of the eye by matching, the upper curve and the lower side with the first position of the eye and the first position of the corner of the eye detected by the position detecting means as end points
- a first saddle shape model in which the upper eyelid position defined on the upper curve is a first parameter and the lower eyelid position defined on the lower curve is a second parameter
- the first likelihood indicating the degree to which the first wrinkle shape model when the first parameter and the second parameter are changed matches the shape of the wrinkle included in the image, Parameters and
- the first calculation means for calculating each second parameter, the first heel shape model having the highest first likelihood calculated by the first calculation means, and the eye position is the third position In the second eyelid shape model in which the parameters and the position of the corner of the eye are the
- the face feature point detection program of the present invention uses a computer to acquire image data of an image including an eye area including an eye and an eye corner, and image data acquired by the acquisition means by pattern matching.
- Position detecting means for detecting the first position and the first position of the outer corner of the eye, and an upper curve and a lower curve with the first position of the eye and the first position of the outer corner of the eye detected by the position detecting means as end points.
- the first heel position defined on the upper curve and the second heel position defined on the lower curve as the second parameter is the first heel shape model.
- the acquisition unit acquires image data of an image including an eye region including the eyes and the corners of the eyes, and the position detection unit from the image data acquired by the acquisition unit,
- the first position of the eyes and the first position of the corners of the eyes are detected by pattern matching.
- the first calculation means is represented by an upper curve and a lower curve whose end points are the first position of the eye head and the first position of the outer corner of the eye detected by the position detection means, and is defined on the upper curve.
- the first parameter and the second parameter are changed.
- a first likelihood indicating a degree of coincidence between the first wrinkle shape model and the wrinkle shape included in the image is calculated for each of the first parameter and the second parameter.
- the position indicated by the third parameter of the second eyelid shape model having the highest likelihood of the eyelid shape included in the image is the second position of the eye and the position indicated by the fourth parameter is the second position of the corner of the eye Therefore, even when the eyes and the corners of the eyes are hidden by noise such as reflection of glasses, the positions of the eyes and the corners of the eyes can be accurately detected as face feature points.
- the second computing means of the present invention is characterized in that the second eyelid shape model among the eyelid shapes included in the image when the degree of pattern matching by the position detecting means is smaller than a predetermined threshold. It is possible to calculate the second likelihood using a portion in the vicinity of the upper eyelid position and the lower eyelid position. If the degree of matching of the pattern matching by the position detection means is smaller than a predetermined threshold value, there is a high possibility that noise exists near the eyes and the corners of the eyes, so the upper eyelid position and the lower eyelid position of the second eyelid shape model By calculating the second likelihood using the portion in the vicinity of the position, the likelihood can be calculated without the influence of noise.
- the position detecting means of the present invention detects a plurality of eye candidate points as the first position of the eye corner and a plurality of eye corners as the first position of the eye corner when the matching degree of pattern matching is smaller than a predetermined threshold.
- a candidate point is detected, and the first calculation means uses a first eye candidate point selected from the plurality of eye candidate points and a first eye candidate point selected from the plurality of eye candidate points as a first point.
- the first likelihood can be calculated for all the combinations of the eye candidate points and the eye corner candidate points. Thereby, a 1st hook shape model can be set flexibly and a parameter with high likelihood can be determined.
- the second calculation means of the present invention is configured so that the third parameter and The fourth parameter can be changed within a predetermined range.
- the eye opening is small, the eye position and the eye position may be detected at a position deviated from the original eye and the corner of the eye due to the influence of wrinkles, etc., so the third parameter and the fourth parameter are set in advance. By changing within a predetermined range, it is possible to prevent the detected eye position and eye position from deviating greatly.
- the face feature point detection device of the present invention provides the difference between the second position of the eye and the first position of the eye that has been determined this time by the position determining means, and the second position of the eye corner and the first position of the eye that have been determined this time. If at least one of the differences from the first position is greater than a previously determined difference threshold, the second position of the eye and the second position of the corner of the eye determined this time are replaced with the first position of the eye and the first position of the corner of the eye.
- the control unit may be configured to control to determine the second position of the eyeball and the second position of the outer corner of the eye again. Thereby, detection accuracy improves.
- the face feature point detection program of the present invention is a program for causing a computer to function as each means constituting the face feature point detection apparatus of the present invention.
- the eye position and the eye corner position are detected as the parameters of the eyelid shape model having a high likelihood of the eyelid shape. Even when the corners of the eyes and the head of the eyes are hidden by noise, the effect of being able to accurately detect the positions of the eyes and the positions of the eyes as the face feature points can be obtained.
- the face feature point detection apparatus 10 includes a camera 12, a display device 14, and a computer 16 that capture an object to be imaged.
- the display device 14 includes an LCD (Liquid Crystal Display) or a CRT (Cathode Ray Tube) that performs display based on input information.
- LCD Liquid Crystal Display
- CRT Cathode Ray Tube
- the computer 16 includes an I / O (input / output) port 16a, a ROM (Read Only Memory) 16b, an HDD (Hard Disk Drive) 16c, a CPU (Central Processing Unit) 16d, a RAM (Random Access Memory) 16e, and these I / Os.
- the bus 16f is connected to the O port 16a, the ROM 16b, the HDD 16c, the CPU 16d, and the RAM 16e.
- the ROM 16b or the HDD 16c as a storage medium stores a basic program such as an OS, various programs such as a facial feature point detection program for executing a processing routine of a facial feature point detection process described later, and various data. .
- the CPU 16d reads the program from the ROM 16b and the HDD 16c and executes it, and various data are temporarily stored in the RAM 16e.
- the camera 12 and the display device 14 are connected to the I / O port 16a.
- this routine is executed by the CPU 16d of the computer 16 every predetermined time interval (for example, several tens of milliseconds) from the time when the switch (not shown) of the facial feature point detection apparatus 10 is turned on.
- step 100 the image data of the face image taken by the camera 12 is captured.
- step 102 an eye corner position detection process for detecting the first position of the eyes and the first position of the eyes from the image based on the image data captured in step 100 is executed.
- step 104 an upper / lower eyelid position detection process for detecting the upper eyelid position and the lower eyelid position based on the first position of the eye head and the first position of the outer corner of the eye detected in step 102 is executed.
- step 106 an eye-eye corner position determination process for determining the second position of the eye and the second position of the eye corner based on the upper eyelid position and the lower eyelid position detected in step 104 is executed.
- step 108 the display device 14 is controlled to display the second position of the eye and the second position of the eye corner determined in step 106 as the detection result of the eye corner position and the eye position, The process ends.
- step 110 as shown in FIG. 4A, for example, a face area 30 of a rectangular area is detected from the image by a technique such as template matching.
- an eye search range 32 is set for the face region 30 detected in step 110 as shown in FIG. 4B.
- the eye search range 32 sets the search range for the right eye and the left eye in the face region 30 where the eyes are assumed to exist, according to the size of the detected face region 30 and the like.
- a rectangular eye region 34 is detected from the set eye search range 32 by a method such as template matching.
- an eye search range 36 and an eye corner search range 38 are set as shown in FIG.
- the eye search range 36 can be set as a right region and its peripheral region when the eye region is divided into three in the horizontal direction. Note that the processing after this step is performed for each of the right eye and the left eye.
- the left eye processing will be described, and the right eye processing is the same, and the description thereof will be omitted.
- step 116 as shown in FIG. 6, the head search area 36 set in step 114 is scanned with, for example, a rectangular head search window 40 to detect the head region, template matching, etc.
- an output value V head indicating the degree of matching between the image of the eye area and the template is calculated.
- step 118 it is determined whether or not the output value V head calculated in step 116 is smaller than a predetermined threshold value ⁇ .
- the threshold value ⁇ is determined in advance as a value that can determine the presence or absence of noise. If V head ⁇ , the routine proceeds to step 120 where the center of the eye search range 36 is detected as the first position of the eye as shown in FIG. On the other hand, if V head ⁇ ⁇ , the routine proceeds to step 122, where the center of the eye area (eye search window 40) is detected as the first position of the eye as shown in FIG.
- the first position of the eye is specified by coordinates (x, y) corresponding to the pixels on the image. The same applies to the first position, the upper eyelid position, the lower eyelid position, etc. of the outer corner of the eye which will be described later.
- the output value V tail is calculated using the eye corner search window 42 by the same processing as the processing for detecting the first position of the eye corner, and the first position of the eye corner is detected. To return.
- step 104 of the processing routine of the face feature point detection process (FIG. 2) will be described with reference to FIG.
- step 132 the first position of the eye corner detected in the eye eye corner position detection process is the control point P 3 , the first position of the eye corner is the control point P 4 , and the control corresponding to the upper eyelid position. setting the point P 1.
- Control points P 3 and P 4 are fixed.
- a Bezier curve determined based on the control points P 1 , P 3 , and P 4 is used as the upper eyelid shape model.
- FIG. 9B determining the middle point of the vertical line drawn control points P 3 and P 4 from the control point P 1 to the line segment edge points and the upper eyelid position candidate.
- step 134 while changing the control point P 1, and calculates a fitting evaluation value ⁇ by the following equation (1).
- p i is a normal vector at a point i on the Bezier curve
- e i is a luminance gradient vector of the image at the point i
- n is the number of points i on the Bezier curve.
- the likelihood of the Bezier curve and the shape of the eyelid included in the image is calculated from the inner product of the normal vector at the point i and the luminance gradient vector of the image.
- the change area control point P 1 is, for example, may be an upper range of a straight line passing through a within the eye search range 32, and the control points P 3 and P 4.
- step 136 determines whether to calculate the fitting evaluation value ⁇ making any changes to the control point P 1 at all positions within the change area. If the calculation does not, it returns to step 132, to set the control points P 1 to the next position and repeats the process of calculating the fitting evaluation value lambda. If the calculation process is completed in all the positions, the process proceeds to step 138, as shown in FIG. 10, the upper eyelid position candidate on determined by the control point P 1 when the fitting evaluation value ⁇ calculated is maximum It is detected as the heel position.
- step 140-146 the upper eyelid position detected process similar, using the control point P 2 corresponding to the lower eyelid positions, calculates a fitting evaluation value lambda, fitting evaluation value lambda is It is detected as a lower eyelid position the lower eyelid position candidate determined by the control point P 2 at which the to return.
- step 106 of the processing routine of the face feature point detection process (FIG. 2)
- step 150 the control point of the upper eyelid position detected by the upper and lower eyelid position detection process is P 1
- the control point of the lower eyelid position is P 2
- the corner of the eye detected by the eye corner position detection process Are set as the control point P 4
- the second position candidate of the eye is set as the control point P 3 .
- Upper eyelid position, the lower eyelid position and P 4 are fixed.
- a Bezier curve determined based on the control points P 1 , P 3 , and P 4 and a Bezier curve determined based on the control points P 2 , P 3 , and P 4 are used as the upper and lower saddle shape models.
- step 152 by determining whether or not the output value V head calculated in step 116 of the upper eye corner position detection process (FIG. 3) is greater than or equal to the threshold value ⁇ , there is a possibility that noise exists near the eye. Determine if there is any. If V head ⁇ ⁇ , the routine proceeds to step 154, where the point (1) on the Bezier curve of the control point P 3 to the control point P 1 and the control point P 3 to the control point P 2 is used. A fitting evaluation value ⁇ is calculated from the equation. On the other hand, if V head ⁇ , the routine proceeds to step 156, where the fitting evaluation value ⁇ is calculated using the point i on the Bezier curve near the upper and lower eyelid positions, as shown in FIG. 12B.
- the vicinity of the upper eyelid position can be, for example, a range of 1/2 to 1/3 from the upper eyelid position on the curve between the control point P 3 and the upper eyelid position. The same applies to the vicinity of the lower eyelid position.
- step 158 it determines whether to calculate the fitting evaluation value ⁇ making any changes to the control point P 3 at all positions within the change area.
- the change range can be, for example, within the search range 36 of the head. If the calculation does not, returns to step 150, to set the control point P 3 to the next position and repeats the process of calculating the fitting evaluation value lambda. If the calculation process is completed in all the positions, the process proceeds to step 160, fitting evaluation value calculated ⁇ determines the position of the control point P 3 at which the maximum as the second position of the inner corner.
- the control point for the upper eyelid position detected in the upper / lower eyelid position detection process is set to P 1 and the control point for the lower eyelid position is determined by the same process as the process for determining the second position of the eye.
- P 2 the second position of the eye determined in step 160 is set as the control point P 3
- the second position candidate of the outer corner of the eye is set as the control point P 4
- the upper eyelid position, the lower eyelid position, and P 3 are fixed.
- the first eyelid and the first position of the outer corner of the eye are detected using the first and second positions of the upper and lower eyes.
- Fitting with the shape model, using the upper eyelid position and the lower eyelid position when the fitting evaluation value is maximum, fitting with the second eyelid shape model, the second position of the eye and the second position of the eye corner Therefore, even when the eyes or the corners of the eyes are hidden by noise such as the reflection of glasses, the positions of the eyes and the positions of the eyes can be accurately detected as face feature points.
- step 116 the eye search area 36 is scanned with respect to the eye search range 36 to detect the eye area, and the eye area is detected by a method such as template matching.
- An output value V head indicating the degree of matching between the image of the region and the template is calculated.
- step 118 it is determined whether or not V head is smaller than the threshold value ⁇ . If it is smaller than the threshold value ⁇ , the routine proceeds to step 200, where a plurality of eye position candidates are set within the eye search range 36, as shown in FIG. Similarly, when the output value V tail is smaller than the threshold value ⁇ , a plurality of eye corner position candidates are set in the eye corner search range 38 in step 202.
- step 210 If a plurality of eye position candidates or eye area position candidates are set in step 210 in the eye / eye corner position detection process, one of them is selected, and then in step 132, the selected eye position candidates are selected.
- the control point P 3 and the corner position candidate are fixed as the control point P 4 , and then, in step 134, the fitting evaluation value ⁇ is calculated.
- step 136 If it is determined in step 136 that fitting has been completed at all positions within the change range, the process proceeds to step 212, where all combinations of eye position candidate and eye corner position candidates are selected and the process is completed. If all the processes have not been completed, the process returns to step 210 to select the next candidate for the eye position or the candidate for the position of the outer corner of the eye. If all the processing has been completed, the fitting calculated in step 138 for all combinations of one first eye position selected from the eye position candidates and one first eye position selected from the eye position candidates. evaluation value ⁇ is detected as an upper eyelid position eyelid position candidate on determined by the control point P 1 at the maximum.
- step 214 the eye corner positions are selected one by one from the eye head position candidates and the eye corner position candidates, and in step 216, it is determined whether all the combinations are selected and the processing is completed.
- the face feature point detection apparatus of the second embodiment when noise is present in the vicinity of the eyes or in the vicinity of the eye corners during the eye / eye corner position detection processing, the first eye feature is detected.
- the first eye feature is detected.
- the position detection accuracy is improved, and the determination accuracy of the second position of the eye and the second position of the corner of the eye determined using the upper eyelid position and the lower eyelid position is also improved.
- the third embodiment is different from the first embodiment in that the range of control points for detecting the second position of the eye and the second position of the corner of the eye is limited by the opening degree of the eyes. .
- symbol is attached
- step 300 it is determined whether or not the opening degree of the eye represented by the distance between the upper eyelid position and the lower eyelid position detected in the vertical eyelid position detection process is equal to or smaller than the opening threshold value.
- the opening degree threshold a value is set in advance so that the opening degree of the eye, which makes it difficult to discriminate the eye position and the eye corner position due to the influence as described above, can be determined. If the opening of the eye is less than or equal to the opening threshold, the process proceeds to step 302. If the opening is larger than the opening threshold, the process proceeds to step 150.
- step 302 to limit the range of change in the control points P 3 and P 4.
- the second position of the eye and the second position of the outer corner of the eye determined when the opening of the eye is equal to or greater than the opening threshold are stored in a predetermined storage area, and the distance from this position is within a predetermined range. It can be limited to such a range.
- the change range can be limited to a distance corresponding to 20 pixels.
- step 158 and step 170 processes similar to the eye inside and outside corner position determination process in the first embodiment, step 158 and step 170, the control point P 3, or P 4 at all points within a limited change range at step 302 By changing, it is determined whether or not the calculation of the fitting evaluation value ⁇ has been completed.
- the face feature point detection device of the third embodiment even when the opening degree of the eyes is small, the eye position and the position of the eye corners are large due to the influence of wrinkles generated near the eyes and the corners of the eyes. It is possible to prevent the detection from being shifted.
- the fourth embodiment is different from the first embodiment in that the top / bottom corner position determination process is repeatedly executed.
- symbol is attached
- step 100 image data of a face image is captured.
- step 102 an eye-eye corner position detection process is performed to detect the first position of the eye and the first position of the eye corner.
- the upper / lower eyelid position detecting process for detecting the eyelid position and the lower eyelid position is executed, and then, in step 106, the eye / eye corner position determining process for determining the second position of the eye and the second position of the eye corner is executed.
- step 400 it is determined whether or not the difference between the second position of the eye and the second position of the corner of the eye determined in step 106 and the first position of the eye and the first position of the corner of the eye is within a predetermined range. To do. If it is within the predetermined range, the process proceeds to step 108 and the detection result is displayed. On the other hand, if it is out of the predetermined range, the process proceeds to step 402.
- step 402 the second position of the eye and the second position of the corner of the eye determined in step 106 are replaced with the first position of the eye and the first position of the corner of the eye, and the process returns to step 104.
- step 104 the upper eyelid position and the lower eyelid position are detected again using the first position of the eye and the first position of the outer corner of the eye replaced in step 402.
- step 106 the second position of the eye is again detected. Determine the position and the second position of the corner of the eye.
- the first position of the eye and the first position of the eye corner detected by the eye / eye corner position detection process are determined with higher accuracy.
- the upper eyelid position and the lower eyelid position are detected again, the second position of the upper eye and the second position of the outer corner of the eye are determined, Since it repeats until the difference with the 1st position becomes a predetermined range, the 2nd position of the eye corner and the 2nd position of the corner of the eye can be determined with sufficient accuracy.
- the present invention is not limited to this.
- a saddle-shaped window 50 in which three windows 50a, 50b, and 50c are combined may be used as a saddle-shaped model.
- the intensity of the vertical edge appearing in the left and right windows 50a and 50c is detected, and the intensity of the horizontal edge appearing in the center window 50b is detected.
- the fitting can be performed using the sum of the intensities.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
12 カメラ
14 表示装置
16 コンピュータ
16d CPU
Claims (7)
- 目頭及び目尻を備えた目領域を含む画像の画像データを取得する取得手段と、
前記取得手段で取得した画像データから、パターンマッチングにより目頭の第1位置及び目尻の第1位置を検出する位置検出手段と、
前記位置検出手段で検出された目頭の第1位置及び目尻の第1位置を端点とする上側の曲線及び下側の曲線で表され、かつ前記上側の曲線上に定めた上瞼位置を第1のパラメータ及び前記下側の曲線上に定めた下瞼位置を第2のパラメータとする第1の瞼形状モデルにおいて、前記第1のパラメータ及び前記第2のパラメータを変更したときの前記第1の瞼形状モデルと前記画像に含まれる瞼の形状とが一致する度合いを示す第1の尤度を、前記第1のパラメータ及び前記第2のパラメータ毎に演算する第1の演算手段と、
前記第1の演算手段で演算された第1の尤度が最も高い前記第1の瞼形状モデルで、かつ目頭位置を第3のパラメータ及び目尻位置を第4のパラメータとした第2の瞼形状モデルにおいて、前記第3のパラメータ及び前記第4のパラメータを変更したときの前記第2の瞼形状モデルと前記画像に含まれる瞼の形状とが一致する度合いを示す第2の尤度を、前記第3のパラメータ及び前記第4のパラメータ毎に演算する第2の演算手段と、
前記第2の演算手段で演算された第2の尤度が最も高い前記第2の瞼形状モデルの第3のパラメータで示される位置を目頭の第2位置及び第4のパラメータで示される位置を目尻の第2位置として決定する位置決定手段と、
を含む顔特徴点検出装置。 - 前記第2の演算手段は、前記位置検出手段によるパターンマッチングのマッチング度が予め定めた閾値より小さい場合に、前記画像に含まれる瞼の形状のうち前記第2の瞼形状モデルの上瞼位置及び下瞼位置近傍の部分を用いて第2の尤度を演算する請求項1記載の顔特徴点検出装置。
- 前記位置検出手段は、パターンマッチングのマッチング度が予め定めた閾値より小さい場合に、目頭の第1位置として複数の目頭候補点を検出し、目尻の第1位置として複数の目尻候補点を検出し、
前記第1の演算手段は、前記複数の目頭候補点から選択した1つの目頭候補点及び前記複数の目尻候補点から選択した1つの目尻候補点を端点とする第1の瞼形状モデルにおいて、目頭候補点と目尻候補点との組合せの全てについて前記第1の尤度を演算する
請求項1または請求項2記載の顔特徴点検出装置。 - 前記第2の演算手段は、前記第2の瞼形状モデルの上瞼位置と下瞼位置との距離が予め定めた開度閾値より小さい場合には、前記第3のパラメータ及び前記第4のパラメータを予め定めた範囲内で変更する請求項1~請求項3のいずれか1項記載の顔特徴点検出装置。
- 前記位置決定手段で今回決定された目頭の第2位置と目頭の第1位置との差、及び今回決定された目尻の第2位置と目尻の第1位置との差の少なくとも一方が予め定めた前回差閾値より大きい場合には、今回決定された目頭の第2位置及び目尻の第2位置を目頭の第1位置及び目尻の第1位置として置き換えて、再度目頭の第2位置及び目尻の第2位置を決定するように制御する制御手段を含む請求項1~請求項4のいずれか1項記載の顔特徴点検出装置。
- コンピュータを、
目頭及び目尻を備えた目領域を含む画像の画像データを取得する取得手段と、
前記取得手段で取得した画像データから、パターンマッチングにより目頭の第1位置及び目尻の第1位置を検出する位置検出手段と、
前記位置検出手段で検出された目頭の第1位置及び目尻の第1位置を端点とする上側の曲線及び下側の曲線で表され、かつ前記上側の曲線上に定めた上瞼位置を第1のパラメータ及び前記下側の曲線上に定めた下瞼位置を第2のパラメータとする第1の瞼形状モデルにおいて、前記第1のパラメータ及び前記第2のパラメータを変更したときの前記第1の瞼形状モデルと前記画像に含まれる瞼の形状とが一致する度合いを示す第1の尤度を、前記第1のパラメータ及び前記第2のパラメータ毎に演算する第1の演算手段と、
前記第1の演算手段で演算された第1の尤度が最も高い前記第1の瞼形状モデルで、かつ目頭位置を第3のパラメータ及び目尻位置を第4のパラメータとした第2の瞼形状モデルにおいて、前記第3のパラメータ及び前記第4のパラメータを変更したときの前記第2の瞼形状モデルと前記画像に含まれる瞼の形状とが一致する度合いを示す第2の尤度を、前記第3のパラメータ及び前記第4のパラメータ毎に演算する第2の演算手段と、
前記第2の演算手段で演算された第2の尤度が最も高い前記第2の瞼形状モデルの第3のパラメータで示される位置を目頭の第2位置及び第4のパラメータで示される位置を目尻の第2位置として決定する位置決定手段と、
して機能させるための顔特徴点検出プログラム。 - コンピュータを、請求項1~請求項5のいずれか1項記載の顔特徴点検出装置を構成する各手段として機能させるための顔特徴点検出プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020117022723A KR101267205B1 (ko) | 2009-04-02 | 2010-03-26 | 얼굴특징점 검출 장치 및 프로그램 |
EP10758593.7A EP2416294B1 (en) | 2009-04-02 | 2010-03-26 | Face feature point detection device and program |
CN2010800132867A CN102362291B (zh) | 2009-04-02 | 2010-03-26 | 脸部特征点检测装置及方法 |
US13/259,065 US8331630B2 (en) | 2009-04-02 | 2010-03-26 | Face feature point detection device and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-090065 | 2009-04-02 | ||
JP2009090065A JP5221436B2 (ja) | 2009-04-02 | 2009-04-02 | 顔特徴点検出装置及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010113821A1 true WO2010113821A1 (ja) | 2010-10-07 |
Family
ID=42828112
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/055454 WO2010113821A1 (ja) | 2009-04-02 | 2010-03-26 | 顔特徴点検出装置及びプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US8331630B2 (ja) |
EP (1) | EP2416294B1 (ja) |
JP (1) | JP5221436B2 (ja) |
KR (1) | KR101267205B1 (ja) |
CN (1) | CN102362291B (ja) |
WO (1) | WO2010113821A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101141901B1 (ko) | 2011-12-13 | 2012-07-12 | 한국 한의학 연구원 | 윗 눈꺼풀 모양 검출장치 및 방법 |
KR101164769B1 (ko) | 2011-12-13 | 2012-07-12 | 한국 한의학 연구원 | 눈 특징점 검출장치 및 방법 |
EP2701122A4 (en) * | 2011-04-19 | 2016-03-16 | Aisin Seiki | EYE-LIGHT DETECTION DEVICE, EYE-LIGHT DETECTION METHOD AND PROGRAM |
EP2698762B1 (en) * | 2011-04-15 | 2017-01-04 | Aisin Seiki Kabushiki Kaisha | Eyelid-detection device, eyelid-detection method, and program |
CN111582270A (zh) * | 2020-04-24 | 2020-08-25 | 哈尔滨工业大学 | 基于高精度的桥梁区域视觉靶标特征点的识别追踪方法 |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8824808B2 (en) * | 2011-08-19 | 2014-09-02 | Adobe Systems Incorporated | Methods and apparatus for automated facial feature localization |
JP5802524B2 (ja) * | 2011-11-21 | 2015-10-28 | 株式会社Pfu | 画像処理装置、画像処理方法、および画像処理プログラム |
CN103208111B (zh) * | 2012-01-17 | 2015-10-07 | 富士通株式会社 | 用于修正图像角点的方法和装置以及图像处理设备 |
US20130243270A1 (en) * | 2012-03-16 | 2013-09-19 | Gila Kamhi | System and method for dynamic adaption of media based on implicit user input and behavior |
WO2014016894A1 (ja) * | 2012-07-23 | 2014-01-30 | 富士通株式会社 | 形状データ生成方法及び装置 |
KR102012254B1 (ko) * | 2013-04-23 | 2019-08-21 | 한국전자통신연구원 | 이동 단말기를 이용한 사용자 응시점 추적 방법 및 그 장치 |
US9122914B2 (en) * | 2013-05-09 | 2015-09-01 | Tencent Technology (Shenzhen) Co., Ltd. | Systems and methods for matching face shapes |
JP6227996B2 (ja) * | 2013-12-18 | 2017-11-08 | 浜松ホトニクス株式会社 | 計測装置及び計測方法 |
CN105279764B (zh) | 2014-05-27 | 2020-09-11 | 北京三星通信技术研究有限公司 | 眼睛图像处理设备和方法 |
US10089525B1 (en) | 2014-12-31 | 2018-10-02 | Morphotrust Usa, Llc | Differentiating left and right eye images |
US9846807B1 (en) | 2014-12-31 | 2017-12-19 | Morphotrust Usa, Llc | Detecting eye corners |
US9710707B1 (en) | 2014-12-31 | 2017-07-18 | Morphotrust Usa, Llc | Detecting iris orientation |
US9747508B2 (en) * | 2015-07-24 | 2017-08-29 | Honda Motor Co., Ltd. | Surrounding environment recognition device |
CN112836664A (zh) * | 2015-08-21 | 2021-05-25 | 奇跃公司 | 使用眼睛姿态测量的眼睑形状估计 |
CA3170014A1 (en) | 2015-10-16 | 2017-04-20 | Magic Leap, Inc. | Eye pose identification using eye features |
JP2019082743A (ja) * | 2016-03-18 | 2019-05-30 | 三菱電機株式会社 | 情報処理装置及び情報処理方法 |
US10082866B2 (en) | 2016-04-12 | 2018-09-25 | International Business Machines Corporation | Gaze point detection using dynamic facial reference points under varying lighting conditions |
CN106203262A (zh) * | 2016-06-27 | 2016-12-07 | 辽宁工程技术大学 | 一种基于眼睑曲线相似度与眼型指数的眼型分类方法 |
JP6946831B2 (ja) * | 2017-08-01 | 2021-10-13 | オムロン株式会社 | 人物の視線方向を推定するための情報処理装置及び推定方法、並びに学習装置及び学習方法 |
JP6698966B2 (ja) * | 2018-02-13 | 2020-05-27 | 三菱電機株式会社 | 誤検出判定装置及び誤検出判定方法 |
CN110634174B (zh) * | 2018-06-05 | 2023-10-10 | 深圳市优必选科技有限公司 | 一种表情动画过渡方法、系统及智能终端 |
CN110956067B (zh) * | 2019-05-26 | 2022-05-17 | 魔门塔(苏州)科技有限公司 | 一种人眼眼睑曲线的构建方法及装置 |
CN113221599B (zh) * | 2020-01-21 | 2022-06-10 | 魔门塔(苏州)科技有限公司 | 一种眼睑曲线的构建方法及装置 |
SE2250299A1 (en) * | 2022-03-04 | 2023-09-05 | Tobii Ab | Eye openness |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007094906A (ja) | 2005-09-29 | 2007-04-12 | Toshiba Corp | 特徴点検出装置および方法 |
JP2007213377A (ja) | 2006-02-10 | 2007-08-23 | Fujifilm Corp | 顔特徴点検出方法および装置並びにプログラム |
JP2007265367A (ja) * | 2006-03-30 | 2007-10-11 | Fujifilm Corp | 視線検出方法および装置ならびにプログラム |
JP2009003644A (ja) * | 2007-06-20 | 2009-01-08 | Toyota Motor Corp | 開眼度判定装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AUPQ896000A0 (en) * | 2000-07-24 | 2000-08-17 | Seeing Machines Pty Ltd | Facial image processing system |
CN100382751C (zh) * | 2005-05-08 | 2008-04-23 | 上海交通大学 | 基于vpf和改进的susan的眼角和瞳孔的定位方法 |
FR2907569B1 (fr) * | 2006-10-24 | 2009-05-29 | Jean Marc Robin | Procede et dispositif de simulation virtuelle d'une sequence d'images video. |
FR2911984B1 (fr) * | 2007-01-30 | 2009-02-27 | Siemens Vdo Automotive Sas | Procede pour identifier des points symboliques sur une image d'un visage d'une personne |
JP4309928B2 (ja) * | 2007-03-15 | 2009-08-05 | アイシン精機株式会社 | 瞼検出装置、瞼検出方法、及び、プログラム |
CN100561503C (zh) * | 2007-12-28 | 2009-11-18 | 北京中星微电子有限公司 | 一种人脸眼角与嘴角定位与跟踪的方法及装置 |
JP2010033305A (ja) * | 2008-07-29 | 2010-02-12 | Hitachi Ltd | 画像情報処理方法、及び装置 |
US8345922B2 (en) * | 2008-09-03 | 2013-01-01 | Denso Corporation | Apparatus for detecting a pupil, program for the same, and method for detecting a pupil |
FR2920938B1 (fr) * | 2008-09-30 | 2010-01-29 | Jean Marc Robin | Procede et dispositif de simulation virtuelle d'une image |
JP4788786B2 (ja) * | 2009-02-09 | 2011-10-05 | 株式会社デンソー | 眠気検出装置,プログラムおよび眠気検出方法 |
-
2009
- 2009-04-02 JP JP2009090065A patent/JP5221436B2/ja active Active
-
2010
- 2010-03-26 US US13/259,065 patent/US8331630B2/en active Active
- 2010-03-26 KR KR1020117022723A patent/KR101267205B1/ko active IP Right Grant
- 2010-03-26 WO PCT/JP2010/055454 patent/WO2010113821A1/ja active Application Filing
- 2010-03-26 CN CN2010800132867A patent/CN102362291B/zh active Active
- 2010-03-26 EP EP10758593.7A patent/EP2416294B1/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007094906A (ja) | 2005-09-29 | 2007-04-12 | Toshiba Corp | 特徴点検出装置および方法 |
JP2007213377A (ja) | 2006-02-10 | 2007-08-23 | Fujifilm Corp | 顔特徴点検出方法および装置並びにプログラム |
JP2007265367A (ja) * | 2006-03-30 | 2007-10-11 | Fujifilm Corp | 視線検出方法および装置ならびにプログラム |
JP2009003644A (ja) * | 2007-06-20 | 2009-01-08 | Toyota Motor Corp | 開眼度判定装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2416294A4 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2698762B1 (en) * | 2011-04-15 | 2017-01-04 | Aisin Seiki Kabushiki Kaisha | Eyelid-detection device, eyelid-detection method, and program |
EP2701122A4 (en) * | 2011-04-19 | 2016-03-16 | Aisin Seiki | EYE-LIGHT DETECTION DEVICE, EYE-LIGHT DETECTION METHOD AND PROGRAM |
KR101141901B1 (ko) | 2011-12-13 | 2012-07-12 | 한국 한의학 연구원 | 윗 눈꺼풀 모양 검출장치 및 방법 |
KR101164769B1 (ko) | 2011-12-13 | 2012-07-12 | 한국 한의학 연구원 | 눈 특징점 검출장치 및 방법 |
CN111582270A (zh) * | 2020-04-24 | 2020-08-25 | 哈尔滨工业大学 | 基于高精度的桥梁区域视觉靶标特征点的识别追踪方法 |
Also Published As
Publication number | Publication date |
---|---|
US8331630B2 (en) | 2012-12-11 |
EP2416294A1 (en) | 2012-02-08 |
CN102362291A (zh) | 2012-02-22 |
JP5221436B2 (ja) | 2013-06-26 |
US20120014610A1 (en) | 2012-01-19 |
JP2010244178A (ja) | 2010-10-28 |
CN102362291B (zh) | 2013-09-04 |
KR101267205B1 (ko) | 2013-05-24 |
EP2416294A4 (en) | 2012-04-11 |
EP2416294B1 (en) | 2013-05-29 |
KR20120006493A (ko) | 2012-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5221436B2 (ja) | 顔特徴点検出装置及びプログラム | |
US11775056B2 (en) | System and method using machine learning for iris tracking, measurement, and simulation | |
CN109426835B (zh) | 信息处理装置、信息处理装置的控制方法和存储介质 | |
JP4895847B2 (ja) | 瞼検出装置及びプログラム | |
EP3767520B1 (en) | Method, device, equipment and medium for locating center of target object region | |
CN106598221A (zh) | 基于眼部关键点检测的3d视线方向估计方法 | |
JP4680161B2 (ja) | 画像評価装置および方法並びにプログラム | |
JP2007042072A (ja) | 追跡装置 | |
US9501689B2 (en) | Image processing apparatus and image processing method | |
JP2009240519A (ja) | 眼開閉判別装置、及びプログラム | |
JP4881199B2 (ja) | 画像評価装置および方法並びにプログラム | |
JP2000137792A (ja) | 眼部検出装置 | |
CN116051631A (zh) | 光斑标注方法及系统 | |
CN112560584A (zh) | 一种人脸检测方法及装置、存储介质、终端 | |
CN111639582A (zh) | 活体检测方法及设备 | |
JP2007025902A (ja) | 画像処理装置、画像処理方法 | |
JP2012068948A (ja) | 顔属性推定装置およびその方法 | |
JP5201184B2 (ja) | 画像処理装置及びプログラム | |
JP5035139B2 (ja) | 眼画像処理装置 | |
KR101276792B1 (ko) | 눈 검출 장치 및 방법 | |
WO2022242490A1 (en) | Method for detecting and removing personnel interference while measuring volume of an object | |
TW202203079A (zh) | 智能鏡的臉部影像定位方法 | |
JP2005215899A (ja) | 対象物検出装置及びその方法 | |
CN115620067A (zh) | 新冠肺炎抗原检测结果判别方法及装置、设备、存储介质 | |
JP2006092151A (ja) | 領域検出装置、領域検出プログラムおよび領域検出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080013286.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10758593 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13259065 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010758593 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20117022723 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |