JP2011138558A5 - - Google Patents
Download PDFInfo
- Publication number
- JP2011138558A5 JP2011138558A5 JP2011088669A JP2011088669A JP2011138558A5 JP 2011138558 A5 JP2011138558 A5 JP 2011138558A5 JP 2011088669 A JP2011088669 A JP 2011088669A JP 2011088669 A JP2011088669 A JP 2011088669A JP 2011138558 A5 JP2011138558 A5 JP 2011138558A5
- Authority
- JP
- Japan
- Prior art keywords
- eye
- image
- eye position
- candidate
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 claims 20
- 210000001747 Pupil Anatomy 0.000 claims 4
- 230000000875 corresponding Effects 0.000 claims 4
- 230000002093 peripheral Effects 0.000 claims 3
- 210000004709 Eyebrows Anatomy 0.000 claims 2
- 239000010410 layer Substances 0.000 claims 1
- 239000002365 multiple layer Substances 0.000 claims 1
Claims (10)
元画像から、グレースケールの走査対象画像を作成する機能、該走査対象画像上で顔画像サイズの枠を走査し、該枠内の中央部に比して周辺部が低明度となる枠領域を検出する機能、検出した枠領域を目の位置の検出対象領域として設定する機能、各目の位置の検出対象領域ごとに明度が変化した複数階層のグレースケールの目の位置検出用画像を作成する機能、複数階層の目の位置検出用画像において、高明度でフェードアウトしたグレースケール画像から低明度のグレースケール画像へ漸次明度変化するのに伴い、目の位置検出用画像に漸次現れる画素の固まり領域を検出する機能、検出した画素の固まり領域のうち対となって現れたものを目の位置候補として選択する機能、及び各目の位置候補の全ての目の位置検出用画像にわたる出現度数に基づいて目の位置候補から目の位置を特定する機能を備えている目の位置の検出システム。 An eye position detection system including an original image acquisition unit including a face image and a calculation unit, wherein the calculation unit includes:
A function for creating a grayscale scan target image from an original image, a face image size frame is scanned on the scan target image, and a frame region whose peripheral portion is lighter than the center portion in the frame function to be detected, the ability to set the detected frame area as a detection target area of the eye position, creating a position detection image grayscale eye plurality hierarchy Lightness is changed for each detection target area of the position of each eye In the image for detecting the position of eyes in multiple layers, the cluster of pixels that gradually appear in the eye position detection image as the brightness gradually changes from the grayscale image faded out at high brightness to the grayscale image at low brightness. A function for detecting an area, a function for selecting a pair of detected pixel areas appearing in pairs as eye position candidates, and an output for all eye position detection images for all eye position candidates. Detection system the position of the eye to have a function to specify the position of the eyes from the eye position candidates based on frequency.
元画像が256階調の場合に、フェードイン0%の画像は全ての画素が画素値255を有し、フェードインx%の画像(但し、x>0)は、元画像において画素値が2.55x未満の画素は元画像の画素値を有し、画素値が2.55x以上の画素は一律画素値255を有するとしたときに、元画像に対してフェードイン25〜35%の画像を形成する機能を有する請求項1記載の目の位置の検出システム。 As a function for the arithmetic unit to create an image to be scanned,
When the original image has 256 gradations, an image with 0% fade-in has all pixels having a pixel value of 255, and an image with fade-in x% (where x> 0) has a pixel value of 2 in the original image. Pixels less than .55x have a pixel value of the original image, and pixels with a pixel value of 2.55x or more have a uniform pixel value of 255, an image with a fade-in of 25-35% of the original image The eye position detection system according to claim 1, which has a function of forming.
顔画像サイズの枠を3×3の9領域に分割し、次の(1)〜(3)の基準に基づいて該枠領域を検出する請求項1又は2記載の目の位置の検出システム。
(1)画素値254以下の領域の存在割合が、中央部領域に比して周囲8領域で高い
(2)画素値254以下の領域の存在割合が、中央部領域に比して上部中央領域で高い
(3)画素値254以下の領域の存在割合が、中央部領域に比して中央部左領域又は中央部右領域で高い As a function that the arithmetic unit detects a frame region in which the peripheral part has low brightness compared to the central part in the frame of the face image size,
The eye position detection system according to claim 1 or 2, wherein a frame of a face image size is divided into nine 3 × 3 regions, and the frame regions are detected based on the following criteria (1) to (3).
(1) The existence ratio of areas with pixel values of 254 or less is higher in the surrounding 8 areas than in the central area.
(2) The existence ratio of the region having a pixel value of 254 or less is higher in the upper central region than in the central region.
(3) The existence ratio of the region having a pixel value of 254 or less is higher in the central left region or the central right region than in the central region.
(a)ランキング第2位の目の位置候補が、ランキング第1位の目の位置候補より上方にある場合 (a) When the second position candidate for the ranking is higher than the first position candidate for the ranking
(b)ランキング第2位の目の位置候補の目と目の中心間距離が、ランキング第1位の目の位置候補の中心間距離より長い場合 (b) When the distance between the center of the eye position candidate of the 2nd ranked eye is longer than the distance between the centers of the position candidate of the 1st ranked eye
(c)ランキング第2位の目の位置候補の左右の瞳にあたる領域が、共にランキング第1位の目の位置候補の左右の瞳にあたる位置よりも外側にある場合 (c) When the regions corresponding to the left and right pupils of the second-ranked eye position candidate are outside the positions corresponding to the left and right pupils of the first-ranked eye position candidate
(d)ランキング第1位の目の位置候補とランキング第2位の目の位置候補の縦方向の距離が、目と眉の距離程度に離れ、ランキング第1位の目の位置候補がランキング第2位の目の位置候補よりも上にある場合 (d) The vertical distance between the first position candidate of the ranking and the second position candidate of the ranking is about the distance between the eyes and the eyebrows, and the first position candidate of the ranking is ranked first. When the position is above the second position candidate
のうち、(a)〜(c)の条件を全て満たした場合、あるいは(d)の条件を満たした場合には、カウント数が2番目に多かった目の位置候補の位置を目の位置と特定する請求項1〜3のいずれかに記載の目の位置の検出システム。If all of the conditions (a) to (c) are satisfied, or if the condition (d) is satisfied, the position of the eye position candidate with the second largest number of counts is set as the eye position. The eye position detection system according to any one of claims 1 to 3.
A.元画像からグレースケールの走査対象画像を作成し、
走査対象画像上で顔画像サイズの枠を走査して該枠内の中央部に比して周辺部が低明度となる領域を目の位置の検出対象領域として検出する工程、
B.目の位置の検出対象領域ごとに明度が変化した複数階層のグレースケールの目の位置検出用画像を作成し、
複数階層の目の位置検出用画像において、高明度でフェードアウトしたグレースケール画像から低明度のグレースケール画像へ漸次明度変化するのに伴い、目の位置検出用画像に漸次現れる画素の固まり領域を検出し、
検出した画素の固まり領域のうち対となって現れたものを目の位置候補として選択し、
各目の位置候補の全ての目の位置検出用画像にわたる出現度数に基づいて目の位置候補から目の位置を特定する工程、
を有する目の位置の検出方法。 A method for detecting an eye position from an original image including a face image,
A. Create a grayscale scan target image from the original image,
Scanning a face image size frame on the scan target image and detecting an area in which the peripheral portion has low brightness as compared to the central portion in the frame as a detection target region of the eye position;
B. Lightness creates a position detection image grayscale eye plurality hierarchy vary from the detection target area of the eye position,
Detects clustered areas of pixels that gradually appear in the eye position detection image as the brightness gradually changes from a grayscale image that fades out at high brightness to a low-lightness grayscale image in multiple-level eye position detection images And
Select the detected pixel cluster area that appeared as a pair as the eye position candidate,
Identifying the eye position from the eye position candidates based on the appearance frequency over all eye position detection images of each eye position candidate;
A method for detecting the position of an eye.
(1)画素値254以下の領域の存在割合が、中央部領域に比して周囲8領域で高い
(2)画素値254以下の領域の存在割合が、中央部領域に比して上部中央領域で高い
(3)画素値254以下の領域の存在割合が、中央部領域に比して中央部左領域又は中央部右領域で高い In step A, the face image size frame for scanning the image to be scanned is divided into 3 × 3 9 areas, and the position of the eye is detected based on the following criteria (1) to (3). detection method for the eye position according to any one of claims 5-8 for detecting a target region.
(1) The existence ratio of areas with pixel values of 254 or less is higher in the surrounding 8 areas than in the central area.
(2) The existence ratio of the region having a pixel value of 254 or less is higher in the upper central region than in the central region.
(3) The existence ratio of the region having a pixel value of 254 or less is higher in the central left region or the central right region than in the central region.
(a)ランキング第2位の目の位置候補が、ランキング第1位の目の位置候補より上方にある場合 (a) When the second position candidate for the ranking is higher than the first position candidate for the ranking
(b)ランキング第2位の目の位置候補の目と目の中心間距離が、ランキング第1位の目の位置候補の中心間距離より長い場合 (b) When the distance between the center of the eye position candidate of the 2nd ranked eye is longer than the distance between the centers of the position candidate of the 1st ranked eye
(c)ランキング第2位の目の位置候補の左右の瞳にあたる領域が、共にランキング第1位の目の位置候補の左右の瞳にあたる位置よりも外側にある場合 (c) When the regions corresponding to the left and right pupils of the second-ranked eye position candidate are outside the positions corresponding to the left and right pupils of the first-ranked eye position candidate
(d)ランキング第1位の目の位置候補とランキング第2位の目の位置候補の縦方向の距離が、目と眉の距離程度に離れ、ランキング第1位の目の位置候補がランキング第2位の目の位置候補よりも上にある場合 (d) The vertical distance between the first position candidate of the ranking and the second position candidate of the ranking is about the distance between the eyes and the eyebrows, and the first position candidate of the ranking is ranked first. When the position is above the second position candidate
のうち、(a)〜(c)の条件を全て満たした場合、あるいは(d)の条件を満たした場合には、カウント数が2番目に多かった目の位置候補の位置を目の位置と特定する請求項5〜9のいずれかに記載の目の位置の検出方法。If all of the conditions (a) to (c) are satisfied, or if the condition (d) is satisfied, the position of the eye position candidate with the second largest number of counts is set as the eye position. The method for detecting an eye position according to any one of claims 5 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011088669A JP5093540B2 (en) | 2011-04-12 | 2011-04-12 | Eye position detection method and detection system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011088669A JP5093540B2 (en) | 2011-04-12 | 2011-04-12 | Eye position detection method and detection system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2007176744A Division JP4831361B2 (en) | 2007-07-04 | 2007-07-04 | Eye position detection method and detection system |
Publications (3)
Publication Number | Publication Date |
---|---|
JP2011138558A JP2011138558A (en) | 2011-07-14 |
JP2011138558A5 true JP2011138558A5 (en) | 2012-01-19 |
JP5093540B2 JP5093540B2 (en) | 2012-12-12 |
Family
ID=44349834
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2011088669A Expired - Fee Related JP5093540B2 (en) | 2011-04-12 | 2011-04-12 | Eye position detection method and detection system |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP5093540B2 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101614468B1 (en) | 2014-11-03 | 2016-04-21 | 백석대학교산학협력단 | Eye Detection and Its Opening and Closing State Recognition Method Using Block Contrast in Mobile Device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1313979C (en) * | 2002-05-03 | 2007-05-02 | 三星电子株式会社 | Apparatus and method for generating 3-D cartoon |
JP2005044330A (en) * | 2003-07-24 | 2005-02-17 | Univ Of California San Diego | Weak hypothesis generation device and method, learning device and method, detection device and method, expression learning device and method, expression recognition device and method, and robot device |
JP2005346474A (en) * | 2004-06-03 | 2005-12-15 | Canon Inc | Image processing method and image processor and program and storage medium |
JP2006004090A (en) * | 2004-06-16 | 2006-01-05 | Mitsubishi Electric Corp | Image normalization apparatus and image normalization program |
JP2007025900A (en) * | 2005-07-13 | 2007-02-01 | Canon Inc | Image processor and image processing method |
JP4799105B2 (en) * | 2005-09-26 | 2011-10-26 | キヤノン株式会社 | Information processing apparatus and control method therefor, computer program, and storage medium |
-
2011
- 2011-04-12 JP JP2011088669A patent/JP5093540B2/en not_active Expired - Fee Related
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101690297B1 (en) | Image converting device and three dimensional image display device including the same | |
US9584824B2 (en) | Method for motion vector estimation | |
EP2551796A3 (en) | Image processing device identifying attribute of region included in image | |
JP2015156607A (en) | Image processing method, image processing apparatus, and electronic device | |
JP2013101615A (en) | Method and system for describing image area on the basis of color histogram | |
JP2013500536A5 (en) | ||
JP2013030183A (en) | Environment recognition device, and program | |
JP2012221117A (en) | Image processing device and program | |
JP6002469B2 (en) | Image processing method and image processing system | |
JP2009157821A (en) | Range image generating device, environment recognition device, and program | |
JP2016526722A (en) | Method and apparatus for specific image detection | |
JP2018520531A5 (en) | ||
US20140314274A1 (en) | Method for optimizing size and position of a search window of a tracking system | |
CN106875371A (en) | Image interfusion method and image fusion device based on Bayer format | |
US20150110414A1 (en) | Image processing apparatus and method | |
JP4963297B2 (en) | Person counting device and person counting method | |
CN102693535A (en) | Method for detecting light bundling device area in DR image | |
EP3363193B1 (en) | Device and method for reducing the set of exposure times for high dynamic range video imaging | |
JP2011138558A5 (en) | ||
KR101373704B1 (en) | Video processing method using adaptive weighted prediction | |
US9818025B2 (en) | Discrimination container generation device and pattern detection device | |
WO2013161407A1 (en) | Object detection device and program | |
JP6708749B2 (en) | Image evaluation apparatus, image evaluation method, and image evaluation program | |
JP6591349B2 (en) | Motion detection system and motion detection method | |
CN106023191B (en) | A kind of optics delineation character edge extraction and edge fitting method based on structure feature |