CN102184401B - Facial feature extraction method - Google Patents

Facial feature extraction method Download PDF

Info

Publication number
CN102184401B
CN102184401B CN201110110982.6A CN201110110982A CN102184401B CN 102184401 B CN102184401 B CN 102184401B CN 201110110982 A CN201110110982 A CN 201110110982A CN 102184401 B CN102184401 B CN 102184401B
Authority
CN
China
Prior art keywords
integral projection
image
horizontal
face
vertical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110110982.6A
Other languages
Chinese (zh)
Other versions
CN102184401A (en
Inventor
陈国庆
衡骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SUZHOU LIANGJIANG TECHNOLOGY Co Ltd
Original Assignee
SUZHOU LIANGJIANG TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SUZHOU LIANGJIANG TECHNOLOGY Co Ltd filed Critical SUZHOU LIANGJIANG TECHNOLOGY Co Ltd
Priority to CN201110110982.6A priority Critical patent/CN102184401B/en
Publication of CN102184401A publication Critical patent/CN102184401A/en
Application granted granted Critical
Publication of CN102184401B publication Critical patent/CN102184401B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a facial feature extraction method, which comprises the following steps of: (1) dividing H, S, V (Hue, Saturation, Value) components of a facial image in an HSV color space and taking out the V component; (2) filtering the V component image; (3) corroding and expanding the obtained image; (4) carrying out binaryzation on the obtained image, and obtaining an image with black and white colors; (5) respectively carrying out integral projection on the obtained black and white image in a horizontal direction and a vertical direction to obtain a horizontal and vertical integral projection image; (6) determining feature values of primary organs of a face according to the integral projection image obtained through analysis; and (7) locating correct locations of the primary organs on the face according to the obtained feature values. Due to the adoption of the extraction method, accuracy rate of facial feature recognition can be improved.

Description

A kind of face feature extraction method
Technical field
The invention belongs to image processing field, particularly a kind of face feature extraction method.
Background technology
Face feature extraction method commonly used mainly contains following 3 classes now:
1. Extraction of Geometrical Features: architectural feature and the priori of utilizing people's face, by the notable feature to human face expression, shape and change in location such as eyes, nose, eyebrow, mouth etc. position and measure, determine its size, distance, shape and mutually ratio characteristic relation, identify.
2. statistical nature extracts: compare with geometric properties, this statistical nature is based on the overall intensity feature of image, it emphasizes to keep as much as possible the information of original facial expression image, by the training to great amount of samples, obtain its statistical nature, its basic thought is that facial expression image is mapped to feature space, and with the laggard row mode classification of great amount of images Data Dimensionality Reduction, in fact the method for therefore extracting statistical nature is exactly " subspace analysis method ".
3. frequency field feature extraction: because the Gabor Wavelet Kernel Function has the characteristic identical with the two-dimentional echo area of human brain cortex simple cell, namely can catch the partial structurtes information corresponding to spatial frequency, locus and directional selectivity, therefore mating as feature with the coefficient amplitude of Two-Dimensional Gabor Wavelets conversion has good visual characteristic and biology background, is widely used in recognition of face and image processing.
With regard to the Extraction of Geometrical Features method, very easily be subject to the impact that the illumination deficiency is brought, and the facial interfere informations such as noise too much also are one of impact very large factors of extracting accuracy rate, have much room for improvement.
In view of above analysis, the inventor furthers investigate for the lower problem of existing face feature extraction method accuracy rate, and this case produces thus.
Summary of the invention
Technical matters to be solved by this invention is for the defective in the aforementioned background art and deficiency, and a kind of face feature extraction method is provided, and it can improve the accuracy rate of face characteristic identification.
The present invention is for solving above technical matters, and the technical scheme that adopts is:
A kind of face feature extraction method comprises the steps:
(1) facial image is divided H, S, V component at the HSV color space, and take out the V component;
(2) aforementioned V component image being carried out filtering processes;
(3) image that abovementioned steps is obtained corrodes successively and expands;
(4) image that obtains is carried out binaryzation, obtain only having black and white image;
(5) black white image that obtains is carried out respectively integral projection in the horizontal direction with on the vertical direction, to obtain level and vertical integral projection figure;
(6) analyze the integral projection figure that obtains, determine thus the eigenwert of people's face major organs for foundation;
(7) orient major organs in people tram on the face according to the eigenwert of obtaining.
In the above-mentioned steps (2), the V component image is carried out Gauss filtering or medium filtering.
In the above-mentioned steps (6), determine that the detailed process of eigenwert is:
Find out 3 peak values according to the aforementioned horizontal integral projection figure that obtains, wherein, the peak that is positioned at the top is the horizontal central point of eyes, finds out respectively peak point according to vertical integral projection figure in front 1/2 and rear 1/2 scope of horizontal ordinate again, is the vertical centre point of left eye and right eye;
In horizontal integral projection figure, the peak in the middle of being positioned at is the horizontal central point of nose, again according to vertical integral projection figure, finds out its valley in the environs of the center of horizontal ordinate, is the vertical centre point of nose;
In horizontal integral projection figure, be positioned at the horizontal central point that the peak of below is face, again according to vertical integral projection figure, in the environs of the center of horizontal ordinate, find out its peak value, be the vertical centre point of face.
In the above-mentioned steps (6), in horizontal integral projection figure, determine the central point of major organs after, set again a critical value, cumulative all surpass numbers of the ordinate of this critical value, be the thickness of this major organs.
In the above-mentioned steps (6), in vertical integral projection figure, determine the central point of major organs after, set again a critical value, cumulative all surpass numbers of the horizontal ordinate of this critical value, be the width of this major organs.
After adopting such scheme, the present invention improves existing integral projection method, in conjunction with the skin color model to people's face, improvement is integrated the image of projection, thereby overcome the impact that illumination is not enough and interfere information brings, improve the accuracy rate of people's face major organs location, according to test relatively, the just inspection rate of major organs eyes, nose, face can both improve 20%-30%.
Description of drawings
Fig. 1 is workflow diagram of the present invention;
Fig. 2 is the horizontal integral projection figure that adopts method of the present invention to obtain;
Fig. 3 is the horizontal integral projection figure that adopts existing method to obtain.
Embodiment
Below with reference to accompanying drawing, implementation process of the present invention is elaborated.
As shown in Figure 1, the invention provides a kind of face feature extraction method, comprise the steps:
(1) reads in a facial image: from people's face picture library, read in a standard faces;
(2) extract the V component in the HSV space: the aforementioned facial image that reads in is marked off H component, S component and V component at the HSV color space, then take out the V component, the V component that extracts by this kind step can overcome the impact (can cooperate simultaneously Fig. 2 and Fig. 3 display comparison effect) that the illumination deficiency is brought well;
(3) carry out neighborhood processing: the aforementioned V component image that obtains is carried out the filtering such as Gauss filtering or medium filtering and process, can remove well like this impact that the interfere informations such as non-feature organ bring, certainly also can adopt other filtering mode, be not limited to cited herein;
(4) carrying out morphology processes: the image that abovementioned steps obtains corrodes successively and expands, and can effectively strengthen like this display effect of the major organs such as eyes nose;
(5) image binaryzation: the image that obtains is carried out binaryzation according to the adaptive threshold method, obtain only having black and white image;
(6) integral projection: the black white image that obtains is carried out respectively integral projection in the horizontal direction with on the vertical direction, to obtain level and vertical integral projection figure;
(7) take out eigenwert: analyze the integral projection figure that obtains, determine thus the eigenwert of the major organs such as eyes, nose, face for foundation; Detailed process is:
Find out peak value according to the aforementioned horizontal integral projection figure that obtains, co-existing in the drawings 3 peak values, representing respectively eyes, nose and face, wherein, be positioned at the peak of the top, be the horizontal central point of eyes, can also set a critical value herein, cumulative all numbers above the ordinate of this critical value, thickness cumulative and that be eyes; In front 1/2 and rear 1/2 scope of horizontal ordinate, find out respectively peak point according to vertical integral projection figure again, be the vertical centre point of left eye and right eye, can determine like this two oculocentric positions, also set simultaneously a critical value, both sides are cumulative all numbers above the horizontal ordinate of this critical value respectively, are two width.
In horizontal integral projection figure, be positioned at middle peak, be the horizontal central point of nose, set again a critical value, cumulative all numbers above the ordinate of this critical value, thickness cumulative and that be nose; According to vertical integral projection figure, in the environs of the center of horizontal ordinate, find out its valley again, be the vertical centre point of nose, set simultaneously a critical value, cumulative all surpass numbers of the horizontal ordinate of this critical value, be the width of nose.
Among the horizontal integral projection figure, be positioned at the horizontal central point that the peak of below is face, set simultaneously a critical value, cumulative all surpass numbers of the ordinate of this critical value, thickness cumulative and that be face; According to vertical integral projection figure, in the environs of the center of horizontal ordinate, find out its peak value again, be the vertical centre point of face, set simultaneously a critical value, cumulative all surpass numbers of the horizontal ordinate of this critical value, be the width of face.
(8) major organs position, location: orient the major organs such as eyes, nose, face according to the eigenwert of obtaining in people tram on the face.
Above embodiment only for explanation technological thought of the present invention, can not limit protection scope of the present invention with this, every technological thought that proposes according to the present invention, and any change of doing on the technical scheme basis all falls in the protection domain of the present invention.

Claims (4)

1. a face feature extraction method is characterized in that comprising the steps:
(1) facial image is divided H, S, V component at the HSV color space, and take out the V component;
(2) aforementioned V component image being carried out filtering processes;
(3) image that abovementioned steps is obtained corrodes successively and expands;
(4) image that obtains is carried out binaryzation, obtain only having black and white image;
(5) black white image that obtains is carried out respectively integral projection in the horizontal direction with on the vertical direction, to obtain level and vertical integral projection figure;
(6) analyze the integral projection figure that obtains, determine thus the eigenwert of people's face major organs for foundation;
The detailed process of the eigenwert of described definite people's face major organs is:
Find out 3 peak values according to the aforementioned horizontal integral projection figure that obtains, wherein, the peak that is positioned at the top is the horizontal central point of eyes, finds out respectively peak point according to vertical integral projection figure in front 1/2 and rear 1/2 scope of horizontal ordinate again, is the vertical centre point of left eye and right eye;
In horizontal integral projection figure, the peak in the middle of being positioned at is the horizontal central point of nose, again according to vertical integral projection figure, finds out its valley in the environs of the center of horizontal ordinate, is the vertical centre point of nose;
In horizontal integral projection figure, be positioned at the horizontal central point that the peak of below is face, again according to vertical integral projection figure, in the environs of the center of horizontal ordinate, find out its peak value, be the vertical centre point of face;
(7) orient major organs in people tram on the face according to the eigenwert of obtaining.
2. a kind of face feature extraction method as claimed in claim 1 is characterized in that: in the described step (2), the V component image is carried out Gauss filtering or medium filtering.
3. a kind of face feature extraction method as claimed in claim 1, it is characterized in that: in the described step (6), after in horizontal integral projection figure, determining the central point of major organs, set again a critical value, adding up, all surpass numbers of the ordinate of this critical value, are the thickness of this major organs.
4. a kind of face feature extraction method as claimed in claim 1, it is characterized in that: in the described step (6), after in vertical integral projection figure, determining the central point of major organs, set again a critical value, adding up, all surpass numbers of the horizontal ordinate of this critical value, are the width of this major organs.
CN201110110982.6A 2011-04-29 2011-04-29 Facial feature extraction method Expired - Fee Related CN102184401B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110110982.6A CN102184401B (en) 2011-04-29 2011-04-29 Facial feature extraction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110110982.6A CN102184401B (en) 2011-04-29 2011-04-29 Facial feature extraction method

Publications (2)

Publication Number Publication Date
CN102184401A CN102184401A (en) 2011-09-14
CN102184401B true CN102184401B (en) 2013-01-16

Family

ID=44570575

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110110982.6A Expired - Fee Related CN102184401B (en) 2011-04-29 2011-04-29 Facial feature extraction method

Country Status (1)

Country Link
CN (1) CN102184401B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102842033A (en) * 2012-08-17 2012-12-26 苏州两江科技有限公司 Human expression emotion semantic recognizing method based on face recognition
CN103112015B (en) * 2013-01-29 2015-03-04 山东电力集团公司电力科学研究院 Operating object position and posture recognition method applicable to industrial robot
CN106778676B (en) * 2016-12-31 2020-02-18 中南大学 Attention assessment method based on face recognition and image processing
CN108399598B (en) * 2018-01-24 2021-11-23 武汉智博创享科技股份有限公司 Panoramic image-based face blurring method and system
CN108596064A (en) * 2018-04-13 2018-09-28 长安大学 Driver based on Multi-information acquisition bows operating handset behavioral value method
CN109002801B (en) * 2018-07-20 2021-01-15 燕山大学 Face shielding detection method and system based on video monitoring
CN109543518A (en) * 2018-10-16 2019-03-29 天津大学 A kind of human face precise recognition method based on integral projection
CN110706415A (en) * 2019-10-14 2020-01-17 重庆国翰能源发展有限公司 Charging pile control system based on biological recognition and face recognition method thereof
CN112381065B (en) * 2020-12-07 2024-04-05 福建天创信息科技有限公司 Face positioning method and terminal
CN114529729B (en) * 2022-04-22 2022-08-23 珠海视熙科技有限公司 Strobe detection and elimination method, device, camera and storage medium
CN114549843B (en) * 2022-04-22 2022-08-23 珠海视熙科技有限公司 Stroboscopic stripe detection and elimination method and device, image pickup device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050041867A1 (en) * 2002-03-27 2005-02-24 Gareth Loy Method and apparatus for the automatic detection of facial features
US20050063582A1 (en) * 2003-08-29 2005-03-24 Samsung Electronics Co., Ltd. Method and apparatus for image-based photorealistic 3D face modeling
CN101383001A (en) * 2008-10-17 2009-03-11 中山大学 Quick and precise front human face discriminating method
CN101539992A (en) * 2008-03-20 2009-09-23 中国科学院自动化研究所 Multi-illumination face recognition method based on morphologic quotient images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050041867A1 (en) * 2002-03-27 2005-02-24 Gareth Loy Method and apparatus for the automatic detection of facial features
US20050063582A1 (en) * 2003-08-29 2005-03-24 Samsung Electronics Co., Ltd. Method and apparatus for image-based photorealistic 3D face modeling
CN101539992A (en) * 2008-03-20 2009-09-23 中国科学院自动化研究所 Multi-illumination face recognition method based on morphologic quotient images
CN101383001A (en) * 2008-10-17 2009-03-11 中山大学 Quick and precise front human face discriminating method

Also Published As

Publication number Publication date
CN102184401A (en) 2011-09-14

Similar Documents

Publication Publication Date Title
CN102184401B (en) Facial feature extraction method
US20210019601A1 (en) System and Method for Face Detection and Landmark Localization
Galoogahi et al. Inter-modality face sketch recognition
CN101615292B (en) Accurate positioning method for human eye on the basis of gray gradation information
Skodras et al. An unconstrained method for lip detection in color images
Corkery et al. A preliminary investigation on face recognition as a biometric identifier of sheep
US11250249B2 (en) Human body gender automatic recognition method and apparatus
WO2019014813A1 (en) Method and apparatus for quantitatively detecting skin type parameter of human face, and intelligent terminal
CN112232332B (en) Non-contact palm detection method based on video sequence
WenJuan et al. A real-time lip localization and tacking for lip reading
CN110728185B (en) Detection method for judging existence of handheld mobile phone conversation behavior of driver
CN101996317B (en) Method and device for identifying markers in human body
Yang et al. An efficient method for vehicle model identification via logo recognition
Yap et al. A survey on facial wrinkles detection and inpainting: Datasets, methods, and challenges
TWI427545B (en) Face recognition method based on sift features and head pose estimation
CN106650616A (en) Iris location method and visible light iris identification system
CN104063682A (en) Pedestrian detection method based on edge grading and CENTRIST characteristic
CN103207989B (en) A kind of vein recognition methods of the slight insect pest blade based on principal component analysis
Bhaskar et al. Advanced algorithm for gender prediction with image quality assessment
CN110222660B (en) Signature authentication method and system based on dynamic and static feature fusion
CN105138984B (en) Sharpening image recognition methods based on multiresolution overshoot effect measuring
Saha et al. An approach to detect the region of interest of expressive face images
CN107424141B (en) Face image quality evaluation method based on probability block
Fan et al. Lane detection based on machine learning algorithm
CN109145875B (en) Method and device for removing black frame glasses in face image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130116

Termination date: 20150429

EXPY Termination of patent right or utility model