CN110991256A - System and method for carrying out age estimation and/or gender identification based on face features - Google Patents

System and method for carrying out age estimation and/or gender identification based on face features Download PDF

Info

Publication number
CN110991256A
CN110991256A CN201911093609.7A CN201911093609A CN110991256A CN 110991256 A CN110991256 A CN 110991256A CN 201911093609 A CN201911093609 A CN 201911093609A CN 110991256 A CN110991256 A CN 110991256A
Authority
CN
China
Prior art keywords
face
face features
age
features
gender
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911093609.7A
Other languages
Chinese (zh)
Inventor
吕楠
张丽秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Huiyan Artificial Intelligence Technology Co Ltd
Original Assignee
Wuxi Huiyan Artificial Intelligence Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Huiyan Artificial Intelligence Technology Co Ltd filed Critical Wuxi Huiyan Artificial Intelligence Technology Co Ltd
Priority to CN201911093609.7A priority Critical patent/CN110991256A/en
Publication of CN110991256A publication Critical patent/CN110991256A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Abstract

The invention relates to a system and a method for carrying out age estimation and/or gender identification based on human face characteristics, which comprises the following steps: acquiring a picture intercepted by front-end equipment as an input image; according to the input image, carrying out face detection and calibration through a face detection calibration network, and carrying out white balance processing on the image data after detection and calibration; sending the processed image data into a model for extracting human face features to obtain the human face features in the picture; mapping the obtained face features and the same number of male and female face features into the same multidimensional space, calculating the distance between feature points, and judging the sum of the distances between the feature points and the male and female to obtain the gender; meanwhile, the obtained face features and the face features of all age groups are mapped into the same multidimensional space, the distance between feature points is calculated, and the distance between the feature points and all age groups is judged to obtain the age groups.

Description

System and method for carrying out age estimation and/or gender identification based on face features
Technical Field
The invention relates to the field of face recognition, in particular to a system and a method for carrying out age estimation and/or gender recognition based on face features.
Background
In the conventional gender identification and age estimation techniques, a specific model is loaded, and after image data is input, a result is obtained through the model, which is also a commonly used method at present. The invention patent with the publication number of CN108573209A and the name of ' a single-model multi-output age and gender identification method and system based on human faces ' discloses a single-model multi-output age and gender identification method and system based on human faces '. The system extracts characteristic data of age and gender through a network model, and then generates a model by utilizing different output layers to output the identification results of age and gender in parallel. In practical applications, the attributes of the human face (such as gender and age) are often inseparable from the human face. When face comparison is performed, the method inevitably loads a face comparison model to obtain face features, so that the steps are increased, and the problems of long time consumption and low efficiency are caused.
The problems of difficult deployment, high memory occupancy rate, long loading time consumption, low efficiency caused by calling the original picture for multiple times and the like can occur in the actual use process of the multi-model method.
Disclosure of Invention
In view of the above problems in the prior art, an object of the present invention is to provide a system and a method for age estimation and/or gender identification based on human face features, which facilitate model deployment, reduce memory usage, and improve efficiency. The specific technical scheme is as follows:
a method for age estimation and/or gender identification based on human face features comprises the following steps:
(1) acquiring an input image;
(2) according to the input image, carrying out face detection to obtain image data;
(3) inputting image data into a human face feature model to obtain human face features in an input image;
(4) calculating and judging gender and/or age according to the human face characteristics.
Further, in the step (1), the input image is a face image captured by a front-end device.
Further, in the step (2), the face detection is performed through a face detection calibration network, which includes the following steps:
(2-1) performing face detection according to the input image;
(2-2) carrying out face calibration;
and (2-3) preprocessing the image data after detection and calibration to obtain processed image data.
Further, the pretreatment in the step (2-3) is a white balance treatment.
Further, the pretreatment in the step (2-3) comprises the following steps:
(2-3-1) unifying the color space of the images;
(2-3-2) carrying out face detection on the images with unified color spaces and returning to the face position;
(2-3-3) intercepting the face according to the returned face position and carrying out size normalization processing;
and (2-3-4) carrying out image data standardization processing on the image after size normalization processing.
Further, in the step (3), the face feature model is used for extracting face features and obtaining a plurality of feature vectors.
Further, in step (4), the prediction of gender and/or age is obtained by calculating the Euclidean distance between the face feature vector and the pre-input gender and age sample feature.
Further, in the step (4), the face features and the gender face features are mapped into a first calculating/judging module; and &
Or, the face feature and the age face feature are mapped into the second calculating/judging module.
Further, in the step (4), the gender facial features are the same number of male and female facial features; the age face features are face features of all age groups.
Further, the air conditioner is provided with a fan,
in the step (4), the method comprises the following steps:
(4-1) mapping the obtained face features and the same number of male and female face features into the same multidimensional space respectively;
(4-2) calculating the distance between the characteristic points;
(4-3) judging the magnitude of the sum of the distances between the male and female to obtain the sex;
the first calculation/judgment module comprises the multidimensional space in the step (4-1) and is used for the calculation and judgment of the steps (4-2) and (4-3);
and/or the presence of a gas in the gas,
in the step (4), the method comprises the following steps:
(4-4) mapping the obtained face features and the face features of all age groups into the same multidimensional space;
(4-5) calculating the distance between the characteristic points;
(4-6) judging the distance to each age group to obtain the age group;
the second calculation/judgment module comprises the multidimensional space in the step (4-4) and is used for the calculation and judgment in the steps (4-5) and (4-6).
Further, the air conditioner is provided with a fan,
in the step (4-2), the calculation formula is as follows:
Figure BDA0002267616150000031
wherein the content of the first and second substances,
Figure BDA0002267616150000032
is the sum of the distances between the face features of the current picture and the face features of the male,
Figure BDA0002267616150000033
the sum of the distances between the face features of the current picture and the face features of the female is obtained.
When D is larger than 0, the gender of the person in the picture is female; when D is less than or equal to 0, the gender of the person in the picture is male;
and/or the presence of a gas in the gas,
in the step (4-5), the calculation formula is as follows:
P=MIN(n1,n2,n3...,ni)
wherein n isiThe position of the minimum distance value points to the corresponding age group for the distance between the picture and the face features of each age group.
A system for age estimation and/or gender identification based on facial features, comprising: the system comprises front-end equipment, a face detection calibration network, a face feature model and a first and/or second calculation/judgment module, wherein the front-end equipment is used for acquiring an input image and is in communication connection with the face detection calibration network; the face detection calibration network is used for carrying out face detection to obtain image data, and the image data is in communication connection with the face feature model; the face feature model is used for obtaining face features in an input image and is in communication connection with the first and/or second calculation/judgment modules; the first and/or second calculating/judging module is used for calculating and judging the gender and/or the age.
The system further comprises an original picture preprocessing module and/or a white balance processing module, wherein the front end of the original picture preprocessing module is connected with the face detection calibration network and used for processing the image data of the face detection calibration network, and the rear end of the original picture preprocessing module is connected with the face feature model and used for sending the processed image data to the face feature model; and/or the front end of the white balance module is connected with the face detection calibration network and used for processing the image data thereof, and the rear end of the white balance module is connected with the face feature model and used for sending the processed image data thereto.
Compared with the prior art, the invention is combined with the face feature model used for face comparison, and the face features obtained by the face comparison model are used for age estimation and gender identification, thereby effectively reducing the deployment complexity of the model, reducing the memory occupancy rate and effectively improving the efficiency. The sum of the sizes of the actual age estimation model and the gender identification model is close to two hundred million, and the scheme of the invention saves the memory occupation of the actual age estimation model and the gender identification model.
Drawings
FIG. 1 is a schematic flow chart of an embodiment of an age estimation and gender identification method based on human face features;
FIG. 2 is a system diagram of an age estimation and gender identification system based on facial features;
fig. 3 is a schematic view of a face capture camera installation.
Detailed Description
The present invention is described in detail below with reference to the attached figures, which are various embodiments of the present invention.
In an alternative embodiment, a method for age estimation and gender identification based on facial features includes the steps of:
s1, acquiring a picture intercepted by the front-end equipment as an input image;
s2, according to the input image, carrying out face detection and calibration through a face detection calibration network, and carrying out white balance processing on the image data after detection and calibration;
s3, sending the processed image data into a model for extracting human face features to obtain the human face features in the picture;
s4, mapping the obtained face features and the same number of male and female face features into the same multidimensional space, calculating the distance between feature points, and judging the sum of the distances between the feature points and the male and female to obtain the gender; meanwhile, the obtained face features and the face features of all age groups are mapped into the same multidimensional space, the distance between feature points is calculated, and the distance between the feature points and all age groups is judged to obtain the age groups.
Referring to fig. 1, a flowchart of an embodiment of a method for age estimation and gender identification based on human face features is shown, in this optional embodiment, a method for age estimation and gender identification based on human face features includes the following steps:
s1, acquiring a picture intercepted by the front-end equipment as an input image;
s2, according to the input image, carrying out face detection and calibration through a face detection calibration network, and carrying out white balance processing on the image data after detection and calibration;
s3, sending the processed image data into a model for extracting human face features to obtain the human face features in the picture;
and S4, mapping the obtained face features and the same number of male and female face features into the same multidimensional space, calculating the distance between feature points, and judging the sum of the distances between the feature points and the male and female to obtain the gender.
The calculation formula is as follows:
Figure BDA0002267616150000051
wherein the content of the first and second substances,
Figure BDA0002267616150000052
is the sum of the distances between the face features of the current picture and the face features of the male,
Figure BDA0002267616150000053
the sum of the distances between the face features of the current picture and the face features of the female is obtained.
When D is larger than 0, the gender of the person in the picture is female; when D is less than 0, the gender of the person in the picture is male.
Meanwhile, the obtained face features and the face features of all age groups are mapped into the same multidimensional space, the distance between feature points is calculated, and the distance between the feature points and all age groups is judged to obtain the age groups.
The calculation formula is as follows:
P=MIN(n1,n2,n3...,ni)
wherein n isiThe distance between the picture and the face features of each age group. Finding the position of the smallest distance value enables finding the corresponding age bracket.
There are many ways to calculate the distance between feature points in the multidimensional space, and the method is not limited to the method used in this embodiment. In this embodiment, euclidean metric calculations are used, and the formula is:
Figure BDA0002267616150000061
in another alternative embodiment, the following scheme may be employed:
1. acquiring a face image;
2. unifying the color space of the image;
3. carrying out face detection on the unified pictures and returning to the face position;
4. intercepting the face according to the returned face position and carrying out size normalization;
5. carrying out image data standardization processing on the image with the normalized size;
6. extracting the face features and identifying the gender and the age.
Aiming at the step 1: whether the front-end equipment directly returns the picture or the picture of the video is intercepted by a program according to the frame, the input image data generates a 128-dimensional output vector by taking an inclusion-v 4 convolution neural network as a model of a feature extractor, and then the 128-dimensional face feature vector is obtained after normalization operation is carried out by an L2 regularization method. The dimension of the face feature vector is too small, the calculation amount is small, the calculation speed is high, but the face feature vector is not enough to express the face feature, so that different pictures are difficult to distinguish; the dimension of the feature vector is too large, the face features are enough to distinguish different pictures, but the training model is not easy to converge, the calculation is slow, and the occupied space is large. 128-dimensional face features are preferred. Aiming at the step 2: image data is read by Opencv with a BGR order as a default, but other software generally uses an RGB order, and therefore needs to be converted uniformly. Aiming at the step 4: and intercepting the face according to the returned face position and carrying out size normalization. The pictures are unified to 160 × 160 size by cropping and resize operations. The picture size can be changed according to the following requirements of extracting a characteristic value model.
Preferably, the face detection uses a multitask cascade CNN model, namely, MTCNN model. The model combines face frame detection and face keypoint detection. Firstly, the face picture is zoomed into pictures with different sizes and characteristic pyramids of the travel pictures according to different zoom ratios. Then through a three-level network: PNet: and obtaining regression vectors of the candidate window and the bounding box of the face region. Performing regression according to the bounding box, calibrating the candidate windows, and combining the highly overlapped candidate windows; RNet: training the PNet candidate window in an RNet network, fine-tuning a candidate frame by utilizing a boundary frame regression value, and removing the highly overlapped candidate window; ONet: and training the RNet-passed candidate window in an ONet network, finely adjusting the candidate frame by using a regression value of the boundary frame, removing the highly coincident candidate window, and simultaneously positioning five face key points.
In addition to the aforementioned euclidean algorithm, the following algorithm may be employed:
KL divergence:
Figure BDA0002267616150000071
wherein D isKL(p | | q) represents: the degree of difference between p and q in the event space is 0 if there is a perfect match. Larger values indicate greater differences in the p and q distributions.
piRepresents: ith distribution value in p distribution
qiRepresents: ith distribution value in q distribution
n represents: there are n distribution values in each event distribution
2. Cosine similarity:
(1)
Figure BDA0002267616150000072
(2)
Figure BDA0002267616150000073
wherein cos (theta) represents the value of the included angle between two vectors, and the larger the value, the more similar. In order to be consistent with the Euclidean distance and KL divergence judgment similar method, conversion can be performed through equation (2), and the similarity value is similar as the similarity value is smaller after conversion.
Ai,BiRepresenting the components of vectors a and B, respectively.
n represents: the vector has n components.
Referring to an alternative embodiment shown in fig. 3, in the first step of obtaining the input image, a face snapshot camera is used in the following embodiment, and the captured image is directly transmitted to the designated directory through the FTP by the face snapshot camera. Other methods may be used to actually acquire the face image. In particular, the face age and gender prediction is greatly related to factors such as the installation position of a camera, the light of the field environment (such as too dark and too light), and the like. Therefore, the face snapshot camera is installed with certain requirements. Specifically, the requirements of the installation position of the face snapshot camera are as follows:
1. the camera is arranged right in front of the channel, the face is shot on the front side, the deflection angle in the horizontal direction is less than or equal to 15 degrees, and the smaller the deflection angle is, the better the deflection angle is.
2. The camera needs to have a certain overlooking angle during installation, and avoids that the human face at the back is sheltered when the former person passes through the passageway, and the overlooking angle α in the vertical direction is 10 degrees +/-3 degrees.
3. Pixels covered by a human face are required to be identified in a snapshot picture to reach 120 pixel points (pupil distance reaches 60 pixel points), the actual width V of a human face detection position of 200 ten thousand cameras is less than or equal to 2.5 meters, and the actual width V of a human face detection position of 600 ten thousand cameras is less than or equal to 4.0 meters.
Then, reading a face image captured by a face capturing camera from a specified directory, detecting a face and calibrating;
secondly, preprocessing the calibrated image data;
next, extracting the face features to obtain a plurality of feature vectors;
and finally, obtaining the prediction of the gender and the age by calculating the Euclidean distance between the face feature vector and the pre-input gender and age sample feature.
In 2362 pictures tested, the accuracy rate of gender identification was 98%, the accuracy rate of age identification was 84%, and the single-process processing time was 18 minutes; the same picture is processed through a hundred-degree single process, the time is 52 minutes, the gender identification accuracy rate is 93%, and the age identification accuracy rate is 85%.
In another alternative embodiment, a method for age estimation and gender identification based on human face features includes the following steps: acquiring an input image; detecting a human face and calibrating; preprocessing the calibrated image data; extracting the face features to obtain a plurality of feature vectors; gender and age are identified.
The method for detecting the human face and calibrating the human face comprises the steps of detecting the human face and calibrating the input image through a human face detection calibration network.
The preprocessing method comprises the step of carrying out white balance processing on image data.
The method for extracting the human face features comprises the steps of adopting a convolutional neural network;
the number of feature vectors is preferably 128.
The method for identifying gender and age comprises the steps of identifying gender and age simultaneously, identifying gender and then age or identifying age and then gender; preferably, gender identification and age identification are performed simultaneously.
The method for identifying gender comprises the following steps: mapping the feature vectors of the extracted face features and the feature vectors of the face features of the same number of men and women into the same multidimensional space, wherein each feature vector corresponds to one dimension in the space, calculating and comparing the sum of the distances between the extracted feature vectors and the feature vectors of the corresponding men and women respectively, and if the sum of the distances between the extracted feature vectors and the feature vectors of the corresponding men is greater than the sum of the distances between the extracted feature vectors and the feature vectors of the corresponding women, identifying that the extracted face features are women; and if the sum of the distances between the extracted feature vector and the feature vector of the female corresponding to the extracted feature vector is greater than the sum of the distances between the extracted feature vector and the feature vector of the male corresponding to the extracted feature vector, identifying that the extracted face feature is male.
The formula for calculating the distance of the feature vector adopts Euclidean measurement calculation, and the specific formula is as follows:
Figure BDA0002267616150000091
wherein d (x, y) represents the distance between point x and point y, xiRepresents: point x is a set of points in n-dimensional space, which can be represented as (x)1,x2,...,xn) Wherein x isi(i 1, 2.., n) is a real number, called the ith coordinate of x; y isiRepresents: point y is a set of points in n-dimensional space, which can be represented as (y)1,y2,...,yn) Wherein y isi(i 1, 2.., n) is a real number, called the ith coordinate of y, i is a natural number, and n represents a spatial dimension;
preferably, the sum of the distances of the feature vectors is compared by the formula:
Figure BDA0002267616150000092
wherein the content of the first and second substances,
Figure BDA0002267616150000093
is the sum of the distances between the extracted feature vector of the face feature and the feature vector of the face feature of the male corresponding to the extracted feature vector,
Figure BDA0002267616150000094
the sum of the distances between the extracted feature vector of the face feature and the feature vector of the face feature of the female corresponding to the extracted feature vector;
when D is larger than 0, the extracted facial features are female, and when D is smaller than 0, the extracted facial features are male.
The method for identifying the age comprises the following steps: and mapping the extracted feature vectors and the feature vectors of all age groups into the same multidimensional space, wherein each feature vector corresponds to one dimension in the space, calculating and comparing the sum of the distances between the extracted feature vectors and the feature vectors of all age groups respectively, and the age group corresponding to the feature vector of the age group with the minimum sum of the distances between the extracted feature vectors is the age group to which the extracted face features belong.
The formula for calculating the distance of the feature vector adopts Euclidean measurement calculation, and the specific formula is as follows:
Figure BDA0002267616150000095
wherein d (x, y) represents: the distance between x and y; xi represents: point x is a set of points in n-dimensional space, which can be represented as (x1, x 2.., xn), where xi (i ═ 1, 2.., n) is a real number, called the ith coordinate of x; yi represents: point y is a set of points in n-dimensional space, which can be represented as (y1, y 2.., yn), where yi (i ═ 1, 2.., n) is a real number, called the ith coordinate of y; n represents a spatial dimension; the sum of the distances is calculated when gender is compared, and n is the sum of the distances when age is compared.
The formula used to identify age is:
P=MIN(n1,n2,n3...,ni);
wherein n isiIs the sum of the distances between the extracted feature vector and the feature vector of the ith age group, and i is a natural number.
The invention has been described in connection with the accompanying drawings, it is to be understood that the invention is not limited to the specific embodiments disclosed, but is intended to cover various modifications, adaptations or uses of the invention, and all such modifications and variations are within the scope of the invention.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (13)

1. A method for age estimation and/or gender identification based on human face features is characterized by comprising the following steps:
(1) acquiring an input image;
(2) according to the input image, carrying out face detection to obtain image data;
(3) inputting image data into a human face feature model to obtain human face features in an input image;
(4) calculating and judging gender and/or age according to the human face characteristics.
2. The method for age estimation and/or gender identification based on human face features as claimed in claim 1, wherein in step (1), the input image is a human face image captured by a front-end device.
3. The method for age estimation and/or gender identification based on human face features as claimed in claim 1 or 2, wherein in the step (2), the human face detection is performed by a human face detection calibration network, comprising the steps of:
(2-1) performing face detection according to the input image;
(2-2) carrying out face calibration;
and (2-3) preprocessing the image data after detection and calibration to obtain processed image data.
4. The method for age estimation and/or gender identification based on human face features as claimed in claim 3, wherein the preprocessing in step (2-3) is white balance processing.
5. The method for age estimation and/or gender identification based on human face features as claimed in claim 3, wherein the preprocessing in step (2-3) comprises the steps of:
(2-3-1) unifying the color space of the images;
(2-3-2) carrying out face detection on the images with unified color spaces and returning to the face position;
(2-3-3) intercepting the face according to the returned face position and carrying out size normalization processing;
and (2-3-4) carrying out image data standardization processing on the image after size normalization processing.
6. The method for age estimation and/or gender identification based on human face features as claimed in any one of claims 1-5, wherein in step (3), the human face feature model is used to extract human face features and get several feature vectors.
7. The method for age estimation and/or gender identification based on human face features as claimed in claim 6, wherein in the step (4), the prediction of gender and/or age is obtained by calculating Euclidean distance between the human face feature vector and the pre-input gender and age sample features.
8. The method for age estimation and/or gender identification based on human face features as claimed in any of claims 1-7, wherein in step (4), the human face features and gender human face features are mapped into a first calculating/judging module; and/or mapping the face features and the age face features into a second calculating/judging module.
9. The method for age estimation and/or gender identification based on human face features as claimed in claim 8, wherein in the step (4), the gender human face features are the same number of male and female human face features; the age face features are face features of all age groups.
10. The method for age estimation and/or gender identification based on human face features as claimed in claim 9, wherein the step (4) comprises the steps of:
(4-1) mapping the obtained face features and the same number of male and female face features into the same multidimensional space respectively;
(4-2) calculating the distance between the characteristic points;
(4-3) judging the magnitude of the sum of the distances between the male and female to obtain the sex;
the first calculation/judgment module comprises the multidimensional space in the step (4-1) and is used for the calculation and judgment of the steps (4-2) and (4-3);
and/or the presence of a gas in the gas,
in the step (4), the method comprises the following steps:
(4-4) mapping the obtained face features and the face features of all age groups into the same multidimensional space;
(4-5) calculating the distance between the characteristic points;
(4-6) judging the distance to each age group to obtain the age group;
the second calculation/judgment module comprises the multidimensional space in the step (4-4) and is used for the calculation and judgment in the steps (4-5) and (4-6).
11. The method for age estimation and/or gender identification based on human face features as claimed in claim 10, wherein in the step (4-2), the calculation formula is:
Figure FDA0002267616140000021
wherein the content of the first and second substances,
Figure FDA0002267616140000031
is the sum of the distances between the face features of the current picture and the face features of the male,
Figure FDA0002267616140000032
the sum of the distances between the face features of the current picture and the face features of the female is obtained.
When D is larger than 0, the gender of the person in the picture is female; when D is less than or equal to 0, the gender of the person in the picture is male;
and/or the presence of a gas in the gas,
in the step (4-5), the calculation formula is as follows:
P=MIN(n1,n2,n3...,ni)
wherein n isiThe position of the minimum distance value points to the corresponding age group for the distance between the picture and the face features of each age group.
12. A system for age estimation and/or gender identification based on facial features, comprising: a front-end device, a face detection calibration network, a face feature model, a first and/or second calculation/determination module, wherein,
the front-end equipment is used for acquiring an input image and is in communication connection with a face detection calibration network;
the face detection calibration network is used for carrying out face detection to obtain image data, and the image data is in communication connection with the face feature model;
the face feature model is used for obtaining face features in an input image and is in communication connection with the first and/or second calculation/judgment modules;
the first and/or second calculating/judging module is used for calculating and judging the gender and/or the age.
13. The system for age estimation and/or gender identification based on facial features of claim 12, further comprising a raw picture preprocessing module and/or a white balance processing module, wherein said raw picture preprocessing module is connected to the face detection calibration network at a front end and is configured to process image data thereof, and connected to the face feature model at a back end and is configured to send the processed image data thereto; and/or the front end of the white balance module is connected with the face detection calibration network and used for processing the image data thereof, and the rear end of the white balance module is connected with the face feature model and used for sending the processed image data thereto.
CN201911093609.7A 2019-11-11 2019-11-11 System and method for carrying out age estimation and/or gender identification based on face features Pending CN110991256A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911093609.7A CN110991256A (en) 2019-11-11 2019-11-11 System and method for carrying out age estimation and/or gender identification based on face features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911093609.7A CN110991256A (en) 2019-11-11 2019-11-11 System and method for carrying out age estimation and/or gender identification based on face features

Publications (1)

Publication Number Publication Date
CN110991256A true CN110991256A (en) 2020-04-10

Family

ID=70083729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911093609.7A Pending CN110991256A (en) 2019-11-11 2019-11-11 System and method for carrying out age estimation and/or gender identification based on face features

Country Status (1)

Country Link
CN (1) CN110991256A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111626303A (en) * 2020-05-29 2020-09-04 南京甄视智能科技有限公司 Sex and age identification method, sex and age identification device, storage medium and server
CN111695415A (en) * 2020-04-28 2020-09-22 平安科技(深圳)有限公司 Construction method and identification method of image identification model and related equipment
CN112836655A (en) * 2021-02-07 2021-05-25 上海卓繁信息技术股份有限公司 Method and device for identifying identity of illegal actor and electronic equipment
CN115063723A (en) * 2022-06-20 2022-09-16 无锡慧眼人工智能科技有限公司 Method for identifying defects of movement type obstacles based on human body posture estimation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005165447A (en) * 2003-11-28 2005-06-23 Toshiba Corp Age/sex discrimination device, age/sex discrimination method and person recognition device
CN105718873A (en) * 2016-01-18 2016-06-29 北京联合大学 People stream analysis method based on binocular vision
CN108765014A (en) * 2018-05-30 2018-11-06 中海云智慧(北京)物联网科技有限公司 A kind of intelligent advertisement put-on method based on access control system
CN109934047A (en) * 2017-12-15 2019-06-25 浙江舜宇智能光学技术有限公司 Face identification system and its face identification method based on deep learning
CN109948422A (en) * 2019-01-16 2019-06-28 深圳壹账通智能科技有限公司 A kind of indoor environment adjusting method, device, readable storage medium storing program for executing and terminal device
CN110188602A (en) * 2019-04-17 2019-08-30 深圳壹账通智能科技有限公司 Face identification method and device in video

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005165447A (en) * 2003-11-28 2005-06-23 Toshiba Corp Age/sex discrimination device, age/sex discrimination method and person recognition device
CN105718873A (en) * 2016-01-18 2016-06-29 北京联合大学 People stream analysis method based on binocular vision
CN109934047A (en) * 2017-12-15 2019-06-25 浙江舜宇智能光学技术有限公司 Face identification system and its face identification method based on deep learning
CN108765014A (en) * 2018-05-30 2018-11-06 中海云智慧(北京)物联网科技有限公司 A kind of intelligent advertisement put-on method based on access control system
CN109948422A (en) * 2019-01-16 2019-06-28 深圳壹账通智能科技有限公司 A kind of indoor environment adjusting method, device, readable storage medium storing program for executing and terminal device
CN110188602A (en) * 2019-04-17 2019-08-30 深圳壹账通智能科技有限公司 Face identification method and device in video

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
万月亮: "互联网图像处理与过滤技术", 北京:国防工业出版社, pages: 110 - 112 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111695415A (en) * 2020-04-28 2020-09-22 平安科技(深圳)有限公司 Construction method and identification method of image identification model and related equipment
CN111695415B (en) * 2020-04-28 2024-04-12 平安科技(深圳)有限公司 Image recognition method and related equipment
CN111626303A (en) * 2020-05-29 2020-09-04 南京甄视智能科技有限公司 Sex and age identification method, sex and age identification device, storage medium and server
CN111626303B (en) * 2020-05-29 2021-04-13 南京甄视智能科技有限公司 Sex and age identification method, sex and age identification device, storage medium and server
CN112836655A (en) * 2021-02-07 2021-05-25 上海卓繁信息技术股份有限公司 Method and device for identifying identity of illegal actor and electronic equipment
CN115063723A (en) * 2022-06-20 2022-09-16 无锡慧眼人工智能科技有限公司 Method for identifying defects of movement type obstacles based on human body posture estimation
CN115063723B (en) * 2022-06-20 2023-10-24 无锡慧眼人工智能科技有限公司 Movement type obstacle defect recognition method based on human body posture estimation

Similar Documents

Publication Publication Date Title
CN110991256A (en) System and method for carrying out age estimation and/or gender identification based on face features
WO2022111506A1 (en) Video action recognition method and apparatus, electronic device and storage medium
CN109284738B (en) Irregular face correction method and system
CN110532970B (en) Age and gender attribute analysis method, system, equipment and medium for 2D images of human faces
CN111027504A (en) Face key point detection method, device, equipment and storage medium
CN111476827B (en) Target tracking method, system, electronic device and storage medium
CN109740572B (en) Human face living body detection method based on local color texture features
CN111144366A (en) Strange face clustering method based on joint face quality assessment
CN112950667B (en) Video labeling method, device, equipment and computer readable storage medium
CN111401324A (en) Image quality evaluation method, device, storage medium and electronic equipment
CN111935479B (en) Target image determination method and device, computer equipment and storage medium
CN110287862B (en) Anti-candid detection method based on deep learning
CN107766864B (en) Method and device for extracting features and method and device for object recognition
CN110838119A (en) Human face image quality evaluation method, computer device and computer readable storage medium
CN114783003A (en) Pedestrian re-identification method and device based on local feature attention
CN112651381A (en) Method and device for identifying livestock in video image based on convolutional neural network
CN112200056A (en) Face living body detection method and device, electronic equipment and storage medium
CN109934129B (en) Face feature point positioning method, device, computer equipment and storage medium
KR20190071452A (en) Apparatus and method for object detection with shadow removed
CN112991159B (en) Face illumination quality evaluation method, system, server and computer readable medium
CN110647813A (en) Human face real-time detection and identification method based on unmanned aerial vehicle aerial photography
CN111104921A (en) Multi-mode pedestrian detection model and method based on Faster rcnn
CN210442821U (en) Face recognition device
CN114972246A (en) Die-cutting product surface defect detection method based on deep learning
CN114820707A (en) Calculation method for camera target automatic tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination