CN117036411A - Method, system and storage medium for tracking human face health characteristic object - Google Patents

Method, system and storage medium for tracking human face health characteristic object Download PDF

Info

Publication number
CN117036411A
CN117036411A CN202311058395.6A CN202311058395A CN117036411A CN 117036411 A CN117036411 A CN 117036411A CN 202311058395 A CN202311058395 A CN 202311058395A CN 117036411 A CN117036411 A CN 117036411A
Authority
CN
China
Prior art keywords
face
feature object
health feature
health
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311058395.6A
Other languages
Chinese (zh)
Inventor
曾金龙
王博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Bode Ruijie Health Technology Co ltd
Original Assignee
Shenzhen Bode Ruijie Health Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bode Ruijie Health Technology Co ltd filed Critical Shenzhen Bode Ruijie Health Technology Co ltd
Priority to CN202311058395.6A priority Critical patent/CN117036411A/en
Publication of CN117036411A publication Critical patent/CN117036411A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The application discloses a method, a system and a storage medium for tracking a human face health feature object, wherein the method for tracking the human face health feature object comprises the following steps: acquiring a face picture; based on the face picture, obtaining a health feature object and obtaining a face key point; correlating the health feature object with the key points of the human face to obtain correlation number information; and reversely indexing the health feature object based on the face key points and the associated number information to obtain the trend information of the health feature object, wherein the trend information is used for tracking the health feature object. According to the method, the health feature object is hung on the key point of the face closest to the health feature object, based on the key point, the micro-change of the health feature object is tracked, whether different face pictures of the same user are health feature objects at the same position or not can be judged, further long-term tracking of the health feature objects at the same position is achieved, and accuracy and reliability of tracking of the health feature objects of the face are improved.

Description

Method, system and storage medium for tracking human face health characteristic object
Technical Field
The present application relates to the field of image processing and skin detection, and in particular, to a method, a system, and a storage medium for tracking a face health feature object.
Background
Face health feature detection includes detecting health feature objects such as acne, spots, moles, lines, block colors, shiny, blackheads, pores, and the like from the face.
The same health characteristic object has different character manifestations along with the health change in the period, for example, the acne is easy to grow beside the nose of people, and the acne at the position can grow many times in the period of one year, and different characters are generated: the pimple of the initial acne, can have the pustule on the top when aggravating, probably form the nodule or the cyst of different sizes after the sustainable development, can suppuration when serious forms abscess etc., forms this acne's life cycle, includes: appearance, burst, death, and further growth.
How to accurately track and record the change condition of a certain health feature object on different positions of a human face for health feature analysis becomes a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a method, a system and a storage medium for tracking a human face health feature object, which are used for solving or partially solving the problem of how to accurately track and record the change condition of a certain health feature object on different positions of the human face for health feature analysis.
A tracking method of a face health feature object comprises the following steps:
acquiring a face picture;
based on the face picture, obtaining a health feature object and obtaining a face key point;
correlating the health feature object with the key points of the human face to obtain correlation number information;
and reversely indexing the health feature object based on the association number information to obtain health feature object trend information, wherein the health feature object trend information is used for tracking the health feature object based on the health feature object trend information.
The present application may be further configured in a preferred example to: after obtaining the health feature object based on the face picture, the method comprises the following steps:
based on the health feature object, obtaining contour coordinates corresponding to the health feature object;
and calculating attribute information of the health feature object based on the contour coordinates, wherein the attribute information is used for tracking the health feature object based on the attribute information.
The present application may be further configured in a preferred example to: before associating the health feature object with the face key point, comprising:
and marking the key points of the human face by adopting a key point algorithm of the human face to obtain key point positioning information, and associating the healthy feature object with the key points of the human face.
The present application may be further configured in a preferred example to: correlating the health feature object with the face key points to obtain correlation number information, wherein the correlation number information comprises:
Obtaining the center point coordinates of the health feature object;
based on the key point positioning information and the center point coordinates, performing distance calculation on the face key points and the center point coordinates to obtain optimal key points;
and obtaining association number information based on the key point positioning information corresponding to the optimal key point.
The present application may be further configured in a preferred example to: the health feature object is inverted and indexed based on the association number information, and health feature object trend information is obtained, including:
comparing the health feature object with the association number information based on the association number information to obtain a health feature object confirmation result;
and recording the health feature object based on the health feature object confirmation result to obtain the trend information of the health feature object, wherein the trend information is used for analyzing the health condition of the health feature object.
The present application may be further configured in a preferred example to: marking the key points of the human face by adopting a key point algorithm of the human face to obtain key point positioning information, wherein the method comprises the following steps:
graying is carried out on the face picture, and a horizontal gray projection curve graph and a vertical gray projection curve graph are obtained;
and (3) establishing a human face coordinate system, and mapping human face key points to a target image based on a horizontal gray projection curve graph and a vertical gray projection curve graph for determining key point positioning information.
The present application may be further configured in a preferred example to: after the face picture is grayed, the method comprises the following steps:
and traversing the grey face picture by adopting an iterator, and performing inverted index on the health feature object.
The present application may be further configured in a preferred example to: after the face picture is acquired, the method comprises the following steps:
and processing the face picture, including face azimuth correction, edge clipping and/or enhancing the contrast of face skin color and background.
The application aims at providing a tracking system for the human face health characteristic object.
The second object of the present application is achieved by the following technical solutions:
a tracking system for a face health feature object, comprising:
the face picture acquisition module is used for acquiring face pictures;
the feature object obtaining module is used for obtaining a health feature object based on the face picture and obtaining face key points;
the information obtaining module is used for associating the health feature object with the key points of the face to obtain association number information;
the health feature object tracking module is used for reversely indexing the health feature object based on the association number information, obtaining health feature object trend information and tracking the health feature object based on the health feature object trend information.
The third object of the present application is to provide a computer readable storage medium.
The third object of the present application is achieved by the following technical solutions:
a computer readable storage medium storing a computer program which when executed by a processor implements the method of tracking a face health feature object described above.
In summary, the application has the following beneficial technical effects:
according to the method for tracking the human face health feature object, the health feature object in the human face picture is detected, and the related attribute of the health feature object is calculated; and then, detecting key points of the face picture, and hanging the health feature object to the face key point closest to the face picture, namely taking the face key point as identity identification information, and carrying out inverted index based on the identity identification information of the face key point, so that the number of the health feature objects appearing on the face key point at the position can be known, and the micro-change of the face health feature object is tracked based on the number. The method enables the face pictures shot by the same user at different times, and even if the angles of the pictures are different, whether the health feature objects, such as acne, appear in the face pictures is the acne at the same position or not can be judged, so that the acne at the same position can be subjected to long-term change tracking, and the accuracy and the reliability of the tracking of the face health feature objects are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for tracking a face health feature object according to an embodiment of the application;
FIG. 2 is a face image of a face health feature object tracking method according to an embodiment of the application;
FIG. 3 is a diagram showing a distribution diagram of key points of a face in a face health feature object tracking method according to an embodiment of the application;
FIG. 4 is a flowchart illustrating a method for tracking a face health feature object according to an embodiment of the application;
FIG. 5 is a diagram showing key points of a face 478 method for tracking a face health feature object according to an embodiment of the present application;
FIG. 6 is a diagram showing the number of healthy feature objects that each key point of the tracking method of the healthy feature objects of the face depends on in an embodiment of the application;
FIG. 7 is a schematic diagram of a tracking system for a face health feature object according to an embodiment of the application;
FIG. 8 is a schematic diagram of a tracking system for a face health feature object according to an embodiment of the application;
fig. 9 is a schematic diagram of an electronic device according to an embodiment of the application.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will become more readily apparent, a more particular description of the application will be rendered by reference to the appended drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The present application may be embodied in many other forms than described herein and similarly modified by those skilled in the art without departing from the spirit of the application, whereby the application is not limited to the specific embodiments disclosed below.
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this disclosure refers to and encompasses any or all possible combinations of one or more of the listed items.
For ease of understanding, related terms and related concepts related to the embodiments of the present application are described below.
The detection of the key points of the human face refers to the positioning of key points of the human face, including eyebrows, eyes, nose, mouth and facial contour area places, given a human face image, and the detection of the key points of the human face is a challenging task due to the influence of factors such as gestures and shielding, wherein the key points of the human face are important characteristic points of all parts of the human face, namely contour points and corner points.
The number of the key points of the face is more and more, the key points are more and more refined, the basic functions of the annotators and the requirements on the auditing capability of the annotating team are higher and higher, the quality of the annotating is good or bad, and the key points have great effect on the algorithm accuracy of the artificial intelligent face model.
The human face contour line includes 13 in total: namely, the outline, left eyebrow, right eyebrow, nose bridge, nose border, left upper eyelid, left lower eyelid, right upper eyelid, right lower eyelid, upper lip upper edge, upper lip lower edge, lower lip upper edge, lower lip lower edge. An edge heat map can be generated based on the contour lines, specifically, firstly, discrete key points are interpolated to obtain dense lines and a corresponding binary boundary feature map, then, gaussian distances from surrounding pixels to non-zero boundary pixels are calculated to obtain a distance transformation map, and finally, the edge heat map is obtained according to a threshold value.
Embodiments of the application are described in further detail below with reference to the drawings.
Example 1
The embodiment of the application provides a tracking method of a human face health feature object, which mainly comprises the following steps:
referring to fig. 1, a face picture is acquired S10.
Specifically, in this embodiment, a photograph of a face of a person is taken under a natural light source environment, preferably a photograph of a face without crown, hair shielding, and clarity, the photograph is transmitted to a terminal, and the face photograph is subjected to gray processing, so as to obtain a face photograph as shown in fig. 2, and a horizontal gray projection and a vertical gray projection are obtained.
The step S10 has the effect of improving the accuracy and reliability of face picture acquisition.
S20, based on the face picture, obtaining a health feature object and obtaining a face key point.
Among them, the health feature object specifically refers to some health related features on the face, such as acne, spots, lines, shape, and luster.
The face key point detection algorithm uses a deep learning technology, adopts a model such as a convolutional neural network (Convolutional Neural Networks, CNN) and the like, and obtains the common structure and the change rule of human faces with different complexion, different ethnicities, different ages and the like by learning a large amount of face image data, so that the automatic recognition of the face key points is realized. Face keypoints typically include the locations of eyes, nose, and mouth, as well as details of eyebrows, facial contours, and the like.
The process of the face key point detection algorithm mainly comprises the following steps:
1. and (5) preprocessing data. The input image is preprocessed, including normalization, graying, image enhancement and other operations, so that the accuracy and stability of the algorithm are improved.
2. And (5) establishing a model. And selecting a corresponding algorithm model according to actual requirements and research purposes, for example, using a AlexNet, VGGNet deep convolutional neural network to establish a face key point detection model.
3. And (5) model training. The human face key point detection model is trained by using a large number of marked data sets, model parameters are continuously adjusted in the training process, and the accuracy and the efficiency of the algorithm are improved.
4. Model testing and optimization. And testing and evaluating the trained model by using a test data set, analyzing the performances of the algorithm under different conditions, and optimizing and adjusting the algorithm.
5. Application deployment. The trained and optimized model is applied to a specific scene, so that automation and intellectualization of face key point detection are realized.
Specifically, in this embodiment, a deep learning model or a traditional visual algorithm is used to detect the obtained face picture, and a health feature object is obtained by combining with a health feature object example segmentation model, and in this embodiment, dense face key points are used, and face key points are obtained according to a face key point detection algorithm, and in this embodiment, the dense face key points have 478 face key points, and the distribution of the face key points is shown in fig. 3.
The step S20 has the effect of locating the position of the key area on the face, is not influenced by the face gesture, shielding and light, and improves the accuracy and the robustness of acquiring the key points of the face.
S30, associating the health feature object with the face key point to obtain association number information.
Specifically, in this embodiment, first, the coordinates of the center point of the health feature object are obtained according to OpenCV, and the face key points are labeled by using a face key point algorithm, so as to obtain key point positioning information. And then calculating the distance between the key point of the face and the coordinate of the central point according to the coordinate of the central point and the positioning information of the key point, obtaining the optimal key point by comparing the calculated distances, and obtaining the association number information according to the positioning information of the key point corresponding to the optimal key point.
The face key point positioning information comprises coordinates, areas and the like of the face key points. The optimal key point is to calculate the distance between the key point of the face and the coordinates of the center point according to the positioning information of the key point and the coordinates of the center point, and then compare the calculated distances to obtain the key point of the face with the shortest distance corresponding to the coordinates of the center point.
The step S30 is used for hanging the health feature object to the optimal key point, that is, numbering and indexing the health feature object by using the optimal key point, so that the tracking efficiency and reliability of the face health feature object are improved.
S40, reversely indexing the health feature object based on the association number information to obtain health feature object trend information, wherein the health feature object trend information is used for tracking the health feature object based on the health feature object trend information.
The inverted index is an index method that is used to store a mapping of a storage location of a certain keyword in a document or a group of documents under full-text searching. It is the most commonly used data structure in document retrieval systems. Through the inverted index, a document list containing a keyword can be quickly obtained according to the keyword. The inverted index is mainly composed of two parts: "keyword dictionary" and "inverted file".
The inverted index has two different inverted index forms: a horizontal inverted index (or inverted archive index) of records contains a list of documents for each reference keyword. The horizontal inverted index (or full inverted index) of a keyword, in turn, contains the location of each keyword in a document. The latter form provides more compatibility (such as phrase search), but requires more time and space to create.
Specifically, according to the face key points and the associated number information, the inverted index is adopted to compare and search the information of the face key points and the associated number information, and the health feature object confirmation result that the health feature object accords with the associated number information is obtained. And then, according to the health feature object confirmation result, sequencing the health feature objects in time to obtain health feature object trend information corresponding to the attribute information change, such as the area, the color shade and the like of the acne. Then, according to the health feature object trend information and the skin problem quantification standard obtained by combining clinical experience, the health condition of the health feature object is analyzed and evaluated, and long-term tracking and micro-change observation of the health feature object are realized.
The step S40 has the function of realizing the rapid search of the mapping relation from the key points of the human face to the healthy feature objects through the inverted index, and improving the tracking efficiency, reliability and accuracy of the healthy feature objects of the human face.
According to the method for tracking the human face health feature object, the health feature object in the human face picture is detected, and the related attribute of the health feature object is calculated; and then, detecting key points of the face picture, and hanging the health feature object to the face key point closest to the face picture, namely taking the face key point as identity identification information, and carrying out inverted index based on the identity identification information of the face key point, so that the number of the health feature objects appearing on the face key point at the position can be known, and the micro-change of the face health feature object is tracked based on the number. The method enables the face pictures shot by the same user at different times, and even if the angles of the pictures are different, whether the health feature objects, such as acne, appear in the face pictures is the acne at the same position or not can be judged, so that the acne at the same position can be subjected to long-term change tracking, and the accuracy and the reliability of the tracking of the face health feature objects are improved.
Example 2
Referring to fig. 4, in some possible embodiments, after step S10, i.e. after taking a face picture, it includes:
s11, processing the face picture, including face azimuth correction, edge clipping and/or enhancing the contrast between the face skin color and the background.
In this embodiment, the face azimuth correction is determined according to the included angle of the horizontal direction of the connecting line of the centers of the eyes. The rotation of the image is achieved by interpolation, the change of the angle can change the image, so that the position of the center of the eye of the rotated image can be changed, and the inclination of the azimuth can also affect the positioning of the eye, therefore, the correction of the incumbent Lin Fangwei is achieved by repeated eye positioning and rotation until the deflection angle is zero in the embodiment.
Face azimuth correcting process:
1. positioning the eyes to obtain the coordinate value (L) x ,L y ) And (R) x ,R y );
2. After the center positions of the eyes are determined, if the eyes are not on the same horizontal line, an included angle theta is formed between the connecting line of the centers of the eyes and the horizontal direction,
3. if θ=0, it indicates that both eyes are already on a horizontal line and are not rotating. Otherwise, the rotation is performed. In this embodiment, linear interpolation is used for image rotation.
And (3) obtaining an image after rotation through image rotation, positioning the two eyes on the basis of the image rotation, obtaining coordinate values of the two eyes, recalculating an included angle theta between the two eyes and the horizontal direction, if theta=0, indicating that the two eyes are on the same horizontal line, otherwise, repeating the steps to rotate the image until theta=0.
Further, the embodiment adopts the Haar integral graph face classifier of OpenCV to cut edges and/or enhance the contrast ratio of the face complexion and the background. Specifically, in the embodiment, haar features are adopted for detection, integral Image (Integral Image) is utilized to accelerate Haar feature evaluation, then AdaBoost algorithm is utilized to train weak classification areas to obtain strong classifiers with stronger capability of distinguishing human faces and non-human faces, finally a cascading method is utilized to stack and cascade gun classifiers together to form a cascading classifier, and further edge cutting is achieved and/or contrast ratio of human face complexion and background is enhanced.
The step S11 has the effects of improving the reliability and the accuracy of the face picture and further improving the tracking accuracy and the reliability of the face health feature object by performing face azimuth correction, edge clipping and/or enhancing the contrast between the face skin color and the background and the like on the face picture.
Example 3
In some possible embodiments, after step S20, i.e. after obtaining the health feature object based on the face picture, the method comprises:
s21, based on the health feature object, obtaining the contour coordinate corresponding to the health feature object.
S22, calculating attribute information of the health feature object based on the contour coordinates, wherein the attribute information is used for tracking the health feature object based on the attribute information.
In this embodiment, the forward face detector get_front_face_detector () of Dlib is used to perform face detection, extract the rectangular frame outside the face, and return faces as the identified face array. Then, the embodiment uses the trained face feature detector to extract the face outline features. Specifically, firstly, the embodiment reads a face picture, then converts the face picture into a face gray scale picture, and then detects a face in the face gray scale picture by using a face feature detector, so that coordinates of the face and contour coordinates corresponding to a health feature object can be obtained.
Further, the embodiment calculates attribute information of the health feature object according to the contour coordinates, wherein the attribute information includes an area and a degree of the health feature object, a mean value and a variance of colors, and the like. The present embodiment then uses these attribute information to track the health feature objects.
The step S21 and the step S22 are used for datamation of the key points of the human face through the outline coordinates, so that the execution of the subsequent steps is facilitated, the health feature objects are labeled through the attribute information of the health feature objects, the recognition success rate is improved, and the subsequent tracking of the health feature objects is facilitated.
Example 4
In some possible embodiments, before step S30, that is, before associating the health feature object with the face key point, the method includes:
and S31, marking the face key points by adopting a face key point algorithm to obtain key point positioning information, and associating the health feature objects with the face key points.
In the embodiment, the forward face detector get_front_face_detector () of Dlib is used for face detection, face key points are extracted, and the face key points are marked to obtain key point positioning information. The face key point positioning information comprises coordinates, areas and the like of the face key points.
Example 5
In some possible embodiments, step S31, namely, labeling the face key points by using a face key point algorithm, obtains key point positioning information, includes:
s311, graying the face picture to obtain a horizontal gray projection graph and a vertical gray projection graph.
S312, a face coordinate system is established, and based on the horizontal gray projection and the vertical gray projection graph, the face key points are mapped to the target image and used for determining the key point positioning information.
Specifically, the present embodimentAnd graying the face picture, and carrying out edge enhancement processing on the face picture. Then for the edge-enhanced image S i,j Searching a binary threshold value t by using an OSTU method (maximum inter-class variance method), and then carrying out binary processing on the image, wherein the rule of binarization is as follows:
then a window is established, the window slides along the central axis from top to bottom on the binary image, the number N (Y) of pixel points falling into the window is calculated, an edge image vertical integral projection curve is obtained, and the maximum point of the projection curve is obtained, wherein the position is approximately the position Y where the corresponding face key point is located appro Satisfies the following conditions
Example 6
In some possible embodiments, after step S311, that is, after graying the face picture, the method includes:
s313, traversing the face picture subjected to graying by adopting an iterator, and performing inverted indexing on the health feature object.
Where an iterator is a very important special class dedicated to traversing a data set, with its traversal hiding the specific implementation of element iterations on a given set. The iterator method is a safer way to traverse the image, first to obtain the matrix start of the data image, and then to move the data pointer by incremental iteration.
The step S313 is used for facilitating the traversal of the face image, and improving the efficiency of inverted indexing of the health feature object.
Example 7
In some possible embodiments, step S30, that is, associating the health feature object with the face key point, obtains association number information, includes:
s301, obtaining the center point coordinates of the health feature object.
S302, calculating the distance between the key points of the face and the coordinates of the center points based on the positioning information of the key points and the coordinates of the center points, and obtaining the optimal key points.
S303, obtaining association number information based on the key point positioning information corresponding to the optimal key point.
In this embodiment, the forward face detector get_front_face_detector () of Dlib is used to perform face detection, extract the health feature object, and further obtain the polygon information of the edge of the health feature object, and further obtain the center point coordinate of the health feature object according to the polygon information. Then, according to the information such as the coordinates of the face key points in the key point positioning information, the distance between the face key points and the coordinates of the center points is calculated 478 by combining the coordinates of the center points of the healthy feature objects, and the face key points with the shortest distance corresponding to the coordinates of the center points are obtained by comparing the calculated distances. In this embodiment, only the points at the distance within the radius of the key point are considered. Then, in this embodiment, the key points of each face are ordered according to the information such as the area corresponding to the optimal key point, so as to obtain the association number information.
The association number information comprises original attribute information of the health feature object and key point positioning information. The association number information is equivalent to the identification card of the health feature object and the optimal key point, and the correctness and the uniqueness of the health feature object can be ensured to be obtained according to the inverted index of the association number information only when the identification cards are mutually matched.
As shown in fig. 5, the center points of the two acnes are feature small circles in the graph, the distance between the feature small circles and 478 human face key points is calculated according to the feature small circles, and then the distances between the acnes and the human face key points are compared to obtain the nearest key points, namely the optimal key points. For example, the center point of the acne on the tip of the nose is closest to the key point 281, so the acne is hung on the key point with the associated number information 281, and similarly, the acne beside the nose is hung on the key point 391 when the acne beside the nose is closest to the key point 391, so 281 becomes the index of the acne on the tip of the nose, and 391 becomes the index of the acne beside the nose.
Example 8
In some possible embodiments, step S40, that is, reversely indexing the health feature object based on the association number information, obtains health feature object trend information, includes:
S401, comparing the health feature object with the association number information based on the association number information to obtain a health feature object confirmation result.
S402, based on the health feature object confirmation result, recording the health feature object to obtain trend information of the health feature object, and analyzing the health condition of the health feature object.
Specifically, in this embodiment, the original attribute information and the key point positioning information of the health feature object recorded in the association number information are compared according to the attribute information of the health feature object, so as to obtain a health feature object confirmation result, that is, the health feature object accords with the association number information, or the health feature object does not accord with the association number information. And if the health feature object accords with the association number information, confirming that the information of the health feature object is valid, and retaining the information. And if the health feature object does not accord with the association number information, confirming that the information of the health feature object is invalid, and removing the information of the health feature object. Then, the embodiment records the information of the effective health feature object, and generates trend information of the same health feature object, so that the system outputs the health condition of the corresponding health feature object to the user.
As shown in fig. 6, the figure is a graph of the inverted statistics of the keypoints obtained by processing 230 photos of the same user, the front column is the number of the keypoints, the rear column is the number of healthy feature objects hanging on the keypoints, it can be seen that 83 acnes appear on the keypoint 391, 101 acnes appear on the keypoint 281, and the two keypoints correspond to the positions beside the nasal wings and the nasal tip respectively. It should be noted that, in this embodiment, only the numbers are shown, and information such as the position and coordinates where the acne appears will not be described in detail in this embodiment.
The effect of step S401 and step S402 is to improve the continuity and reliability of health tracking of the health feature object.
According to the face health feature object tracking method, as shown in fig. 3, the face orientation correction, edge cutting and/or the contrast of face complexion and background enhancement are/is carried out on the face image, so that the reliability and the accuracy of the face image are improved, the accuracy and the reliability of the face health feature object tracking are further improved, the face key points are dataized through the contour coordinates, the follow-up steps are conveniently carried out, the health feature object is labeled through the attribute information of the health feature object, the recognition success rate is improved, and the follow-up tracking of the health feature object is facilitated.
The application discloses a tracking system of a face health feature object.
Referring to fig. 7, the tracking system of the face health feature object includes:
the face picture acquisition module 10 is configured to acquire a face picture.
The get feature object and key points module 20 is configured to obtain a health feature object based on a face picture and obtain a face key point.
The association number information obtaining module 30 is configured to associate the health feature object with the face key point, and obtain association number information.
The health feature object tracking module 40 is configured to index the health feature object in an inverted manner based on the association number information, and obtain health feature object trend information, and track the health feature object based on the health feature object trend information.
Further, as shown in fig. 8, the tracking system of the face health feature object further includes:
the face picture processing module 11 is configured to process a face picture, including face orientation correction, edge clipping, and/or enhancing a contrast ratio between a skin color and a background.
Further, as shown in fig. 8, the tracking system of the face health feature object further includes:
the contour coordinate obtaining module 21 is configured to obtain contour coordinates corresponding to the health feature object based on the health feature object.
The attribute information calculating module 22 is configured to calculate attribute information of the health feature object based on the contour coordinates, and track the health feature object based on the attribute information.
Further, as shown in fig. 8, the tracking system of the face health feature object further includes:
the key point positioning information obtaining module 31 is configured to label the key points of the face by using a key point algorithm of the face, and obtain key point positioning information for associating the health feature object with the key points of the face.
Further, as shown in fig. 8, the obtaining the key point positioning information module 31 includes:
the graying face picture submodule 311 is used for graying the face picture to obtain a horizontal gray projection graph and a vertical gray projection graph.
The determine key point positioning information sub-module 312 is configured to establish a face coordinate system, map the face key points to the target image based on the horizontal gray scale projection and the vertical gray scale projection graph, and determine key point positioning information.
Further, as shown in fig. 8, the obtaining the key point positioning information module 31 includes:
the traversing face picture sub-module 313 is configured to traverse the grayed face picture by using an iterator, and is configured to index the health feature object in an inverted manner.
Further, as shown in fig. 8, the obtaining association number information module 30 includes:
the center point coordinate obtaining sub-module 301 is configured to obtain center point coordinates of the health feature object.
The optimal key point obtaining sub-module 302 is configured to perform distance calculation on the face key point and the center point coordinate based on the key point positioning information and the center point coordinate, so as to obtain an optimal key point.
The obtained association number information sub-module 303 is configured to obtain association number information based on the key point positioning information corresponding to the optimal key point.
Further, as shown in fig. 8, the tracking health feature object module 40 includes:
the get confirmation result sub-module 401 is configured to compare the health feature object with the association number information based on the association number information, and get a health feature object confirmation result.
The sub-module 402 for obtaining trend information of the health feature object is configured to record the health feature object based on the confirmation result of the health feature object, obtain trend information of the health feature object, and analyze the health condition of the health feature object.
The tracking system of the face health feature object provided in this embodiment can achieve the same technical effects as the foregoing embodiments due to the functions of each module and the logic connection between each module, and the principle analysis can refer to the relevant description of the steps of the foregoing tracking method of the face health feature object, which is not repeated here.
For specific limitations on the tracking system of the face health feature object, reference may be made to the above limitation on the tracking method of the face health feature object, which is not described herein. The modules in the tracking system of the face health feature object can be all or partially implemented by software, hardware and a combination thereof. The above modules may be embedded in hardware or independent of a processor in the device, or may be stored in software in a memory in the device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, an electronic device is provided. The electronic device may include a processor, an external memory interface, an internal memory, a universal serial bus (universal serial bus, USB) interface, a charge management module, a power management module, a battery, an antenna, a wireless communication module, an audio module, a speaker, a receiver, a microphone, an earphone interface, a sensor module, keys, an indicator, a camera, a display screen, and the like. Wherein the sensor module comprises an ambient light sensor. In addition, the sensor module may further include a pressure sensor, a gyroscope sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, a bone conduction sensor, and the like. In other embodiments, the electronic device in the embodiments of the present application may further include a mobile communication module, a subscriber identity module (subscriber identification module, SIM) card interface, and the like. The function of the above modules or devices is prior art and will not be described here in detail.
Applications supported by the electronic device in the embodiment of the present application may include applications of photographing class, such as a camera.
Applications supported by the electronic device in the embodiment of the application can also include applications for tracking of the face health feature object. The application of the tracking of the human face health feature object is to detect the health feature object of the face of the user, namely acne, spots, lines, shapes, colors and the like, through the shot human face picture, and provide a trend analysis report of the health feature object for the user.
The application for tracking the face health feature object in the embodiment of the present application may detect the face health feature object condition by using the method for tracking the face health feature object provided in other embodiments of the present application.
In this embodiment, the electronic device is taken as an example of a mobile phone, and in a specific operation, as shown in fig. 8.
As shown in a of fig. 9, the electronic device detects a click operation on the skin detection icon, and in response to the operation on the icon, the electronic device displays a user interface of the skin detection application on the display screen, as shown in B of fig. 9. In this interface, a camera icon is included.
The electronic device detects the operation on the camera icon, and calls a camera application on the electronic device to acquire a face picture to be detected in response to the operation on the camera icon. Of course, the user may also select a picture containing a face stored in the internal memory as a picture to be detected.
After the application for skin detection receives the input picture to be detected, the face health feature object condition can be detected by adopting the tracking method of the face health feature object provided by other embodiments of the present application.
In an embodiment, a computer readable storage medium is provided, and a computer program is stored on the computer readable storage medium, where the computer program when executed by a processor implements the method for tracking a face health feature object in the foregoing embodiment, or where the computer program when executed by the processor implements functions of each module/unit in the system for tracking a face health feature object in the foregoing system embodiment. To avoid repetition, no further description is provided here.
It will be apparent to those skilled in the art that embodiments of the present application may be implemented in hardware, or firmware, or a combination thereof. When implemented in software, the functions described above may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. Taking this as an example but not limited to: computer readable media can include RAM, ROM, electrically erasable programmable read-Only memory (electrically erasable programmable read Only memory, EEPROM), compact-disk-read-Only memory (CD-ROM) or other optical disk storage, magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
Furthermore, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (digital subscriber line, DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the fixing of the medium. As used in the embodiments of the present application, discs (disks) and disks include Compact Discs (CDs), laser discs, optical discs, digital versatile discs (digital video disc, DVDs), floppy disks, and blu-ray discs where disks usually reproduce data magnetically, while disks reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the system is divided into different functional units or modules to perform all or part of the above-described functions.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. The method for tracking the face health feature object is characterized by comprising the following steps:
acquiring a face picture;
based on the face picture, obtaining a health feature object and obtaining a face key point;
correlating the health feature object with the face key points to obtain the correlation number information;
and reversely indexing the health feature object based on the association number information to obtain health feature object trend information, wherein the health feature object trend information is used for tracking the health feature object based on the health feature object trend information.
2. The method for tracking a face health feature object according to claim 1, comprising, after the obtaining a health feature object based on the face picture:
Based on the health feature object, obtaining contour coordinates corresponding to the health feature object;
and calculating attribute information of the health feature object based on the contour coordinates, wherein the attribute information is used for tracking the health feature object based on the attribute information.
3. The method of claim 1, comprising, prior to said associating said health feature object with said face keypoints:
and marking the face key points by adopting a face key point algorithm to obtain key point positioning information, wherein the key point positioning information is used for associating the health feature object with the face key points.
4. The method for tracking a face health feature object according to claim 1, wherein associating the health feature object with the face key point to obtain the association number information includes:
obtaining the center point coordinates of the health feature object;
based on the key point positioning information and the center point coordinates, performing distance calculation on the face key points and the center point coordinates to obtain optimal key points;
and acquiring the association number information based on the key point positioning information corresponding to the optimal key point.
5. The method for tracking a face health feature object according to claim 1, wherein the reversely indexing the health feature object based on the association number information to obtain health feature object trend information comprises:
comparing the health feature object with the association number information based on the association number information to obtain a health feature object confirmation result;
and recording the health feature object based on the health feature object confirmation result to obtain health feature object trend information, wherein the health feature object trend information is used for analyzing the health condition of the health feature object.
6. A method for tracking a face health feature object according to claim 3, wherein the labeling the face key points by using a face key point algorithm to obtain key point positioning information comprises:
graying the face picture to obtain a horizontal gray scale projection curve graph and a vertical gray scale projection curve graph;
and establishing a human face coordinate system, and mapping the human face key points to a target image based on the horizontal gray scale projection curve graph and the vertical gray scale projection curve graph to determine the key point positioning information.
7. The method for tracking a face health feature object according to claim 6, comprising, after said graying said face picture:
and traversing the grey-scaled face picture by adopting an iterator, and performing inverted indexing on the health feature object.
8. The method for tracking a face health feature object according to claim 1, comprising, after the acquiring of the face picture:
and processing the face picture, including face azimuth correction, edge clipping and/or enhancing the contrast of face complexion and background.
9. A tracking system for a face health feature object, comprising:
the face picture acquisition module is used for acquiring face pictures;
the feature object obtaining module is used for obtaining a health feature object based on the face picture and obtaining a face key point;
the information obtaining module is used for associating the health feature object with the face key points to obtain the association number information;
and the health feature object tracking module is used for reversely indexing the health feature object based on the association number information to obtain health feature object trend information and tracking the health feature object based on the health feature object trend information.
10. A computer readable storage medium storing a computer program, wherein the computer program when executed by a processor implements a method of tracking a face health feature object according to any one of claims 1 to 7.
CN202311058395.6A 2023-08-21 2023-08-21 Method, system and storage medium for tracking human face health characteristic object Pending CN117036411A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311058395.6A CN117036411A (en) 2023-08-21 2023-08-21 Method, system and storage medium for tracking human face health characteristic object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311058395.6A CN117036411A (en) 2023-08-21 2023-08-21 Method, system and storage medium for tracking human face health characteristic object

Publications (1)

Publication Number Publication Date
CN117036411A true CN117036411A (en) 2023-11-10

Family

ID=88626231

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311058395.6A Pending CN117036411A (en) 2023-08-21 2023-08-21 Method, system and storage medium for tracking human face health characteristic object

Country Status (1)

Country Link
CN (1) CN117036411A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004005384A (en) * 2002-04-19 2004-01-08 Sony Corp Image processing method, image processing device, program, recording medium, automatic trimming device and picture-taking arrangement
WO2015166052A1 (en) * 2014-04-30 2015-11-05 Usw Commercial Services Ltd Data acquistion
CN111523524A (en) * 2020-07-02 2020-08-11 江苏原力数字科技股份有限公司 Facial animation capturing and correcting method based on machine learning and image processing
CN112417985A (en) * 2020-10-30 2021-02-26 杭州魔点科技有限公司 Face feature point tracking method, system, electronic equipment and storage medium
CN114496246A (en) * 2022-01-05 2022-05-13 青岛海尔智能技术研发有限公司 Method and device for evaluating posture health and storage medium
CN115294457A (en) * 2022-08-29 2022-11-04 山东农业大学 Pine nematode disease-suffering wood section identification and tracking method
CN115797993A (en) * 2022-11-10 2023-03-14 深圳市即构科技有限公司 Face tracking method and device, computer equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004005384A (en) * 2002-04-19 2004-01-08 Sony Corp Image processing method, image processing device, program, recording medium, automatic trimming device and picture-taking arrangement
WO2015166052A1 (en) * 2014-04-30 2015-11-05 Usw Commercial Services Ltd Data acquistion
CN111523524A (en) * 2020-07-02 2020-08-11 江苏原力数字科技股份有限公司 Facial animation capturing and correcting method based on machine learning and image processing
CN112417985A (en) * 2020-10-30 2021-02-26 杭州魔点科技有限公司 Face feature point tracking method, system, electronic equipment and storage medium
CN114496246A (en) * 2022-01-05 2022-05-13 青岛海尔智能技术研发有限公司 Method and device for evaluating posture health and storage medium
CN115294457A (en) * 2022-08-29 2022-11-04 山东农业大学 Pine nematode disease-suffering wood section identification and tracking method
CN115797993A (en) * 2022-11-10 2023-03-14 深圳市即构科技有限公司 Face tracking method and device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YUANYUAN, XU: "CenterFace:Joint Face Detection and Alignment Using Face as Point", 《OPEN ACCESS》, 2 July 2020 (2020-07-02), pages 1 - 8 *
吴子鄂: "基于比例密集聚合的轻量级人体姿态估计方法", 《现代计算机》, 15 September 2021 (2021-09-15), pages 79 - 86 *

Similar Documents

Publication Publication Date Title
CN110232311B (en) Method and device for segmenting hand image and computer equipment
CN109359538B (en) Training method of convolutional neural network, gesture recognition method, device and equipment
US8027521B1 (en) Method and system for robust human gender recognition using facial feature localization
Yang et al. Tracking based multi-orientation scene text detection: A unified framework with dynamic programming
Lee et al. Vasir: an open-source research platform for advanced iris recognition technologies
US20120243742A1 (en) Information processing device, information processing method, and program
CN105590097A (en) Security system and method for recognizing face in real time with cooperation of double cameras on dark condition
CN113239907B (en) Face recognition detection method and device, electronic equipment and storage medium
CN111950514B (en) Depth camera-based aerial handwriting recognition system and method
CN112101208A (en) Feature series fusion gesture recognition method and device for elderly people
CN114359553B (en) Signature positioning method and system based on Internet of things and storage medium
CN111488943A (en) Face recognition method and device
CN111597910A (en) Face recognition method, face recognition device, terminal equipment and medium
Lee Component-based face detection and verification
CN111259757B (en) Living body identification method, device and equipment based on image
CN113378764B (en) Video face acquisition method, device, equipment and medium based on clustering algorithm
CN107368817A (en) Face identification method and device
CN114360039A (en) Intelligent eyelid detection method and system
Barra et al. Unconstrained ear processing: What is possible and what must be done
CN112329663A (en) Micro-expression time detection method and device based on face image sequence
Sonoda et al. A letter input system based on handwriting gestures
CN117036411A (en) Method, system and storage medium for tracking human face health characteristic object
CN115019364A (en) Identity authentication method and device based on face recognition, electronic equipment and medium
CN108171144A (en) Information processing method, device, electronic equipment and storage medium
CN113674340A (en) Binocular vision navigation method and device based on landmark points

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination