CN110874567A - Color value determination method, device, electronic device and storage medium - Google Patents

Color value determination method, device, electronic device and storage medium Download PDF

Info

Publication number
CN110874567A
CN110874567A CN201910901612.0A CN201910901612A CN110874567A CN 110874567 A CN110874567 A CN 110874567A CN 201910901612 A CN201910901612 A CN 201910901612A CN 110874567 A CN110874567 A CN 110874567A
Authority
CN
China
Prior art keywords
color value
feature
features
grade
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910901612.0A
Other languages
Chinese (zh)
Other versions
CN110874567B (en
Inventor
孙汀娟
黄竹梅
周雅君
赵星
李恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201910901612.0A priority Critical patent/CN110874567B/en
Publication of CN110874567A publication Critical patent/CN110874567A/en
Priority to PCT/CN2020/093341 priority patent/WO2021057063A1/en
Application granted granted Critical
Publication of CN110874567B publication Critical patent/CN110874567B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A method of color value determination, the method comprising: acquiring a face image to be judged, which needs to be subjected to color value judgment; extracting a plurality of first characteristic points in a face image to be judged by using a face dotting technology; calculating a plurality of first features according to the coordinates of the plurality of first feature points; determining a first feature matched with a preset feature type as a first standard feature from the plurality of first features; according to the first standard feature, conducting standardization processing on the significant features in the first features to obtain a plurality of first feature ratios, wherein the significant features are features with high color value discrimination; judging the plurality of first characteristic ratios by using a pre-trained color value judging model to obtain a first color value judging result of the face image to be judged; and outputting a first color value judgment result. The invention also provides a color value judging device, an electronic device and a storage medium. The invention can improve the accuracy of color value judgment.

Description

Color value determination method, device, electronic device and storage medium
Technical Field
The present invention relates to the field of machine learning technologies, and in particular, to a method and an apparatus for determining a color value, an electronic device, and a storage medium.
Background
"color value" is a popular web word in recent years, and news about color value is now visible on the respective large internet almost every day. The simplest interpretation of the color value is the facies, which is a determination of how good or bad the appearance characteristics are. "color value" also has a measure that can be measured and compared, and the measures of color value include: the expressions "low color value", "high color value", "color value acting" and "color value explosion table" are used. Wherein, high color value and color value are used as the color value, and low color value is not good.
At present, the color value determination method mainly determines according to the proximity degree of the position and ratio of five sense organs of a human face and the "golden ratio", and the closer the human face is to the "golden ratio", the higher the score of the color value determination is. However, different people have different aesthetic standards, and with the development of the times, the aesthetic level of people also changes, and the color value is judged according to the golden ratio, so that the accuracy is not high.
Therefore, how to more accurately determine the color value of the user is a technical problem to be solved.
Disclosure of Invention
In view of the above, it is desirable to provide a color value determination method, device, electronic device, and storage medium, which can improve the accuracy of color value determination.
A first aspect of the present invention provides a color value determination method, the method including:
acquiring a face image to be judged, which needs to be subjected to color value judgment;
extracting a plurality of first characteristic points in the face image to be judged by using a face dotting technology;
calculating a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the length of each part of the face in the face image to be judged and the distance between some two parts;
determining the first feature matched with a preset feature type as a first standard feature from a plurality of first features;
according to the first standard feature, conducting standardization processing on a significant feature in the first features to obtain a plurality of first feature ratios, wherein the significant feature is a feature with a large color value discrimination;
judging the first feature ratios by using a pre-trained color value judgment model to obtain a first color value judgment result of the face image to be judged;
and outputting the first color value judgment result.
In a possible implementation manner, the determining, by using a pre-trained color value determination model, a plurality of first feature ratio values, and obtaining a first color value determination result of the face image to be determined includes:
for each first feature ratio, acquiring a first feature type of the first feature ratio;
for each preset color value grade, judging whether the first characteristic ratios belong to the characteristic ratio range matched with the first characteristic type in the color value grade by using a pre-trained color value judgment model;
if the first feature ratios belong to the feature ratio range matched with the first feature type in the color value grade, determining the color value grade to be subjected to grade determination;
and if the grade of the to-be-fixed color value is one, determining that the grade of the to-be-fixed color value is a first color value judgment result of the face image to be judged.
In one possible implementation, the method further includes:
if the number of the to-be-fixed color value grades is multiple, sequencing the multiple to-be-fixed color value grades from high to low according to the color value grades to obtain a color value grade sequencing queue;
and determining any one to-be-fixed color value grade at the middle position in the color value grade sorting queue as a first color value judgment result of the to-be-judged face image.
In one possible implementation, the method further includes:
if the plurality of first characteristic ratios do not belong to the characteristic ratio range matched with the first characteristic type in the color value grades, determining that the plurality of first characteristic ratios belong to preset color value grades;
and determining the grade of the preset color value as a first color value judgment result of the face image to be judged.
In one possible implementation, the method further includes:
acquiring a plurality of face sample images needing to be trained;
extracting a plurality of second feature points in the face sample image aiming at each face sample image;
calculating a plurality of second features according to the coordinates of the plurality of second feature points, wherein the second features comprise the length of each part of the face in the face sample image and the distance between two parts;
determining the second feature matched with the preset feature type as a second standard feature from the plurality of second features;
according to the second standard features, carrying out standardization processing on the plurality of second features to obtain a plurality of second feature ratios;
selecting a significant feature with high color value discrimination from the plurality of second features according to the distribution condition of the plurality of second feature ratios;
and constructing a color value judgment model according to the remarkable features in the plurality of second features.
In a possible implementation manner, the constructing a color value determination model according to the significant feature of the plurality of second features includes:
learning salient features of the plurality of second features;
determining a plurality of color value grades corresponding to the significant features in the plurality of second features and a feature ratio range corresponding to each color value grade;
judging whether the characteristic ratio ranges of different color value grades accord with extreme value consistency or not;
and if the characteristic ratio ranges of different color value grades accord with the consistency of extreme values, generating a color value judgment model according to the significant characteristics in the second characteristics, the color value grades and the characteristic ratio range corresponding to each color value grade.
In a possible implementation manner, the color value grades corresponding to the color value determination model include a plurality of color value grades, the color value grades are divided according to the color value levels, each color value grade includes all feature types, each feature type is in different color value grades, and the feature value ranges corresponding to the feature types are different.
A second aspect of the present invention provides a color value determination device, the device including:
the acquisition module is used for acquiring a face image to be judged, which needs to be subjected to color value judgment;
the extraction module is used for extracting a plurality of first characteristic points in the face image to be judged by using a face dotting technology;
the calculation module is used for calculating a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the length of each part of the face in the face image to be judged and the distance between two parts;
the determining module is used for determining the first feature matched with a preset feature type from a plurality of first features as a first standard feature;
the processing module is used for carrying out standardization processing on the salient features in the first features according to the first standard features to obtain a plurality of first feature ratios, wherein the salient features are features with high color value discrimination;
the judging module is used for judging the first characteristic ratios by using a pre-trained color value judging model to obtain a first color value judging result of the face image to be judged;
and the output module is used for outputting the first color value judgment result.
A third aspect of the present invention provides an electronic device comprising a processor and a memory, the processor being configured to implement the color value determination method when executing a computer program stored in the memory.
A fourth aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the color value determination method.
By the technical scheme, the face image to be judged, which needs to be subjected to color value judgment, can be obtained; extracting a plurality of first characteristic points in the face image to be judged by using a face dotting technology; calculating a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the length of each part of the face in the face image to be judged and the distance between some two parts; determining the first feature matched with a preset feature type as a first standard feature from a plurality of first features; according to the first standard feature, conducting standardization processing on a significant feature in the first features to obtain a plurality of first feature ratios, wherein the significant feature is a feature with a large color value discrimination; judging the first feature ratios by using a pre-trained color value judgment model to obtain a first color value judgment result of the face image to be judged; and outputting the first color value judgment result. Therefore, in the invention, the first color value judgment result obtained by judging the significant features can represent the color value of the face image to be judged better, so that the accuracy of color value judgment can be improved.
Drawings
FIG. 1 is a flow chart of a preferred embodiment of a color value determination method disclosed in the present invention.
Fig. 2 is an exemplary diagram of a face image disclosed in the present invention.
Fig. 3 is a functional block diagram of a preferred embodiment of a color value determination apparatus according to the present disclosure.
Fig. 4 is a schematic structural diagram of an electronic device implementing a color value determination method according to a preferred embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
The color value determination method of the embodiment of the invention is applied to the electronic equipment, and can also be applied to a hardware environment formed by the electronic equipment and a server connected with the electronic equipment through a network, and the server and the electronic equipment are jointly executed. Networks include, but are not limited to: a wide area network, a metropolitan area network, or a local area network.
A server may refer to a computer system that provides services to other devices (e.g., electronic devices) in a network. A personal computer may also be called a server if it can externally provide a File Transfer Protocol (FTP) service. In a narrow sense, a server refers to a high-performance computer, which can provide services to the outside through a network, and compared with a common personal computer, the server has higher requirements on stability, security, performance and the like, and therefore, hardware such as a CPU, a chipset, a memory, a disk system, a network and the like is different from that of the common personal computer.
The electronic device is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and the hardware thereof includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like. The electronic device may also include a network device and/or a user device. The network device includes, but is not limited to, a single network server, a server group consisting of a plurality of network servers, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of hosts or network servers. The user device includes, but is not limited to, any electronic product that can interact with a user through a keyboard, a mouse, a remote controller, a touch pad, or a voice control device, for example, a personal computer, a tablet computer, a smart phone, a Personal Digital Assistant (PDA), or the like.
FIG. 1 is a flow chart of a preferred embodiment of a color value determination method disclosed in the present invention. The order of the steps in the flowchart may be changed, and some steps may be omitted.
And S11, the electronic equipment acquires the face image to be judged, which needs to be subjected to color value judgment.
The face image to be judged refers to a face front image.
As an optional implementation manner, before step S11, the method further includes:
acquiring a plurality of face sample images needing to be trained;
extracting a plurality of second feature points in the face sample image aiming at each face sample image;
calculating a plurality of second features according to the coordinates of the plurality of second feature points, wherein the second features comprise the length of each part of the face in the face sample image and the distance between two parts;
determining the second feature matched with the preset feature type as a second standard feature from the plurality of second features;
according to the second standard features, carrying out standardization processing on the plurality of second features to obtain a plurality of second feature ratios;
selecting a significant feature with high color value discrimination from the plurality of second features according to the distribution condition of the plurality of second feature ratios;
and constructing a color value judgment model according to the remarkable features in the plurality of second features.
The face sample image is a face image which is prepared in advance, the face sample image carries color value grade information, and the face sample image is determined according to mass aesthetics in advance.
The second feature points refer to points marked on the outer contour of the face and the edges of organs, and a plurality of second feature points and coordinates thereof in the face sample image can be acquired by using a pre-trained face dotting technology.
Wherein the second feature includes the length of each part of the face in the face sample image and the distance between two parts, such as the length of the eye: 2cm, distance from the bottom of the nose to the mouth: 2 cm.
Wherein the second feature ratio is a result of a ratio of the second feature to the second standard feature.
In this optional embodiment, feature points are extracted from a prepared face sample image, the length of each part and the distance between two parts in the face sample image are calculated according to coordinates of the feature points, and then normalization processing is performed on the lengths and/or distances, wherein the normalization processing is to convert all the lengths and/or distances into a ratio result (i.e., a feature ratio) with the standard features, and through the standard processing, all faces can be scaled to a ratio with the standard features as 1 unit, different faces can be compared with each other, and the features of the same face can also be compared with each other; in addition, since the difference between the face data and the data is small, the data precision needs to be at least 6 bits after the decimal point. For example, in the embodiment of the present invention, the face width of the straight line position where the two eyes are located is selected as a standard feature, and in a face sample image, if the face width is 10cm and the length of the eyes is 2cm, a feature ratio obtained after normalizing the feature of the length of the eyes is 0.200000.
After the characteristic ratios of the sample face images are obtained, the distribution conditions of the characteristic ratios in different color values are observed, and then the characteristic types corresponding to the characteristic ratios with larger discrimination degrees of different color values are found out. The embodiment of the invention generates box type graphs of various characteristic ratios at different color value grades according to the characteristic ratio and the color value grade of the sample face image, and determines a plurality of significant characteristics with larger distinguishing degrees at different grade color values from the box type graphs: an atrium length, a distance between two eyes, an eye length, a nose width, an eye width, etc. And then training the salient features to obtain a color value judgment model. The box type graph is composed of rectangular frames generated by characteristic ratios and color value grades.
Fig. 2 is an exemplary diagram of a face image disclosed in the present invention. As shown in fig. 2, a face dotting technique may be used to extract 71 points from the face image shown in fig. 2, and at the same time, position coordinates of each feature point may be recorded, and a distance or a length of "trioctadium pentaocular" may be further calculated, for example, the length of "pentaocular 1" is a length from a point 13 to a point 17 in the face image shown in fig. 2. And calculating distances, such as eye length, eye width, nose width, mouth width, atrium length and the like, which have large influence on the color value by all subjective judgments in the same way.
Among these, 9 salient features may be selected, such as the vertical length of points 6 to 51-52 (i.e., the inferior antrum length), the linear vertical length of points 26-39 to 51-52 (i.e., the medial antrum length), the length of points 30-17 (i.e., the distance between the eyes), the length of points 10-2 (i.e., the face width), the length of points 11-1 (i.e., the face width), the linear vertical length of points 58-62 to 51-52 (i.e., the distance between the mouth and the nose), the length of points 50-53 (i.e., the width of the nose), the length of points 15-19 (i.e., the width of the eyes), and the length of points 30-34 (i.e., the length of the eyes).
Specifically, the constructing a color value determination model according to the significant features of the plurality of second features includes:
learning salient features of the plurality of second features;
determining a plurality of color value grades corresponding to the significant features in the plurality of second features and a feature ratio range corresponding to each color value grade;
judging whether the characteristic ratio ranges of different color value grades accord with extreme value consistency or not;
and if the characteristic ratio ranges of different color value grades accord with the consistency of extreme values, generating a color value judgment model according to the significant characteristics in the second characteristics, the color value grades and the characteristic ratio range corresponding to each color value grade.
The color value grades are pre-divided into a plurality of grades, the type and number of the color value grades are pre-defined, for example, the color value grades may be divided into 5 grades, wherein the color value grades may be divided into A, B, C, D, E five grades according to the color value from high to low, wherein the color value grades may be divided into more grades or less grades according to the color value from high to low, A, B, C, D, E characters are only pre-defined to identify different color value grades, and other characters may also be used to identify different color value grades, which is not specifically limited in the embodiment of the present invention.
In the embodiment of the present invention, when the salient features are trained, the feature ratio ranges corresponding to the salient features in different color value grades need to be determined according to the maximum value and the minimum value of the box type graphs corresponding to the salient features in different color value grades. After determining the feature ratio ranges corresponding to the salient features at different color grades, it is necessary to determine whether the feature ratio ranges meet the extremum consistency, for example, the feature ratio ranges corresponding to one salient feature at five color grades are [ a1, b1], [ a2, b2], [ a3, b3], [ a4, b4], [ a5, b5], if the color grade is monotonically increasing on the salient feature, that is, if the color grade is higher, the maximum value and the minimum value of the feature ratio corresponding to the salient feature are larger, and if the feature ratio ranges meet a1<, a2<, > a3<, > a4<, a5, b1<, b2<, b3<, b4<, > b 5. At this time, the feature ratio ranges of different color value grades can be determined to accord with the extreme value consistency. And generating a color value judgment model according to the significant features in the second features, the color value grades and the feature ratio range corresponding to each color value grade.
Alternatively, if the feature ratio ranges of different color grades do not meet the extreme value consistency, the feature ratio range needs to be changed, for example, the feature ratio ranges of five color grades corresponding to the salient features in the above example are [ a1, b1], [ a2, b2], [ a3, b3], [ a4, b4], [ a5, b5], and the color grade monotonically increases on the salient feature, and if a feature ratio range does not satisfy a1< a2<, a3<, a4<, a5, b1<, b2<, b3<, b4<, b5, the feature ratio range needs to be changed to the feature ratio range of the next grade, for example: a1 a2 a3 a4 a5, the value of a1 needs to be changed to a2, so that a1 a2 a3 a4 a5 is true.
The color value grade corresponding to the color value judgment model comprises a plurality of color value grades, the plurality of color value grades are divided according to the color value, each color value grade comprises all feature types, each feature type is in different color value grades, and the feature value ranges corresponding to the feature types are different.
In the color value determination model, each feature type has a corresponding feature ratio range (a value range of a feature ratio), and each feature type has a feature ratio range matched with the feature type in different color value grades.
Wherein the feature types are, for example, eye length, eye width, nose length, nose width, mouth length, etc.
S12, the electronic equipment extracts a plurality of first feature points in the face image to be judged by using a face dotting technology.
The first feature points refer to points marked on the outer contour of the face and the edges of organs, and a plurality of first feature points and coordinates thereof in the face sample image can be acquired by using a face dotting technology.
And S13, the electronic equipment calculates a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the lengths of all parts of the face in the face image to be judged and the distance between some two parts.
Wherein, the first characteristic refers to the length of a certain part or the distance between two parts, such as the length of the eye: 2cm, distance from the bottom of the nose to the mouth: 2 cm. A plurality of the first features may be calculated from the coordinates of the first feature points.
S14, the electronic equipment determines the first feature matched with a preset feature type as a first standard feature from the plurality of first features.
The first standard feature refers to a predetermined feature. The embodiment of the invention selects the face width of the straight line position of the two eyes as the first standard characteristic. Among them, preset feature types such as eye length, eye width, nose length, nose width, mouth length, and the like.
And S15, the electronic device standardizes the salient features in the first features according to the first standard features to obtain a plurality of first feature ratios, wherein the salient features are features with high color discrimination.
Wherein the first feature ratio is a result of a ratio of the significant feature to the first standard feature.
Specifically, reference may be made to the processing method adopted in the model training, and details are not described here.
S16, the electronic device judges the first feature ratios by using a pre-trained color value judgment model to obtain a first color value judgment result of the face image to be judged.
The color value judgment result is the color value grade obtained after the first characteristic ratio of the face image to be judged is judged by using the color value judgment model.
Specifically, the determining the plurality of first feature ratios by using a pre-trained color value determination model to obtain a first color value determination result of the face image to be determined includes:
for each first feature ratio, acquiring a first feature type of the first feature ratio;
for each preset color value grade, judging whether the first characteristic ratios belong to the characteristic ratio range matched with the first characteristic type in the color value grade by using a pre-trained color value judgment model;
if the first feature ratios belong to the feature ratio range matched with the first feature type in the color value grade, determining the color value grade to be subjected to grade determination;
and if the grade of the to-be-fixed color value is one, determining that the grade of the to-be-fixed color value is a first color value judgment result of the face image to be judged.
In this optional embodiment, when all the first feature ratios of the facial image to be determined belong to the feature ratio ranges corresponding to the same color value class, the color value class may be determined as the color value class of the facial image to be determined. For example, when a face image is subjected to color value determination, the feature X and the feature Y are selected for determination, and if the determination result is the color value grade a, the following conditions are satisfied: the characteristic ratio of the characteristic X belongs to the range of the characteristic ratio of the characteristic X in the color value grade A; the characteristic ratio of the characteristic Y belongs to the range of the characteristic ratio of the characteristic Y in the color class a. Therefore, it is necessary to determine each first feature ratio of the face image to be determined, and determine whether all the first feature ratios belong to various corresponding feature ratio ranges in the same color value grade. For each first feature ratio, obtaining a feature type of the first feature ratio, namely the first feature type, and then judging whether the plurality of first feature ratios all belong to a feature ratio range matched with the first feature type in each color value grade by using the pre-trained color value judgment model; and if the grade of the color value meeting the requirements is one, determining the grade of the color value to be fixed as a first color value judgment result of the face image to be judged.
As an optional implementation, the method further comprises:
if the number of the to-be-fixed color value grades is multiple, sequencing the multiple to-be-fixed color value grades from high to low according to the color value grades to obtain a color value grade sequencing queue;
and determining any one to-be-fixed color value grade at the middle position in the color value grade sorting queue as a first color value judgment result of the to-be-judged face image.
If the number of the color values to be graded is multiple, namely the number of the first color value judgment results is multiple, sequencing the multiple color values to be graded from high to low according to the color value grades to obtain a color value grade sequencing queue; and determining any one to-be-fixed color value grade at the middle position in the color value grade sorting queue as a first color value judgment result, wherein if only one to-be-fixed color value grade at the middle position exists, the to-be-fixed color value grade is determined as the first color value judgment result, and if 2 to-be-fixed color value grades exist at the middle position, one to-be-fixed color value grade is taken as the first color value judgment result according to a preset rule. Such as: in the embodiment of the invention, the color value grades are divided into A, B, C, D, E grades according to the height, if the color value grade to be fixed obtained preliminarily by the model is 4 color value grades such as ABCD, the first color value judgment result is determined to be the color value grade B according to a preset rule.
As an optional implementation, the method further comprises:
if the plurality of first characteristic ratios do not belong to the characteristic ratio range matched with the first characteristic type in the color value grades, determining that the plurality of first characteristic ratios belong to preset color value grades;
and determining the grade of the preset color value as a first color value judgment result of the face image to be judged.
In this optional embodiment, if none of the first feature ratios falls within the range of the feature ratio matched with the first feature type in the color value class, a preset color value class may be determined as the first color value determination result. For example, in the embodiment of the present invention, the color value ranks are classified into A, B, C, D, E ranks from high to low, and if the determination result for the face image does not satisfy any one of ABCDE ranks, it is determined as the C rank.
And S17, the electronic equipment outputs the first color value judgment result.
In the embodiment of the present invention, after the result of determining the color value of the face image to be determined is obtained, that is, after the first result of determining the color value is obtained, the first result of determining the color value may be output to an interface/page interacting with a user.
In the method flow described in fig. 1, a face image to be determined, which needs to be subjected to color value determination, may be obtained; extracting a plurality of first characteristic points in the face image to be judged by using a face dotting technology; calculating a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the length of each part of the face in the face image to be judged and the distance between some two parts; determining the first feature matched with a preset feature type as a first standard feature from a plurality of first features; according to the first standard feature, conducting standardization processing on a significant feature in the first features to obtain a plurality of first feature ratios, wherein the significant feature is a feature with a large color value discrimination; judging the first feature ratios by using a pre-trained color value judgment model to obtain a first color value judgment result of the face image to be judged; and outputting the first color value judgment result. Therefore, the first color value judgment result obtained by judging the significant features can better represent the color value of the face image to be judged, and the accuracy of color value judgment can be improved.
The above description is only a specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and it will be apparent to those skilled in the art that modifications may be made without departing from the inventive concept of the present invention, and these modifications are within the scope of the present invention.
Fig. 3 is a functional block diagram of a preferred embodiment of a color value determination apparatus according to the present disclosure.
In some embodiments, the color value determination apparatus operates in an electronic device. The color value determination means may comprise a plurality of functional modules consisting of program code segments. The program code of each program segment in the color determination device may be stored in a memory and executed by at least one processor to perform some or all of the steps of the color determination method described in fig. 1.
In this embodiment, the color value determination device may be divided into a plurality of functional modules according to the functions performed by the color value determination device. The functional module may include: the device comprises an acquisition module 201, an extraction module 202, a calculation module 203, a determination module 204, a processing module 205, a determination module 206 and an output module 207. The module referred to herein is a series of computer program segments capable of being executed by at least one processor and capable of performing a fixed function and is stored in memory. In some embodiments, the functionality of the modules will be described in greater detail in subsequent embodiments.
The acquiring module 201 is configured to acquire a face image to be determined, which needs to be subjected to color value determination.
The face image to be judged refers to a face front image.
The extracting module 202 is configured to extract a plurality of first feature points in the face image to be determined by using a face dotting technique.
The first feature points refer to points marked on the outer contour of the face and the edges of organs, and a plurality of first feature points and coordinates thereof in the face sample image can be acquired by using a face dotting technology.
A calculating module 203, configured to calculate a plurality of first features according to the coordinates of the plurality of first feature points, where the first features include lengths of respective portions of a face in the face image to be determined and a distance between two portions.
Wherein, the first characteristic refers to the length of a certain part or the distance between two parts, such as the length of the eye: 2cm, distance from the bottom of the nose to the mouth: 2 cm. A plurality of the first features may be calculated from the coordinates of the first feature points.
A determining module 204, configured to determine, from the plurality of first features, the first feature that matches a preset feature type as a first standard feature.
The first standard feature refers to a predetermined feature. The embodiment of the invention selects the face width of the straight line position of the two eyes as the first standard characteristic. Among them, preset feature types such as eye length, eye width, nose length, nose width, mouth length, and the like.
The processing module 205 is configured to perform normalization processing on a significant feature in the plurality of first features according to the first standard feature to obtain a plurality of first feature ratios, where the significant feature is a feature with a high color discrimination.
Wherein the first feature ratio is a result of a ratio of the significant feature to the first standard feature.
Specifically, reference may be made to the processing method adopted in the model training, and details are not described here.
The determining module 206 is configured to determine the plurality of first feature ratios by using a pre-trained color value determination model, so as to obtain a first color value determination result of the face image to be determined.
The color value judgment result is the color value grade obtained after the first characteristic ratio of the face image to be judged is judged by using the color value judgment model.
And the output module 207 is used for outputting the first color value judgment result.
In the embodiment of the present invention, after the result of determining the color value of the face image to be determined is obtained, that is, after the first result of determining the color value is obtained, the first result of determining the color value may be output to an interface/page interacting with a user.
As an optional implementation, the determining module 206 includes:
the obtaining submodule is used for obtaining a first feature type of the first feature ratio aiming at each first feature ratio;
the judging submodule is used for judging whether the first characteristic ratios belong to the characteristic ratio range matched with the first characteristic type in the color value grade or not by using a pre-trained color value judging model aiming at each preset color value grade;
the determining submodule is used for determining that the color value grade is the color value grade to be determined if the plurality of first characteristic ratios all belong to the characteristic ratio range matched with the first characteristic type in the color value grade;
the determining submodule is further configured to determine that the grade of the to-be-fixed color value is a first color value determination result of the to-be-determined face image if the grade of the to-be-fixed color value is one.
In this optional embodiment, when all the first feature ratios of the facial image to be determined belong to the feature ratio ranges corresponding to the same color value class, the color value class may be determined as the color value class of the facial image to be determined. For example, when a face image is subjected to color value determination, the feature X and the feature Y are selected for determination, and if the determination result is the color value grade a, the following conditions are satisfied: the characteristic ratio of the characteristic X belongs to the range of the characteristic ratio of the characteristic X in the color value grade A; the characteristic ratio of the characteristic Y belongs to the range of the characteristic ratio of the characteristic Y in the color class a. Therefore, it is necessary to determine each first feature ratio of the face image to be determined, and determine whether all the first feature ratios belong to various corresponding feature ratio ranges in the same color value grade. For each first feature ratio, obtaining a feature type of the first feature ratio, namely the first feature type, and then judging whether the plurality of first feature ratios all belong to a feature ratio range matched with the first feature type in each color value grade by using the pre-trained color value judgment model; and if the grade of the color value meeting the requirements is one, determining the grade of the color value to be fixed as a first color value judgment result of the face image to be judged.
As an optional implementation manner, the determining sub-module is further configured to, if the to-be-fixed color value grades are multiple, sort the multiple to-be-fixed color value grades from high to low according to the color value grades, and obtain a color value grade sorting queue;
the determining submodule is further configured to determine any one of the to-be-determined color value grades at the middle position in the color value grade sorting queue as a first color value determination result of the to-be-determined face image.
If the number of the color values to be graded is multiple, namely the number of the first color value judgment results is multiple, sequencing the multiple color values to be graded from high to low according to the color value grades to obtain a color value grade sequencing queue; and determining any one to-be-fixed color value grade at the middle position in the color value grade sorting queue as a first color value judgment result, wherein if only one to-be-fixed color value grade at the middle position exists, the to-be-fixed color value grade is determined as the first color value judgment result, and if 2 to-be-fixed color value grades exist at the middle position, one to-be-fixed color value grade is taken as the first color value judgment result according to a preset rule. Such as: in the embodiment of the invention, the color value grades are divided into A, B, C, D, E grades according to the height, if the color value grade to be fixed obtained preliminarily by the model is 4 color value grades such as ABCD, the first color value judgment result is determined to be the color value grade B according to a preset rule.
As an optional implementation manner, the determining sub-module is further configured to determine that the plurality of first feature ratios belong to a preset color value grade if none of the plurality of first feature ratios belongs to a feature ratio range in the color value grade that matches the first feature type;
the determining submodule is further configured to determine the preset color value grade as a first color value determination result of the face image to be determined.
In this optional embodiment, if none of the first feature ratios falls within the range of the feature ratio matched with the first feature type in the color value class, a preset color value class may be determined as the first color value determination result. For example, in the embodiment of the present invention, the color value ranks are classified into A, B, C, D, E ranks from high to low, and if the determination result for the face image does not satisfy any one of ABCDE ranks, it is determined as the C rank.
As an optional implementation manner, the obtaining module 201 is further configured to obtain a plurality of face sample images that need to be trained;
the extracting module 202 is further configured to extract, for each face sample image, a plurality of second feature points in the face sample image;
the calculating module 203 is further configured to calculate a plurality of second features according to the coordinates of the plurality of second feature points, where the second features include lengths of respective parts of a face in the face sample image and a distance between two parts;
the determining module 204 is further configured to determine, from the plurality of second features, the second feature that matches the preset feature type as a second standard feature;
the processing module 205 is further configured to perform normalization processing on the plurality of second features according to the second standard features to obtain a plurality of second feature ratios;
the color value determination device may further include:
the selection module is used for selecting the significant features with high color value discrimination from the second features according to the distribution condition of the second feature ratios;
and the construction module is used for constructing a color value judgment model according to the remarkable features in the plurality of second features.
The face sample image is a face image which is prepared in advance, the face sample image carries color value grade information, and the face sample image is determined according to mass aesthetics in advance.
The second feature points refer to points marked on the outer contour of the face and the edges of organs, and a plurality of second feature points and coordinates thereof in the face sample image can be acquired by using a pre-trained face dotting technology.
Wherein the second feature includes the length of each part of the face in the face sample image and the distance between two parts, such as the length of the eye: 2cm, distance from the bottom of the nose to the mouth: 2 cm.
Wherein the second feature ratio is a result of a ratio of the second feature to the second standard feature.
In this optional embodiment, feature points are extracted from a prepared face sample image, the length of each part and the distance between two parts in the face sample image are calculated according to coordinates of the feature points, and then normalization processing is performed on the lengths and/or distances, wherein the normalization processing is to convert all the lengths and/or distances into a ratio result (i.e., a feature ratio) with the standard features, and through the standard processing, all faces can be scaled to a ratio with the standard features as 1 unit, different faces can be compared with each other, and the features of the same face can also be compared with each other; in addition, since the difference between the face data and the data is small, the data precision needs to be at least 6 bits after the decimal point. For example, in the embodiment of the present invention, the face width of the straight line position where the two eyes are located is selected as a standard feature, and in a face sample image, if the face width is 10cm and the length of the eyes is 2cm, a feature ratio obtained after normalizing the feature of the length of the eyes is 0.200000.
After the characteristic ratios of the sample face images are obtained, the distribution conditions of the characteristic ratios in different color values are observed, and then the characteristic types corresponding to the characteristic ratios with larger discrimination degrees of different color values are found out. The embodiment of the invention generates box type graphs of various characteristic ratios at different color value grades according to the characteristic ratio and the color value grade of the sample face image, and determines a plurality of significant characteristics with larger distinguishing degrees at different grade color values from the box type graphs: an atrium length, a distance between two eyes, an eye length, a nose width, an eye width, etc. And then training the salient features to obtain a color value judgment model. The box type graph is composed of rectangular frames generated by characteristic ratios and color value grades.
Fig. 2 is an exemplary diagram of a face image disclosed in the present invention. As shown in fig. 2, a face dotting technique may be used to extract 71 points from the face image shown in fig. 2, and at the same time, the position coordinates of each feature point may be recorded. And further calculating the distance or length of the three-family five-eye, for example, the length of the five-eye 1 is the length from the point 13 to the point 17 in the face image shown in fig. 2. And calculating distances, such as eye length, eye width, nose width, mouth width, atrium length and the like, which have large influence on the color value by all subjective judgments in the same way.
Among these, 9 salient features may be selected, such as the vertical length of points 6 to 51-52 (i.e., the inferior antrum length), the linear vertical length of points 26-39 to 51-52 (i.e., the medial antrum length), the length of points 30-17 (i.e., the distance between the eyes), the length of points 10-2 (i.e., the face width), the length of points 11-1 (i.e., the face width), the linear vertical length of points 58-62 to 51-52 (i.e., the distance between the mouth and the nose), the length of points 50-53 (i.e., the width of the nose), the length of points 15-19 (i.e., the width of the eyes), and the length of points 30-34 (i.e., the length of the eyes).
As an optional implementation manner, the constructing module constructs, according to the significant feature in the plurality of second features, a color value determination model in a specific manner as follows:
learning salient features of the plurality of second features;
determining a plurality of color value grades corresponding to the significant features in the plurality of second features and a feature ratio range corresponding to each color value grade;
judging whether the characteristic ratio ranges of different color value grades accord with extreme value consistency or not;
and if the characteristic ratio ranges of different color value grades accord with the consistency of extreme values, generating a color value judgment model according to the significant characteristics in the second characteristics, the color value grades and the characteristic ratio range corresponding to each color value grade.
The color value grades are pre-divided into a plurality of grades, the type and number of the color value grades are pre-defined, for example, the color value grades may be divided into 5 grades, wherein the color value grades may be divided into A, B, C, D, E five grades according to the color value from high to low, wherein the color value grades may be divided into more grades or less grades according to the color value from high to low, A, B, C, D, E characters are only pre-defined to identify different color value grades, and other characters may also be used to identify different color value grades, which is not specifically limited in the embodiment of the present invention.
In the embodiment of the present invention, when the salient features are trained, the feature ratio ranges corresponding to the salient features in different color value grades need to be determined according to the maximum value and the minimum value of the box type graphs corresponding to the salient features in different color value grades. After determining the feature ratio ranges corresponding to the salient features at different color grades, it is necessary to determine whether the feature ratio ranges meet the extremum consistency, for example, the feature ratio ranges corresponding to one salient feature at five color grades are [ a1, b1], [ a2, b2], [ a3, b3], [ a4, b4], [ a5, b5], if the color grade is monotonically increasing on the salient feature, that is, if the color grade is higher, the maximum value and the minimum value of the feature ratio corresponding to the salient feature are larger, and if the feature ratio ranges meet a1<, a2<, > a3<, > a4<, a5, b1<, b2<, b3<, b4<, > b 5. At this time, the feature ratio ranges of different color value grades can be determined to accord with the extreme value consistency. And generating a color value judgment model according to the significant features in the second features, the color value grades and the feature ratio range corresponding to each color value grade.
Alternatively, if the feature ratio ranges of different color grades do not meet the extreme value consistency, the feature ratio range needs to be changed, for example, the feature ratio ranges of five color grades corresponding to the salient features in the above example are [ a1, b1], [ a2, b2], [ a3, b3], [ a4, b4], [ a5, b5], and the color grade monotonically increases on the salient feature, and if a feature ratio range does not satisfy a1< a2<, a3<, a4<, a5, b1<, b2<, b3<, b4<, b5, the feature ratio range needs to be changed to the feature ratio range of the next grade, for example: a1 a2 a3 a4 a5, the value of a1 needs to be changed to a2, so that a1 a2 a3 a4 a5 is true.
As an optional implementation manner, the color value grades corresponding to the color value determination model include a plurality of color value grades, the plurality of color value grades are divided according to the color value levels, each color value grade includes all feature types, each feature type is in different color value grades, and the feature value ranges corresponding to the feature types are different.
In the color value determination model, each feature type has a corresponding feature ratio range (a value range of a feature ratio), and each feature type has a feature ratio range matched with the feature type in different color value grades.
Wherein the feature types are, for example, eye length, eye width, nose length, nose width, mouth length, etc.
In the color value determination apparatus depicted in fig. 3, a face image to be determined, which needs to be subjected to color value determination, may be acquired; extracting a plurality of first characteristic points in the face image to be judged by using a face dotting technology; calculating a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the length of each part of the face in the face image to be judged and the distance between some two parts; determining the first feature matched with a preset feature type as a first standard feature from a plurality of first features; according to the first standard feature, conducting standardization processing on a significant feature in the first features to obtain a plurality of first feature ratios, wherein the significant feature is a feature with a large color value discrimination; judging the first feature ratios by using a pre-trained color value judgment model to obtain a first color value judgment result of the face image to be judged; and outputting the first color value judgment result. Therefore, the first color value judgment result obtained by judging the significant features can better represent the color value of the face image to be judged, and the accuracy of color value judgment can be improved.
Fig. 4 is a schematic structural diagram of an electronic device implementing a color value determination method according to a preferred embodiment of the invention. The electronic device 3 comprises a memory 31, at least one processor 32, a computer program 33 stored in the memory 31 and executable on the at least one processor 32, and at least one communication bus 34.
Those skilled in the art will appreciate that the schematic diagram shown in fig. 4 is merely an example of the electronic device 3, and does not constitute a limitation of the electronic device 3, and may include more or less components than those shown, or combine some components, or different components, for example, the electronic device 3 may further include an input/output device, a network access device, and the like.
The at least one Processor 32 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The processor 32 may be a microprocessor or the processor 32 may be any conventional processor or the like, and the processor 32 is a control center of the electronic device 3 and connects various parts of the whole electronic device 3 by various interfaces and lines.
The memory 31 may be used to store the computer program 33 and/or the module/unit, and the processor 32 may implement various functions of the electronic device 3 by running or executing the computer program and/or the module/unit stored in the memory 31 and calling data stored in the memory 31. The memory 31 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data) created according to the use of the electronic device 3, and the like. Further, the memory 31 may include a non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other non-volatile solid state storage device.
In conjunction with fig. 1, the memory 31 in the electronic device 3 stores a plurality of instructions to implement a color value determination method, and the processor 32 executes the plurality of instructions to implement:
acquiring a face image to be judged, which needs to be subjected to color value judgment;
extracting a plurality of first characteristic points in the face image to be judged by using a face dotting technology;
calculating a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the length of each part of the face in the face image to be judged and the distance between some two parts;
determining the first feature matched with a preset feature type as a first standard feature from a plurality of first features;
according to the first standard feature, conducting standardization processing on a significant feature in the first features to obtain a plurality of first feature ratios, wherein the significant feature is a feature with a large color value discrimination;
judging the first feature ratios by using a pre-trained color value judgment model to obtain a first color value judgment result of the face image to be judged;
and outputting the first color value judgment result.
Specifically, the processor 32 may refer to the description of the relevant steps in the embodiment corresponding to fig. 1 for a specific implementation method of the instruction, which is not described herein again.
In the electronic device 3 depicted in fig. 4, a face image to be determined, which needs to be subjected to color value determination, may be acquired; extracting a plurality of first characteristic points in the face image to be judged by using a face dotting technology; calculating a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the length of each part of the face in the face image to be judged and the distance between some two parts; determining the first feature matched with a preset feature type as a first standard feature from a plurality of first features; according to the first standard feature, conducting standardization processing on a significant feature in the first features to obtain a plurality of first feature ratios, wherein the significant feature is a feature with a large color value discrimination; judging the first feature ratios by using a pre-trained color value judgment model to obtain a first color value judgment result of the face image to be judged; and outputting the first color value judgment result. Therefore, the first color value judgment result obtained by judging the significant features can better represent the color value of the face image to be judged, and the accuracy of color value judgment can be improved.
The integrated modules/units of the electronic device 3 may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying said computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, and Read-Only Memory (ROM).
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference signs in the claims shall not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the system claims may also be implemented by one unit or means in software or hardware. The terms second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A color value determination method, the method comprising:
acquiring a face image to be judged, which needs to be subjected to color value judgment;
extracting a plurality of first characteristic points in the face image to be judged by using a face dotting technology;
calculating a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the length of each part of the face in the face image to be judged and the distance between some two parts;
determining the first feature matched with a preset feature type as a first standard feature from a plurality of first features;
according to the first standard feature, conducting standardization processing on a significant feature in the first features to obtain a plurality of first feature ratios, wherein the significant feature is a feature with a large color value discrimination;
judging the first feature ratios by using a pre-trained color value judgment model to obtain a first color value judgment result of the face image to be judged;
and outputting the first color value judgment result.
2. The method according to claim 1, wherein the determining a plurality of the first feature ratio values by using a pre-trained color value determination model, and obtaining a first color value determination result of the face image to be determined comprises:
for each first feature ratio, acquiring a first feature type of the first feature ratio;
for each preset color value grade, judging whether the first characteristic ratios belong to the characteristic ratio range matched with the first characteristic type in the color value grade by using a pre-trained color value judgment model;
if the first feature ratios belong to the feature ratio range matched with the first feature type in the color value grade, determining the color value grade to be subjected to grade determination;
and if the grade of the to-be-fixed color value is one, determining that the grade of the to-be-fixed color value is a first color value judgment result of the face image to be judged.
3. The method of claim 2, further comprising:
if the number of the to-be-fixed color value grades is multiple, sequencing the multiple to-be-fixed color value grades from high to low according to the color value grades to obtain a color value grade sequencing queue;
and determining any one to-be-fixed color value grade at the middle position in the color value grade sorting queue as a first color value judgment result of the to-be-judged face image.
4. The method of claim 2, further comprising:
if the plurality of first characteristic ratios do not belong to the characteristic ratio range matched with the first characteristic type in the color value grades, determining that the plurality of first characteristic ratios belong to preset color value grades;
and determining the grade of the preset color value as a first color value judgment result of the face image to be judged.
5. The method according to any one of claims 1 to 4, further comprising:
acquiring a plurality of face sample images needing to be trained;
extracting a plurality of second feature points in the face sample image aiming at each face sample image;
calculating a plurality of second features according to the coordinates of the plurality of second feature points, wherein the second features comprise the length of each part of the face in the face sample image and the distance between two parts;
determining the second feature matched with the preset feature type as a second standard feature from the plurality of second features;
according to the second standard features, carrying out standardization processing on the plurality of second features to obtain a plurality of second feature ratios;
selecting a significant feature with high color value discrimination from the plurality of second features according to the distribution condition of the plurality of second feature ratios;
and constructing a color value judgment model according to the remarkable features in the plurality of second features.
6. The method of claim 5, wherein constructing a color-valued decision model based on the salient features of the plurality of second features comprises:
learning salient features of the plurality of second features;
determining a plurality of color value grades corresponding to the significant features in the plurality of second features and a feature ratio range corresponding to each color value grade;
judging whether the characteristic ratio ranges of different color value grades accord with extreme value consistency or not;
and if the characteristic ratio ranges of different color value grades accord with the consistency of extreme values, generating a color value judgment model according to the significant characteristics in the second characteristics, the color value grades and the characteristic ratio range corresponding to each color value grade.
7. The method according to claim 6, wherein the color value determination model includes a plurality of color value grades, the plurality of color value grades are divided according to color values, each color value grade includes all feature types, each feature type is in a different color value grade, and a feature value range corresponding to the feature type is different.
8. A color value determination device, characterized by comprising:
the acquisition module is used for acquiring a face image to be judged, which needs to be subjected to color value judgment;
the extraction module is used for extracting a plurality of first characteristic points in the face image to be judged by using a face dotting technology;
the calculation module is used for calculating a plurality of first features according to the coordinates of the plurality of first feature points, wherein the first features comprise the length of each part of the face in the face image to be judged and the distance between two parts;
the determining module is used for determining the first feature matched with a preset feature type from a plurality of first features as a first standard feature;
the processing module is used for carrying out standardization processing on the salient features in the first features according to the first standard features to obtain a plurality of first feature ratios, wherein the salient features are features with high color value discrimination;
the judging module is used for judging the first characteristic ratios by using a pre-trained color value judging model to obtain a first color value judging result of the face image to be judged;
and the output module is used for outputting the first color value judgment result.
9. An electronic device, characterized in that the electronic device comprises a processor and a memory, the processor being configured to execute a computer program stored in the memory to implement the color value determination method according to any one of claims 1 to 7.
10. A computer-readable storage medium storing at least one instruction which, when executed by a processor, implements a color value determination method according to any one of claims 1 to 7.
CN201910901612.0A 2019-09-23 2019-09-23 Color value judging method and device, electronic equipment and storage medium Active CN110874567B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910901612.0A CN110874567B (en) 2019-09-23 2019-09-23 Color value judging method and device, electronic equipment and storage medium
PCT/CN2020/093341 WO2021057063A1 (en) 2019-09-23 2020-05-29 Facial attractiveness determining method and apparatus, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910901612.0A CN110874567B (en) 2019-09-23 2019-09-23 Color value judging method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110874567A true CN110874567A (en) 2020-03-10
CN110874567B CN110874567B (en) 2024-01-09

Family

ID=69718056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910901612.0A Active CN110874567B (en) 2019-09-23 2019-09-23 Color value judging method and device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN110874567B (en)
WO (1) WO2021057063A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021057063A1 (en) * 2019-09-23 2021-04-01 平安科技(深圳)有限公司 Facial attractiveness determining method and apparatus, electronic device, and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169408A (en) * 2017-03-31 2017-09-15 北京奇艺世纪科技有限公司 A kind of face value decision method and device
CN108629336A (en) * 2018-06-05 2018-10-09 北京千搜科技有限公司 Face value calculating method based on human face characteristic point identification
WO2019061203A1 (en) * 2017-09-28 2019-04-04 深圳传音通讯有限公司 Method for acquiring change in facial attractiveness score, and terminal
CN109657539A (en) * 2018-11-05 2019-04-19 深圳前海达闼云端智能科技有限公司 Face value evaluation method and device, readable storage medium and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604377A (en) * 2009-07-10 2009-12-16 华南理工大学 A kind of facial beauty classification method that adopts computing machine to carry out woman image
US9626597B2 (en) * 2013-05-09 2017-04-18 Tencent Technology (Shenzhen) Company Limited Systems and methods for facial age identification
CN108764334A (en) * 2018-05-28 2018-11-06 北京达佳互联信息技术有限公司 Facial image face value judgment method, device, computer equipment and storage medium
CN110874567B (en) * 2019-09-23 2024-01-09 平安科技(深圳)有限公司 Color value judging method and device, electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169408A (en) * 2017-03-31 2017-09-15 北京奇艺世纪科技有限公司 A kind of face value decision method and device
WO2019061203A1 (en) * 2017-09-28 2019-04-04 深圳传音通讯有限公司 Method for acquiring change in facial attractiveness score, and terminal
CN108629336A (en) * 2018-06-05 2018-10-09 北京千搜科技有限公司 Face value calculating method based on human face characteristic point identification
CN109657539A (en) * 2018-11-05 2019-04-19 深圳前海达闼云端智能科技有限公司 Face value evaluation method and device, readable storage medium and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈良仁: ""基于深度卷积神经网络的颜值计算研究"", 《中国优秀硕士学位论文全文数据库信息科技辑》, no. 4, pages 31 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021057063A1 (en) * 2019-09-23 2021-04-01 平安科技(深圳)有限公司 Facial attractiveness determining method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
WO2021057063A1 (en) 2021-04-01
CN110874567B (en) 2024-01-09

Similar Documents

Publication Publication Date Title
US11238310B2 (en) Training data acquisition method and device, server and storage medium
US10810870B2 (en) Method of processing passage record and device
CN105183731B (en) Recommendation information generation method, device and system
CN112749344B (en) Information recommendation method, device, electronic equipment, storage medium and program product
US10846332B2 (en) Playlist list determining method and device, electronic apparatus, and storage medium
US20140012532A1 (en) System, method, and computer program product for simultaneously determining settings for a plurality of parameter variations
CN109829478B (en) Problem classification method and device based on variation self-encoder
CN110490444A (en) Mark method for allocating tasks, device, system and storage medium
WO2021036309A1 (en) Image recognition method and apparatus, computer apparatus, and storage medium
CN112163637B (en) Image classification model training method and device based on unbalanced data
CN111737473B (en) Text classification method, device and equipment
CN110969172A (en) Text classification method and related equipment
CN110046251A (en) Community content methods of risk assessment and device
CN109993450B (en) Movie scoring method, device, equipment and storage medium
CN111552865A (en) User interest portrait method and related equipment
WO2021057062A1 (en) Method and apparatus for optimizing attractiveness judgment model, electronic device, and storage medium
CN112767038B (en) Poster CTR prediction method and device based on aesthetic characteristics
CN114419378A (en) Image classification method and device, electronic equipment and medium
CN110874567B (en) Color value judging method and device, electronic equipment and storage medium
CN115545088B (en) Model construction method, classification method, device and electronic equipment
CN114415997B (en) Display parameter setting method and device, electronic equipment and storage medium
CN107665443B (en) Obtain the method and device of target user
CN110610479B (en) Object scoring method and device
CN111984637A (en) Missing value processing method and device in data modeling, equipment and storage medium
CN111359224B (en) Method for obtaining addiction index

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40017502

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant