CN116580435A - Data processing method and device for color value scoring - Google Patents

Data processing method and device for color value scoring Download PDF

Info

Publication number
CN116580435A
CN116580435A CN202310468458.9A CN202310468458A CN116580435A CN 116580435 A CN116580435 A CN 116580435A CN 202310468458 A CN202310468458 A CN 202310468458A CN 116580435 A CN116580435 A CN 116580435A
Authority
CN
China
Prior art keywords
face
data
image
social
scoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310468458.9A
Other languages
Chinese (zh)
Inventor
杨扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Aibili Technology Co ltd
Original Assignee
Shenzhen Aibili Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Aibili Technology Co ltd filed Critical Shenzhen Aibili Technology Co ltd
Priority to CN202310468458.9A priority Critical patent/CN116580435A/en
Publication of CN116580435A publication Critical patent/CN116580435A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a data processing method and device for color value scoring. The method comprises the steps of obtaining an image of a user in a social platform, and carrying out image extraction processing based on face extraction on an image to be processed to obtain a process face image; carrying out feature extraction processing on the process face image to obtain face feature data; and carrying out evaluation processing on the face characteristic data based on a preset face value scoring model to obtain target face value scoring data, wherein the target face value scoring data is data for representing the face value scoring of the social user. According to the application, the face image is extracted from the image of the social user, the face feature is extracted from the extracted face image, the face feature is subjected to face scoring processing through a face scoring model, so that face scoring data is obtained, and the face scoring of the social user is realized. The technical problem that the social matching recommendation lacks color value recommendation in the prior art is solved, and the technical effect of improving social matching recommendation diversification is achieved.

Description

Data processing method and device for color value scoring
Technical Field
The application relates to the field of computers, in particular to a data processing method and device for scoring color values.
Background
Along with the continuous development of information technology, the diversification development of internet social recommendation is realized, the existing internet social recommendation is mainly based on social recommendation attributes of social users, and along with the requirement of the social users on face value recommendation, face images of the users are provided on a social platform, but the users are required to select the face images, a scheme for carrying out face value scoring calculation on the face images is lacked, and the social matching recommendation efficiency of the social platform is low due to the fact that the user selects the images to be matched.
Therefore, aiming at the problem that social matching recommendation lacks color value recommendation in the prior art, the application provides a data processing method for color value scoring.
Disclosure of Invention
The application mainly aims to provide a data processing method and device for color value scoring, which are used for solving the technical problem that the social matching recommendation lacks color value recommendation in the prior art, and realizing the technical effect of improving the diversification of the social matching recommendation.
In order to achieve the above object, according to a first aspect of the present application, there is provided a data processing method for scoring a face value, applied to a social platform, so as to score the face value in social matching calculation, the data processing method comprising:
acquiring an image to be processed, wherein the image to be processed is an image of a user in a social platform;
carrying out image extraction processing based on face extraction on the image to be processed to obtain a process face image;
performing feature extraction processing on the process face image to obtain face feature data, wherein the face feature data is data for representing feature points of the process face image;
and carrying out evaluation processing on the face characteristic data based on a preset face value scoring model to obtain target face value scoring data, wherein the target face value scoring data is data for representing the face value scoring of the social user.
Optionally, performing an evaluation process based on a preset face value scoring model on the face feature data, where obtaining target face value scoring data includes:
performing recognition processing based on a plurality of preset face value decision features on the face feature data to obtain a plurality of process face value decision feature data, wherein the plurality of process face value decision feature data are data for representing a plurality of preset face value decision features of social users;
performing face value scoring processing based on the face value decision feature data corresponding to the preset face value decision features to obtain a plurality of process face value scoring data, wherein the process face value scoring data are face value scoring data used for representing the preset face value decision features of social users;
and determining the target pigment value scoring data according to the plurality of process pigment value scoring data.
Optionally, performing an evaluation process based on a preset face value scoring model on the face feature data, where obtaining target face value scoring data includes:
performing recognition processing based on preset face value decision features on the face feature data to obtain process face value decision feature data, wherein the process face value decision feature data are data used for representing the corresponding preset face value decision features of the face image of the social user;
matching a face value scoring model corresponding to the preset face value decision feature in a preset face value scoring model database to obtain a process face value scoring model, wherein the process face value scoring model is a face value scoring model corresponding to the preset face value decision feature;
performing face value scoring processing on the face value decision feature data based on the face value scoring model to obtain face value scoring data, wherein the face value scoring data are face value scoring data used for representing the corresponding preset face value decision feature of the social user;
and determining the target color value scoring data according to the process color value scoring data.
Optionally, performing feature extraction processing on the face image to obtain face feature data includes:
extracting the face characteristic points based on the face image to obtain face characteristic point data, wherein the face characteristic point data comprises data for representing the connection relation between the face characteristic points of the social user and the characteristic points;
carrying out normalization processing on the process face feature point data based on preset feature classes to obtain process face feature point data, wherein the process feature point data is face feature point data subjected to normalization processing of the preset feature classes;
and carrying out data correction processing on the face characteristic point data in the process to obtain the face characteristic data.
Optionally, performing image extraction processing based on face extraction on the image to be processed, and obtaining a process face image includes:
carrying out gray level pretreatment on the image to be treated to obtain a pretreated image, wherein the pretreated image is an image subjected to gray level treatment;
and carrying out face extraction processing on the preprocessed image based on a face classifier model to obtain the process face image, wherein the process face image is an image for representing the face of the social user.
Optionally, performing face extraction processing based on a face classifier model on the preprocessed image, and obtaining the process face image includes:
performing first image screening processing on the preprocessed image based on a first preset face classification rule to obtain a first face image, wherein the first face image is the preprocessed image meeting the first preset face classification rule;
performing second image screening processing on the first face image based on a second preset face classification rule to obtain a second face image, wherein the second face image is a first face image meeting the second preset face classification rule;
and carrying out face image segmentation processing on the second face image to obtain the process face image.
Optionally, after performing an evaluation process based on a preset face value scoring model on the face feature data to obtain target face value scoring data, the data processing method further includes:
acquiring social user data, wherein the social user data is social data of users in a social platform;
performing feature extraction processing based on social labels on the social user data to obtain social label data, wherein the social label data is data for representing social labels of social users;
and updating the social request database by the target scoring data and the social label data based on the data of the identity characteristic data so as to realize the matching processing of the social request data.
According to a second aspect of the present application, there is provided a data processing apparatus for face scoring for use in a social platform for face scoring in social matching calculation, the data processing apparatus comprising:
the data acquisition module is used for acquiring an image to be processed, wherein the image to be processed is an image of a user in the social platform;
the image extraction module is used for carrying out image extraction processing based on face extraction on the image to be processed to obtain a process face image;
the feature extraction module is used for carrying out feature extraction processing on the process face image to obtain face feature data, wherein the face feature data is data for representing feature points of the process face image;
and the scoring module is used for performing evaluation processing on the face characteristic data based on a preset face value scoring model to obtain target face value scoring data, wherein the target face value scoring data is data for representing the face value scoring of the social user.
According to a third aspect of the present application, there is provided a computer-readable storage medium storing computer instructions for causing the computer to execute the above-described data processing method for face value scoring.
According to a fourth aspect of the present application, there is provided an electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to cause the at least one processor to perform the data processing method for color value scoring described above.
The technical scheme provided by the embodiment of the application can comprise the following beneficial effects:
according to the method, the image of the user in the social platform is obtained, and the image to be processed is subjected to image extraction processing based on face extraction, so that a process face image is obtained; performing feature extraction processing on the process face image to obtain face feature data, wherein the face feature data is data for representing feature points of the process face image; and carrying out evaluation processing on the face characteristic data based on a preset face value scoring model to obtain target face value scoring data, wherein the target face value scoring data is data for representing the face value scoring of the social user. According to the application, the face image is extracted from the image of the social user, the face feature is extracted from the extracted face image, the face feature is subjected to face scoring processing through a face scoring model, so that face scoring data is obtained, and the face scoring of the social user is realized. The technical problem that the social matching recommendation lacks color value recommendation in the prior art is solved, and the technical effect of improving social matching recommendation diversification is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application, are incorporated in and constitute a part of this specification. The drawings and their description are illustrative of the application and are not to be construed as unduly limiting the application. In the drawings:
FIG. 1 is a flow chart of a data processing method for color value scoring according to the present application;
FIG. 2 is a flow chart of a data processing method for color value scoring according to the present application;
FIG. 3 is a flow chart of a data processing method for color value scoring according to the present application;
fig. 4 is a schematic diagram of a data processing apparatus for color value scoring according to the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe the embodiments of the application herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the present application, the terms "upper", "lower", "left", "right", "front", "rear", "top", "bottom", "inner", "outer", "middle", "vertical", "horizontal", "lateral", "longitudinal" and the like indicate an azimuth or a positional relationship based on that shown in the drawings. These terms are only used to better describe the present application and its embodiments and are not intended to limit the scope of the indicated devices, elements or components to the particular orientations or to configure and operate in the particular orientations.
Also, some of the terms described above may be used to indicate other meanings in addition to orientation or positional relationships, for example, the term "upper" may also be used to indicate some sort of attachment or connection in some cases. The specific meaning of these terms in the present application will be understood by those of ordinary skill in the art according to the specific circumstances.
Furthermore, the terms "mounted," "configured," "provided," "connected," "coupled," and "sleeved" are to be construed broadly. For example, "connected" may be in a fixed connection, a removable connection, or a unitary construction; may be a mechanical connection, or an electrical connection; may be directly connected, or indirectly connected through intervening media, or may be in internal communication between two devices, elements, or components. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances.
In the existing social platform, social matching recommendation is performed on social attribute data of users, such as age, height, occupation, hobbies and the like, so that internet social contact is realized. Along with the requirement of social users on the face value, the users need diversified social matching recommendation, in the prior art, the social users mainly select user images provided by a platform and then conduct social contact, the platform lacks a scheme for carrying out face value scoring calculation on face images of the social users, and the platform social matching recommendation efficiency is low.
Aiming at the problem that social matching recommendation lacks color value recommendation in the prior art, the embodiment of the application provides a data processing method for color value scoring.
In an alternative embodiment of the present application, a data processing method for color value scoring is provided, and fig. 1 is a flowchart of a data processing method for color value scoring provided in the present application, as shown in fig. 1, the method includes the following steps:
s101: acquiring an image to be processed;
the image to be processed is an image of a user in the social platform. In the social platform, images uploaded by users are stored, and the images of the social users are acquired for color value scoring processing.
S102: carrying out image extraction processing based on face extraction on an image to be processed to obtain a process face image;
in another alternative embodiment of the present application, a data processing method for color value scoring is provided, and fig. 2 is a flowchart of a data processing method for color value scoring provided in the present application, as shown in fig. 2, the method includes the following steps:
s201: carrying out gray level pretreatment on an image to be treated to obtain a pretreated image;
the preprocessed image is an image after gray processing;
s202: and carrying out face extraction processing on the preprocessed image based on a face classifier model to obtain a process face image.
The procedural face image is an image representing a face of a social user.
According to another alternative embodiment of the present application, there is provided a data processing method for color value scoring, including:
performing first image screening processing on the preprocessed image based on a first preset face classification rule to obtain a first face image, wherein the first face image is the preprocessed image meeting the first preset face classification rule; performing second image screening processing on the first face image based on a second preset face classification rule to obtain a second face image, wherein the second face image is the first face image meeting the second preset face classification rule; and carrying out face image segmentation processing on the second face image to obtain a process face image.
For example, if the first preset face classification rule is whether a face exists in the preprocessed image, the second preset face classification rule is whether a single face exists in the preprocessed image, the preprocessed image is subjected to recognition processing based on the first preset face classification rule, and if the face exists in the preprocessed image, the preprocessed image is subjected to screening processing of the second preset face classification rule; if no human face exists in the preprocessed image, the image is not processed any more; screening the preprocessed image with the face image according to a second preset face classification rule, and obtaining a process face image if a single face exists in the preprocessed image; if no single face exists in the preprocessed image, the image is not processed.
In another alternative embodiment of the application, the basic face image is constructed by a Haar classifier. As a strong classifier set cascade based on the harr-like characteristics, the haar classifier can rapidly and effectively sketch a face image. Performing first image classification processing on the preprocessed image through a first classifier, and rejecting the image if a first image classification rule is not satisfied; if the preprocessed image meets the first image classification rule, performing second image classification processing on the preprocessed image, and if the preprocessed image does not meet the second image classification rule, rejecting the preprocessed image; and if the preprocessed image meets the second image classification rule, performing third image classification processing on the preprocessed image until the classifier target image is obtained.
In the embodiment of the application, the image data volume required to be processed in the color value scoring process is reduced by extracting the face image from the user image, so that the technical effect of improving the color value scoring efficiency is realized.
S103: carrying out feature extraction processing on the process face image to obtain face feature data;
the face characteristic data are data used for representing the characteristic points of the face image in the process;
in another alternative embodiment of the present application, there is provided a data processing method for color value scoring, including:
extracting and processing the process face image based on face feature points to obtain face feature point data, wherein the face feature point data comprises data used for representing the connection relation between the face feature points of the social user and the feature points; carrying out normalization processing on the process face feature point data based on preset feature classes to obtain process face feature point data, wherein the process feature point data is face feature point data subjected to normalization processing of the preset feature classes; and carrying out data correction processing on the face characteristic point data to obtain the face characteristic data.
In another optional embodiment of the application, the ASM-68 algorithm based on the point distribution model is used for carrying out facial interception and marking connection of 68 feature points on the image, so as to extract the facial feature points of the facial image; extracting feature vectors of the face image through the HOG, and extracting connection relations of feature points in the face image in the process; the L2-norm function is utilized to perform feature normalization to reduce the influence of external factors such as contrast, local shadow and the like on the result; and carrying out data correction processing on the process point data through the ternary loss function to obtain the face characteristic data.
In the embodiment of the application, the feature point extraction and the feature normalization processing are carried out on the process face image, the feature extraction is carried out on the face image of the social user, the process image is optimized, the influence of factors such as ambient light in the image on the face feature extraction is reduced, and the accuracy of data processing is improved.
S104: and carrying out evaluation processing on the face characteristic data based on a preset face value scoring model to obtain target face value scoring data.
The target face value score data is data for representing social user face value scores.
In another alternative embodiment of the present application, a data processing method for color value scoring is provided, and fig. 2 is a flowchart of a data processing method for color value scoring provided in the present application, as shown in fig. 2, the method includes the following steps:
s301: carrying out recognition processing based on a plurality of preset face value decision features on the face feature data to obtain a plurality of process face value decision feature data;
the plurality of process face value decision feature data are data used for representing a plurality of preset face value decision features of the social user;
s302: performing color value scoring processing based on the corresponding color value decision features of the preset color value decision features on the process color value decision feature data to obtain process color value scoring data;
the plurality of process face value scoring data are face value scoring data used for representing a plurality of preset face value decision features of the social user;
s303: target face value scoring data is determined from the plurality of process face value scoring data.
In another alternative embodiment of the present application, there is provided a data processing method for color value scoring, including:
carrying out recognition processing based on preset face value decision features on the face feature data to obtain process face value decision feature data, wherein the process face value decision feature data are data for representing the corresponding preset face value decision features of the face image of the social user;
matching a face value scoring model corresponding to a preset face value decision feature in a preset face value scoring model database to obtain a process face value scoring model, wherein the process face value scoring model is a face value scoring model corresponding to the preset face value decision feature;
performing face value scoring processing based on a face value scoring model on the process face value decision feature data to obtain process face value scoring data, wherein the process face value scoring data are face value scoring data used for representing corresponding preset face value decision features of social users;
and determining the target color value scoring data according to the process color value scoring data.
In another optional embodiment of the present application, there is provided a data processing method for face feature data, after performing an evaluation process based on a preset face score model to obtain target face score data, the method further includes:
acquiring social user data, wherein the social user data is social data of users in a social platform;
performing feature extraction processing based on social labels on the social user data to obtain social label data, wherein the social label data is data for representing social labels of the social users;
and updating the social request database by the target scoring data and the social label data based on the data of the identity characteristic data so as to realize the matching processing of the social request data.
In another alternative embodiment of the present application, a data processing apparatus for score scoring is provided, which is applied to a social platform to score a score in a social matching calculation, and fig. 4 is a schematic diagram of a data processing apparatus for score scoring according to the present application, as shown in fig. 4, where the data processing apparatus includes:
the data acquisition module 41 is configured to acquire an image to be processed, where the image to be processed is an image of a user in the social platform;
an image extraction module 42, configured to perform image extraction processing based on face extraction on an image to be processed, so as to obtain a process face image;
the feature extraction module 43 is configured to perform feature extraction processing on the process face image to obtain face feature data, where the face feature data is data for representing feature points of the process face image;
the scoring module 44 is configured to perform an evaluation process on the face feature data based on a preset face value scoring model, so as to obtain target face value scoring data, where the target face value scoring data is data for representing a face value score of the social user.
The specific manner in which the operations of the units in the above embodiments are performed has been described in detail in the embodiments related to the method, and will not be described in detail here.
In summary, in the application, by acquiring the image of the user in the social platform, the image to be processed is subjected to image extraction processing based on face extraction, so as to obtain a process face image; performing feature extraction processing on the process face image to obtain face feature data, wherein the face feature data is data for representing feature points of the process face image; and carrying out evaluation processing on the face characteristic data based on a preset face value scoring model to obtain target face value scoring data, wherein the target face value scoring data is data for representing the face value scoring of the social user. According to the application, the face image is extracted from the image of the social user, the face feature is extracted from the extracted face image, the face feature is subjected to face scoring processing through a face scoring model, so that face scoring data is obtained, and the face scoring of the social user is realized. The technical problem that the social matching recommendation lacks color value recommendation in the prior art is solved, and the technical effect of improving social matching recommendation diversification is achieved.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowcharts, in some cases the steps illustrated or described may be performed in an order other than that illustrated herein.
It will be apparent to those skilled in the art that the elements or steps of the application described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, or they may alternatively be implemented in program code executable by computing devices, so that they may be stored in a memory device for execution by the computing devices, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps within them may be fabricated into a single integrated circuit module. Thus, the present application is not limited to any specific combination of hardware and software.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A data processing method for scoring of a face value, for use in a social platform for scoring of a face value in a social matching calculation, the data processing method comprising:
acquiring an image to be processed, wherein the image to be processed is an image of a user in a social platform;
carrying out image extraction processing based on face extraction on the image to be processed to obtain a process face image;
performing feature extraction processing on the process face image to obtain face feature data, wherein the face feature data is data for representing feature points of the process face image;
and carrying out evaluation processing on the face characteristic data based on a preset face value scoring model to obtain target face value scoring data, wherein the target face value scoring data is data for representing the face value scoring of the social user.
2. The data processing method according to claim 1, wherein performing evaluation processing based on a preset face value scoring model on the face feature data to obtain target face value scoring data includes:
performing recognition processing based on a plurality of preset face value decision features on the face feature data to obtain a plurality of process face value decision feature data, wherein the plurality of process face value decision feature data are data for representing a plurality of preset face value decision features of social users;
performing face value scoring processing based on the face value decision feature data corresponding to the preset face value decision features to obtain a plurality of process face value scoring data, wherein the process face value scoring data are face value scoring data used for representing the preset face value decision features of social users;
and determining the target pigment value scoring data according to the plurality of process pigment value scoring data.
3. The data processing method according to claim 1, wherein performing evaluation processing based on a preset face value scoring model on the face feature data to obtain target face value scoring data includes:
performing recognition processing based on preset face value decision features on the face feature data to obtain process face value decision feature data, wherein the process face value decision feature data are data used for representing the corresponding preset face value decision features of the face image of the social user;
matching a face value scoring model corresponding to the preset face value decision feature in a preset face value scoring model database to obtain a process face value scoring model, wherein the process face value scoring model is a face value scoring model corresponding to the preset face value decision feature;
performing face value scoring processing on the face value decision feature data based on the face value scoring model to obtain face value scoring data, wherein the face value scoring data are face value scoring data used for representing the corresponding preset face value decision feature of the social user;
and determining the target color value scoring data according to the process color value scoring data.
4. The data processing method according to claim 1, wherein performing feature extraction processing on the process face image to obtain face feature data includes:
extracting the face characteristic points based on the face image to obtain face characteristic point data, wherein the face characteristic point data comprises data for representing the connection relation between the face characteristic points of the social user and the characteristic points;
carrying out normalization processing on the process face feature point data based on preset feature classes to obtain process face feature point data, wherein the process feature point data is face feature point data subjected to normalization processing of the preset feature classes;
and carrying out data correction processing on the face characteristic point data in the process to obtain the face characteristic data.
5. The data processing method according to claim 1, wherein performing image extraction processing based on face extraction on the image to be processed, obtaining a process face image includes:
carrying out gray level pretreatment on the image to be treated to obtain a pretreated image, wherein the pretreated image is an image subjected to gray level treatment;
and carrying out face extraction processing on the preprocessed image based on a face classifier model to obtain the process face image, wherein the process face image is an image for representing the face of the social user.
6. The data processing method according to claim 5, wherein performing face extraction processing based on a face classifier model on the preprocessed image to obtain the process face image includes:
performing first image screening processing on the preprocessed image based on a first preset face classification rule to obtain a first face image, wherein the first face image is the preprocessed image meeting the first preset face classification rule;
performing second image screening processing on the first face image based on a second preset face classification rule to obtain a second face image, wherein the second face image is a first face image meeting the second preset face classification rule;
and carrying out face image segmentation processing on the second face image to obtain the process face image.
7. The data processing method according to claim 1, wherein after performing evaluation processing based on a preset face value scoring model on the face feature data to obtain target face value scoring data, the data processing method further comprises:
acquiring social user data, wherein the social user data is social data of users in a social platform;
performing feature extraction processing based on social labels on the social user data to obtain social label data, wherein the social label data is data for representing social labels of social users;
and updating the social request database by the target scoring data and the social label data based on the data of the identity characteristic data so as to realize the matching processing of the social request data.
8. A data processing apparatus for face scoring, for use in a social platform for face scoring in a social matching calculation, the data processing apparatus comprising:
the data acquisition module is used for acquiring an image to be processed, wherein the image to be processed is an image of a user in the social platform;
the image extraction module is used for carrying out image extraction processing based on face extraction on the image to be processed to obtain a process face image;
the feature extraction module is used for carrying out feature extraction processing on the process face image to obtain face feature data, wherein the face feature data is data for representing feature points of the process face image;
and the scoring module is used for performing evaluation processing on the face characteristic data based on a preset face value scoring model to obtain target face value scoring data, wherein the target face value scoring data is data for representing the face value scoring of the social user.
9. A computer-readable storage medium storing computer instructions for causing the computer to execute the data processing method for color value scoring according to any one of claims 1 to 7.
10. An electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor to cause the at least one processor to perform the data processing method for color value scoring of any one of claims 1-7.
CN202310468458.9A 2023-04-23 2023-04-23 Data processing method and device for color value scoring Pending CN116580435A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310468458.9A CN116580435A (en) 2023-04-23 2023-04-23 Data processing method and device for color value scoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310468458.9A CN116580435A (en) 2023-04-23 2023-04-23 Data processing method and device for color value scoring

Publications (1)

Publication Number Publication Date
CN116580435A true CN116580435A (en) 2023-08-11

Family

ID=87542404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310468458.9A Pending CN116580435A (en) 2023-04-23 2023-04-23 Data processing method and device for color value scoring

Country Status (1)

Country Link
CN (1) CN116580435A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103678323A (en) * 2012-09-03 2014-03-26 上海唐里信息技术有限公司 Friend recommendation method and system in SNS network
CN106815557A (en) * 2016-12-20 2017-06-09 北京奇虎科技有限公司 A kind of evaluation method of face features, device and mobile terminal
CN109284778A (en) * 2018-09-07 2019-01-29 北京相貌空间科技有限公司 Face face value calculating method, computing device and electronic equipment
CN113537398A (en) * 2021-08-11 2021-10-22 腾讯音乐娱乐科技(深圳)有限公司 Color value evaluation model training method and component, and color value evaluation method and component
CN115827995A (en) * 2022-12-13 2023-03-21 深圳市爱聊科技有限公司 Social matching method based on big data analysis

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103678323A (en) * 2012-09-03 2014-03-26 上海唐里信息技术有限公司 Friend recommendation method and system in SNS network
CN106815557A (en) * 2016-12-20 2017-06-09 北京奇虎科技有限公司 A kind of evaluation method of face features, device and mobile terminal
CN109284778A (en) * 2018-09-07 2019-01-29 北京相貌空间科技有限公司 Face face value calculating method, computing device and electronic equipment
CN113537398A (en) * 2021-08-11 2021-10-22 腾讯音乐娱乐科技(深圳)有限公司 Color value evaluation model training method and component, and color value evaluation method and component
CN115827995A (en) * 2022-12-13 2023-03-21 深圳市爱聊科技有限公司 Social matching method based on big data analysis

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王昆翔 等: "《智能理论与警用智能技术(第二版)》", 31 May 2009, 中国人民公安大学出版社, pages: 442 - 443 *
言有三: "《深度学习之人脸图像处理核心算法与案例实战》", 31 July 2020, 机械工业出版社, pages: 212 - 214 *

Similar Documents

Publication Publication Date Title
CN103927387B (en) Image indexing system and its correlation technique and device
CN110555372A (en) Data entry method, device, equipment and storage medium
US9613296B1 (en) Selecting a set of exemplar images for use in an automated image object recognition system
CN111626371B (en) Image classification method, device, equipment and readable storage medium
CN109117857B (en) Biological attribute identification method, device and equipment
WO2019033525A1 (en) Au feature recognition method, device and storage medium
CN111475613A (en) Case classification method and device, computer equipment and storage medium
CN111814810A (en) Image recognition method and device, electronic equipment and storage medium
CN110188829B (en) Neural network training method, target recognition method and related products
CN107679448A (en) Eyeball action-analysing method, device and storage medium
CN113449704B (en) Face recognition model training method and device, electronic equipment and storage medium
CN111178195A (en) Facial expression recognition method and device and computer readable storage medium
CN111695458A (en) Video image frame processing method and device
CN105956631A (en) On-line progressive image classification method facing electronic image base
CN111340213B (en) Neural network training method, electronic device, and storage medium
CN113792659B (en) Document identification method and device and electronic equipment
CN117197904B (en) Training method of human face living body detection model, human face living body detection method and human face living body detection device
CN115827995A (en) Social matching method based on big data analysis
CN111382791B (en) Deep learning task processing method, image recognition task processing method and device
CN111126254A (en) Image recognition method, device, equipment and storage medium
CN116994021A (en) Image detection method, device, computer readable medium and electronic equipment
CN108052918A (en) A kind of person's handwriting Compare System and method
CN113673528B (en) Text processing method, text processing device, electronic equipment and readable storage medium
CN115600013B (en) Data processing method and device for matching recommendation among multiple subjects
CN117689884A (en) Method for generating medical image segmentation model and medical image segmentation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination