CN111784163A - Data evaluation method, device, equipment and storage medium - Google Patents

Data evaluation method, device, equipment and storage medium Download PDF

Info

Publication number
CN111784163A
CN111784163A CN202010625201.6A CN202010625201A CN111784163A CN 111784163 A CN111784163 A CN 111784163A CN 202010625201 A CN202010625201 A CN 202010625201A CN 111784163 A CN111784163 A CN 111784163A
Authority
CN
China
Prior art keywords
evaluation
data
target user
evaluated
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010625201.6A
Other languages
Chinese (zh)
Inventor
于夕畔
杨海军
徐倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WeBank Co Ltd
Original Assignee
WeBank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WeBank Co Ltd filed Critical WeBank Co Ltd
Priority to CN202010625201.6A priority Critical patent/CN111784163A/en
Publication of CN111784163A publication Critical patent/CN111784163A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Multimedia (AREA)
  • Game Theory and Decision Science (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a data evaluation method, a device, equipment and a storage medium, wherein the method comprises the following steps: when a display instruction of the artwork data to be evaluated is detected, determining whether a target user pointed by the display instruction is in an evaluation state of the artwork data to be evaluated; if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, acquiring evaluation information of the target user; and determining the evaluation result of the artwork data to be evaluated based on the evaluation information. The method and the device aim to solve the technical problem that the data evaluation accuracy rate is low in the prior art.

Description

Data evaluation method, device, equipment and storage medium
Technical Field
The present application relates to the field of artificial intelligence technology for financial technology (Fintech), and in particular, to a data evaluation method, apparatus, device, and storage medium.
Background
With the continuous development of financial science and technology, especially internet science and technology finance, more and more technologies are applied to the financial field, but the financial industry also puts higher requirements on the technologies, for example, the financial industry also has higher requirements on data evaluation such as artwork data evaluation.
Currently, the evaluation of art data such as AI (artificial intelligence) composition, AI drawing and the like is often performed in a subjective manner, for example, the compositions generated by an AI composition algorithm are played one by one to a panel of an evaluation team, the evaluation is performed by the panel of the evaluation team, and after the evaluation, the goodness of the art data such as AI composition, AI drawing and the like is determined based on the evaluation result of the panel of the evaluation team.
Disclosure of Invention
The application mainly aims to provide a data evaluation method, a device, equipment and a storage medium, and aims to solve the technical problem that in the prior art, the art data evaluation accuracy is low.
In order to achieve the above object, the present application provides a data evaluation method, including:
when a display instruction of the artwork data to be evaluated is detected, determining whether a target user pointed by the display instruction is in an evaluation state of the artwork data to be evaluated;
if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, acquiring evaluation information of the target user;
and determining the evaluation result of the artwork data to be evaluated based on the evaluation information.
Optionally, when the display instruction of the to-be-evaluated artwork data is detected, the step of determining whether the target user pointed by the display instruction is in the evaluation state of the to-be-evaluated artwork data includes:
when a display instruction of the artwork data to be evaluated is detected, determining whether the acquired ambiguity index of the target user is smaller than a preset ambiguity threshold value;
if the ambiguity index is smaller than a preset ambiguity threshold, determining whether the face area index of the target user is smaller than a preset face area threshold;
if the face area index is smaller than a preset face area threshold, determining whether the side face angle index of the target user is smaller than a preset side face angle threshold;
and if the side face angle index of the target user is smaller than a preset side face angle threshold value, determining that the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated.
Optionally, the step of obtaining the evaluation information of the target user when the target user to which the display instruction points is in the evaluation state of the artwork data to be evaluated includes:
if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, acquiring an image of the target user through a camera;
acquiring the face information of the target user based on the image of the target user, and acquiring the emotion information of the target user based on the image of the target user;
determining the evaluation duration of the target user based on the face information, and acquiring the sub-duration of each emotion type determined by the emotion information in the evaluation duration;
and determining the evaluation information of the target user based on the sub-duration of each emotion type determined by the emotion information in the evaluation duration.
Optionally, the step of obtaining emotion information of the target user based on the image of the target user includes:
inputting the image of the target user into a preset emotion recognition model;
performing emotion type recognition processing on the image of the target user based on the preset emotion recognition model to obtain emotion information of the target user;
the preset emotion recognition model is obtained by performing iterative training on a preset basic model through preset emotion training image data.
Optionally, the step of determining an evaluation result of the to-be-evaluated artwork data based on the evaluation information includes:
obtaining the expression type of the artwork data to be evaluated;
and determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the expression type.
Optionally, the step of determining an evaluation result of the to-be-evaluated artwork data based on the evaluation information includes:
acquiring an evaluation result of the evaluation group of the artwork data to be evaluated;
and determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the evaluation result of the appraisal group.
Optionally, the step of determining an evaluation result of the to-be-evaluated artwork data based on the evaluation information includes:
acquiring the number of the target users;
if the number of the target users is larger than the preset number, obtaining a mean evaluation score of the to-be-evaluated artwork data based on the evaluation information of all the target users;
and setting the average evaluation score as an evaluation result of the artwork data to be evaluated.
The present application further provides a data evaluation device, the data evaluation device includes:
the system comprises a first determination module, a second determination module and a third determination module, wherein the first determination module is used for determining whether a target user pointed by a display instruction is in an evaluation state of the to-be-evaluated artwork data when the display instruction of the to-be-evaluated artwork data is detected;
the acquisition module is used for acquiring the evaluation information of the target user if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated;
and the second determination module is used for determining the evaluation result of the artwork data to be evaluated based on the evaluation information.
Optionally, the first determining module includes:
the first determining unit is used for determining whether the acquired ambiguity index of the target user is smaller than a preset ambiguity threshold value or not when a display instruction of the artwork data to be evaluated is detected;
a second determining unit, configured to determine whether the face area index of the target user is smaller than a preset face area threshold if the ambiguity index is smaller than the preset ambiguity threshold;
a third determining unit, configured to determine whether the side face angle index of the target user is smaller than a preset side face angle threshold if the face area index is smaller than the preset face area threshold;
and the fourth determining unit is used for determining that the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated if the side face angle index of the target user is smaller than a preset side face angle threshold.
Optionally, the obtaining module includes:
the first obtaining unit is used for obtaining an image of a target user through a camera if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated;
the second acquisition unit is used for acquiring the face information of the target user based on the image of the target user and acquiring the emotion information of the target user based on the image of the target user;
a third obtaining unit, configured to determine an evaluation duration of the target user based on the face information, and obtain a sub-duration of each emotion type determined by the emotion information within the evaluation duration;
and the fifth determining unit is used for determining the evaluation information of the target user based on the sub-duration of each emotion type determined by the emotion information in the evaluation duration.
Optionally, the second obtaining unit includes:
the setting subunit is used for inputting the image of the target user into a preset emotion recognition model;
the recognition processing subunit is used for performing emotion type recognition processing on the image of the target user based on the preset emotion recognition model to obtain emotion information of the target user;
the preset emotion recognition model is obtained by performing iterative training on a preset basic model through preset emotion training image data.
Optionally, the second determining module includes:
the fourth acquisition unit is used for acquiring the expression type of the to-be-evaluated artwork data;
and the sixth determining unit is used for determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the expression type.
Optionally, the second determining module further includes:
the fifth acquisition unit is used for acquiring the evaluation result of the evaluation group of the artwork data to be evaluated;
and the seventh determining unit is used for determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the evaluation result of the evaluation party.
Optionally, the second determining module further includes:
a sixth acquiring unit configured to acquire the number of the target users;
a seventh obtaining unit, configured to, if it is detected that the number of the target users is greater than a preset number, obtain a mean evaluation score of the to-be-evaluated artwork data based on the evaluation information of all the target users;
and the setting unit is used for setting the mean evaluation score as the evaluation result of the to-be-evaluated artwork data.
The present application further provides a data evaluation device, the data evaluation device is an entity device, the data evaluation device includes: a memory, a processor and a program of the data evaluation method stored on the memory and executable on the processor, the program of the data evaluation method being executable by the processor to implement the steps of the data evaluation method as described above.
The present application also provides a storage medium having stored thereon a program for implementing the above-described data evaluation method, the program for implementing the data evaluation method implementing the steps of the above-described data evaluation method when executed by a processor.
When a display instruction of the artwork data to be evaluated is detected, whether a target user pointed by the display instruction is in an evaluation state of the artwork data to be evaluated is determined; if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, acquiring evaluation information of the target user; and determining the evaluation result of the artwork data to be evaluated based on the evaluation information. Compared with the prior art that the assessment of the to-be-assessed artwork data is carried out in the assessment group scoring mode, the technical means that the assessment result of the to-be-assessed artwork data is determined based on the assessment information obtained after the to-be-assessed artwork data is assessed by the target user in the assessment state of the to-be-assessed artwork data is adopted, and due to the fact that the target users are numerous, the assessment data richness is improved, the technical defect that the data assessment accuracy is low in the prior art is overcome, and the assessment accuracy is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic flow chart of a first embodiment of the data evaluation method of the present application;
FIG. 2 is a schematic flow chart illustrating a refining step of the countermeasure data for generating the first data in the data evaluation method of the present application;
fig. 3 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present application.
The objectives, features, and advantages of the present application will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In a first embodiment of the data evaluation method of the present application, referring to fig. 1, the data evaluation method includes:
step S10, when a display instruction of the artwork data to be evaluated is detected, determining whether a target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated;
step S20, if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, obtaining evaluation information of the target user;
and step S30, determining the evaluation result of the artwork data to be evaluated based on the evaluation information.
The method comprises the following specific steps:
step S10, when a display instruction of the artwork data to be evaluated is detected, determining whether a target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated;
in this embodiment, it should be noted that the method and the system can be applied to a data evaluation system, the data evaluation system belongs to a data evaluation device, and for the data evaluation system, the data evaluation system is in communication connection with an organization such as each website, so that as long as a user has an evaluation behavior of the artwork data to be evaluated on each website, the data evaluation system can obtain evaluation information corresponding to the evaluation behavior, and further, comprehensively obtain an evaluation result of the artwork data to be evaluated.
It should be noted that the artwork data to be evaluated may include a non-AI (artificial intelligence) synthesized, i.e., an artificially produced song to be evaluated, or an artificially produced drawing to be evaluated, and in addition, the artwork data to be evaluated may also include an AI synthesized song to be evaluated, or an AI synthesized drawing to be evaluated, which is applicable to the data evaluation method in this embodiment regardless of whether the drawing is AI synthesized or artificially synthesized.
The artwork data to be evaluated is displayed on the network, and specifically, for example, the composition to be evaluated or the song to be evaluated is displayed on a QQ music, a dog music, a painting area on a certain website a or a painting area on another website B.
When a display instruction of the artwork data to be evaluated is detected, determining whether a target user pointed by the display instruction is in an evaluation state of the artwork data to be evaluated, wherein if the evaluation composition to be evaluated or a playing instruction of a song to be evaluated is detected, determining that the artwork data to be evaluated, such as the evaluation composition to be evaluated or the display instruction of the song to be evaluated, is detected, or if the reading instruction of a painting to be evaluated is detected, determining that the artwork data to be evaluated, such as the display instruction of the painting to be evaluated, is detected. The target user to which the display instruction points may refer to a target user determined by the ID information to which the display instruction points, for example, if the user a triggers the display instruction of a song after logging in the QQ music, the user a is the target user. When a display instruction of the artwork data to be evaluated is detected, whether a target user pointed by the display instruction is in an evaluation state of the artwork data to be evaluated is determined, specifically, whether the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated is determined through a user image collected by a camera, and the evaluation state can be determined through the ambiguity or definition of the face of a user, the area of the face of the user, the angle of the user deviating from a preset horizontal plane and the like.
When a display instruction of the artwork data to be evaluated is detected, the step of determining whether a target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated comprises the following steps:
step S11, when a display instruction of the artwork data to be evaluated is detected, determining whether the acquired ambiguity index of the target user is smaller than a preset ambiguity threshold value;
in this embodiment, whether the user is in the evaluation state is determined by the camera and the face recognition technology, and specifically, when a display instruction of the artwork data to be evaluated is detected, acquiring a face image of a target user through a camera, determining a fuzziness index of the target user based on the acquisition resolution of the face image, after the ambiguity index is determined, judging whether the ambiguity index is smaller than a preset ambiguity threshold value or not, when the ambiguity index is greater than or equal to a preset ambiguity threshold value, the target user is not in the evaluation state of the artwork data to be evaluated, this is because, when the ambiguity index is equal to or greater than the preset ambiguity threshold, it is difficult to clearly recognize whether the user is in the evaluation state, and therefore, in order to prevent misjudgment, the default user is not in the evaluation state, and if the ambiguity index is smaller than a preset ambiguity threshold value, the first round of judgment that the target user passes the evaluation state is determined.
Step S12, if the ambiguity index is smaller than a preset ambiguity threshold, determining whether the face area index of the target user is smaller than a preset face area threshold;
if the ambiguity index is smaller than a preset ambiguity threshold, further determining whether the face area index of the target user is smaller than a preset face area threshold based on the acquired face image, if the face area index of the target user is determined to be larger than or equal to the preset face area threshold, determining that the target user does not pass the judgment of the evaluation state in the second round, if the face area index of the target user is determined to be smaller than the preset face area threshold, determining that the target user passes the judgment of the evaluation state in the second round, and when the evaluation state is judged through the face area index, because: and determining the evaluation concentration degree of the target user.
Step S13, if the face area index is smaller than a preset face area threshold, determining whether the side face angle index of the target user is smaller than a preset side face angle threshold;
if when the face area index is less than the preset face area threshold, determining whether the side face angle index of the target user is less than the preset side face angle threshold through the collected face image, specifically, determining the side face angle index of the target user through the collected face image at first, wherein the side face angle index is calculated by using a preset calculation formula on the basis of a plane where the image is located, comparing the side face angle index with the preset side face angle threshold after the side face angle index is obtained, if the side face angle index of the target user is greater than or equal to the preset side face angle threshold, determining that the target user does not pass the judgment of the evaluation state in the third round, and if the side face angle index of the target user is less than the preset side face angle threshold, determining that the target user passes the judgment of the evaluation state in the third round.
Step S14, if the side face angle index of the target user is smaller than a preset side face angle threshold, determining that the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated.
And if the side face angle index of the target user is smaller than a preset side face angle threshold value, determining that the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated.
Step S20, if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, obtaining evaluation information of the target user;
and if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, acquiring the evaluation information of the target user, specifically, acquiring the evaluation information of the target user through a camera, wherein the evaluation information comprises an evaluation category and an evaluation score, and the evaluation category, the evaluation score and the evaluation duration are associated with the evaluation emotion.
If the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, the step of obtaining the evaluation information of the target user comprises the following steps:
step S21, if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, acquiring the image of the target user through a camera;
if the target user pointed by the display instruction is in the evaluation state of the to-be-evaluated artwork data, acquiring an image of the target user through a camera, wherein the camera comprises a high-definition camera or a super-definition camera or a high-definition slow-lens camera, acquiring the image of the target user through the camera, specifically acquiring the image of the target user at preset time intervals, and acquiring one image of the target user every 0.5 second.
Step S22, acquiring the face information of the target user based on the image of the target user, and acquiring the emotion information of the target user based on the image of the target user;
the method comprises the steps of obtaining face information and emotion information of a target user based on an image of the target user, specifically obtaining the face information and emotion information of the target user based on each image of the target user, obtaining the face information of the target user from each image of the target user based on a preset face recognition technology, and obtaining the emotion information of the target user from each image of the target user based on a preset emotion information recognition technology.
Wherein the step of obtaining emotion information of the target user based on the image of the target user includes:
step S221, inputting the image of the target user into a preset emotion recognition model;
in this embodiment, after obtaining the image of the target user, the image of the target user is input into a preset emotion recognition model, and specifically, after obtaining each image of the target user, the image of the target user is input into a preset emotion recognition model.
Step S222, performing emotion type recognition processing on the image of the target user based on the preset emotion recognition model to obtain emotion information of the target user;
the preset emotion recognition model is obtained by performing iterative training on a preset basic model through preset emotion training image data.
The preset emotion recognition model is obtained by performing iterative training on a preset basic model through preset emotion training image data, namely the preset emotion recognition model is obtained by training and can accurately acquire user emotion information based on the image, so that the emotion type recognition processing can be accurately performed on the basis of the image of the target user to obtain the emotion information of the target user.
Step S23, determining the evaluation duration of the target user based on the face information, and acquiring the type of the preset emotion determined by the emotion information in the evaluation duration and the sub-duration of each type of preset emotion;
and determining the evaluation duration of the target user based on the face information, specifically, after the face images of the target user are acquired at intervals of a preset time period, identifying each face image, determining that the evaluation is stopped if a certain target user image is not in the evaluation state, so as to obtain the evaluation duration of the target user, and storing the evaluation duration in association with the ID number of the target user, so as to correspondingly increase the evaluation duration of the user when the target user is detected to be in the evaluation state subsequently. It should be noted that the evaluation duration of a plurality of target users can be synchronously counted, in the process of obtaining the evaluation duration, the types of preset emotions determined by the emotion information in the evaluation duration are obtained through a preset emotion recognition model based on the images of the target users, and the sub-duration of each type of preset emotion is correspondingly determined. The preset emotion types include emotion information such as very happy, general, happy, unhappy and the like, and each emotion preset emotion type corresponds to a score.
And step S24, determining the evaluation information of the target user based on the type of the preset emotion determined by the emotion information and the sub-duration of each type of preset emotion.
Determining the evaluation information of the target user based on the type of the preset emotion determined by the emotion information and the sub-duration of each type of preset emotion, wherein the types of the preset emotion are respectively: 100. 75, 50, 20 and 0 minutes, and the sub-durations are respectively: for 3 seconds of pleasure, 5 seconds of very pleasure, and generally 2 seconds, the evaluation information of the target user is obtained in a weighted average manner, and the final score for the target user is (3 × 75+5 × 100+2 × 50)/(3+2+5) ═ 82.5.
And step S30, determining the evaluation result of the artwork data to be evaluated based on the evaluation information.
And determining an evaluation result of the to-be-evaluated artwork data based on the evaluation information, specifically, determining the evaluation result of the to-be-evaluated artwork data based on a plurality of evaluation information of a plurality of target users.
The step of determining the evaluation result of the to-be-evaluated artwork data based on the evaluation information comprises the following steps of:
step S31, acquiring the number of the target users;
step S32, if the number of the target users is larger than the preset number, obtaining the mean evaluation score of the artwork data to be evaluated based on the evaluation information of all the target users;
in order to avoid misjudgment, in this embodiment, the number of the target users is obtained, and the evaluation result of the to-be-evaluated artwork data is obtained only when the number group is large enough, specifically, if it is detected that the number of the target users is larger than a preset number, the average evaluation score of the to-be-evaluated artwork data is obtained based on the evaluation information of all the target users. For example, if there are 4 users evaluating a song (the sample size is greater than the preset number, which is only an example), and the scores are 80, 85, 90, and 95, respectively, the total score of the song is (80+85+90+95)/4, which is 87.5.
And step S33, setting the average evaluation score as the evaluation result of the artwork data to be evaluated.
And setting the average evaluation score as an evaluation result of the artwork data to be evaluated.
In this embodiment, it should be noted that the category of the to-be-evaluated artwork data is also obtained, and if the to-be-evaluated artwork data is the category of the AI synthesis, the evaluation result of the to-be-evaluated artwork data synthesized by the AI is used to score the a algorithm.
When a display instruction of the artwork data to be evaluated is detected, whether a target user pointed by the display instruction is in an evaluation state of the artwork data to be evaluated is determined; if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, acquiring evaluation information of the target user; and determining the evaluation result of the artwork data to be evaluated based on the evaluation information. Compared with the prior art that the assessment of the to-be-assessed artwork data is carried out in the assessment group scoring mode, the technical means that the assessment result of the to-be-assessed artwork data is determined based on the assessment information obtained after the to-be-assessed artwork data is assessed by the target user in the assessment state of the to-be-assessed artwork data is adopted, and due to the fact that the target users are numerous, the assessment data richness is improved, the technical defect that the data assessment accuracy is low in the prior art is overcome, and the assessment accuracy is improved.
Further, according to the first embodiment of the present application, there is provided another embodiment of the present application, in which the step of determining the evaluation result of the to-be-evaluated artwork data based on the evaluation information includes:
step S33, obtaining the expression type of the artwork data to be evaluated;
in this embodiment, in order to accurately determine the evaluation result of the to-be-evaluated artwork data, the expression type of the to-be-evaluated artwork data is further obtained, specifically, the expression type of the to-be-evaluated artwork data may be preset, for example, the expression type of the to-be-evaluated artwork data may be sadness, happiness, lightness, and the like.
And step S34, determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the expression type.
And determining an evaluation result of the to-be-evaluated artwork data based on the evaluation information and the expression type, specifically, determining the evaluation result of the to-be-evaluated artwork data based on the positive and negative mapping relation between the evaluation information and the expression type, wherein if the expression type is sad, the evaluation information score is lower, the evaluation result of the to-be-evaluated artwork data is better, and if the expression type is happy, the evaluation information score is higher, and the evaluation result of the to-be-evaluated artwork data is better.
And if the expression type is flat, the higher the evaluation information score is, the better the evaluation result of the artwork data to be evaluated is.
Specifically, in this embodiment, the mapping relationship between the evaluation information and the expression types may also be changed according to an actual situation, so as to improve the accuracy of the evaluation result.
In the embodiment, the expression type of the to-be-evaluated artwork data is obtained; and determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the expression type. In this embodiment, based on the evaluation information and the expression type, an evaluation result of the to-be-evaluated artwork data is determined, so as to avoid misjudgment of the artwork.
In another embodiment of the data evaluation method, the step of determining the evaluation result of the to-be-evaluated artwork data based on the evaluation information includes:
a1, obtaining the evaluation result of the appraiser group of the artwork data to be evaluated;
and A2, determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the evaluation result of the evaluation group.
In this embodiment, an evaluation result of the evaluation artwork data to be tested is further obtained, so as to determine the evaluation result of the evaluation artwork data to be tested based on the evaluation information and the evaluation result of the evaluation group, specifically, a first percentage of the evaluation information is obtained, a second percentage of the evaluation result of the evaluation group is obtained, and the evaluation result of the evaluation artwork data to be tested is determined based on the evaluation information and the evaluation result of the evaluation group, the first percentage and the second percentage.
For example, if the first percentage is 70% and the second percentage is 30%, the evaluation result of the piece of artwork data to be evaluated is 70% of evaluation information + 30% of evaluation result of the evaluation team.
In the embodiment, the evaluation result of the appraiser group of the artwork data to be evaluated is obtained; and determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the evaluation result of the appraisal group. In this embodiment, the evaluation results of the to-be-evaluated artwork data of the user and the appraisal group are obtained simultaneously, and the evaluation accuracy is improved by performing comprehensive evaluation.
Referring to fig. 3, fig. 3 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present application.
As shown in fig. 3, the data evaluation apparatus may include: a processor 1001, such as a CPU, a memory 1005, and a communication bus 1002. The communication bus 1002 is used for realizing connection communication between the processor 1001 and the memory 1005. The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a memory device separate from the processor 1001 described above.
Optionally, the data evaluation device may further include a rectangular user interface, a network interface, a camera, RF (radio frequency) circuits, a sensor, an audio circuit, a WiFi module, and the like. The rectangular user interface may comprise a Display screen (Display), an input sub-module such as a Keyboard (Keyboard), and the optional rectangular user interface may also comprise a standard wired interface, a wireless interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface).
Those skilled in the art will appreciate that the data evaluation device configuration shown in FIG. 3 does not constitute a limitation of the data evaluation device and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 3, a memory 1005, which is a storage medium, may include therein an operating system, a network communication module, and a data evaluation program. The operating system is a program that manages and controls the hardware and software resources of the data evaluation device, supporting the operation of the data evaluation program as well as other software and/or programs. The network communication module is used for realizing communication among the components in the memory 1005 and communication with other hardware and software in the data evaluation system.
In the data evaluation apparatus shown in fig. 3, the processor 1001 is configured to execute a data evaluation program stored in the memory 1005, and implement the steps of the data evaluation method according to any one of the above.
The specific implementation of the data evaluation device of the present application is substantially the same as that of each embodiment of the data evaluation method, and is not described herein again.
The present application further provides a data evaluation device, the data evaluation device includes:
the system comprises a first determination module, a second determination module and a third determination module, wherein the first determination module is used for determining whether a target user pointed by a display instruction is in an evaluation state of the to-be-evaluated artwork data when the display instruction of the to-be-evaluated artwork data is detected;
the acquisition module is used for acquiring the evaluation information of the target user if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated;
and the second determination module is used for determining the evaluation result of the artwork data to be evaluated based on the evaluation information.
Optionally, the first determining module includes:
the first determining unit is used for determining whether the acquired ambiguity index of the target user is smaller than a preset ambiguity threshold value or not when a display instruction of the artwork data to be evaluated is detected;
a second determining unit, configured to determine whether the face area index of the target user is smaller than a preset face area threshold if the ambiguity index is smaller than the preset ambiguity threshold;
a third determining unit, configured to determine whether the side face angle index of the target user is smaller than a preset side face angle threshold if the face area index is smaller than the preset face area threshold;
and the fourth determining unit is used for determining that the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated if the side face angle index of the target user is smaller than a preset side face angle threshold.
Optionally, the obtaining module includes:
the first obtaining unit is used for obtaining an image of a target user through a camera if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated;
the second acquisition unit is used for acquiring the face information of the target user based on the image of the target user and acquiring the emotion information of the target user based on the image of the target user;
a third obtaining unit, configured to determine an evaluation duration of the target user based on the face information, and obtain a sub-duration of each emotion type determined by the emotion information within the evaluation duration;
and the fifth determining unit is used for determining the evaluation information of the target user based on the sub-duration of each emotion type determined by the emotion information in the evaluation duration.
Optionally, the second obtaining unit includes:
the setting subunit is used for inputting the image of the target user into a preset emotion recognition model;
the recognition processing subunit is used for performing emotion type recognition processing on the image of the target user based on the preset emotion recognition model to obtain emotion information of the target user;
the preset emotion recognition model is obtained by performing iterative training on a preset basic model through preset emotion training image data.
Optionally, the second determining module includes:
the fourth acquisition unit is used for acquiring the expression type of the to-be-evaluated artwork data;
and the sixth determining unit is used for determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the expression type.
Optionally, the second determining module further includes:
the fifth acquisition unit is used for acquiring the evaluation result of the evaluation group of the artwork data to be evaluated;
and the seventh determining unit is used for determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the evaluation result of the evaluation party.
Optionally, the second determining module further includes:
a sixth acquiring unit configured to acquire the number of the target users;
a seventh obtaining unit, configured to, if it is detected that the number of the target users is greater than a preset number, obtain a mean evaluation score of the to-be-evaluated artwork data based on the evaluation information of all the target users;
and the setting unit is used for setting the mean evaluation score as the evaluation result of the to-be-evaluated artwork data.
The specific implementation of the data evaluation device of the present application is substantially the same as that of each embodiment of the data evaluation method, and is not described herein again.
The embodiment of the application provides a storage medium, and the storage medium stores one or more programs, and the one or more programs can be further executed by one or more processors to realize the steps of the data evaluation method of any one of the above.
The specific implementation of the storage medium of the present application is substantially the same as that of each embodiment of the data evaluation method, and is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A data evaluation method is characterized by comprising the following steps:
when a display instruction of the artwork data to be evaluated is detected, determining whether a target user pointed by the display instruction is in an evaluation state of the artwork data to be evaluated;
if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, acquiring evaluation information of the target user;
and determining the evaluation result of the artwork data to be evaluated based on the evaluation information.
2. The data evaluation method according to claim 1, wherein the step of determining whether a target user to which a display instruction is directed is in an evaluation state of the art data to be evaluated when the display instruction of the art data to be evaluated is detected comprises:
when a display instruction of the artwork data to be evaluated is detected, determining whether the acquired ambiguity index of the target user is smaller than a preset ambiguity threshold value;
if the ambiguity index is smaller than a preset ambiguity threshold, determining whether the face area index of the target user is smaller than a preset face area threshold;
if the face area index is smaller than a preset face area threshold, determining whether the side face angle index of the target user is smaller than a preset side face angle threshold;
and if the side face angle index of the target user is smaller than a preset side face angle threshold value, determining that the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated.
3. The data evaluation method of claim 1, wherein the step of obtaining the evaluation information of the target user if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated comprises:
if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated, acquiring an image of the target user through a camera;
acquiring the face information of the target user based on the image of the target user, and acquiring the emotion information of the target user based on the image of the target user;
determining the evaluation duration of the target user based on the face information, and acquiring the sub-duration of each emotion type determined by the emotion information in the evaluation duration;
and determining the evaluation information of the target user based on the sub-duration of each emotion type determined by the emotion information in the evaluation duration.
4. The data evaluation method of claim 3, wherein the step of obtaining emotional information of the target user based on the image of the target user comprises:
inputting the image of the target user into a preset emotion recognition model;
performing emotion type recognition processing on the image of the target user based on the preset emotion recognition model to obtain emotion information of the target user;
the preset emotion recognition model is obtained by performing iterative training on a preset basic model through preset emotion training image data.
5. The data evaluation method of claim 1, wherein the step of determining the evaluation result of the artwork data to be evaluated based on the evaluation information comprises:
obtaining the expression type of the artwork data to be evaluated;
and determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the expression type.
6. The data evaluation method of claim 1, wherein the step of determining the evaluation result of the artwork data to be evaluated based on the evaluation information comprises:
acquiring an evaluation result of the evaluation group of the artwork data to be evaluated;
and determining the evaluation result of the artwork data to be evaluated based on the evaluation information and the evaluation result of the appraisal group.
7. The data evaluation method of claim 1, wherein the step of determining the evaluation result of the artwork data to be evaluated based on the evaluation information comprises:
acquiring the number of the target users;
if the number of the target users is larger than the preset number, obtaining a mean evaluation score of the to-be-evaluated artwork data based on the evaluation information of all the target users;
and setting the average evaluation score as an evaluation result of the artwork data to be evaluated.
8. A data evaluation device characterized by comprising:
the system comprises a first determination module, a second determination module and a third determination module, wherein the first determination module is used for determining whether a target user pointed by a display instruction is in an evaluation state of the to-be-evaluated artwork data when the display instruction of the to-be-evaluated artwork data is detected;
the acquisition module is used for acquiring the evaluation information of the target user if the target user pointed by the display instruction is in the evaluation state of the artwork data to be evaluated;
and the second determination module is used for determining the evaluation result of the artwork data to be evaluated based on the evaluation information.
9. A data evaluation apparatus characterized by comprising: a memory, a processor, and a program stored on the memory for implementing the data evaluation method,
the memory is used for storing a program for realizing the data evaluation method;
the processor is configured to execute a program for implementing the data evaluation method to implement the steps of the data evaluation method according to any one of claims 1 to 7.
10. A storage medium having stored thereon a program for implementing a data evaluation method, the program being executed by a processor to implement the steps of the data evaluation method according to any one of claims 1 to 7.
CN202010625201.6A 2020-07-01 2020-07-01 Data evaluation method, device, equipment and storage medium Pending CN111784163A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010625201.6A CN111784163A (en) 2020-07-01 2020-07-01 Data evaluation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010625201.6A CN111784163A (en) 2020-07-01 2020-07-01 Data evaluation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111784163A true CN111784163A (en) 2020-10-16

Family

ID=72757771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010625201.6A Pending CN111784163A (en) 2020-07-01 2020-07-01 Data evaluation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111784163A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101836219A (en) * 2007-11-01 2010-09-15 索尼爱立信移动通讯有限公司 Generating music playlist based on facial expression
US20140324885A1 (en) * 2013-04-25 2014-10-30 Trent R. McKenzie Color-based rating system
CN104434140A (en) * 2013-09-13 2015-03-25 Nhn娱乐公司 Content evaluation system and content evaluation method using the system
CN107463876A (en) * 2017-07-03 2017-12-12 珠海市魅族科技有限公司 Information processing method and device, computer installation and storage medium
CN107679504A (en) * 2017-10-13 2018-02-09 北京奇虎科技有限公司 Face identification method, device, equipment and storage medium based on camera scene
CN108090698A (en) * 2018-01-08 2018-05-29 聚影汇(北京)影视文化有限公司 A kind of film test and appraisal service system and method
CN108197595A (en) * 2018-01-23 2018-06-22 京东方科技集团股份有限公司 A kind of method, apparatus, storage medium and computer for obtaining evaluation information
CN108848416A (en) * 2018-06-21 2018-11-20 北京密境和风科技有限公司 The evaluation method and device of audio-video frequency content
CN109447729A (en) * 2018-09-17 2019-03-08 平安科技(深圳)有限公司 A kind of recommended method of product, terminal device and computer readable storage medium
CN109766770A (en) * 2018-12-18 2019-05-17 深圳壹账通智能科技有限公司 QoS evaluating method, device, computer equipment and storage medium
CN109905595A (en) * 2018-06-20 2019-06-18 成都市喜爱科技有限公司 A kind of method, apparatus, equipment and medium shot and play
CN110888997A (en) * 2018-09-10 2020-03-17 北京京东尚科信息技术有限公司 Content evaluation method and system and electronic equipment
CN111339358A (en) * 2020-02-28 2020-06-26 杭州市第一人民医院 Movie recommendation method and device, computer equipment and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101836219A (en) * 2007-11-01 2010-09-15 索尼爱立信移动通讯有限公司 Generating music playlist based on facial expression
US20140324885A1 (en) * 2013-04-25 2014-10-30 Trent R. McKenzie Color-based rating system
CN104434140A (en) * 2013-09-13 2015-03-25 Nhn娱乐公司 Content evaluation system and content evaluation method using the system
CN107463876A (en) * 2017-07-03 2017-12-12 珠海市魅族科技有限公司 Information processing method and device, computer installation and storage medium
CN107679504A (en) * 2017-10-13 2018-02-09 北京奇虎科技有限公司 Face identification method, device, equipment and storage medium based on camera scene
CN108090698A (en) * 2018-01-08 2018-05-29 聚影汇(北京)影视文化有限公司 A kind of film test and appraisal service system and method
CN108197595A (en) * 2018-01-23 2018-06-22 京东方科技集团股份有限公司 A kind of method, apparatus, storage medium and computer for obtaining evaluation information
CN109905595A (en) * 2018-06-20 2019-06-18 成都市喜爱科技有限公司 A kind of method, apparatus, equipment and medium shot and play
CN108848416A (en) * 2018-06-21 2018-11-20 北京密境和风科技有限公司 The evaluation method and device of audio-video frequency content
CN110888997A (en) * 2018-09-10 2020-03-17 北京京东尚科信息技术有限公司 Content evaluation method and system and electronic equipment
CN109447729A (en) * 2018-09-17 2019-03-08 平安科技(深圳)有限公司 A kind of recommended method of product, terminal device and computer readable storage medium
CN109766770A (en) * 2018-12-18 2019-05-17 深圳壹账通智能科技有限公司 QoS evaluating method, device, computer equipment and storage medium
CN111339358A (en) * 2020-02-28 2020-06-26 杭州市第一人民医院 Movie recommendation method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN106293074B (en) Emotion recognition method and mobile terminal
CN109345553B (en) Palm and key point detection method and device thereof, and terminal equipment
CN109446961B (en) Gesture detection method, device, equipment and storage medium
CN111193965B (en) Video playing method, video processing method and device
KR20140091555A (en) Measuring web page rendering time
CN110264093B (en) Credit model establishing method, device, equipment and readable storage medium
CN109739768A (en) Search engine evaluating method, device, equipment and readable storage medium storing program for executing
CN112732974A (en) Data processing method, electronic equipment and storage medium
CN114638777A (en) Image defect detection method, device, electronic equipment and medium
CN113962965A (en) Image quality evaluation method, device, equipment and storage medium
CN109840212B (en) Function test method, device and equipment of application program and readable storage medium
CN112835807B (en) Interface identification method and device, electronic equipment and storage medium
CN111640421B (en) Speech comparison method, device, equipment and computer readable storage medium
CN117871545A (en) Method and device for detecting defects of circuit board components, terminal and storage medium
CN113051235A (en) Document loading method and device, terminal and storage medium
CN111784163A (en) Data evaluation method, device, equipment and storage medium
CN116912911A (en) Satisfaction data screening method and device, electronic equipment and storage medium
CN111401465A (en) Training sample optimization method, device, equipment and storage medium
CN111507139A (en) Image effect generation method and device and electronic equipment
CN113420809A (en) Video quality evaluation method and device and electronic equipment
CN114153954A (en) Test case recommendation method and device, electronic equipment and storage medium
CN114298137A (en) Tiny target detection system based on countermeasure generation network
CN114355175A (en) Chip performance evaluation method and device, storage medium and computer equipment
CN112581001A (en) Device evaluation method and device, electronic device and readable storage medium
CN111062377A (en) Question number detection method, system, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination