CN114416503A - Data processing method and device, electronic equipment and storage medium - Google Patents

Data processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114416503A
CN114416503A CN202111670958.8A CN202111670958A CN114416503A CN 114416503 A CN114416503 A CN 114416503A CN 202111670958 A CN202111670958 A CN 202111670958A CN 114416503 A CN114416503 A CN 114416503A
Authority
CN
China
Prior art keywords
user
control
emotion
historical
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111670958.8A
Other languages
Chinese (zh)
Inventor
刘孟
彭飞
邓竹立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 58 Information Technology Co Ltd
Original Assignee
Beijing 58 Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 58 Information Technology Co Ltd filed Critical Beijing 58 Information Technology Co Ltd
Priority to CN202111670958.8A priority Critical patent/CN114416503A/en
Publication of CN114416503A publication Critical patent/CN114416503A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention provides a data processing method, a data processing device, electronic equipment and a storage medium. The method comprises the steps of acquiring current control behavior data of a user on a current object according to control behaviors of various control types supported by the current object controlled by the user, acquiring historical control behavior data of the user on the historical object according to control behaviors of various control types supported by the historical object controlled by the user in a historical process, and acquiring emotion classification of the user on the current object according to the current control behavior data and the historical control behavior data. By the method and the device, the historical control behavior data of the user on the historical object in the historical process is taken as a reference, and the emotion classification of the user on the current object is obtained by combining the current control behavior data of the user on the current object, so that the emotion classification of the user on the current object can be obtained according to the actual conditions (such as behavior habits and the like) of the user, and the emotion classification accuracy of the obtained user on the current object can be improved.

Description

Data processing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a data processing method and apparatus, an electronic device, and a storage medium.
Background
In a personalized recommendation system, emotion classification of a user on an object is often obtained through feedback of the user on the object, preference information of the user is updated according to the emotion classification of the user on the object and the attribute of the object, and then the object meeting the preference of the user is recommended to the user according to the updated preference information of the user.
The feedback of the user is often a user's manipulation behavior on the object, for example, like, like, buy, share like.
In this way, the emotion classification of the user on the object can be judged according to the control action performed on the object by the user, but the inventor finds that the obtained emotion classification of the user on the object is possibly inaccurate.
Disclosure of Invention
The application discloses a data processing method, a data processing device, electronic equipment and a storage medium.
In a first aspect, the present application shows a data processing method applied to a terminal, where the method includes:
under the condition that a user operates a current object, acquiring current operation and control behavior data of the user on the current object aiming at operation and control behaviors of various operation and control types supported by the current object, wherein the current operation and control behavior data comprises operation and control types of operation and control behaviors actually executed by the user on the current object and operation and control types of operation and control behaviors not executed by the user;
for each history object which is controlled by the user in the history process, acquiring historical control behavior data of the user on the history object aiming at control behaviors of each control type supported by the history object, wherein the historical control behavior data comprises the control type of the control behavior actually executed by the user on the history object and the control type of the control behavior not executed;
and acquiring the emotion classification of the current object by the user according to the current control behavior data and the historical control behavior data.
In an optional implementation manner, the obtaining, according to the current manipulation behavior data and the historical manipulation behavior data, an emotion classification of the user on the current object includes:
obtaining an emotion classification obtaining model, wherein the emotion classification obtaining model is obtained by training according to a plurality of training data sets, and the training data sets comprise: the method comprises the steps that a sample user conducts first sample control behavior data on a first sample object, the sample user conducts second sample control behavior data on a second sample object, and the sample user conducts labeling emotion classification on the second sample object;
inputting the current control behavior data and the historical control behavior data into the emotion classification acquisition model so that the emotion classification acquisition model processes the current control behavior data and the historical control behavior data to obtain the emotion classification of the current object by the user, and outputting the emotion classification of the current object by the user;
and acquiring the emotion classification of the current object output by the emotion classification acquisition model.
In an optional implementation manner, the obtaining of the emotion classification obtaining model includes:
and acquiring an emotion classification acquisition model suitable for the user from the trained emotion classification acquisition models according to the historical control behavior data.
In an optional implementation manner, the obtaining, according to the historical manipulation behavior data, an emotion classification obtaining model applicable to the user from a plurality of trained emotion classification obtaining models includes:
determining a user set to which the user belongs in a plurality of user sets according to the historical manipulation behavior data;
and selecting the emotion classification acquisition models suitable for the user set to which the user belongs from the emotion classification acquisition models respectively suitable for the user sets.
In an optional implementation manner, the obtaining, according to the current manipulation behavior data and the historical manipulation behavior data, an emotion classification of the user on the current object includes:
for any control behavior of a control type supported by the current object, determining a current execution situation of the user on the control behavior of the current object related to the control type according to the current control behavior data, determining a historical execution situation of the user on the control behavior of the historical object related to the control type according to the historical control behavior data, and acquiring an emotion score of the user on the control behavior of the current object related to the control type according to the current execution situation and the historical execution situation;
and acquiring the emotion classification of the current object by the user according to the emotion scores of the control behaviors of the current object on each control type respectively.
In an optional implementation manner, the obtaining, according to the current execution condition and the historical execution condition, an emotion score of a manipulation behavior of the user on the current object with respect to the manipulation type includes:
acquiring basic emotion classification of the control behaviors of the control types;
counting a statistical result of whether the user has an execution tendency of executing the control behavior of the control type on the historical object according to the historical execution condition, and determining whether the user executes the execution result of the control behavior of the control type on the current object according to the current execution condition;
and acquiring the emotion score of the user on the control behavior of the current object relative to the control type according to the basic emotion classification of the control behavior of the control type, the statistical result and the execution result.
In an optional implementation manner, the counting, according to the historical execution condition, a result of counting whether the historical object has an execution tendency to execute the manipulation behavior of the manipulation type by the user includes:
acquiring the historical quantity of the historical objects which are controlled by the user in the historical process, and acquiring the execution times of the control behaviors of the control types which are executed on the historical objects by the user in the process of controlling the historical objects in the historical process;
and counting the statistical result of whether the user has the execution tendency of executing the control behavior of the control type on the history object according to the history number and the execution times.
In an optional implementation manner, the counting, according to the history number and the execution times, a result of counting whether the user has an execution tendency to execute the manipulation behavior of the manipulation type on the history object includes:
calculating a ratio between the number of executions and the historical number;
determining that the user has an execution tendency for executing the manipulation behavior of the manipulation type on the historical object when the ratio is greater than or equal to a preset ratio;
alternatively, the first and second electrodes may be,
determining that the user does not have an execution tendency to execute the manipulation behavior of the manipulation type on the history object in the case that the ratio is smaller than a preset ratio.
In an optional implementation manner, the obtaining, according to the basic emotion classification, the statistical result, and the execution result of the control behavior of the control type, an emotion score of the user on the control behavior of the current object with respect to the control type includes:
and searching the emotion scores of the user about the control behaviors, which simultaneously correspond to the basic emotion classification of the control behaviors of the control type, the statistical result and the execution result, and are used as the emotion scores of the user about the control behaviors of the control type for the current object.
In an optional implementation manner, the obtaining, according to emotion scores of the control behaviors of the user on the current object respectively related to each control type, an emotion classification of the user on the current object includes:
weighting and summing the emotion scores of the current object about the control behaviors of the control types by the user to obtain the total emotion value of the current object by the user;
and obtaining the emotion classification of the current object by the user according to the total emotion value of the current object by the user.
In an optional implementation manner, the obtaining of the emotion classification of the user for the current object according to the total value of the emotion of the user for the current object includes:
under the condition that the total value of the user's emotion on the current object is greater than a preset emotion threshold value, determining that the user's emotion on the current object is classified as a forward emotion;
alternatively, the first and second electrodes may be,
under the condition that the total value of the user's emotion to the current object is equal to a preset emotion threshold value, determining that the user's emotion to the current object is classified as neutral emotion;
alternatively, the first and second electrodes may be,
and under the condition that the total value of the user's emotion to the current object is smaller than a preset emotion threshold value, determining that the user's emotion to the current object is classified as negative emotion.
In a second aspect, the present application shows a data processing apparatus applied to a terminal, the apparatus comprising:
a first obtaining module, configured to, when a user operates a current object, obtain, for control behaviors of each control type supported by the current object, current control behavior data of the current object by the user, where the current control behavior data includes a control type of a control behavior actually executed by the user on the current object and a control type of a control behavior not executed;
a second obtaining module, configured to obtain, for each historical object that is controlled by the user in a historical process, historical control behavior data of the historical object by the user for control behaviors of each control type supported by the historical object, where the historical control behavior data includes a control type of a control behavior actually executed by the user on the historical object and a control type of a control behavior not executed by the user;
and the third obtaining module is used for obtaining the emotion classification of the current object by the user according to the current control behavior data and the historical control behavior data.
In an optional implementation manner, the third obtaining module includes:
the emotion classification acquisition module is used for acquiring emotion classification acquisition models, wherein the emotion classification acquisition models are obtained by training according to a plurality of training data sets, and the training data sets comprise: the method comprises the steps that a sample user conducts first sample control behavior data on a first sample object, the sample user conducts second sample control behavior data on a second sample object, and the sample user conducts labeling emotion classification on the second sample object;
the input unit is used for inputting the current control behavior data and the historical control behavior data into the emotion classification acquisition model so that the emotion classification acquisition model processes the current control behavior data and the historical control behavior data to obtain the emotion classification of the current object by the user and output the emotion classification of the current object by the user;
and the second acquisition unit is used for acquiring the emotion classification of the current object, which is output by the emotion classification acquisition model, of the user.
In an optional implementation manner, the first obtaining unit includes:
and the first obtaining subunit is used for obtaining the emotion classification obtaining model suitable for the user from the trained emotion classification obtaining models according to the historical control behavior data.
In an optional implementation manner, the first obtaining subunit is specifically configured to: determining a user set to which the user belongs in a plurality of user sets according to the historical manipulation behavior data; and selecting the emotion classification acquisition models suitable for the user set to which the user belongs from the emotion classification acquisition models respectively suitable for the user sets.
In an optional implementation manner, the third obtaining module includes:
the device comprises a first determining unit, a second determining unit and a third acquiring unit, wherein the first determining unit is used for determining the current execution situation of the user on the control behavior of the current object relative to the control type according to the current control behavior data for the control behavior of any control type supported by the current object, the second determining unit is used for determining the historical execution situation of the user on the control behavior of the historical object relative to the control type according to the historical control behavior data, and the third acquiring unit is used for acquiring the emotion score of the user on the control behavior of the current object relative to the control type according to the current execution situation and the historical execution situation;
and the fourth acquisition unit is used for acquiring the emotion classification of the current object by the user according to the emotion scores of the control behaviors of the current object on each control type respectively.
In an optional implementation manner, the third obtaining unit includes:
the second obtaining subunit is used for obtaining the basic emotion classification of the control behavior of the control type;
a statistic subunit, configured to count, according to the historical execution condition, a statistic result of whether the user has an execution tendency to execute the manipulation behavior of the manipulation type on the historical object, and a determination subunit, configured to determine, according to the current execution condition, an execution result of whether the user has executed the manipulation behavior of the manipulation type on the current object;
and the third obtaining subunit is configured to obtain, according to the basic emotion classification of the control behavior of the control type, the statistical result, and the execution result, an emotion score of the control behavior of the control type of the current object by the user.
In an optional implementation manner, the statistics subunit is specifically configured to: acquiring the historical quantity of the historical objects which are controlled by the user in the historical process, and acquiring the execution times of the control behaviors of the control types which are executed on the historical objects by the user in the process of controlling the historical objects in the historical process; and counting the statistical result of whether the user has the execution tendency of executing the control behavior of the control type on the history object according to the history number and the execution times.
In an optional implementation manner, the statistics subunit is specifically configured to: calculating a ratio between the number of executions and the historical number; determining that the user has an execution tendency for executing the manipulation behavior of the manipulation type on the historical object when the ratio is greater than or equal to a preset ratio; or, in a case that the ratio is smaller than a preset ratio, it is determined that the user does not have an execution tendency to execute the manipulation behavior of the manipulation type on the history object.
In an optional implementation manner, the third obtaining subunit is specifically configured to: and searching the emotion scores of the user about the control behaviors, which simultaneously correspond to the basic emotion classification of the control behaviors of the control type, the statistical result and the execution result, and are used as the emotion scores of the user about the control behaviors of the control type for the current object.
In an optional implementation manner, the fourth obtaining unit includes:
the summation subunit is used for weighting and summing the emotion scores of the control behaviors of the current object respectively related to each control type by the user to obtain the total emotion value of the current object by the user;
and the fourth obtaining subunit is configured to obtain the emotion classification of the current object by the user according to the total value of the emotion of the current object by the user.
In an optional implementation manner, the fourth obtaining subunit is specifically configured to: under the condition that the total value of the user's emotion on the current object is greater than a preset emotion threshold value, determining that the user's emotion on the current object is classified as a forward emotion; or under the condition that the total value of the user's emotion to the current object is equal to a preset emotion threshold value, determining that the user's emotion to the current object is classified as neutral emotion; or determining that the emotion of the user on the current object is classified as negative emotion under the condition that the total emotion value of the user on the current object is smaller than a preset emotion threshold value.
In a third aspect, the present application shows an electronic device comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the data processing method of the first aspect.
In a fourth aspect, the present application shows a non-transitory computer readable storage medium having instructions which, when executed by a processor of an electronic device, enable the electronic device to perform the data processing method of the first aspect.
In a fifth aspect, the present application shows a computer program product, in which instructions, when executed by a processor of an electronic device, enable the electronic device to perform the data processing method according to the first aspect.
The technical scheme provided by the application can comprise the following beneficial effects:
the analysis of the inventor shows that: the emotion classification of the user on the current object is judged only according to the control behavior executed by the user on the current object, that is, the emotion classification of the user on the current object is judged absolutely by using the control behavior executed by the user on the current object, which may possibly cause the acquired emotion classification of the user on the current object to be inaccurate.
For example, the basic emotion classification of the manipulation behavior of "click on" is a negative emotion (if the user clicks on an object point, it indicates that the user does not like the object, and if the user does not click on the object point, it does not indicate that the user likes the object nor does the user dislike the object, and it is often determined that the emotion of the user on the object is classified as a neutral emotion only according to the state of "the user does not click on the object point" and it is often not determined whether the user likes the object or dislikes the object). When a user browses short videos one by using a short video application program, if almost all short videos are browsed, the short video point is stepped on, and the fact that the short video point is stepped on usually indicates that the user does not like the short videos. And then if the user does not step on the short video frequency point when browsing a short video, the user often likes the short video (because the user does not like the short video frequency point according to the user habit, the user can step on the short video frequency point) in combination with the user habit, and the condition that the user does not step on the short video frequency point can be regarded as implicit feedback of forward emotion. However, the emotion classification of the short video by the user is often determined as neutral emotion according to the fact that the user does not step on the short video, which is not consistent with the forward emotion liked by the user to the short video, and the obtained emotion classification of the user to the current object is inaccurate.
For another example, the basic emotion of the manipulation behavior of "like" is classified as a forward emotion (if the user likes an object, it indicates that the user likes the object, and if the user does not like the object, it does not indicate that the user does not like the object nor does the user like the object, and it is often determined that the emotion of the user on the object is classified as a neutral emotion only according to the state of "the user does not like the object" and it is often not determined whether the user does not like the object or likes the object). In the process that a user browses short videos one by using a short video application program, if almost every short video is browsed, the short video will be praised, and the praise of the short video usually indicates that the short video is liked. And then, if the user does not like the short video when browsing the short video, the user often does not like the short video (because the user likes the short video according to the user habit, the user likes the short video) in combination with the user habit, and the 'not like the short video' can be regarded as an implicit feedback of a negative emotion. However, the emotion classification of the short video by the user is often determined as neutral emotion according to the fact that the user does not like the short video, which is not consistent with the forward emotion that the user likes the short video, and the obtained emotion classification of the user on the current object is inaccurate.
Secondly, historical control behavior data of the user on the historical object in the historical process can reflect historical behavior habits of the user in the historical process, and current control behavior data of the user on the current object can reflect a current control state of the user.
If the difference between the current control state and the historical behavior habit of the user in the historical process is large, the emotion classification of the user on the current object, which can be reflected by the current control behavior data of the user on the current object, is usually negative emotion or positive emotion.
In view of this, in the present application, when a user manipulates a current object, current manipulation behavior data of the user on the current object is acquired for manipulation behaviors of each manipulation type supported by the current object, where the current manipulation behavior data of the user on the current object includes a manipulation type of a manipulation behavior actually performed on the current object by the user and a manipulation type of a manipulation behavior not performed on the current object. For each historical object which is controlled by a user in the historical process, historical control behavior data of the user on the historical object is obtained according to control behaviors of each control type supported by the historical object, wherein the historical control behavior data of the user on the historical object comprises the control type of the control behavior actually executed by the user on the historical object and the control type of the control behavior not executed. And obtaining the emotion classification of the current object by the user according to the current control behavior data of the current object by the user and the historical control behavior data of the historical object by the user.
By the method and the device, the historical control behavior data of the user on the historical object in the historical process is taken as a reference, namely the historical behavior habit of the user in the historical process is taken as a reference, and the emotion classification of the user on the current object is obtained by combining the current control behavior data of the user on the current object, so that the emotion classification of the user on the current object can be obtained according to the actual conditions (such as behavior habit and the like) of the user, and the emotion classification accuracy of the obtained user on the current object can be improved.
Drawings
FIG. 1 is a flow chart of the steps of a data processing method of the present application.
FIG. 2 is a flow chart of the steps of a data processing method of the present application.
FIG. 3 is a flow chart of the steps of a data processing method of the present application.
FIG. 4 is a flow chart of steps of a data processing method of the present application.
Fig. 5 is a block diagram of a data processing apparatus according to the present application.
FIG. 6 is a block diagram of an electronic device of the present application.
FIG. 7 is a block diagram of an electronic device of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart illustrating steps of a data processing method according to the present application is shown, where the method is applied to a terminal, and the method may specifically include the following steps:
in step S101, under the condition that the user manipulates the current object, current manipulation behavior data of the user on the current object is obtained for manipulation behaviors of each manipulation type supported by the current object, where the current manipulation behavior data of the user on the current object includes a manipulation type of a manipulation behavior actually executed by the user on the current object and a manipulation type of a manipulation behavior not executed by the user.
The manipulation behaviors supported by the current object include: the method comprises the following steps of browsing in different duration (such as short-duration browsing, medium-duration browsing, long-duration browsing and the like, and the different durations can be determined according to actual conditions, such as 0-20 seconds, 20-40 seconds, 40-60 seconds and the like), agreeing, clicking, reporting, complaining, sharing, collecting, adding a shopping cart, purchasing, reporting, downloading, forwarding and the like, wherein the control behaviors are control behaviors in different control types.
The current operation and control behavior data of the current object by the user comprises the following steps: and whether the user executes the control action on the control action of each control type supported by the current object.
For example, the user performs a browsing action, a praise action, a tramp action, a report action, a complaint action, a share action, a collection action, a shopping cart joining action, a purchase action, an entry action, a download action, a forward action, and the like on the current object for one duration.
The current manipulation behavior data of the user on the current object may be counted in real time under the condition that the user manipulates the current object.
In the process of browsing the current object by the user, in the control behaviors of each control type supported by the current object, if the user does not execute a control behavior of a certain control type on the current object, the "not executing a control behavior of a certain control type" may be regarded as implicit feedback of the user on the current object.
Or, in the process of browsing the current object by the user, the manipulation behavior of the manipulation type not performed by the user on the current object may be regarded as implicit feedback of the user on the current object.
In step S102, for each history object that is manipulated by the user in the history process, historical manipulation behavior data of the user on the history object is obtained for manipulation behaviors of each manipulation type supported by the history object, where the historical manipulation behavior data of the user on the history object includes a manipulation type of a manipulation behavior actually performed on the history object by the user and a manipulation type of a manipulation behavior not performed on the history object by the user.
And when the user operates the historical object in the historical process, the historical operation and control behavior data of the user on the historical object is automatically recorded, so that the historical operation and control behavior data of the user on the historical object can be acquired according to the recorded content.
The history objects manipulated by the user in the history process may include: and each history object which is operated by the user in a period close to the current moment, and the like, so as to improve timeliness.
The historical object supported manipulation behaviors include: the method comprises the following steps of browsing in different duration (such as short-duration browsing, medium-duration browsing, long-duration browsing and the like, and the different durations can be determined according to actual conditions, such as 0-20 seconds, 20-40 seconds, 40-60 seconds and the like), agreeing, clicking, reporting, complaining, sharing, collecting, adding a shopping cart, purchasing, reporting, downloading, forwarding and the like, wherein the control behaviors are control behaviors in different control types.
The historical manipulation behavior data of the user on the historical object comprises the following steps: and whether the user executes the control action on the control action of each control type supported by the historical object.
For example, the user performs a browsing action, a praise action, a tramp action, a report action, a complaint action, a share action, a collection action, a shopping cart joining action, a purchase action, an entry action, a download action, a forward action, and the like on the history object for one duration.
In step S103, according to the current manipulation behavior data of the current object by the user and the historical manipulation behavior data of the historical object by the user, obtaining the emotion classification of the current object by the user.
The emotion classification includes positive emotion, neutral emotion, negative emotion and the like.
When the emotion of the user on the current object is classified as a forward emotion, the user tends to like the current object. In the case where the emotion of the user to the current object is classified as a negative emotion, the user tends to be hate to the current object. In the case where the emotion of the user to the current object is classified as neutral emotion, the user tends to be neither definitely liked nor definitely disliked the current object.
Of course, further, finer grained sentiment classifications and the like may also be divided.
For example, a plurality of different levels of forward emotions are divided, and the high level of forward emotion represents a higher like degree than the low level of forward emotion.
For another example, a plurality of negative emotions of different levels are divided, and the negative emotion of a high level represents a higher aversion degree than the negative emotion of a low level.
The step may specifically refer to the embodiment shown in fig. 2 or fig. 3, and will not be described in detail here.
The analysis of the inventor shows that: the emotion classification of the user on the current object is judged only according to the control behavior executed by the user on the current object, that is, the emotion classification of the user on the current object is judged absolutely by using the control behavior executed by the user on the current object, which may possibly cause the acquired emotion classification of the user on the current object to be inaccurate.
For example, the basic emotion classification of the manipulation behavior of "click on" is a negative emotion (if the user clicks on an object point, it indicates that the user does not like the object, and if the user does not click on the object point, it does not indicate that the user likes the object nor does the user dislike the object, and it is often determined that the emotion of the user on the object is classified as a neutral emotion only according to the state of "the user does not click on the object point" and it is often not determined whether the user likes the object or dislikes the object). When a user browses short videos one by using a short video application program, if almost all short videos are browsed, the short video point is stepped on, and the fact that the short video point is stepped on usually indicates that the user does not like the short videos. And then if the user does not step on the short video frequency point when browsing a short video, the user often likes the short video (because the user does not like the short video frequency point according to the user habit, the user can step on the short video frequency point) in combination with the user habit, and the condition that the user does not step on the short video frequency point can be regarded as implicit feedback of forward emotion. However, the emotion classification of the short video by the user is often determined as neutral emotion according to the fact that the user does not step on the short video, which is not consistent with the forward emotion liked by the user to the short video, and the obtained emotion classification of the user to the current object is inaccurate.
For another example, the basic emotion of the manipulation behavior of "like" is classified as a forward emotion (if the user likes an object, it indicates that the user likes the object, and if the user does not like the object, it does not indicate that the user does not like the object nor does the user like the object, and it is often determined that the emotion of the user on the object is classified as a neutral emotion only according to the state of "the user does not like the object" and it is often not determined whether the user does not like the object or likes the object). In the process that a user browses short videos one by using a short video application program, if almost every short video is browsed, the short video will be praised, and the praise of the short video usually indicates that the short video is liked. And then, if the user does not like the short video when browsing the short video, the user often does not like the short video (because the user likes the short video according to the user habit, the user likes the short video) in combination with the user habit, and the 'not like the short video' can be regarded as an implicit feedback of a negative emotion. However, the emotion classification of the short video by the user is often determined as neutral emotion according to the fact that the user does not like the short video, which is not consistent with the forward emotion that the user likes the short video, and the obtained emotion classification of the user on the current object is inaccurate.
Secondly, historical control behavior data of the user on the historical object in the historical process can reflect historical behavior habits of the user in the historical process, and current control behavior data of the user on the current object can reflect a current control state of the user.
If the difference between the current control state and the historical behavior habit of the user in the historical process is large, the emotion classification of the user on the current object, which can be reflected by the current control behavior data of the user on the current object, is usually negative emotion or positive emotion.
In view of this, in the present application, when a user manipulates a current object, current manipulation behavior data of the user on the current object is acquired for manipulation behaviors of each manipulation type supported by the current object, where the current manipulation behavior data of the user on the current object includes a manipulation type of a manipulation behavior actually performed on the current object by the user and a manipulation type of a manipulation behavior not performed on the current object. For each historical object which is controlled by a user in the historical process, historical control behavior data of the user on the historical object is obtained according to control behaviors of each control type supported by the historical object, wherein the historical control behavior data of the user on the historical object comprises the control type of the control behavior actually executed by the user on the historical object and the control type of the control behavior not executed. And obtaining the emotion classification of the current object by the user according to the current control behavior data of the current object by the user and the historical control behavior data of the historical object by the user.
By the method and the device, the historical control behavior data of the user on the historical object in the historical process is taken as a reference, namely the historical behavior habit of the user in the historical process is taken as a reference, and the emotion classification of the user on the current object is obtained by combining the current control behavior data of the user on the current object, so that the emotion classification of the user on the current object can be obtained according to the actual conditions (such as behavior habit and the like) of the user, and the emotion classification accuracy of the obtained user on the current object can be improved.
In one embodiment of the present application, referring to fig. 2, step S103 includes:
in step S201, an emotion classification acquisition model is acquired, where the emotion classification acquisition model is obtained by training according to a plurality of training data sets, and the training data sets include: the method comprises the steps that a sample user conducts first sample control behavior data on a first sample object, a sample user conducts second sample control behavior data on a second sample object, and the sample user conducts labeling emotion classification on the second sample object.
Wherein, the emotion classification acquisition model can be trained in advance.
For example, a plurality of training data sets may be acquired, the training data sets including: the method comprises the steps that a sample user conducts first sample control behavior data on a first sample object, a sample user conducts second sample control behavior data on a second sample object, and the sample user conducts labeling emotion classification on the second sample object.
The first sample manipulation behavior data of the sample user for the first sample object comprises: and the sample user carries out the manipulation type of the manipulation behavior actually executed by the first sample object and the manipulation type of the manipulation behavior which is not executed.
The second sample manipulation behavior data of the sample user on the second sample object comprises: and the sample user performs the manipulation type of the manipulation behavior actually performed on the second sample object and does not perform the manipulation type of the manipulation behavior.
The model may then be trained using multiple training data sets until the parameters in the model converge, thereby obtaining an emotion classification acquisition model.
The model includes a convolutional neural network or a cyclic neural network, etc.
The trained emotion classification acquisition model may be deployed in the terminal, so that the terminal may directly acquire the emotion classification acquisition model deployed in the terminal, and then may execute step S202.
In step S202, the current control behavior data of the user on the current object and the historical control behavior data of the user on the historical object are input into the emotion classification acquisition model, so that the emotion classification acquisition model processes the current control behavior data of the user on the current object and the historical control behavior data of the user on the historical object to obtain the emotion classification of the user on the current object, and outputs the emotion classification of the user on the current object.
In step S203, the emotion classification of the current object by the user output from the emotion classification acquisition model is acquired.
Further, in an embodiment of the present application, a plurality of emotion classification acquisition models may be trained in advance, one emotion classification acquisition model is suitable for a part of users, and users suitable for different emotion classification acquisition models may not overlap.
Therefore, when the emotion classification acquisition model is acquired, the emotion classification acquisition model suitable for the user can be acquired from the trained emotion classification acquisition models according to the historical control behavior data of the user on the historical object.
Therefore, when the emotion classification of the user on the current object is acquired according to the current control behavior data of the user on the current object and the historical control behavior data of the user on the historical object, the emotion classification acquisition model suitable for the user can be used instead of a common emotion classification acquisition model, and therefore the accuracy of acquiring the emotion classification of the user on the current object can be further improved.
When the emotion classification acquisition model suitable for the user is acquired from the trained emotion classification acquisition models according to the historical control behavior data of the user on the historical object, the user set to which the user belongs can be determined in the plurality of user sets according to the historical control behavior data of the user on the historical object, and then the emotion classification acquisition model suitable for the user set to which the user belongs is selected from the emotion classification acquisition models respectively suitable for the user sets.
Wherein, different emotion classification acquisition models applicable to different user sets are different.
The historical control behavior data of the users in one user set on the historical object are similar, and the historical control behavior data of the users in different user sets on the historical object are dissimilar.
In this way, the behavior data can be manipulated according to the history of the user on the history object, and the user set to which the user belongs can be determined in the plurality of user sets.
For example, for any user set, statistical data of historical manipulation behavior data of each user in the user set on the historical object is obtained. For example, the execution times of the respective user in the user set executing the control operation of each control type on the history object are obtained, then the execution times of the control operation of each executed control type are summed to obtain the total execution times, the ratio between the execution times of the control operation of each executed control type and the total execution times is calculated to obtain the execution proportion of the control operation of each control type in the user set, and the statistical data can reflect the control characteristics (history behavior habits and the like) of each user in the user set on the history object.
The same is true for each of the other user sets.
Then, statistical data of historical manipulation behavior data of the user on the historical object can be obtained. For example, the execution times of the user executing the control operation of each control type on the historical object are obtained, then the execution times of the executed control operation of each control type are summed to obtain the total execution times, the ratio between the execution times of the control operation of each control type and the total execution times is calculated to obtain the execution proportion of the control operation of each control type of the user, and the statistical data can reflect the control characteristics (historical behavior habits and the like) of the user on the historical object.
Similarity between the statistical data of the users and the statistical data of the user sets can be calculated. For example, for any one user set, for any one manipulation type, a square of a difference between an execution proportion of a manipulation operation of the manipulation type of the user and an execution proportion of a manipulation operation of the manipulation type of the user set may be calculated to obtain a square corresponding to the manipulation type, and the same is true for each of the other manipulation types. And then calculating the evolution of the sum of the squares corresponding to each control type as the similarity between the statistical data of the user and the statistical data of the user set. The same is true for each of the other user sets.
And then, the user set with the highest similarity between the statistical data and the statistical data of the users can be used as the user set to which the users belong.
In another embodiment of the present application, a training method for emotion classification acquisition models applicable to different user sets includes:
and acquiring first sample manipulation behavior data of each sample user in the plurality of sample users on the first sample object aiming at the manipulation behaviors of each manipulation type supported by the first sample object. The first sample manipulation behavior data includes manipulation types of manipulation behaviors that are actually performed on the first sample object by the sample user and manipulation types of manipulation behaviors that are not performed.
And classifying the plurality of sample users according to the first sample control behavior data of each sample user on the first sample object to obtain at least two sample user sets.
For example, a plurality of sample users may be classified according to the existing manner (e.g., K-nearest neighbor algorithm, etc.) of the first sample manipulation behavior data of the first sample object by each sample user, respectively, and the sample users in different sample user sets do not coincide.
And the historical control behavior data of each user in the sample user set on the historical object are similar. The historical manipulation behavior data of the users in different sample user sets on the historical object is not similar.
For any sample user set, the emotion classification acquisition model applicable to the sample user set can be obtained in the following manner, and the same is true for each other sample user set.
For any first sample user in the sample user set, for the control behaviors of each control type supported by the second sample object, second sample control behavior data of the second sample object by the first sample user is acquired, the second sample control behavior data includes the control type of the control behavior actually executed by the first sample user on the second sample object and the control type of the control behavior not executed by the first sample user, and the labeled emotion classification of the second sample object by the first sample user is acquired. And combining the first sample manipulation behavior data of the first sample user on the first sample object, the second sample manipulation behavior data of the first sample user on the second sample object and the labeled emotion classification of the first sample user on the second sample object into a training data set of the first sample user. The same is true for each sample user of the sample user set, and the training data sets of the sample users in the sample user set are obtained.
And training the model according to the training data set of each first sample user in the sample user set until the parameters in the model are converged, thereby obtaining the emotion classification acquisition model suitable for the sample user set.
The model includes a convolutional neural network or a cyclic neural network.
In one embodiment of the present application, referring to fig. 3, step S103 includes:
in step S301, for any one manipulation type of manipulation behavior supported by the current object, according to the current manipulation behavior data of the current object by the user, determining a current execution situation of the manipulation behavior of the current object with respect to the manipulation type by the user, and according to the historical manipulation behavior data of the historical object by the user, determining a historical execution situation of the manipulation behavior of the historical object with respect to the manipulation type by the user, and according to the current execution situation and the historical execution situation, obtaining an emotion score of the manipulation behavior of the current object with respect to the manipulation type by the user.
The current execution situation of the current object on the control action of the control type by the user comprises the following steps: and whether the user performs the control action on the control type or does not perform the control action on the control type on the current object.
The historical execution situation of the historical object on the control action of the control type by the user comprises the following steps: and whether the user performs the manipulation behavior about the manipulation type or does not perform the manipulation behavior about the manipulation type on the history object.
The specific manner of acquiring the emotion score of the current object about the manipulation behavior of the manipulation type according to the current execution situation and the historical execution situation may be referred to as an embodiment shown in fig. 4, and is not described in detail here.
In step S302, emotion classification of the current object by the user is obtained according to the emotion scores of the control behaviors of the current object respectively related to each control type.
In this step, the emotion scores of the current object respectively related to the control behaviors of each control type can be weighted and summed by the user to obtain the total emotion value of the current object by the user, and then the emotion classification of the current object by the user can be obtained according to the total emotion value of the current object by the user.
For example, in the case that the total value of the user's emotion to the current object is greater than a preset emotion threshold value, it is determined that the user's emotion to the current object is classified as a forward emotion. Or, under the condition that the total value of the user's emotion to the current object is equal to a preset emotion threshold value, determining that the user's emotion to the current object is classified as neutral emotion. Or determining that the emotion of the user on the current object is classified as negative emotion under the condition that the total emotion value of the user on the current object is smaller than a preset emotion threshold value.
The preset emotion threshold may be determined according to actual conditions, and this is not limited in this application.
In one embodiment of the present application, referring to fig. 4, step S301 includes:
in step S401, a basic emotion classification of the manipulation behavior of the manipulation type is obtained.
In the application, each control behavior of each control type supported by the current object has its own basic emotion classification, and the basic emotion classification includes a positive emotion and a negative emotion.
For example, the basic emotion of a press is classified into positive emotion, the basic emotion of a press is classified into negative emotion, the basic emotion of a report is classified into negative emotion, the basic emotion of a complaint is classified into negative emotion, the basic emotion of a share is classified into positive emotion, the basic emotion of a collection is classified into positive emotion, the basic emotion of a shopping cart is classified into positive emotion, the basic emotion of a purchase is classified into positive emotion, the basic emotion of a report is classified into positive emotion, the basic emotion of a download is classified into positive emotion, the basic emotion of a forward transfer is classified into positive emotion, and the like.
The basic emotion classification of the control behavior of each of the control behaviors supported by the current object may be manually set in advance according to experience and actual conditions.
In step S402, a statistical result of whether the user has an execution tendency to execute the manipulation behavior of the manipulation type on the history object is counted according to the history execution condition, and an execution result of whether the user has executed the manipulation behavior of the manipulation type on the current object is determined according to the current execution condition.
When counting the statistical result of whether the user has the execution tendency of executing the control behavior of the control type on the historical object according to the historical execution condition, the following process can be used for realizing the statistical result, and the statistical result comprises the following steps:
4021. and acquiring the historical quantity of the historical objects manipulated by the user in the historical process. And acquiring the execution times of executing the control action of the control type on the history object in the history process of controlling the history object by the user.
4021. And counting the statistical result of whether the user has the execution tendency of executing the control behaviors of the control type to the history object according to the history number and the execution times.
For example, a ratio between the number of executions and the number of histories may be calculated. And determining that the user has an execution tendency of executing the manipulation behavior of the manipulation type on the history object when the ratio is larger than or equal to a preset ratio. Or, in a case where the ratio is smaller than a preset ratio, it is determined that the user does not have an execution tendency to execute the manipulation behavior of the manipulation type on the history object.
The preset ratio may be determined according to actual conditions, and the application is not limited thereto.
In step S403, according to the basic emotion classification, the statistical result, and the execution result of the manipulation behavior of the manipulation type, an emotion score of the user on the manipulation behavior of the manipulation type of the current object is obtained.
In one example, in a case that the base emotion of the control behavior of the control type is classified as a positive emotion, if the statistical result is used to indicate that the user has an execution tendency to execute the control behavior of the control type on the historical object and the execution result is used to indicate that the user does not execute the control behavior of the control type on the current object, an emotion score of the user on the control behavior of the current object with respect to the control type is set as a first negative emotion score.
Or, in the case that the basic emotion of the control behavior of the control type is classified as a forward emotion, if the statistical result is used to indicate that the user has an execution tendency to execute the control behavior of the control type on the historical object and the execution result is used to indicate that the user executes the control behavior of the control type on the current object, setting an emotion score of the user on the control behavior of the current object with respect to the control type to be a neutral emotion score. The neutral sentiment score is greater than the first negative sentiment score.
Or, in the case that the basic emotion of the control behavior of the control type is classified as a forward emotion, if the statistical result is used to indicate that the user does not have an execution tendency to execute the control behavior of the control type on the historical object, and the execution result is used to indicate that the user does not execute the control behavior of the control type on the current object, setting an emotion score of the user on the control behavior of the current object with respect to the control type to be a neutral emotion score.
Or, in the case that the basic emotion of the control behavior of the control type is classified as a forward emotion, if the statistical result is used to indicate that the user does not have an execution tendency for executing the control behavior of the control type on the historical object, and the execution result is used to indicate that the user executes the control behavior of the control type on the current object, setting an emotion score of the user on the current object with respect to the control behavior of the control type as a first forward emotion score. The first forward sentiment score is greater than the neutral sentiment score.
Or, under the condition that the basic emotion of the control behavior of the control type is classified as a negative emotion, if the statistical result is used for indicating that the user has an execution tendency for executing the control behavior of the control type on the historical object, and the execution result is used for indicating that the user does not execute the control behavior of the control type on the current object, setting the emotion score of the user on the control behavior of the current object related to the control type as a second positive emotion score.
Or, under the condition that the basic emotion of the control behavior of the control type is classified as a negative emotion, if the statistical result is used for indicating that the user has an execution tendency for executing the control behavior of the control type on the historical object, and the execution result is used for indicating that the user executes the control behavior of the control type on the current object, setting the emotion score of the user on the control behavior of the current object related to the control type as a neutral emotion score. The neutral sentiment score is less than the second forward sentiment score.
Or, under the condition that the basic emotion of the control behavior of the control type is classified as a negative emotion, if the statistical result is used for indicating that the user does not have the execution tendency of executing the control behavior of the control type on the historical object, and the execution result is used for indicating that the user does not execute the control behavior of the control type on the current object, setting the emotion score of the user on the control behavior of the current object related to the control type as a neutral emotion score.
Or, in the case that the basic emotion of the manipulation type of the manipulation behavior is classified as a negative emotion, if the statistical result is used to indicate that the user does not have an execution tendency to execute the manipulation behavior of the manipulation type on the historical object, and the execution result is used to indicate that the user executes the manipulation behavior of the manipulation type on the current object, setting an emotion score of the user on the current object with respect to the manipulation behavior of the manipulation type as a second negative emotion score. The second negative sentiment score is less than the neutral sentiment score.
Thus, the corresponding relationship among the four of the basic emotion classification of the control behavior, the statistical result of the execution tendency of the execution control behavior, the execution result of the execution control behavior and the emotion score of the user about the control behavior can be set in advance, and the corresponding relationship includes the basic emotion classification of different control behaviors, the statistical result of the execution tendency of different execution control behaviors, the execution result of different execution control behaviors and the corresponding table entry of the emotion score of the user about different control behaviors, so that the emotion score of the user about the control behavior, which corresponds to the basic emotion classification, the statistical result and the execution result of the control behavior of the control type at the same time, can be searched in the corresponding relationship among the four of the basic emotion classification of the control behavior, the statistical result of the execution tendency of the execution control behavior, the execution result of the execution control behavior and the emotion score of the user about the control behavior, and the emotion score of the user on the control behavior of the current object about the control type is used.
It is noted that, for simplicity of explanation, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will appreciate that the present application is not limited by the order of acts, as some steps may, in accordance with the present application, occur in other orders and concurrently. Further, those skilled in the art will also appreciate that the embodiments described in the specification are exemplary and that no action is necessarily required in this application.
Referring to fig. 5, a block diagram of a data processing apparatus of the present application is shown, applied to a terminal, the apparatus including:
a first obtaining module 11, configured to, under a condition that a user operates a current object, obtain, for control behaviors of each control type supported by the current object, current control behavior data of the current object by the user, where the current control behavior data includes a control type of a control behavior actually executed by the user on the current object and a control type of a control behavior not executed by the user;
a second obtaining module 12, configured to obtain, for each historical object that is controlled by the user in a historical process, historical control behavior data of the historical object by the user for control behaviors of each control type supported by the historical object, where the historical control behavior data includes a control type of a control behavior actually executed by the user on the historical object and a control type of a control behavior not executed by the user;
a third obtaining module 13, configured to obtain, according to the current manipulation behavior data and the historical manipulation behavior data, an emotion classification of the current object by the user.
In an optional implementation manner, the third obtaining module includes:
the emotion classification acquisition module is used for acquiring emotion classification acquisition models, wherein the emotion classification acquisition models are obtained by training according to a plurality of training data sets, and the training data sets comprise: the method comprises the steps that a sample user conducts first sample control behavior data on a first sample object, the sample user conducts second sample control behavior data on a second sample object, and the sample user conducts labeling emotion classification on the second sample object;
the input unit is used for inputting the current control behavior data and the historical control behavior data into the emotion classification acquisition model so that the emotion classification acquisition model processes the current control behavior data and the historical control behavior data to obtain the emotion classification of the current object by the user and output the emotion classification of the current object by the user;
and the second acquisition unit is used for acquiring the emotion classification of the current object, which is output by the emotion classification acquisition model, of the user.
In an optional implementation manner, the first obtaining unit includes:
and the first obtaining subunit is used for obtaining the emotion classification obtaining model suitable for the user from the trained emotion classification obtaining models according to the historical control behavior data.
In an optional implementation manner, the first obtaining subunit is specifically configured to: determining a user set to which the user belongs in a plurality of user sets according to the historical manipulation behavior data; and selecting the emotion classification acquisition models suitable for the user set to which the user belongs from the emotion classification acquisition models respectively suitable for the user sets.
In an optional implementation manner, the third obtaining module includes:
the device comprises a first determining unit, a second determining unit and a third acquiring unit, wherein the first determining unit is used for determining the current execution situation of the user on the control behavior of the current object relative to the control type according to the current control behavior data for the control behavior of any control type supported by the current object, the second determining unit is used for determining the historical execution situation of the user on the control behavior of the historical object relative to the control type according to the historical control behavior data, and the third acquiring unit is used for acquiring the emotion score of the user on the control behavior of the current object relative to the control type according to the current execution situation and the historical execution situation;
and the fourth acquisition unit is used for acquiring the emotion classification of the current object by the user according to the emotion scores of the control behaviors of the current object on each control type respectively.
In an optional implementation manner, the third obtaining unit includes:
the second obtaining subunit is used for obtaining the basic emotion classification of the control behavior of the control type;
a statistic subunit, configured to count, according to the historical execution condition, a statistic result of whether the user has an execution tendency to execute the manipulation behavior of the manipulation type on the historical object, and a determination subunit, configured to determine, according to the current execution condition, an execution result of whether the user has executed the manipulation behavior of the manipulation type on the current object;
and the third obtaining subunit is configured to obtain, according to the basic emotion classification of the control behavior of the control type, the statistical result, and the execution result, an emotion score of the control behavior of the control type of the current object by the user.
In an optional implementation manner, the statistics subunit is specifically configured to: acquiring the historical quantity of the historical objects which are controlled by the user in the historical process, and acquiring the execution times of the control behaviors of the control types which are executed on the historical objects by the user in the process of controlling the historical objects in the historical process; and counting the statistical result of whether the user has the execution tendency of executing the control behavior of the control type on the history object according to the history number and the execution times.
In an optional implementation manner, the statistics subunit is specifically configured to: calculating a ratio between the number of executions and the historical number; determining that the user has an execution tendency for executing the manipulation behavior of the manipulation type on the historical object when the ratio is greater than or equal to a preset ratio; or, in a case that the ratio is smaller than a preset ratio, it is determined that the user does not have an execution tendency to execute the manipulation behavior of the manipulation type on the history object.
In an optional implementation manner, the third obtaining subunit is specifically configured to: and searching the emotion scores of the user about the control behaviors, which simultaneously correspond to the basic emotion classification of the control behaviors of the control type, the statistical result and the execution result, and are used as the emotion scores of the user about the control behaviors of the control type for the current object.
In an optional implementation manner, the fourth obtaining unit includes:
the summation subunit is used for weighting and summing the emotion scores of the control behaviors of the current object respectively related to each control type by the user to obtain the total emotion value of the current object by the user;
and the fourth obtaining subunit is configured to obtain the emotion classification of the current object by the user according to the total value of the emotion of the current object by the user.
In an optional implementation manner, the fourth obtaining subunit is specifically configured to: under the condition that the total value of the user's emotion on the current object is greater than a preset emotion threshold value, determining that the user's emotion on the current object is classified as a forward emotion; or under the condition that the total value of the user's emotion to the current object is equal to a preset emotion threshold value, determining that the user's emotion to the current object is classified as neutral emotion; or determining that the emotion of the user on the current object is classified as negative emotion under the condition that the total emotion value of the user on the current object is smaller than a preset emotion threshold value.
The analysis of the inventor shows that: the emotion classification of the user on the current object is judged only according to the control behavior executed by the user on the current object, that is, the emotion classification of the user on the current object is judged absolutely by using the control behavior executed by the user on the current object, which may possibly cause the acquired emotion classification of the user on the current object to be inaccurate.
For example, the basic emotion classification of the manipulation behavior of "click on" is a negative emotion (if the user clicks on an object point, it indicates that the user does not like the object, and if the user does not click on the object point, it does not indicate that the user likes the object nor does the user dislike the object, and it is often determined that the emotion of the user on the object is classified as a neutral emotion only according to the state of "the user does not click on the object point" and it is often not determined whether the user likes the object or dislikes the object). When a user browses short videos one by using a short video application program, if almost all short videos are browsed, the short video point is stepped on, and the fact that the short video point is stepped on usually indicates that the user does not like the short videos. And then if the user does not step on the short video frequency point when browsing a short video, the user often likes the short video (because the user does not like the short video frequency point according to the user habit, the user can step on the short video frequency point) in combination with the user habit, and the condition that the user does not step on the short video frequency point can be regarded as implicit feedback of forward emotion. However, the emotion classification of the short video by the user is often determined as neutral emotion according to the fact that the user does not step on the short video, which is not consistent with the forward emotion liked by the user to the short video, and the obtained emotion classification of the user to the current object is inaccurate.
For another example, the basic emotion of the manipulation behavior of "like" is classified as a forward emotion (if the user likes an object, it indicates that the user likes the object, and if the user does not like the object, it does not indicate that the user does not like the object nor does the user like the object, and it is often determined that the emotion of the user on the object is classified as a neutral emotion only according to the state of "the user does not like the object" and it is often not determined whether the user does not like the object or likes the object). In the process that a user browses short videos one by using a short video application program, if almost every short video is browsed, the short video will be praised, and the praise of the short video usually indicates that the short video is liked. And then, if the user does not like the short video when browsing the short video, the user often does not like the short video (because the user likes the short video according to the user habit, the user likes the short video) in combination with the user habit, and the 'not like the short video' can be regarded as an implicit feedback of a negative emotion. However, the emotion classification of the short video by the user is often determined as neutral emotion according to the fact that the user does not like the short video, which is not consistent with the forward emotion that the user likes the short video, and the obtained emotion classification of the user on the current object is inaccurate.
Secondly, historical control behavior data of the user on the historical object in the historical process can reflect historical behavior habits of the user in the historical process, and current control behavior data of the user on the current object can reflect a current control state of the user.
If the difference between the current control state and the historical behavior habit of the user in the historical process is large, the emotion classification of the user on the current object, which can be reflected by the current control behavior data of the user on the current object, is usually negative emotion or positive emotion.
In view of this, in the present application, when a user manipulates a current object, current manipulation behavior data of the user on the current object is acquired for manipulation behaviors of each manipulation type supported by the current object, where the current manipulation behavior data of the user on the current object includes a manipulation type of a manipulation behavior actually performed on the current object by the user and a manipulation type of a manipulation behavior not performed on the current object. For each historical object which is controlled by a user in the historical process, historical control behavior data of the user on the historical object is obtained according to control behaviors of each control type supported by the historical object, wherein the historical control behavior data of the user on the historical object comprises the control type of the control behavior actually executed by the user on the historical object and the control type of the control behavior not executed. And obtaining the emotion classification of the current object by the user according to the current control behavior data of the current object by the user and the historical control behavior data of the historical object by the user.
By the method and the device, the historical control behavior data of the user on the historical object in the historical process is taken as a reference, namely the historical behavior habit of the user in the historical process is taken as a reference, and the emotion classification of the user on the current object is obtained by combining the current control behavior data of the user on the current object, so that the emotion classification of the user on the current object can be obtained according to the actual conditions (such as behavior habit and the like) of the user, and the emotion classification accuracy of the obtained user on the current object can be improved.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
Optionally, an embodiment of the present invention further provides an electronic device, including: the processor, the memory, and the computer program stored in the memory and capable of running on the processor, when executed by the processor, implement the processes of the data processing method embodiments described above, and can achieve the same technical effects, and in order to avoid repetition, details are not described here.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements each process of the data processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
Fig. 6 is a block diagram of an electronic device 800 shown in the present application. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 6, electronic device 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operation at the device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, images, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, a carrier network (such as 2G, 3G, 4G, or 5G), or a combination thereof. In an exemplary embodiment, the communication component 816 receives broadcast signals or broadcast operation information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the electronic device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Fig. 7 is a block diagram of an electronic device 1900 shown in the present application. For example, the electronic device 1900 may be provided as a server.
Referring to fig. 7, electronic device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (14)

1. A data processing method is applied to a terminal, and the method comprises the following steps:
under the condition that a user operates a current object, acquiring current operation and control behavior data of the user on the current object aiming at operation and control behaviors of various operation and control types supported by the current object, wherein the current operation and control behavior data comprises operation and control types of operation and control behaviors actually executed by the user on the current object and operation and control types of operation and control behaviors not executed by the user;
for each history object which is controlled by the user in the history process, acquiring historical control behavior data of the user on the history object aiming at control behaviors of each control type supported by the history object, wherein the historical control behavior data comprises the control type of the control behavior actually executed by the user on the history object and the control type of the control behavior not executed;
and acquiring the emotion classification of the current object by the user according to the current control behavior data and the historical control behavior data.
2. The method of claim 1, wherein obtaining the emotion classification of the current object by the user according to the current manipulation behavior data and the historical manipulation behavior data comprises:
obtaining an emotion classification obtaining model, wherein the emotion classification obtaining model is obtained by training according to a plurality of training data sets, and the training data sets comprise: the method comprises the steps that a sample user conducts first sample control behavior data on a first sample object, the sample user conducts second sample control behavior data on a second sample object, and the sample user conducts labeling emotion classification on the second sample object;
inputting the current control behavior data and the historical control behavior data into the emotion classification acquisition model so that the emotion classification acquisition model processes the current control behavior data and the historical control behavior data to obtain the emotion classification of the current object by the user, and outputting the emotion classification of the current object by the user;
and acquiring the emotion classification of the current object output by the emotion classification acquisition model.
3. The method of claim 1, wherein obtaining the emotion classification obtaining model comprises:
and acquiring an emotion classification acquisition model suitable for the user from the trained emotion classification acquisition models according to the historical control behavior data.
4. The method of claim 3, wherein obtaining an emotion classification acquisition model applicable to the user from a plurality of emotion classification acquisition models trained according to the historical manipulation behavior data comprises:
determining a user set to which the user belongs in a plurality of user sets according to the historical manipulation behavior data;
and selecting the emotion classification acquisition models suitable for the user set to which the user belongs from the emotion classification acquisition models respectively suitable for the user sets.
5. The method of claim 1, wherein obtaining the emotion classification of the current object by the user according to the current manipulation behavior data and the historical manipulation behavior data comprises:
for any control behavior of a control type supported by the current object, determining a current execution situation of the user on the control behavior of the current object related to the control type according to the current control behavior data, determining a historical execution situation of the user on the control behavior of the historical object related to the control type according to the historical control behavior data, and acquiring an emotion score of the user on the control behavior of the current object related to the control type according to the current execution situation and the historical execution situation;
and acquiring the emotion classification of the current object by the user according to the emotion scores of the control behaviors of the current object on each control type respectively.
6. The method according to claim 5, wherein the obtaining of the emotion score of the user's manipulation behavior of the current object with respect to the manipulation type according to the current execution situation and the historical execution situation comprises:
acquiring basic emotion classification of the control behaviors of the control types;
counting a statistical result of whether the user has an execution tendency of executing the control behavior of the control type on the historical object according to the historical execution condition, and determining whether the user executes the execution result of the control behavior of the control type on the current object according to the current execution condition;
and acquiring the emotion score of the user on the control behavior of the current object relative to the control type according to the basic emotion classification of the control behavior of the control type, the statistical result and the execution result.
7. The method according to claim 6, wherein the counting, according to the historical execution situation, a result of the counting of whether the user has an execution tendency to execute the manipulation behavior of the manipulation type on the historical object comprises:
acquiring the historical quantity of the historical objects which are controlled by the user in the historical process, and acquiring the execution times of the control behaviors of the control types which are executed on the historical objects by the user in the process of controlling the historical objects in the historical process;
and counting the statistical result of whether the user has the execution tendency of executing the control behavior of the control type on the history object according to the history number and the execution times.
8. The method according to claim 7, wherein the counting the statistical result of the user's execution tendency on whether the history object has the manipulation behavior of the manipulation type according to the history number and the execution times comprises:
calculating a ratio between the number of executions and the historical number;
determining that the user has an execution tendency for executing the manipulation behavior of the manipulation type on the historical object when the ratio is greater than or equal to a preset ratio;
alternatively, the first and second electrodes may be,
determining that the user does not have an execution tendency to execute the manipulation behavior of the manipulation type on the history object in the case that the ratio is smaller than a preset ratio.
9. The method according to claim 6, wherein the obtaining of the emotion score of the user on the control behavior of the current object with respect to the control type according to the basic emotion classification of the control behavior of the control type, the statistical result, and the execution result comprises:
and searching the emotion scores of the user about the control behaviors, which simultaneously correspond to the basic emotion classification of the control behaviors of the control type, the statistical result and the execution result, and are used as the emotion scores of the user about the control behaviors of the control type for the current object.
10. The method according to claim 5, wherein the obtaining of the emotion classification of the current object by the user according to the emotion scores of the control behaviors of the current object by the user respectively related to the control types comprises:
weighting and summing the emotion scores of the current object about the control behaviors of the control types by the user to obtain the total emotion value of the current object by the user;
and obtaining the emotion classification of the current object by the user according to the total emotion value of the current object by the user.
11. The method of claim 10, wherein the obtaining the emotion classification of the current object by the user according to the total emotion value of the current object by the user comprises:
under the condition that the total value of the user's emotion on the current object is greater than a preset emotion threshold value, determining that the user's emotion on the current object is classified as a forward emotion;
alternatively, the first and second electrodes may be,
under the condition that the total value of the user's emotion to the current object is equal to a preset emotion threshold value, determining that the user's emotion to the current object is classified as neutral emotion;
alternatively, the first and second electrodes may be,
and under the condition that the total value of the user's emotion to the current object is smaller than a preset emotion threshold value, determining that the user's emotion to the current object is classified as negative emotion.
12. A data processing apparatus, applied to a terminal, the apparatus comprising:
a first obtaining module, configured to, when a user operates a current object, obtain, for control behaviors of each control type supported by the current object, current control behavior data of the current object by the user, where the current control behavior data includes a control type of a control behavior actually executed by the user on the current object and a control type of a control behavior not executed;
a second obtaining module, configured to obtain, for each historical object that is controlled by the user in a historical process, historical control behavior data of the historical object by the user for control behaviors of each control type supported by the historical object, where the historical control behavior data includes a control type of a control behavior actually executed by the user on the historical object and a control type of a control behavior not executed by the user;
and the third obtaining module is used for obtaining the emotion classification of the current object by the user according to the current control behavior data and the historical control behavior data.
13. An electronic device, comprising: processor, memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the data processing method according to any one of claims 1 to 11.
14. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the data processing method according to any one of claims 1 to 11.
CN202111670958.8A 2021-12-31 2021-12-31 Data processing method and device, electronic equipment and storage medium Pending CN114416503A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111670958.8A CN114416503A (en) 2021-12-31 2021-12-31 Data processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111670958.8A CN114416503A (en) 2021-12-31 2021-12-31 Data processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114416503A true CN114416503A (en) 2022-04-29

Family

ID=81271141

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111670958.8A Pending CN114416503A (en) 2021-12-31 2021-12-31 Data processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114416503A (en)

Similar Documents

Publication Publication Date Title
US20170193399A1 (en) Method and device for conducting classification model training
CN111859020B (en) Recommendation method, recommendation device, electronic equipment and computer readable storage medium
CN109978606B (en) Method and device for processing advertisement click rate data and computer readable storage medium
CN110688527A (en) Video recommendation method and device, storage medium and electronic equipment
CN111898018B (en) Virtual resource sending method and device, electronic equipment and storage medium
CN112445970A (en) Information recommendation method and device, electronic equipment and storage medium
CN108230094B (en) Vehicle recommendation method and device
CN112131466A (en) Group display method, device, system and storage medium
CN113031837B (en) Content sharing method and device, storage medium, terminal and server
CN111859097B (en) Data processing method, device, electronic equipment and storage medium
CN112784151B (en) Method and related device for determining recommended information
CN117453933A (en) Multimedia data recommendation method and device, electronic equipment and storage medium
CN114647774A (en) Pushing method and device, electronic equipment and storage medium
CN112685641A (en) Information processing method and device
US20220277204A1 (en) Model training method and apparatus for information recommendation, electronic device and medium
CN105554080A (en) Information pushing method and information pushing device
CN111898019B (en) Information pushing method and device
CN114416503A (en) Data processing method and device, electronic equipment and storage medium
CN114189719A (en) Video information extraction method and device, electronic equipment and storage medium
CN111179011A (en) Insurance product recommendation method and device
CN113190725B (en) Object recommendation and model training method and device, equipment, medium and product
CN114416246B (en) Data processing method and device, electronic equipment and storage medium
CN114722238B (en) Video recommendation method and device, electronic equipment, storage medium and program product
CN117350824B (en) Electronic element information uploading and displaying method, device, medium and equipment
CN115225702B (en) Information pushing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination