Disclosure of Invention
In order to solve the above problems in the prior art, that is, to improve the convenience of use of electronic equipment, the present application provides an article recommendation method, apparatus, device, and storage medium.
In a first aspect, the present application provides an item recommendation method, including:
clustering a plurality of users to obtain a plurality of clustered user groups;
performing local tensor decomposition on the user, the item searched by the user and the context information of the item aiming at the user group to obtain a local prediction score of the user on the item under the context information;
according to the local prediction scores, performing global tensor decomposition on the user, the items searched by the user and the context information of the items to obtain global prediction scores of the user on the items under the context information;
and recommending articles for the user according to the global prediction score.
In a possible implementation manner, the performing, for the user group, local tensor decomposition on the user, an item searched by the user, and context information of the item, to obtain a local prediction score of the item by the user under the context information includes:
performing local tensor decomposition modeling on the user group, the items searched by the user and the context information of the items to obtain a local scoring model;
and optimizing the local scoring model to obtain the local prediction score.
In a possible implementation manner, the optimizing the local scoring model to obtain the local prediction score includes:
constructing a first target loss function according to the local grading model;
optimizing the local scoring model according to the real scoring of the article by the user under the context information and the first target loss function to obtain a first latent semantic vector corresponding to the user, a first latent semantic vector corresponding to the article and a first latent semantic vector corresponding to the context information;
and inputting the first latent semantic vector corresponding to the user, the first latent semantic vector corresponding to the article and the first latent semantic vector corresponding to the context information into the local scoring model to obtain the local prediction score of the user on the article under the context information, which is output by the local scoring model.
In a possible implementation manner, the performing, according to the local prediction score, global tensor decomposition on the user, the item searched by the user, and context information of the item, to obtain a global prediction score of the user on the item under the context information includes:
carrying out global tensor decomposition modeling on the user, the articles searched by the user and the context information of the articles on the user group to obtain a global scoring model;
combining the local scoring model and the global scoring model through a weighted summation mode and the local prediction scoring to obtain a corrected global scoring model;
and optimizing the corrected global scoring model to obtain the global prediction score.
In a possible implementation manner, the combining, by the weighted summation and the local prediction score, the local scoring model and the global scoring model to obtain a modified global scoring model includes:
weighting the local prediction scores and the global scoring model according to first weights corresponding to the local prediction scores and second weights corresponding to the global scoring model;
according to model parameters of a global scoring model and the weighting results of the local prediction scores and the global scoring model, constructing and obtaining the corrected global scoring model;
the model parameters comprise bias parameters corresponding to the user, the article and the context information respectively.
In a possible implementation manner, the optimizing the modified global scoring model to obtain the global prediction score includes:
constructing a second target loss function according to the modified global scoring model;
optimizing the corrected global scoring model according to the real scoring of the user on the article under the context information and the second target loss function to obtain a second latent semantic vector corresponding to the user, a second latent semantic vector corresponding to the article and a second latent semantic vector corresponding to the context information;
inputting a second latent semantic vector corresponding to the user, a second latent semantic vector corresponding to the article and a second latent semantic vector corresponding to the context information into the optimized global scoring model, and obtaining a global prediction score of the user on the article under the context information, which is output by the optimized global scoring model.
In a second aspect, the present application provides an item recommendation device comprising:
the clustering module is used for clustering a plurality of users to obtain a plurality of clustered user groups;
the first prediction module is used for carrying out local tensor decomposition on the user, the articles searched by the user and the context information of the articles aiming at the user group to obtain a local prediction score of the user on the articles under the context information;
the second prediction module is used for performing global tensor decomposition on the user, the articles searched by the user and the context information of the articles according to the local prediction scores to obtain the global prediction scores of the user on the articles under the context information;
and the recommending module is used for recommending the articles to the user according to the global prediction score.
In a third aspect, the present application provides an electronic device, comprising:
a processor and a memory;
the memory stores a computer program;
the processor, when executing the computer program stored in the memory, implements the item recommendation method provided in the first aspect or any of the possible implementation manners of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, in which computer-executable instructions are stored, and the computer-executable instructions are executed by a processor to implement the item recommendation method provided in the first aspect or any one of the possible implementation manners of the first aspect.
In a fifth aspect, the present application provides a chip comprising:
a processor and a memory;
the memory stores a computer program;
the processor, when executing the computer program stored in the memory, implements the item recommendation method provided in the first aspect or any of the possible implementation manners of the first aspect.
In a sixth aspect, the present application provides a computer program product comprising a computer program, which when executed by a processor, implements the item recommendation method provided in the first aspect or any one of the possible embodiments of the first aspect.
The technical personnel in the field can understand that, in the application, based on the first voice information, the target function program is determined in the function program corresponding to the electronic equipment, the parameter information corresponding to the target function program and the target function program is displayed, the function program in the electronic equipment is convenient to call, a user can conveniently observe detailed information of the called function program, the interaction efficiency of the user and the electronic equipment is improved, the use convenience of the electronic equipment is improved, and the user experience is improved.
Detailed Description
First, it should be understood by those skilled in the art that these embodiments are merely for explaining the technical principles of the present application, and are not intended to limit the scope of the present application. And can be modified as needed by those skilled in the art to suit particular applications.
The terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the embodiments of the present application, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B, may be expressed as: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
Typically, item recommendations are made to the user based on items that the user has searched for or purchased within a recent period of time. When the data sets of the search records and the purchase records of the user are sparse, the article recommendation accuracy of the method is not high. Or determining individual preferences of the user, aggregating the individual preferences of the user to obtain group preferences, and recommending the articles based on the group preferences. However, there is an interaction between groups and individuals in a real social environment, and the aggregation from individual preference to group preference is a one-way process, and the aggregation process cannot capture the influence of the group preference on the individual preference, so the accuracy of item recommendation in the method is still to be improved.
In order to improve the accuracy of item recommendation, the embodiment of the application provides an item recommendation method. The method improves the accuracy of item recommendation from the following aspects: on one hand, not only the interaction relation between the user and the article is concerned, but also the context information of the article is concerned, and the richness of data on which article recommendation depends is improved; on the other hand, tensor decomposition is applied to article recommendation, and the problem of data sparsity is further solved by utilizing the characteristics that the tensor decomposition is beneficial to filling of sparse data, mining of implicit relations and the like; in yet another aspect, in item recommendation, a combination of local tensor decomposition and global tensor decomposition enables capture of the interplay between individual preferences and group preferences.
Fig. 1 is an exemplary diagram of an application scenario provided in an embodiment of the present application. As shown in fig. 1, the application scenario includes a terminal 101 and a server 102, wherein the terminal 101 and the server 102 communicate, for example, over a network. When recommending articles, the server 102 predicts articles in which the user is interested according to the collected user behaviors, and recommends articles for the user on the terminal 101 according to the prediction result. Alternatively, the prediction of the item of interest to the user may be performed on the terminal 101.
Among them, the terminal 101 is, for example, a handheld device (e.g., a smart phone and a tablet), a computing device (e.g., a Personal Computer (PC)), a wearable device (e.g., a smart watch and a smart band), a smart home device (e.g., a smart display device), and the like, which have a wireless communication function. The server 102 may be a single server or a server cluster of multiple servers, such as a distributed server, a centralized server, and a cloud server.
Fig. 2 is a flowchart illustrating an article recommendation method according to an embodiment of the present application, where an execution subject of the method is an electronic device. As shown in fig. 2, the method includes:
s201, clustering a plurality of users to obtain a plurality of clustered user groups.
Specifically, user portrait information of a plurality of users is collected in advance, and similarity between the users is judged according to the user portrait information of the plurality of users. And according to the similarity between the users, clustering the users to obtain a plurality of clustered user groups. The user portrait information of the user includes a plurality of label information of the user, that is, values of the user under a plurality of labels, or attribute values of the user under a plurality of attributes. Such as gender, age, occupation, hobbies, family information of the user. Therefore, the similarity of the item preferences of similar users is fully considered, the users with high similarity are divided into the same user group and the users with low similarity are divided into different user groups through user clustering, and the mutual influence of the group preferences and the individual preferences is conveniently captured subsequently.
S202, aiming at a user group, performing local tensor decomposition on the user, the item searched by the user and the context information of the item to obtain a local prediction score of the user on the item under the context information.
With the refinement of functions or the refinement of characteristics that can be provided by the same kind of articles, a user may search for an article having a certain function or a certain characteristic when searching for an article. For example, when searching for a washing machine, a user may search for "a washing machine having a drying function", "a washing machine having a sterilization function", "a child clothes washing machine", and the like, and when searching for an air conditioner, may search for "a dehumidification function air conditioner", "a low power consumption air conditioner", and the like. When historical information such as search records, browsing records and purchase records of a user is collected, items searched by the user and context information of the items when the user searches the items can be extracted and obtained through keywords. Therefore, when recommending the articles, not only the articles searched by the user are concerned, but also the detailed information which is interested by the searched articles by the user is concerned, so that the articles with more detailed categories and functions are recommended to the user more accurately. The context information of the item includes an attribute description or a requirement description related to the item when the user searches for the item within a preset time period, for example, within 1 minute, within 5 minutes, and the like.
Specifically, after a plurality of user groups are obtained through clustering, for each user group, context information of the user, the item searched by the user and the item is obtained from a database storing historical data, in the user group, tensor decomposition is performed on the user, the item searched by the user and the context information of the item through a tensor decomposition algorithm, and a prediction score of the user on the item under the context information in the user group is obtained through tensor decomposition. Since the tensor resolution is within the range of each user group, the tensor resolution is converted into a local tensor resolution, and the prediction score is converted into a local prediction score. The process of the local tensor decomposition reflects the influence of the individual preference of the user (the item searched by the user and the context information of the item) on the group preference of the user group (the local prediction score of the item under the context information by the user in the user group).
S203, according to the local prediction score of the user on the article under the context information, performing global tensor decomposition on the user, the article searched by the user and the context information of the article to obtain the global prediction score of the user on the article under the context information.
Specifically, after the local prediction scores of the users in each user group for the articles under the context information are obtained, tensor decomposition is performed according to the users in all the user groups, the articles searched by the users and the context information of the articles, and the local prediction scores of the users in each user group for the articles under the context information are integrated into the tensor decomposition process, so that the prediction scores of the users for the articles under the context information are obtained. Since the range of the tensor decomposition is within the range of all user groups, the tensor decomposition is called a global tensor decomposition, and the predictive score is called a global predictive score. By integrating the local prediction scores of the users in each user group for the articles under the context information into the global tensor decomposition process, the influence of group preference (the local prediction scores of the users in the user group for the articles under the context information) on the predicted individual preference of the users (the global prediction scores of the users for the articles under the context information) is reflected.
And S204, recommending the articles to the user according to the global prediction score.
Specifically, after obtaining the global prediction scores of the items by the respective users under the context information, the global prediction scores of the items by the users under the context information may be ranked high or low for the respective users, for example, the global prediction score of the item c1 by the user a1 under the context information b1, the global prediction score of the item c2 by the user a1 under the context information b2, and the global prediction score of the item c3 by the user a1 under the context information b3, and … … are ranked.
And selecting context information and articles corresponding to the global prediction scores with the front preset digits after sorting according to the sorting result, and recommending the articles according to the selected context information and articles. Or determining context information and articles corresponding to the global prediction scores with the numerical values larger than the preset threshold according to the sorting results, and recommending the articles according to the context information and the articles corresponding to the global prediction scores with the numerical values larger than the preset threshold. For example, if the global prediction score of the user on the air conditioner under the context information "dehumidification" is higher and is greater than a preset threshold, the user may be recommended to dehumidify the air conditioner.
In the embodiment of the application, by clustering the users and performing local tensor decomposition and global tensor decomposition on the users, the articles searched by the users and the context information of the articles, the problem of data sparsity is improved to a certain extent, the mutual influence of individual preference and group preference is considered, and the accuracy of article recommendation is effectively improved.
When clustering is performed on users, the cosine similarity may be used to determine the similarity between users, and based on this, fig. 3 is a schematic flow diagram of an article recommendation method according to another embodiment of the present application. As shown in fig. 3, the method includes:
s301, according to the user portrait information of a plurality of users, the similarity between different users is determined by adopting the modified cosine similarity.
Specifically, the user portrait information of the user comprises values of the user under a plurality of labels, when the similarity between different users is determined, the cosine similarity between different users under each label is determined according to the values of different users under each label and a calculation formula of the modified cosine similarity, and the similarity between different users is determined according to the cosine similarity between different users under each label. The similarity between different users is the similarity of item preferences between different users.
Optionally, different weights may be set for different tags in consideration of different degrees of influence of different tags on similarity of item preferences of users, for example, influence of gender and age on similarity of item preferences of different users is high, influence of occupation on similarity of item preferences of different users is low. And weighting and summing the cosine similarity between different users under each label according to the weight corresponding to each label to obtain the similarity between different users.
Further, the formula for calculating the similarity between different users through the modified cosine similarity and the values of different users under multiple labels is as follows:
wherein, sim (U)i,Uj) Indicating the degree of similarity between user i and user j, labelq(Ui) Represents the value of the user i at the kth label, WqRepresenting the weight of the qth label.
S302, clustering the plurality of users according to the similarity between different users to obtain a plurality of clustered user groups.
Specifically, after the similarity between different users is obtained, the plurality of users can be clustered according to the clustering algorithm and the similarity between different users, so as to obtain a plurality of user groups.
Optionally, the clustering algorithm is a K-means clustering algorithm, so that the clustering effect of user clustering is improved through the K-means clustering algorithm.
S303, carrying out local tensor decomposition on the user, the item searched by the user and the context information of the item aiming at the user group to obtain the local prediction score of the user on the item under the context information.
S304, according to the local prediction score of the user on the article under the context information, performing global tensor decomposition on the user, the article searched by the user and the context information of the article to obtain the global prediction score of the user on the article under the context information.
And S305, recommending the articles to the user according to the global prediction score.
S303 to S305 can refer to the description of the foregoing embodiments, and are not repeated.
According to the embodiment of the application, the similarity between the users is determined according to the user image information and the corrected cosine similarity of the users, the accuracy of the similarity between the users is improved, and the clustering algorithm is adopted to cluster the users based on the similarity between the users, so that the accuracy of the user clustering is improved. Furthermore, the accuracy of the global prediction scores obtained subsequently based on the user group, the local tensor decomposition and the global tensor decomposition is improved, and the accuracy of item recommendation is improved.
Fig. 4 is a flowchart illustrating an item recommendation method according to another embodiment of the present application. As shown in fig. 4, the method includes:
s401, clustering the plurality of users to obtain a plurality of clustered user groups.
S401 may refer to the description of the foregoing embodiments, and is not repeated herein.
S402, performing local tensor decomposition modeling on the user, the articles searched by the user and the context information of the articles on the user group according to each user group to obtain a local scoring model.
Specifically, after a plurality of user groups are obtained through clustering, local tensor decomposition is performed on the users in the user groups, the articles searched by the users and the context information of the articles aiming at each user group to obtain implicit semantic vectors corresponding to the users, implicit semantic vectors corresponding to the articles and implicit semantic vectors corresponding to the context information, so that the implicit characteristics of the users are stored in the implicit semantic vectors corresponding to the users, the implicit characteristics of the articles searched by the users are stored in the implicit vectors corresponding to the articles, and the implicit characteristics of the context information of the articles are stored in the implicit vectors corresponding to the context information. For the purpose of differential description, the latent semantic vector obtained in the local tensor decomposition is subsequently referred to as a first latent semantic vector. And carrying out grading modeling according to the first latent semantic vector corresponding to the user, the first latent semantic vector corresponding to the article and the first latent semantic vector corresponding to the context information, and constructing a local grading model of the user in each user group for the article under the context information.
Alternatively, the tensor decomposition may employ a CP decomposition algorithm.
Alternatively, the local scoring model may be expressed as:
wherein, adopt
A first latent semantic vector corresponding to user u is represented,
m is the total number of users of all user groups, D is the dimension of the first latent semantic vector,
first latent semantic vector representing user uThe d-th element of (1); by using
A first latent semantic vector representing the item i,
n is the total number of all the articles,
a d-th element of the first latent semantic vector representing item i; by using
A first latent semantic vector corresponding to context k is represented,
k is the total number of all context information,
the d-th element of the first latent semantic vector representing context information k.
And S403, optimizing the local score model to obtain the local prediction score of the user on the article under the context information.
Specifically, for each user group, traversing each user in the user group, and optimizing the local scoring model corresponding to the user group according to a difference between a local prediction score calculated based on a first latent semantic vector corresponding to the user, a first latent semantic vector corresponding to an article searched by the user, and a first latent semantic vector corresponding to context information in the local scoring model and a real score of the article by the user under the context information, that is, optimizing the first latent semantic vector corresponding to the user, the first latent semantic vector corresponding to the article, and the first latent semantic vector corresponding to the context information. And finally, determining a first implicit semantic vector corresponding to the user, a first implicit semantic vector corresponding to the article and a first implicit semantic vector corresponding to the context information in the optimized user group.
Optionally, in the process of optimizing the local scoring model to obtain the local predictive score of the user on the article under the context information: aiming at each user group, constructing a first target loss function according to a local scoring model; optimizing the local scoring model according to the real score of the user on the article and the first target loss function of the user in the context information to obtain a first latent semantic vector corresponding to the user, a first latent semantic vector corresponding to the article and a first latent semantic vector corresponding to the context information; and inputting the first latent semantic vector corresponding to the user, the first latent semantic vector corresponding to the article and the first latent semantic vector corresponding to the context information into a local scoring model to obtain a local prediction score of the user on the article under the context information, which is output by the local scoring model.
The first target loss function is used for reflecting the difference between the local prediction score of the user for the item under the context information and the real score of the user for the item under the context information, which are predicted based on the local score model. The real scoring of the items by the user under the context information is from all pre-collected scoring vectors of the items by the user under the context information, for example, the scoring of the items by the user under different context information can be collected by means of a questionnaire.
Specifically, a first target loss function reflecting the difference between the local prediction score and the real score is constructed for each user group according to the local score model. And traversing each user aiming at each user group, determining a local prediction score output by a local scoring model according to a first implicit semantic vector corresponding to the user, a first implicit semantic vector corresponding to the article and a first implicit semantic vector corresponding to the context information, and determining a function value of a first target loss function according to the local prediction score and the real score of the user on the article under the context information. And optimizing the local scoring model according to the function value and the optimization algorithm of the first target loss function, namely optimizing a first latent semantic vector corresponding to a user, a first latent semantic vector corresponding to an article and a first latent semantic vector corresponding to context information in the local scoring model. And circularly executing the process until the function value of the first target loss function is less than or equal to a preset threshold value, finishing the training of the local scoring model, and obtaining a first latent semantic vector corresponding to the user, a first latent semantic vector corresponding to the article and a first latent semantic vector corresponding to the context information in each user group. And then inputting the first latent semantic vector corresponding to the user, the first latent semantic vector corresponding to the article and the first latent semantic vector corresponding to the context information into the local prediction model to obtain the final local prediction score of the user on the user under the context information.
Optionally, to avoid the over-fitting problem, an L2 norm may be added to the first objective loss function.
Further, the first objective loss function is:
wherein Y represents a user group, Y
uikRepresenting the user u's true rating of item i under context k,
user u, representing the local scoring model output, scores the local prediction of item i at context k. λ denotes the regularization parameter of the local tensor decomposition. And aiming at each user group, traversing each user, and inputting a first latent semantic vector corresponding to the user, a first latent semantic vector corresponding to the article and a latent semantic vector corresponding to the context information into the first target loss function to obtain a function value. Optimizing the local scoring model based on the function values, i.e. optimizing
Finally, obtaining a first latent semantic vector U corresponding to the optimized users in each user group
localFirst latent semantic vector V corresponding to article
localA first latent semantic vector C corresponding to the context information
local。
Optionally, the optimization algorithm is a Stochastic Gradient Descent (SGD) algorithm, so as to improve the prediction effect of the local scoring model through the optimization algorithm.
S404, performing global tensor decomposition modeling on the user group, the items searched by the user and the context information of the items to obtain a global scoring model.
Specifically, global tensor decomposition is performed on users in a user group, articles searched by the users and context information of the articles to obtain implicit semantic vectors corresponding to the users, implicit semantic vectors corresponding to the articles and implicit semantic vectors corresponding to the context information. For the difference description, the latent semantic vector obtained in the global tensor decomposition is subsequently referred to as a second latent semantic vector. And carrying out grading modeling according to the second latent semantic vector corresponding to the user, the second latent semantic vector corresponding to the article and the second latent semantic vector corresponding to the context information, and constructing to obtain a global grading model.
Alternatively, the tensor decomposition may employ a CP decomposition algorithm.
Alternatively, the global scoring model may be expressed as:
wherein,
representing the d-th element in the second latent semantic vector corresponding to the user,
representing the d-th element in the corresponding second latent semantic vector of the item,
the d-th element in the second latent semantic vector corresponding to the context information is represented,
s405, combining the local scoring model and the global scoring model through a weighted summation mode and local prediction scoring to obtain a corrected global scoring model.
Specifically, after the global scoring model is obtained, the local prediction scoring and the global scoring model are subjected to weighted summation to obtain a modified global scoring model. Therefore, the local scoring model and the global scoring model are combined, the local scoring model is integrated into the global scoring model, the mutual influence of group preference and individual preference can be captured, and the accuracy of subsequent global prediction scoring is improved.
Optionally, in the process of obtaining the modified global scoring model by combining the local scoring model and the global scoring model in a weighted summation manner and the local predictive scoring, the method includes: weighting the local prediction scores and the global prediction model according to the first weights corresponding to the local prediction scores and the second weights corresponding to the global prediction models; and according to the model parameters of the global scoring model, the global scoring model is constructed according to the weighting results of the local prediction scoring and the global scoring model. The model parameters of the global scoring model comprise bias parameters corresponding to the user, bias parameters corresponding to the articles and bias parameters corresponding to the context. Therefore, local prediction scores are introduced into the global scoring model, namely the local scoring model is introduced, and the proportion of the local prediction model in the global scoring model can be adjusted by adjusting the first weight and the second weight; in addition, a plurality of bias parameters are introduced into the global scoring model, so that the accuracy of global scoring prediction of the global scoring model is improved.
Wherein the sum of the first weight and the second weight is 1.
Further, the modified global scoring model may be expressed as:
wherein,
for local prediction scoring, μ is the global mean score (which can be preset), b
uBias parameters corresponding to the user, b
iA bias parameter corresponding to the article, b
kThe bias parameter is corresponding to the context information, η is the proportion of the local prediction score, i.e. the first weight, and 1- η is the second weight.
And S406, optimizing the corrected global scoring model to obtain a global prediction score.
Specifically, for each user, a global prediction score of the user for the article under the context information is calculated according to the corrected global score model, and the global score model is optimized according to the difference between the local prediction score of the user for the article under the context information and the real score of the user for the article under the context information, namely, a second latent semantic vector corresponding to the user, a second latent semantic vector corresponding to the article and a second latent semantic vector corresponding to the context information are optimized. And finally, obtaining a second latent semantic vector corresponding to the optimized user, a second latent semantic vector corresponding to the article and a second latent semantic vector corresponding to the context information.
Optionally, in the process of optimizing the modified global scoring model to obtain the global prediction score: constructing a second target loss function according to the modified global scoring model; optimizing the corrected global scoring model according to the real score of the user on the article and a second target loss function of the user on the context information to obtain a second latent semantic vector corresponding to the user, a second latent semantic vector corresponding to the article and a second latent semantic vector corresponding to the context information; and inputting the second latent semantic vector corresponding to the user, the second latent semantic vector corresponding to the article and the second latent semantic vector corresponding to the context information into the global scoring model to obtain the global prediction score of the user on the article under the context information, which is output by the global scoring model.
And the second target loss function is used for reflecting the difference between the global prediction score of the user for the item under the context information and the real score of the user for the item under the context information, which are predicted based on the global score model.
Specifically, according to the global scoring model, a second target loss function reflecting the difference between the global prediction score and the real score is constructed. And for each user, determining a global prediction score output by the global scoring model according to a second implicit semantic vector corresponding to the user, a second implicit vector corresponding to the article and a second implicit vector corresponding to the context information, and determining a function value of a second target loss function according to the global prediction score and the real score of the user on the article under the context information. And optimizing the global scoring model according to the function value and the optimization algorithm of the second target loss function, namely optimizing a second latent semantic vector corresponding to the user, a second latent semantic vector corresponding to the article and a second latent semantic vector corresponding to the context information in the global scoring model. And circularly executing the process until the function value of the second target loss function is less than or equal to a preset threshold value, finishing the training of the global scoring model, and obtaining a second latent semantic vector corresponding to the user, a second latent semantic vector corresponding to the article and a second latent semantic vector corresponding to the context information. And then inputting a second latent semantic vector corresponding to the user, a second latent semantic vector corresponding to the article and a second latent semantic vector corresponding to the context information into the local prediction model to obtain a final global prediction score of the user on the user under the context information.
Optionally, to avoid the over-fitting problem, an L2 norm may be added to the second objective loss function.
Further, the second objective loss function is:
optionally, the optimization algorithm is a Stochastic Gradient Descent (SGD) algorithm, so as to improve the prediction effect of the local scoring model through the optimization algorithm.
Optionally, the iterative update formula of each parameter in the global scoring model is as follows:
bu←bu+α2·(yuik-fuik-λ2·bu);
bi←bi+α2·(yuik-fuik-λ2·bi);
bk←bk+α2·(yuik-fuik-λ2·bk);
wherein alpha is2Denotes the learning rate, λ2And representing the regularization parameters and obtaining the regularization parameters through cross validation.
And S407, recommending the articles to the user according to the global prediction score.
In S407, reference may be made to the description of the foregoing embodiments, which is not repeated.
In the embodiment of the application, a target function program is determined in function programs corresponding to the electronic device based on first voice information for calling the target function program, the target function program and parameter information corresponding to the target function program are displayed, and the adjusted parameter information is acquired and displayed in response to acquiring instruction information for adjusting the parameter information. Therefore, the voice calling of the functional program and the convenient adjustment of the parameters of the functional program are realized, and the use convenience of the electronic equipment is improved.
Fig. 5 is a schematic structural diagram of an article recommendation device according to an embodiment of the present application. As shown in fig. 5, the item recommendation apparatus includes:
the clustering module 501 is configured to cluster a plurality of users to obtain a plurality of clustered user groups;
a first prediction module 502, configured to perform, for a user group, local tensor decomposition on a user, an item searched by the user, and context information of the item, to obtain a local prediction score of the user on the item under the context information;
the second prediction module 503 is configured to perform global tensor decomposition on the user, the item searched by the user, and context information of the item according to the local prediction score, so as to obtain a global prediction score of the user on the item under the context information;
and the recommending module 504 is used for recommending the articles to the user according to the global prediction score.
In a possible implementation, the first prediction module 502 is specifically configured to:
performing local tensor decomposition modeling on a user group, items searched by the user and context information of the items to obtain a local scoring model;
and optimizing the local score model to obtain a local prediction score.
In a possible implementation, the first prediction module 502 is specifically configured to: constructing a first target loss function according to the local grading model; optimizing the local scoring model according to the real score of the user on the article under the context information and a first target loss function to obtain a first latent semantic vector corresponding to the user, a first latent semantic vector corresponding to the article and a first latent semantic vector corresponding to the context information; and inputting the first latent semantic vector corresponding to the user, the first latent semantic vector corresponding to the article and the first latent semantic vector corresponding to the context information into a local scoring model to obtain a local prediction score of the user on the article under the context information, which is output by the local scoring model.
In a possible implementation manner, the second prediction module 503 is specifically configured to: carrying out global tensor decomposition modeling on the user, the articles searched by the user and the context information of the articles to obtain a global scoring model; combining the local scoring model and the global scoring model through a weighted summation mode and local prediction scoring to obtain a corrected global scoring model; and optimizing the corrected global scoring model to obtain a global prediction score.
In a possible implementation manner, the second prediction module 503 is specifically configured to: weighting the local prediction scores and the global scoring model according to first weights corresponding to the local prediction scores and second weights corresponding to the global scoring model; constructing a modified global scoring model according to the model parameters of the global scoring model and the weighting results of the local prediction scoring and the global scoring model; the model parameters comprise bias parameters corresponding to the user, the article and the context information respectively.
In a possible implementation manner, the second prediction module 503 is specifically configured to: constructing a second target loss function according to the modified global scoring model; optimizing the corrected global scoring model according to the real score of the user on the article under the context information and a second target loss function to obtain a second latent semantic vector corresponding to the user, a second latent semantic vector corresponding to the article and a second latent semantic vector corresponding to the context information; and inputting the second latent semantic vector corresponding to the user, the second latent semantic vector corresponding to the article and the second latent semantic vector corresponding to the context information into the optimized global scoring model, and obtaining the global prediction score of the user on the article under the context information, which is output by the optimized global scoring model.
The article recommendation device provided in fig. 5 may perform the corresponding method embodiments described above, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 6, the electronic device includes: a processor 601 and a memory 602; the memory 602 stores a computer program; the processor 601 executes the computer program stored in the memory to implement the steps of the item recommendation method in the above-mentioned embodiments of the methods.
In the heating stove, the memory 602 and the processor 601 are electrically connected directly or indirectly to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines, such as a bus. The memory 602 stores computer-executable instructions for implementing the data access control method, including at least one software functional module that can be stored in the memory 602 in the form of software or firmware, and the processor 601 executes various functional applications and data processing by running software programs and modules stored in the memory 602.
The Memory 602 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like. The memory 602 is used for storing programs, and the processor 601 executes the programs after receiving the execution instructions. Further, the software programs and modules within the memory 602 may also include an operating system, which may include various software components and/or drivers for managing system tasks (e.g., memory management, storage device control, power management, etc.), and may communicate with various hardware or software components to provide an operating environment for other software components.
The processor 601 may be an integrated circuit chip having signal processing capabilities. The Processor 601 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
An embodiment of the present application further provides a chip, including: a processor and a memory; the storage is stored with a computer program, and the processor executes the computer program stored in the storage to realize the steps of the item recommendation method in the above method embodiments.
An embodiment of the present application further provides a computer-readable storage medium, in which computer-executable instructions are stored, and when the computer-executable instructions are executed by a processor, the steps of the item recommendation method in the above-mentioned method embodiments are implemented.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, the computer program can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
So far, the technical solutions of the present application have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present application is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the present application, and the technical scheme after the changes or substitutions will fall into the protection scope of the present application.