CN112488355A - Method and device for predicting user rating based on graph neural network - Google Patents

Method and device for predicting user rating based on graph neural network Download PDF

Info

Publication number
CN112488355A
CN112488355A CN202011176586.9A CN202011176586A CN112488355A CN 112488355 A CN112488355 A CN 112488355A CN 202011176586 A CN202011176586 A CN 202011176586A CN 112488355 A CN112488355 A CN 112488355A
Authority
CN
China
Prior art keywords
commodity
user
adjacent
information
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011176586.9A
Other languages
Chinese (zh)
Inventor
赵煜
杨斯雯
许强
李维
何彦杉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011176586.9A priority Critical patent/CN112488355A/en
Publication of CN112488355A publication Critical patent/CN112488355A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0203Market surveys; Market polls

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Data Mining & Analysis (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Educational Administration (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a method and a device for predicting user rating based on a graph neural network in the field of artificial intelligence. According to the technical scheme, a user-commodity feature three-part graph is constructed according to the commodity features in the historical records of commodities purchased by users and the commodity features in user comments of the users to the commodities; according to the constructed three-part graph, the representation information of the user, the commodity and the commodity feature is learned by using a neural network, wherein the representation information of the user is learned by the representation information of the adjacent commodity feature and the representation information of the adjacent commodity, the representation information of the commodity is learned by the representation information of the adjacent commodity feature and the representation information of the adjacent user, and the representation information of the commodity feature is learned by the representation information of the adjacent user and the adjacent commodity; and predicting and generating the commodity rating information of the user through a neural network by the learned user characteristic information, commodity characteristic information and commodity characteristic information. According to the technical scheme, the accuracy of the user rating can be improved.

Description

Method and device for predicting user rating based on graph neural network
Technical Field
The application relates to the field of artificial intelligence, in particular to a method and a device for predicting user rating based on a graph neural network.
Background
Artificial Intelligence (AI) is a theory, method, technique and application system that uses a digital computer or a machine controlled by a digital computer to simulate, extend and expand human intelligence, perceive the environment, acquire knowledge and use the knowledge to obtain the best results. In other words, artificial intelligence is a branch of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making. Research in the field of artificial intelligence includes robotics, natural language processing, computer vision, decision and reasoning, human-computer interaction, recommendation and search, AI basic theory, and the like.
Currently, AI has been widely used in various fields of life, for example, AI may be applied to user rating prediction of an e-commerce website. With the popularity of networks, users are often doing various activities online, such as purchasing products and reserving restaurants on an e-commerce website. Generally, when a user performs an online activity, a favorite product is selected from a large number of similar products. In order to reduce the cognitive burden of the user in selecting the commodities, the role of the recommendation system is important. The recommendation system can predict the user rating of the commodity by the user through the historical purchasing habits of the user so as to recommend the commodity to the user.
When the user rating of the user u to the commodity i needs to be predicted, the characterization information of the user u and the characterization information of the commodity i can be learned from user comments written by the user u and user comments received by the commodity i through a deep learning network respectively, the learned representative embedded vector of the user u and the learned representative embedded vector of the commodity i are combined, and finally the user rating of the commodity i by the user u is predicted based on the combined representative embedded vector.
However, when the above-mentioned techniques are used to predict the rating of a user for a good, there is often a problem that the prediction is inaccurate. For example, the user is not satisfied with the goods recommended by the website.
Disclosure of Invention
The application provides a method and a related device for predicting user rating based on a graph neural network, which can improve the accuracy of user rating.
In a first aspect, the present application provides a method of predicting a user rating based on a neural network. The method comprises the following steps: generating target characterization information by using a neural network based on the characterization information of the adjacent commodity feature of each user in a plurality of users, the characterization information of the adjacent commodity of each user, the characterization information of the adjacent commodity feature of each commodity in a plurality of commodities, the characterization information of the adjacent user of each commodity, the characterization information of the adjacent user and the characterization information of the adjacent commodity of each commodity feature in the adjacent commodity feature of each user, the characterization information of the adjacent user and the characterization information of the adjacent commodity of each commodity feature in the adjacent commodity feature of each commodity, wherein the target characterization information comprises the characterization information of a user to be evaluated, the characterization information of a commodity to be evaluated, the characterization information of the adjacent commodity feature of the commodity to be evaluated and the characterization information of the adjacent commodity feature of the user to be evaluated; obtaining first rating information based on the target representation information by using the neural network, wherein the first rating information represents user rating of the to-be-rated user on the to-be-rated commodity; wherein, the multiple users comprise multiple users corresponding to multiple user nodes connected by commodity nodes and/or commodity feature nodes in the three-part graph, and the adjacent commodity feature packages of each userIncluding commodity features corresponding to commodity feature nodes connected with user nodes corresponding to each user in the three-part graph, wherein adjacent commodities of each user include commodities corresponding to the commodity nodes connected with the user nodes corresponding to each user in the three-part graph, the commodities include commodities corresponding to commodity features corresponding to commodity feature nodes connected with the commodity nodes corresponding to each commodity in the three-part graph through the user nodes and/or the commodity feature nodes, the adjacent commodity features of each commodity include commodity features corresponding to the commodity feature nodes connected with the commodity nodes corresponding to each commodity in the three-part graph, and the adjacent users of each commodity include users corresponding to the user nodes connected with the commodity nodes corresponding to each commodity in the three-part graph; the three-part graph comprises M1Individual user node, M2Individual commodity node and M3Item feature node, said M1A user node and M1One-to-one correspondence of each user, M2Each commodity node and M2One-to-one correspondence of each commodity, M3Individual commodity characteristic node and M3Each commodity feature is in one-to-one correspondence, and M is1The user node corresponding to the u-th user in each user is connected with the commodity node corresponding to the commodity commented and/or purchased by the u-th user, the user node corresponding to the u-th user is connected with the commodity feature node corresponding to the commodity feature commented by the u-th user, and the M is used for determining the commodity feature of the commodity commented by the u-th user2The commodity node corresponding to the ith commodity in each commodity is connected with the commodity feature node corresponding to the commodity feature of the commented ith commodity, and M1、M2And M3Are each an integer greater than 1, u is less than or equal to M1I is less than or equal to M2Is a positive integer of (1).
According to the method, a user-commodity feature three-part graph is constructed according to a historical record of commodities purchased by a user and commodity features in user comments of the user on the commodities; according to the constructed three-part graph, the representation information of the user, the commodity and the commodity feature is learned by using a neural network, wherein the representation information of the user is learned by the representation information of the adjacent commodity feature and the representation information of the adjacent commodity, the representation information of the commodity is learned by the representation information of the adjacent commodity feature and the representation information of the adjacent user, and the representation information of the commodity feature is learned by the representation information of the adjacent user and the adjacent commodity; and predicting and generating the commodity rating information of the user through a neural network by the learned user characteristic information, commodity characteristic information and commodity characteristic information. According to the technical scheme, the accuracy of the user rating can be improved.
With reference to the first aspect, in a first possible implementation manner, the neural network is a graph neural network, and the characterization information of each user, the characterization information of each commodity, the characterization information of each adjacent commodity feature of each user, and the characterization information of each adjacent commodity feature of each commodity are initialized.
Accordingly, the generating of the target characterizing information based on the characterizing information of the adjacent commodity feature of each of the plurality of users, the characterizing information of the adjacent commodity of the each user, the characterizing information of the adjacent commodity feature of each of the plurality of commodities, and the characterizing information of the adjacent user of the each commodity, the characterizing information of the adjacent user and the adjacent commodity of the adjacent commodity feature of the each user, the characterizing information of the adjacent user and the adjacent commodity of the adjacent commodity feature of the each commodity, and the characterizing information of the adjacent commodity includes: step one, using an embedded updating layer of the graph neural network to update the characterization information of each user based on the characterization information of each user, the characterization information of the adjacent commodity features of each user and the characterization information of the adjacent commodities of each user; secondly, updating the representation information of each commodity based on the representation information of each commodity, the representation information of adjacent users of each commodity and the representation information of adjacent commodity features of each commodity by using an embedded updating layer of the graph neural network; and thirdly, updating the characteristic information of each commodity feature based on the characteristic information of each commodity feature in the plurality of commodity features, the characteristic information of the first adjacent commodity of each commodity feature and the characteristic information of the first adjacent user of each commodity feature by using the embedded updating layer of the graph neural network.
In the implementation mode, the embedded updating layer of the graph neural network is used for updating the representation information of each user, the representation information of each commodity and the representation information of each commodity feature, and the representation information of all nodes in the three-part graph can be updated by executing one-time embedded updating, so that the learning efficiency is improved.
With reference to the first possible implementation manner, in a second possible implementation manner, the graph neural network includes an L-layer embedded update layer, where L is equal to a connection relationship length between the user to be evaluated and a first user of the multiple users, the connection relationship length between the user to be evaluated and the first user is greater than a connection relationship length between the user to be evaluated and other users, the connection relationship length between the user to be evaluated and each user of the multiple users is obtained by subtracting 1 from the number of commodity nodes, commodity feature nodes, and user nodes included in a shortest path from a user node corresponding to the user to be evaluated to a user node corresponding to each user in the three-part graph, and L is a positive integer.
In the implementation mode, the number of all nodes contained in the shortest path between the user to be evaluated and the user node farthest away is used as the number of layers of the embedded updating layer in the graph neural network, so that the user node corresponding to the user to be evaluated can learn the representation information of all nodes in the three-part graph, which have connection relations with the user node corresponding to the user to be evaluated, the accuracy of the representation information of the user to be evaluated is improved, and the accuracy of prediction and rating is further improved.
With reference to the first possible implementation manner or the second possible implementation manner, in a third possible implementation manner, the updating, by using the embedded update layer of the graph neural network, the characterization information of each user based on the characterization information of each user, the characterization information of the adjacent commodity feature of each user, and the characterization information of the adjacent commodity of each user includes: using the attention network in the embedded update layer based on the n of each user1The characterization information of each adjacent commodity feature in the adjacent commodity features is obtained, and the n is obtained by each user1An attention value for each of a plurality of adjacent commodity features; using the embedded update layer, according to the n1A value of attention of each of adjacent commodity features for said n1Merging the characterization information of the adjacent commodity characteristics to obtain first commodity characteristic information; and merging the characterization information of each user, the first commodity characterization information and the characterization information of the adjacent commodity of each user by using the embedded updating layer to obtain user characterization information, wherein the characterization information of each user is updated to the user characterization information.
In the implementation mode, the attention network is used as an aggregation function to aggregate the characterization information of the adjacent commodity features of each user, so that the attention value of each adjacent commodity feature of the user, namely the importance of each adjacent commodity feature to the user, can be obtained, and the accuracy of the characterization information of the adjacent commodity features of each user is improved.
With reference to any one of the foregoing possible implementation manners, in a fourth possible implementation manner, the updating, by using the embedded update layer of the graph neural network, the characterizing information of each commodity based on the characterizing information of each commodity, the characterizing information of adjacent users of each commodity, and the characterizing information of adjacent commodity features of each commodity includes: using the attention network embedded in the update layer based on the n of each item2The characterization information of each adjacent commodity feature in the adjacent commodity features is obtained to obtain the n of each commodity pair2An attention value for each of a plurality of adjacent commodity features; using the embedded update layer, according to the n2A value of attention of each of adjacent commodity features for said n2Fusing the characterization information of the adjacent commodity characteristics to obtain second commodity characteristic information; merging the characterizing information of each of the commodities, the second commodity characterizing information and the characterizing information of the adjacent users of each of the commodities by using the embedded updating layer,and obtaining commodity characterization information, wherein the characterization information of each commodity is updated to the commodity characterization information.
In this implementation, the attention network is used as an aggregation function to aggregate the characterization information of the adjacent commodity features of each commodity, so that the attention value of each adjacent commodity feature of each commodity, that is, the quality of each adjacent commodity feature observed in the commodity, can be obtained, and the accuracy of the characterization information of the adjacent commodity features of each commodity is improved.
With reference to any one of the foregoing possible implementation manners, in a fifth possible implementation manner, the first adjacent user is the user to be evaluated, and the first adjacent product is the product to be evaluated.
With reference to the first aspect or any one of the foregoing possible implementation manners, in a sixth possible implementation manner, the obtaining, by using the neural network, first rating information based on the target characterization information includes: merging the target characterization information by using the neural network to obtain merged information; predicting, using the neural network, the first rating information based on the consolidated information.
In the implementation mode, the neural network is used for merging the obtained characterization information of the commodity features to obtain the merged characterization information of one commodity feature, and the predicted user rating of the commodity to be rated by the user to be rated is obtained based on the characterization information of the user to be rated, the characterization information of the commodity to be rated and the merged characterization information of the commodity features, so that the accuracy of predicting the user rating is improved.
With reference to the first aspect or any one of the foregoing possible implementation manners, in a seventh possible implementation manner, the neural network is trained based on the first rating information and the real rating information of the to-be-rated user on the to-be-rated commodity.
In the implementation mode, the neural network model is trained according to the predicted user rating and the real predicted rating of the commodity to be rated of the user to be rated, so that the neural network model can predict the user rating more accurately.
In a second aspect, the present application provides an apparatus for predicting user ratings based on a neural network, which may include one or more functional modules for implementing the method of the first aspect, each of which may be implemented by software and/or hardware. For example, the apparatus may include a processing module and a rating module.
In a third aspect, the present application provides an apparatus for predicting a user rating based on a neural network. The apparatus may include a processor coupled with a memory. Wherein the memory is configured to store program code and the processor is configured to execute the program code in the memory to implement the method of the first aspect or any one of the implementations.
Optionally, the apparatus may further comprise the memory.
In a fourth aspect, the present application provides a chip comprising at least one processor and a communication interface, the communication interface and the at least one processor are interconnected by a line, and the at least one processor is configured to execute a computer program or instructions to perform the method according to the first aspect or any one of the possible implementations thereof.
In a fifth aspect, the present application provides a computer readable medium storing program code for execution by a device, the program code comprising instructions for performing the method according to the first aspect or any one of its possible implementations.
In a sixth aspect, the present application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method according to the first aspect or any one of its possible implementations.
In a seventh aspect, the present application provides a computing device comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the communication interface being in communication with a target system, the at least one processor being configured to execute a computer program or instructions to perform the method according to the first aspect or any one of the possible implementations.
In an eighth aspect, the present application provides a computing system comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the communication interface being in communication with a target system, the at least one processor being configured to execute a computer program or instructions to perform the method according to the first aspect or any one of the possible implementations thereof.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a schematic diagram of a three-part diagram provided by an embodiment of the present application;
FIG. 2 is a schematic diagram of an application scenario according to an embodiment of the present application;
FIG. 3 is a schematic flow chart diagram of a training method for a neural network for predicting user ratings according to one embodiment of the present application;
FIG. 4 is a schematic flow chart diagram of a training method for a graphical neural network for predicting user ratings according to the present application;
FIG. 5 is a graphical representation of the ranking results of the neural network model and the 8 baseline models provided herein;
FIG. 6 is a schematic view of the attention distribution of the merchandise features in the user network provided by the present application;
FIG. 7 is a schematic view of attention distribution of commodity features in a commodity network provided by the present application;
FIG. 8 is a schematic flow chart diagram of a method of predicting user ratings based on a neural network in one embodiment of the present application;
FIG. 9 is a system architecture diagram of one embodiment of the present application;
FIG. 10 is a schematic block diagram of an apparatus for predicting user ratings based on a neural network according to an embodiment of the present application;
fig. 11 is a schematic block diagram of an apparatus for predicting a user rating based on a neural network according to another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
For ease of understanding, some concepts related to the embodiments of the present application will be described first.
E-commerce website: refers to a website that sells goods to users in the form of network electronics. The electronic commerce website to which the technical scheme of the embodiment of the application can be applied generally meets the following requirements: 1. a user rating of a purchased or used item by a user indicating the user's preference for the item; 2. there are user comments by the user to purchase or use the goods.
And (4) commenting by the user: it is meant that the user describes the user's experience with the merchandise in natural language to interpret the user rating given.
And (3) user rating: refers to the point marked on the commodity after the user purchases or uses the commodity on the e-commerce website. The user rating represents the user's preference for the commodity, and a higher user rating indicates that the user likes the commodity more. For example, in an e-commerce website, a user may rate items for 1-5 points, with 1 point representing that the user dislikes the item and 5 points representing that the user likes the item. User ratings are often accompanied by user comments to explain why a user rating is given.
The commodity characteristics are as follows: refers to the salient features that appear in user reviews in the form of nouns or noun phrases that describe a good. For example, the commodity characteristics may be price, service, taste, cleanliness, and the like.
Commodity feature extraction: the method is characterized in that the prominent features of the commodity are extracted from user comments. Specifically, given user reviews as input, characteristics of the good described by nouns or noun phrases are output, such as outputting "price" and "service". The commodity feature extraction can be carried out by various methods, including manual extraction and unsupervised machine learning method extraction. Currently, those skilled in the art have proposed a variety of unsupervised machine learning methods to extract commodity features.
In a method for extracting commodity features, it is assumed that each commodity feature has a corresponding emotional expression in a user comment, and the commodity features and the emotional expressions are presented by a specific grammar. For example, "service is also very popular", "service" is a commodity feature, "popular" is an emotional expression corresponding to the commodity feature, "popular" is directly modifying "service". In the method, firstly, an emotional expression word library is manually created, such as 'enthusiasm, good, cheap and Zhou-you'; finding out commodity characteristics corresponding to the emotional expressions in the word library according to the created word library and the appointed grammar; finding out new emotional expression according to the found commodity characteristics and the specified grammar, and updating a word bank; and continuously and iteratively finding new commodity characteristics and emotional expressions until the commodity characteristics and the emotional expressions are saturated.
Embedding a vector: refers to the mapping of an object (such as a user, a commodity and a commodity characteristic) to a real number vector for characterizing the attribute of the object. The embedded vector is widely used in machine learning for prediction reasoning, for example, the user rating of the user u on the commodity i can be predicted by the user embedded vector of the user u and the commodity embedded vector of the commodity i. The embedded vectors are continuously optimized and learned during the model training process. If the attributes of two things are similar, the learned corresponding embedded vectors will have a high degree of similarity.
The three parts are as follows: the method is a schematic diagram showing interaction relations between a user and a commodity adjacent to the user, between the user and a commodity feature adjacent to the user, between the commodity and a user adjacent to the commodity, between the commodity and a user feature adjacent to the commodity, between the commodity feature and a user adjacent to the commodity feature, and between the commodity feature and a commodity adjacent to the commodity feature.
For example, a three-part graph may include M1Individual user node, M2Individual commodity node and M3Individual commodity characteristic node, M1A user node and M1One-to-one correspondence of individual users, M2Each commodity node and M2One-to-one correspondence of each commodity, M3Individual commodity characteristic node and M3The characteristics of each commodity correspond to one another. The user nodes are used for representing corresponding users, the commodity node users represent corresponding commodities, and the commodity feature nodes are used for representing corresponding commodity features.
M1And the user node corresponding to the u-th user in each user is connected with the commodity node corresponding to the adjacent commodity of the u-th user, and the user node corresponding to the u-th user is connected with the commodity feature node corresponding to the adjacent commodity feature of the u-th user.
M2The commodity node corresponding to the ith commodity in each commodity is connected with the commodity feature node corresponding to the adjacent commodity feature of the ith commodity, and the commodity node corresponding to the ith commodity is connected with the user node corresponding to the adjacent user of the ith commodity.
M3The commodity feature node corresponding to the a-th commodity feature in the commodity features is connected with the user node corresponding to the adjacent user of the a-th commodity feature, and the commodity feature node corresponding to the a-th commodity feature is connected with the commodity node corresponding to the adjacent commodity of the a-th commodity feature node.
The three-part graph comprises the following three interactive relations: 1) the interaction relation between the user and the commodity, and the adjacent commodity of the user and the adjacent user of the commodity can be obtained based on the relation; 2) the interactive relation between the commodities and the commodity characteristics can be obtained based on the relation, and the adjacent commodity characteristics of the commodities and the adjacent commodities of the commodity characteristics can be obtained; 3) and the user-commodity feature interactive relationship is used for acquiring the adjacent commodity features of the user and the adjacent users of the commodity features.
Adjacent users of the goods: refers to a user who purchased and/or reviewed the item. For example, the neighboring users of item i include users who purchased and/or reviewed item i.
Adjacent item characteristics of the item: which means that the characteristics of the product have been reviewed. For example, the adjacent item features of item i include the item features of item i that have been reviewed.
User's adjacent merchandise: refers to goods that have been purchased and/or reviewed by the user. For example, the neighboring items of user u include items purchased and/or reviewed by user u.
User's adjacent merchandise features: refers to the characteristics of the item that was reviewed by the user. For example, the adjacent commodity features of user u include the commodity features that user u has commented on.
Adjacent users of merchandise features: refers to the user who has reviewed the characteristics of the product. For example, the neighboring users of the item feature a include users who have commented on the item feature a.
Adjacent items of item characteristics: refers to the product that has been reviewed by the product characteristics. For example, the adjacent items of the item characteristic a include items that have been reviewed by the item characteristic a.
Wherein M is1、M2And M3Are each an integer greater than 1, u is less than or equal to M1I is less than or equal to M2A is less than or equal to M3Is a positive integer of (1).
Connection relationship length: the number of all nodes included in the shortest path from any node to any node except the node in the three-part graph is referred to. For example, user u1With user u2The length of the connection between is equal to the user u in the three-part diagram1Corresponding user node to user u2And subtracting 1 from the number of all nodes contained in the shortest path of the corresponding user node, wherein the nodes on the shortest path contain a starting point and an end point.
In order to facilitate understanding of the concepts of the three-part diagram, the adjacent users, the adjacent products, the characteristics of the adjacent products, the length of the connection relationship, and the like, the three-part diagram shown in fig. 1 is used as an example for description.
As shown in fig. 1, a product 1 is a product adjacent to a user 1, a product feature 1 is a product adjacent to the user 1, a user 2 is a user adjacent to the product 1, a product feature 2 is a product adjacent to the product 1 and the user 3, the user 3 is a user adjacent to the product 1 and the product feature 1, and the product 2 is a product adjacent to the product feature 1 and the user 3. The length of the connection relationship between the user 1 and the user 2 is 2.
Graph neural network: refers to a neural network comprising a plurality of embedded updating layers (embedding updating layers) and a prediction layer (prediction layer).
Typically, an embedded update layer performs two operations to update the embedded vectors of a node in a bipartite graph. These two operations may include aggregation and combination.
An aggregation (aggregation) operation aggregates the embedding vectors of the neighbor nodes of the current node using an aggregation function to derive an aggregated embedding vector of the neighbor nodes. For example, in fig. 1, in order to update the embedded vector of the node of user 1, an aggregation operation may be used to aggregate the embedded vectors of the node of commodity 1 and the node of commodity feature 1, resulting in an aggregated neighbor embedded vector. Common aggregation methods include mean, max pool, long-short-term memory (LSTM) and attention network.
Combining (combine) operation combines the neighbor embedded vector obtained by aggregation and the current embedded vector of the current node to generate an updated embedded vector of the current node. A common merge involves splicing.
An embedded update layer may update the embedded vectors of all nodes in the three-part graph simultaneously. The graph neural network includes an L-layer embedded update layer. The value of L is equal to the maximum connection relationship length between the node corresponding to the user to be evaluated in the three-part graph and the node used for updating the representation information of the user to be evaluated. When the model is applied, the number of the embedded updating layers of the graph neural network is required to be set. As an example, the number of layers of the embedded update layer may be set according to the accuracy requirement of the prediction rating, for example, when the accuracy requirement of the prediction rating is high, the number of layers of different embedded update layers may be tested, and the number of embedded update layers with the highest accuracy may be selected. Alternatively, the number of layers of the embedded update layer may be set empirically.
For example, assume that the embedded vectors of user 1 nodes are updated using two embedded update layers. In the first layer of the embedded updating layer, in order to update the embedded vector of the user 1 node, the embedded vector of the commodity 1 node and the embedded vector of the commodity feature 1 node need to be aggregated, and then the embedded vectors before updating of the user 1 node are combined. Meanwhile, in order to update the embedded vector of the commodity 1 node, the embedded vectors of the user 2 node, the commodity feature 2 node and the user 3 node need to be aggregated, and then the embedded vectors before updating of the commodity 1 node are combined; and in the second layer, embedding and updating the layer, wherein the embedding vector of the commodity 1 node is obtained by aggregating and updating the embedding vector of the commodity 1 node and the embedding vector of the commodity characteristic 1, which are obtained from the first layer of embedding and updating layer. Since the embedded vectors of the user 2, the commodity characteristic 2, the user 3 and the commodity 2 are aggregated by the commodity 1 and the commodity characteristic 1 in the first embedded updating layer, the embedded vector of the user 1 node combines the characteristics of the commodity 1 node, the commodity characteristic 1 node, the user 2 node, the commodity characteristic 2 node, the user 3 node and the commodity 2 node in the graph through the two embedded updating layers.
After the L-layer embedded updating layer, in order to obtain the prediction of the user rating of the commodity to be rated by the user to be rated, the prediction layer is used for combining the embedded vector of the user to be rated, which is obtained by updating, with the embedded vector of the adjacent characteristics of the user to be rated and the commodity to be rated, and generating the predicted value of the user rating by using a neural network.
Fig. 2 is a schematic diagram of an application scenario according to an embodiment of the present application. The scenario shown in fig. 2 is where a user uses a user terminal 201 to shop on an e-commerce website. Firstly, a user inputs commodity text query information in an electronic commerce website page displayed by a user terminal 201; the user terminal 201 sends the commodity text query information to the e-commerce website server 202; the e-commerce website server 202 finds out a related commodity list according to the text matching degree between the commodity text query information and the commodity information of the user, and predicts the user rating of the user on the commodities in the related commodity list; the e-commerce website server 202 arranges the commodities in the commodity list according to the predicted user rating in a descending order, and sends the ordered user rating to the user terminal 201; the user terminal 201 presents the ranked user ratings to the user. The user terminal comprises a mobile phone, a computer, a tablet and the like.
It is understood that the scenario of fig. 2 is only an example, and the technical solution of the present application may also be applied to other scenarios as long as the scenario involves recommending goods to the user according to the historical purchase information of the user. The historical purchase information of the user may include: the items purchased by the user, the user ratings of the purchased items by the user, and/or the user reviews of the purchased items by the user. For example, the technical scheme of the application can also be applied to a comment website, and scenes such as new restaurants are recommended to the user according to restaurants consumed by the user.
The technical solution shown in the present application is described below. It is to be understood that the following embodiments may be implemented individually or in combination with each other, and the description of the same or similar contents will not be repeated in different embodiments.
FIG. 3 is a flow chart illustrating a training method of a neural network for predicting user ratings according to an embodiment of the present application. As shown in fig. 3, the method may include S301, S302, and S303.
S301, generating target characterization information by using a neural network based on the characterization information of the adjacent commodity feature of each user in the multiple users, the characterization information of the adjacent commodity of each user, the characterization information of the adjacent commodity feature of each commodity in the multiple commodities, the characterization information of the adjacent user of each commodity feature and the characterization information of the adjacent commodity in the adjacent commodity feature of each user, the characterization information of the adjacent user of each commodity feature and the characterization information of the adjacent commodity in the adjacent commodity feature of each commodity, wherein the target characterization information comprises the characterization information of the user to be evaluated, the characterization information of the commodity to be evaluated, the characterization information of the adjacent commodity feature of the commodity to be evaluated and the characterization information of the adjacent commodity feature of the user to be evaluated. The plurality of users includes users to be rated, and the plurality of commodities includes commodities to be rated.
The representation information of the commodity feature is used to represent the feature of the commodity feature, for example, the representation information of the commodity feature may be a word corresponding to the commodity feature. The user characterization information is used to represent characteristics of the user, for example, the user characterization information may include characteristics of the goods purchased by the user and/or the goods reviewed by the user. The characterizing information of the item is used to represent characteristics of the item, for example, the characterizing information of the item may include characteristics of the item that a user purchased the item and/or reviewed the item.
As an example, the characterization information of the user, the characterization information of the merchandise, and the characterization information of the merchandise feature may be represented by an embedded vector. In one possible implementation manner, generating the characterization information of the user to be ranked based on the characterization information of the adjacent commodity feature of each of the plurality of users, the characterization information of the adjacent commodity feature of each of the plurality of commodities, and the characterization information of the adjacent user of each of the commodities may include: and aggregating and/or combining the characterization information of all or part of the adjacent commodity characteristics of the user to be rated and the characteristics of all or part of the adjacent commodity characteristics of the user to be rated, and taking the obtained information as the characterization information of the user to be rated.
As an example, when the characterization information of the user, the characterization information of the product, and the characterization information of the product features are embedded vectors, the attention value of each adjacent product feature of the user may be obtained by using attention network aggregation, and the embedded vectors of the adjacent product features of the user are obtained by calculation based on the characterization information of each adjacent product feature and the corresponding attention value; aggregating the representation information of the commodities of the user to obtain the embedded vectors of the adjacent commodities of the user; and finally, combining the adjacent commodity characteristic embedded vector of the user, the adjacent commodity embedded vector of the user and the user embedded vector obtained by the last updating to obtain the embedded vector updated by the user.
As another example, when the characterization information of the user, the characterization information of the product, and the characterization information of the product features are embedded vectors, average calculation may be performed on the characterization information of the adjacent product features of the user to be ranked, and then information obtained by the average calculation of the characterization information of the adjacent product features of the user to be ranked and the average calculation of the adjacent product features is combined, so as to obtain the characterization information of the user to be ranked.
For example, the characterization information of a plurality of adjacent commodity features of the user to be ranked is firstly averaged to obtain common characterization information of all adjacent commodity features, and the calculation formula is as follows:
Figure BDA0002748841360000081
wherein the content of the first and second substances,
Figure BDA0002748841360000082
representing common characterizing information, naRepresenting the number of adjacent merchandise features of the user to be rated,
Figure BDA0002748841360000083
and characterizing information representing each of a plurality of adjacent merchandise features of the user to be rated.
Then, common representation information of all adjacent commodities of the user to be evaluated can be calculated by using a method similar to common representation information of commodity features; and then splicing the common characteristic information of the adjacent commodity characteristics of the user to be evaluated and the common characteristic information of the adjacent commodity of the user to be evaluated together to obtain the characteristic information of the user to be evaluated.
In one possible implementation manner, generating the characterizing information of the commodity to be rated based on the characterizing information of the adjacent commodity feature of each user in the plurality of users, the characterizing information of the adjacent commodity of each user, the characterizing information of the adjacent commodity feature of each commodity in the plurality of commodities, the characterizing information of the adjacent user of each commodity feature and the characterizing information of the adjacent commodity of each user, the characterizing information of the adjacent user of each commodity feature and the characterizing information of the adjacent commodity of each commodity feature of each commodity may include: and aggregating and/or combining the characterization information of all or part of the adjacent commodity characteristics of the commodity to be rated and all or part of the adjacent user characteristics of the commodity to be rated, and taking the obtained information as the characterization information of the commodity to be rated.
The obtaining mode of the characterization information of the commodity to be evaluated can refer to the obtaining mode of the characterization information of the user to be evaluated, and details are not repeated here.
It can be understood that the above implementation manner for generating the characterization information of the user to be evaluated and the characterization information of the commodity to be evaluated is only an example, and the implementation manner for generating the characterization information of the user to be evaluated and the characterization information of the commodity to be evaluated in the embodiment of the present application is not limited thereto.
S302, obtaining first rating information of the commodity to be rated of the user to be rated by using the neural network based on the target representation information. The first rating information is used for representing or indicating the rating of the commodity to be rated by the user to be rated.
When the neural network is used to generate the first rating information of the commodity to be rated by the user to be rated based on the target characterization information, in some implementation manners, information included in the target characterization information (for example, the characterization information of the user to be rated and the characterization information of the commodity to be rated) may be merged to obtain merged information, and then the merged information is input to a prediction layer of the neural network to obtain the first rating information.
And S303, training the neural network according to the first rating information and the real rating information of the commodity to be rated of the user to be rated.
In some possible implementations, the neural network in this embodiment may be a graph neural network, and the characterization information of the user, the commodity, and the commodity feature may be an embedded vector.
The training method of the graph neural network for predicting user ratings is described below in conjunction with fig. 4.
As shown in fig. 4, the graph neural network of this embodiment may include L embedded update layers, each embedded update layer includes a user network, a commodity network, and a commodity feature network, the user network is used to update an embedded vector of a user, the commodity network is used to update an embedded vector of a commodity, and the commodity feature network is used to update an embedded vector of a commodity feature.
S401, initializing the representation information of each user, the representation information of each commodity, the representation information of each adjacent commodity feature of each user and the representation information of each adjacent commodity feature of each commodity.
For example, each user and each commodity are randomly assigned an embedded vector
Figure BDA0002748841360000091
And
Figure BDA0002748841360000092
embedding vectors
Figure BDA0002748841360000093
And
Figure BDA0002748841360000094
may be randomly initialized with a uniform distribution and trained as a parameter in the training process.
To capture the semantics of the commodity features, the commodity features may be initialized as word vectors. In a feasible implementation manner, the user comments of each user and the user comments of each commodity can be trained to obtain a word vector v of each word in the user comments of each user and the user comments of each commodityi. The word vector contains the semantics of the corresponding word, so that two similar words correspond to a word vector with a very high degree of similarity. For example, word vectors corresponding to "king" and "queen" have very high similarity.
Each user's adjacent merchandise characteristics and each merchandise's adjacent merchandise characteristics may contain multiple words, for example, "staff service" contains two words, "staff" and "service". Each word is mapped to its corresponding word vector. Calculating the average value of all word vectors to obtain an initialized embedded vector of the commodity feature a
Figure BDA0002748841360000095
In order to maintain each user's and each adjacent merchandise characteristicSemantics of each adjacent commodity feature of an individual commodity, initialization of the commodity feature embedded vector
Figure BDA0002748841360000096
And the model training process is kept unchanged.
S402, using a user network embedded in an updating layer in the graph neural network, and updating the characterization information of each user based on the characterization information of each user, the characterization information of the adjacent commodity features of each user and the characterization information of the adjacent commodities of each user.
For example, the characterizing information of all the adjacent product features of each user and the characterizing information of all the adjacent products of the user may be aggregated, and then the characterizing information of the user may be updated according to the product feature characterizing information obtained by aggregation and the product characterizing information obtained by aggregation.
The following describes an implementation of aggregating the characterization information of all the neighboring products of each user.
Illustratively, for a user and the user's neighboring merchandise, the following relationship may be used based on the embedded transformation matrix
Figure BDA0002748841360000101
And Vi lEmbedded vectors to the user respectively
Figure BDA0002748841360000102
And embedded vectors for each of the user's adjacent items
Figure BDA0002748841360000103
Performing a transformation to ensure that the characterization information of the user to be updated and the characterization information of the goods adjacent to the user are in the same state space:
Figure BDA0002748841360000104
where σ is a linear rectification function (ReLU),
Figure BDA0002748841360000105
and Vi lTo embed the transformation matrix. Embedding a transformation matrix
Figure BDA0002748841360000106
And Vi lAre trained in a model training process.
Embedding vector of adjacent commodities converted by the above formula
Figure BDA0002748841360000107
Polymerizing by using the following polymerization function to obtain the polymerization embedded vector of the adjacent commodity of the user, namely obtaining the characterization information of the adjacent commodity of the user
Figure BDA0002748841360000108
Figure BDA0002748841360000109
Where Aggregate represents the aggregation function, N (u) represents all the neighboring commodity nodes for each user,
Figure BDA00027488413600001010
embedded vectors representing adjacent items of each user.
It will be appreciated that the aggregation function may include a maximum aggregation function, an LSTM aggregation function, or an attention network. Compared with the maximum aggregation function, the LSTM aggregation function or the attention network, the mean aggregation function is used for obtaining the characterization information of the adjacent commodities of the user, and the mean aggregation function is used for obtaining the characterization information of the adjacent commodities of the user, so that the phenomenon of overfitting of the neural network of the graph is avoided.
The following describes an implementation of aggregating the characterization information of all the neighboring merchandise features of the user.
First, the following relation can be used based on the embedded transformation matrix
Figure BDA00027488413600001011
Embedded vector for each adjacent commodity feature of the user
Figure BDA00027488413600001012
Performing a transformation to ensure that the characterizing information of the user and the characterizing information of the user's neighboring merchandise are in the same state space:
Figure BDA00027488413600001013
where, σ is the ReLU function,
Figure BDA00027488413600001014
to embed the transformation matrix. Embedding a transformation matrix
Figure BDA00027488413600001015
Are trained in a model training process.
The embedded vector of each adjacent commodity feature after the conversion of the above formula
Figure BDA00027488413600001016
The aggregation function can be used for aggregation to obtain an aggregation embedded vector of the adjacent commodity features of the user, namely the characterization information of the adjacent commodity features of the user
Figure BDA00027488413600001017
Figure BDA00027488413600001018
Where Aggregate represents the aggregation function, Au represents all the neighboring commodity feature nodes for each user,
Figure BDA00027488413600001019
an embedded vector representing the adjacent merchandise features of each user.
It will be appreciated that the aggregation function may include a maximum aggregation function, an LSTM aggregation function, or an attention network. Compared with the method for acquiring the characterization information of the adjacent commodity of the user by using a maximum aggregation function, an LSTM aggregation function or a mean aggregation function, the method for acquiring the characterization information of the adjacent commodity of the user by using the attention network as the aggregation function to aggregate the characterization information of the adjacent commodity characteristics of the user can obtain the importance of each adjacent commodity characteristic to the user, so that the acquired characterization information of the adjacent commodity characteristic node of each user is more accurate.
An exemplary implementation of aggregating user's neighboring merchandise characteristic information using an attention network as an aggregation function is described below.
First, scaling dot product operation is used to convert user characterization information
Figure BDA00027488413600001020
And commodity characteristic representation information
Figure BDA00027488413600001021
Obtaining the characteristics a of the commodityiCoefficient to user u
Figure BDA00027488413600001022
The specific operational formula is shown below:
Figure BDA00027488413600001023
second, calculating the commodity characteristics aiAttention was paid to the force values. In particular, the characteristics of the goods aiCoefficient of (2)
Figure BDA00027488413600001024
And (3) performing normalization operation on the adjacent commodity characteristics of the user u by using a softmax function:
Figure BDA0002748841360000111
thirdly, using the obtained commodity characteristic attention valueAggregating adjacent commodity features to obtain an aggregated commodity feature embedded vector
Figure BDA00027488413600001132
Figure BDA0002748841360000113
Wherein the content of the first and second substances,
Figure BDA0002748841360000114
to embed the transformation matrix. Embedding a transformation matrix
Figure BDA0002748841360000115
Are trained in a model training process.
The following describes an implementation manner of updating the characterization information of the user according to the commodity feature characterization information obtained by aggregation and the commodity characterization information obtained by aggregation.
The representation information of the adjacent goods of the user
Figure BDA0002748841360000116
Characterization information of adjacent merchandise features
Figure BDA0002748841360000117
And the user inserts the representation information obtained by updating in the last layer and the updating layer
Figure BDA0002748841360000118
Merging to obtain the updated representation information of the user in the current embedded updating layer
Figure BDA0002748841360000119
Illustratively, the characterization information of the user's neighboring commodities may be combined by averaging, embedding vector concatenation, or learning parameter aggregation using neural networks, etc
Figure BDA00027488413600001110
Adjacent to each otherCharacterizing information of a characteristic of a commodity
Figure BDA00027488413600001111
And the user inserts the representation information obtained by updating in the last layer and the updating layer
Figure BDA00027488413600001112
Thereby obtaining the updated representation information of the user in the current embedded updating layer
Figure BDA00027488413600001113
The user is embedded into the updated representation information of the update layer
Figure BDA00027488413600001114
An expression of (1) is as follows:
Figure BDA00027488413600001115
wherein the content of the first and second substances,
Figure BDA00027488413600001116
represents the updated embedding vector of the user at the current embedding updating layer, namely the updated characterization information of the user, sigma represents the ReLU function,
Figure BDA00027488413600001117
representing an updated weight matrix, updating the weight matrix
Figure BDA00027488413600001118
Continuously trained in the model training process, Concat represents a merging operation function,
Figure BDA00027488413600001119
an aggregate embedded vector representing the user's neighboring items,
Figure BDA00027488413600001120
aggregate embedding of adjacent merchandise features representing the userThe vector of the vector is then calculated,
Figure BDA00027488413600001121
indicating that the user embeds the updated representation information of the updating layer at the upper layer.
Merging and merging the representation information of the adjacent commodities of the users by adopting a vector connection method
Figure BDA00027488413600001122
Characterization information of adjacent merchandise features
Figure BDA00027488413600001123
And the user inserts the representation information obtained by updating in the last layer and the updating layer
Figure BDA00027488413600001124
Then, a one-dimensional embedding vector can be obtained, that is, the current embedding update layer can be updated to obtain a one-dimensional embedding vector for characterizing the user.
And S403, updating the representation information of each commodity based on the representation information of each commodity, the representation information of adjacent users of each commodity and the representation information of adjacent commodity features of each commodity by using a commodity network embedded in an updating layer in the graph neural network.
For example, the characterizing information of all adjacent product features of each product and the characterizing information of all adjacent users of the product may be aggregated, and then the characterizing information of the product may be updated according to the product feature characterizing information obtained by aggregation and the user characterizing information obtained by aggregation.
The implementation of aggregating the characterization information of all neighboring users for each commodity is described first below.
Illustratively, for a good, the following relationship may be used based on an embedded transformation matrix
Figure BDA00027488413600001125
Embedded vectors to each neighboring user of the good
Figure BDA00027488413600001126
A transition is made to ensure that the characterizing information of the good to be updated and the characterizing information of the neighboring users of the good are in the same state space:
Figure BDA00027488413600001127
where, σ is the ReLU function,
Figure BDA00027488413600001128
to embed the transformation matrix. Embedding a transformation matrix
Figure BDA00027488413600001129
Are trained in a model training process.
Embedding vector for adjacent user after conversion by the above formula
Figure BDA00027488413600001130
Polymerizing by using the following polymerization function to obtain polymerization embedded vectors of the adjacent users of the commodity, namely obtaining the characterization information of the adjacent users of the commodity
Figure BDA00027488413600001131
Figure BDA0002748841360000121
Where Aggregate represents the aggregation function, N (i) represents all neighboring user nodes for each commodity,
Figure BDA0002748841360000122
embedded vectors representing adjacent users of each item.
It will be appreciated that the aggregation function may include a maximum aggregation function, an LSTM aggregation function, or an attention network. Compared with the maximum aggregation function, the LSTM aggregation function or the attention network, the mean aggregation function is used for obtaining the characterization information of the adjacent users of the commodity, and the mean aggregation function is used for obtaining the characterization information of the adjacent users of the commodity, so that the phenomenon of overfitting of the neural network of the graph is avoided.
The implementation manner of aggregating the representation information of all the adjacent product features of the product may refer to the implementation manner of aggregating the representation information of all the adjacent product features of the user, and is not described herein again.
In a similar way, the attention network is used as an aggregation function to aggregate the characterization information of the adjacent commodity features of the commodity, so that the importance of each adjacent commodity feature to the commodity can be obtained, and the obtained characterization information of the adjacent commodity feature node of each commodity is more accurate.
The implementation manner of updating the characterization information of the commodity according to the commodity characteristic characterization information obtained by aggregation and the user characterization information obtained by aggregation may refer to the implementation manner of updating the characterization information of the user according to the commodity characteristic characterization information obtained by aggregation and the commodity characterization information obtained by aggregation, and details are not repeated here.
Similarly, after the representation information of the adjacent users of the commodity, the representation information of the characteristics of the adjacent commodity and the representation information obtained by embedding the commodity in the last embedded updating layer are combined by adopting a vector connection method, a one-dimensional commodity embedded vector can be obtained in the current embedded updating layer.
S404, using the commodity feature network embedded in the updating layer in the graph neural network, and updating the characterization information of each commodity feature based on the characterization information of each commodity feature, the characterization information of the adjacent commodities of each commodity feature and the characterization information of the adjacent users of each commodity feature.
In a feasible implementation manner, the characterizing information of all adjacent commodities of each commodity feature and the characterizing information of all adjacent users of the commodity feature may be aggregated, and then the characterizing information of the commodity feature may be updated according to the commodity characterizing information obtained by aggregation and the user characterizing information obtained by aggregation.
Exemplarily, for a commodity characteristic aiUsing embedded transformation matrices
Figure BDA0002748841360000123
The characteristic information of the commodity characteristics obtained by the last update
Figure BDA0002748841360000124
The conversion is performed as shown below:
Figure BDA0002748841360000125
for each adjacent user of the merchandise feature, the following relationship may be used based on the embedded transformation matrix
Figure BDA0002748841360000126
Embedding vectors to neighboring users
Figure BDA0002748841360000127
Performing a transition to ensure that the characterizing information of the merchandise feature to be updated and the characterizing information of the neighboring user of the merchandise feature are in the same state space:
Figure BDA0002748841360000128
where, σ is the ReLU function,
Figure BDA0002748841360000129
to embed the transformation matrix. Embedding a transformation matrix
Figure BDA00027488413600001210
Are trained in a model training process.
Embedding vector for adjacent user after conversion by the above formula
Figure BDA00027488413600001211
Polymerizing by using the following polymerization function to obtain the polymerization embedded vector of the adjacent user of the commodity feature, namely obtaining the characterization information of the adjacent user of the commodity feature
Figure BDA00027488413600001212
Figure BDA00027488413600001213
Where Aggregate represents the aggregation function, A (u) represents all the neighboring user nodes for each commodity feature,
Figure BDA00027488413600001214
embedded vectors representing adjacent users of each merchandise feature.
It will be appreciated that the aggregation function may include a maximum aggregation function, an LSTM aggregation function, or an attention network.
When aggregating the characterization information of all the neighboring commodities of the commodity feature, first, the following relational expression may be used based on the embedded transformation matrix Vi lEmbedded vector for each adjacent commodity of the commodity feature
Figure BDA0002748841360000131
Performing a transformation to ensure that the characterization information of the merchandise feature and the characterization information of the adjacent merchandise of the merchandise feature are in the same state space:
Figure BDA0002748841360000132
where σ is the ReLU function, Vi lTo embed the transformation matrix. Embedding a transformation matrix Vi lAre trained in a model training process.
Embedding vector for adjacent user after conversion by the above formula
Figure BDA0002748841360000133
Polymerizing by using the following polymerization function to obtain the polymerization embedded vector of the adjacent commodity of the commodity characteristic, namely obtaining the characterization information of the adjacent commodity of the commodity characteristic
Figure BDA0002748841360000134
Figure BDA0002748841360000135
Where Aggregate represents an aggregation function, A (i) represents all neighboring commodity nodes for each commodity feature,
Figure BDA0002748841360000136
an embedded vector of adjacent items representing each item feature.
It will be appreciated that the aggregation function may include a maximum aggregation function, an LSTM aggregation function, or an attention network.
After aggregating the obtained commodity user characterization information and the obtained commodity characterization information, the characterization information of the adjacent users of the commodity characteristics can be obtained
Figure BDA0002748841360000137
Characterization information of adjacent goods
Figure BDA0002748841360000138
And the commodity characteristics are embedded into the characterization information obtained by the updating layer at the upper layer
Figure BDA0002748841360000139
Merging to obtain the updated characteristic information of the commodity features in the current embedded updating layer
Figure BDA00027488413600001310
Illustratively, the characterization information of neighboring users of merchandise features may be combined by averaging, embedding vector concatenation, or learning parameter aggregation using neural networks
Figure BDA00027488413600001311
Characterization information of adjacent goods
Figure BDA00027488413600001312
And the commodity characteristics are embedded into the characterization information obtained by updating the updating layer at the upper layer
Figure BDA00027488413600001313
Thereby obtaining the characteristic information after the commodity characteristic is updated
Figure BDA00027488413600001314
The updated characteristic information of the commodity characteristics
Figure BDA00027488413600001315
An expression of (1) is as follows:
Figure BDA00027488413600001316
wherein the content of the first and second substances,
Figure BDA00027488413600001317
represents the embedded vector after the commodity characteristic is updated, namely the characterization information after the commodity characteristic is updated, sigma represents a ReLU function,
Figure BDA00027488413600001318
representing an updated weight matrix, updating the weight matrix
Figure BDA00027488413600001319
Continuously trained in the model training process, Concat represents a merging operation function,
Figure BDA00027488413600001320
an aggregate embedded vector of neighboring users representing the merchandise feature,
Figure BDA00027488413600001321
an aggregate embedded vector of neighboring items characterizing the item,
Figure BDA00027488413600001322
and representing the updated representation information of the commodity feature.
Method for merging characterization information of adjacent users of commodity features by adopting vector connection method
Figure BDA00027488413600001323
Characterization information of adjacent goods
Figure BDA00027488413600001324
And the commodity characteristics are embedded into the characterization information obtained by the updating layer at the upper layer
Figure BDA00027488413600001325
And then, the current embedding updating layer can obtain a one-dimensional commodity feature embedding vector.
For a group of users u and goods i, it is likely that the user and the goods will have many adjacent goods features, respectively, i.e. a ∈ Au∪AiA represents all the adjacent product characteristics of user u and product i, AuAll the adjacent merchandise features, A, of user uiRepresenting all of the adjacent item characteristics for item i.
In each embedded updating layer, assuming that all the neighbor commodity characteristics of the user u and the commodity i are updated, the commodity characteristic network can update n, which is different from the condition that the user network only updates the embedded vector of the user u and the commodity network only updates the embedded vector of the commodity ia=|Au|+|AiAn embedded vector of | merchandise features.
Ideally, the updating of the embedded vector for each commodity feature requires aggregating all of its sampled neighboring users and neighboring commodities. However, n is updated compared to the user network and the commodity networkaIndividual commodity characteristics require the consumption of O (n)a) Multiple computing resources. In order to make the commodity feature network similar to the consumption of computing resources of the user network and the commodity network, for each commodity feature, representative embedded vectors of adjacent users and embedded vectors of adjacent commodities can be selected for aggregation.
As an example, a user u and a commodity i, which currently need to be predicted in user rating, may be selected as representative adjacent users and adjacent commodities, respectively.
For example, the description will be given by taking a representative adjacent user of each of the plurality of product features as a user to be evaluated and a representative adjacent product of each of the product features as a product to be evaluated.
Updating the embedded vector of each commodity feature by using the following formula, namely the characteristic information of each commodity feature
Figure BDA0002748841360000141
Figure BDA0002748841360000142
Wherein the content of the first and second substances,
Figure BDA0002748841360000143
represents the embedded vector after each commodity feature is updated, namely the characterization information after each commodity feature is updated, sigma represents a ReLU function,
Figure BDA0002748841360000144
representing an updated weight matrix, updating the weight matrix
Figure BDA0002748841360000145
Continuously trained in the model training process, Concat represents a merging operation function,
Figure BDA0002748841360000146
the aggregation embedded vector of the commodity to be evaluated representing the characteristics of each commodity is the characterization information of the commodity to be evaluated of each commodity characteristic,
Figure BDA0002748841360000147
and the aggregation embedded vector of the user to be evaluated of each commodity feature is represented, namely the characterization information of the user to be evaluated of each commodity feature.
Characterizing information of representative neighboring users of the characteristics of the good, i.e. to be treatedCharacterizing information of rating users
Figure BDA0002748841360000148
The representative information of the adjacent commodities, i.e. the information of the commodities to be rated
Figure BDA0002748841360000149
And the characteristic information obtained by updating the commodity characteristics at the last time
Figure BDA00027488413600001410
Merging to obtain the updated characteristic information of the commodity characteristics
Figure BDA00027488413600001411
In the implementation mode, only the representative adjacent user and the representative adjacent commodity of each commodity feature are selected for aggregation, so that the computer resource can be saved, and the calculation time can be reduced.
In this embodiment, S402 to S404 may be repeatedly executed L times, where L is a positive integer. Generally, L may be set in advance, and may be set according to the accuracy of the prediction rating, for example. Wherein, each time S402 to S404 are executed, it can be understood that one embedded update layer executes a related operation.
S405, predicting first rating information of the commodity to be rated of the user to be rated based on the characterization information output by the L-layer embedded updating layer by using the prediction layer of the graph neural network.
When the prediction layer of the graph neural network is used for generating the first rating information of the commodity to be rated of the user to be rated based on the characterization information output by the embedded updating layer, in some implementation manners, the characterization information output by the embedded updating layer can be merged to obtain merged information, and then the merged information is input into the prediction layer of the graph neural network to obtain the first rating information.
As an example, in order to predict the rating of the commodity i to be rated by the user u to be rated, n is first ratedaThe characterization information of the adjacent commodity features is subjected to averaging calculation to obtain all the characterization informationThe common characteristic information of the adjacent commodity features has the following calculation formula:
Figure BDA00027488413600001412
wherein the content of the first and second substances,
Figure BDA00027488413600001413
representing common characterizing information, naThe number of the adjacent commodity characteristics of the user to be evaluated and the commodity to be evaluated is represented,
Figure BDA00027488413600001414
and representing the characterization information of each commodity characteristic in a plurality of adjacent commodity characteristics of the user to be evaluated and the commodity to be evaluated. There are various ways to combine the characteristics of the goods, such as taking an average, weighted averaging, learning parameter aggregation using neural networks. The characterization information of the user to be evaluated, which we will learn later
Figure BDA00027488413600001415
Characterization information of a commodity to be rated
Figure BDA00027488413600001416
And the common characteristic information of the adjacent commodity characteristics. Similarly, there are a number of ways to incorporate embedded vectors, such as averaging, embedding vectors connected, learning parameter aggregation using neural networks. As an example, we use the following formula to generate a prediction of the first rating information:
Figure BDA00027488413600001417
s406, training the graph neural network according to the first rating information and the real rating information of the commodity to be rated of the user to be rated.
For example, a loss value between the first rating information and the real rating information of the commodity to be rated by the user to be rated is calculated by using a loss function, and the weight parameter in the embedded updating layer and/or the prediction layer in the graph neural network is adjusted based on the loss value.
The application further provides a method for predicting the user rating based on the neural network in one embodiment. Operations similar to S301 and S302 in fig. 3 may be included in this embodiment, or operations similar to S401 to S405 in fig. 4 may be referred to in this embodiment.
According to the technical scheme, a user-commodity feature three-part graph is constructed according to commodity features in historical records of commodities purchased by users and user comments of the users on the commodities. And according to the constructed three-part graph, learning the characterization information of the user, the commodity and the commodity characteristics by using a neural network. Particularly, the representation information of each user is learned by the representation information of the characteristics of the adjacent commodities and the representation information of the adjacent commodities; the representation information of each commodity is learned from the representation information of the characteristics of adjacent commodities and the representation information of adjacent users; the characterization information of the characteristics of the adjacent commodities is learned by the characterization information of the adjacent users and the adjacent commodities. And predicting and generating the commodity rating information of the user through the neural network by the learned user characterization information, commodity characterization information and adjacent commodity feature characterization information. The rating information is used for representing the user rating of the commodity to be rated by the user to be rated. And the graph neural network trains the real rating information of the commodity to be rated by the user to be rated. According to the technical scheme, the accuracy of the user rating can be improved.
To validate the effectiveness of the neural network of the previous figures, applicants conducted extensive experiments on 25 public datasets from Amazon and Yelp e-commerce websites. Each data set contains user reviews and user ratings from a domain (e.g., makeup or dining). The applicant picked 8 models predicting user ratings as baseline models (baseline) for comparison with the graphical neural network model (ContGraph) in the present application.
These 8 models include: a PMF (Mnih,2008) model, an AARM (Guan,2019) model, an ALFM (Cheng,2018) model, an ANR (chi, 2018) model, a NARRE (Chen,2018) model, an NGCF (Wang,2019) model, a GC-MC (Berg,2017) model, and a GC-MC + Feat (Berg,2017) model.
The following describes the processing of data sets and model evaluation methods.
1) And (4) processing the data set.
TABLE 1 statistical description of the data set
Figure BDA0002748841360000151
Figure BDA0002748841360000161
Table 1 shows the statistics of the 25 data sets. The fifth column (the number of characteristics of the product) in table 1 shows the number of characteristics of the product extracted in the user comment using the foregoing method. The data set contains a total of over 22000000 user comments. For each data set, the user ratings were randomly divided into a training set, a validation set, and a test set at a ratio of 80:10: 10. When training the model using the training set, an adaptive gradient optimizer (Adagrad) is employed to optimize the objective function.
The batch size M is empirically set to 512, the initial learning rate of Adagrad is set to 0.003, and the dimensionality of the embedded vector of initialized user, commodity, and commodity features is set to 300. At each embedded update layer, 10 neighbor nodes, i.e., S, are randomly sampledl=10。
Based on the above data, the aforementioned 8 baseline models and the neural network model of the present application were trained using training sets, respectively. And the 9 models obtained by training were verified using a test set.
The purpose of the test set is to evaluate the accuracy of the model user rating predictions. To verify the accuracy of the predicted user rating of the model, the present embodiment uses a widely used Mean Absolute Error (MAE) index, i.e., an average of absolute differences between the true user rating and the predicted user rating, to measure the prediction accuracy.
In the following relation, N denotes the number of data in the test set. The smaller the MAE value, the better the model achieved.
Figure BDA0002748841360000162
Wherein r isuiA true user rating is represented by a value representing,
Figure BDA0002748841360000163
the prediction rating is indicated and MAE the mean absolute error.
The results of comparing the predicted results of the neural network model of the graph of the present application and the 8 baseline models after the test set test are shown in table 2.
TABLE 2 comparison of neural network model and 8 baseline models in the present application
Figure BDA0002748841360000164
In table 2, the second column represents the average MAE values obtained for the corresponding model over 25 data sets. Experiments show that for each data set, the graph neural network model provided by the application obtains the best MAE value compared with other 8 baseline models. Compared with the best baseline model, namely GC-MC + Feat, the graph neural network model provided by the application improves the user rating prediction accuracy by 7.3% on average.
In order to verify the commodity ranking effect of the graph neural network model provided by the present application, the present embodiment also compares the ranking effect of the graph neural network model provided by the present application embodiment and 8 baseline models in 25 data sets. Evaluation indexes using the criteria in the comparison include: the precision and the recall rate of the top K-ranked item ordering are respectively denoted as precision @ K (precision @ K) and recall @ K (recall @ K). As an example, K may be selected from {5,10,15,20 }. The larger the values of precision @ K and recall @ K represent the better the ordering effect of the model.
A comparison graph of precision @ K and call @ K corresponding to the sorting of the commodities based on the prediction results of the graph neural network model and the 8 baseline models provided by the embodiment of the application is shown in fig. 5. The abscissa of the left graph in fig. 5 represents K, and the ordinate represents recall; the abscissa of the right graph in fig. 5 represents K and the ordinate represents accuracy. As can be seen from fig. 5, the graph neural network type proposed in the present application achieves the best ranking results.
The embodiment of the application provides that commodity features in attention network aggregation user comments are used in a graph neural network model to learn embedded vectors of users and commodities. In the process of model learning (training), according to the interactive relationship between the user and the user comment and between the commodity and the user comment, the characterization information of the user, the commodity and the commodity feature is continuously learned and updated, so that the attention network can learn which commodity features the user pays attention to and which commodity features the commodity feels good in quality.
In one embodiment of the present application, a garment shoe and hat and jewelry data set is used to verify the validity of learned attention values. The trained graph neural network model of the embodiment of the application is operated on the data set, so that attention values of different commodity characteristics in a user network and a commodity network can be obtained.
Taking the graph neural network model of the embodiment of the application as an example, randomly sampling 10 neighbor nodes in each embedded update layer, an attention value greater than 0.1 indicates that the corresponding commodity features are important to the user or good in quality in the commodity, and an attention value less than 0.1 indicates that the corresponding commodity features are not important to the user or poor in quality in the commodity. This example performed two experiments to evaluate the validity of the obtained attention value.
And summarizing the first 10 commodity characteristics mentioned by the user and the commodities according to the times of appearance of the commodity characteristics. For each commodity feature, the attention value distribution in the user network and the attention value distribution in the commodity network are shown using the box charts shown in fig. 6 and 7. In fig. 6 and 7, the abscissa represents characteristics of the article, including comfort (fit), size (size), durability (wear), appearance (look), price (price), color (color), quality (quality), footwear (hose), pair (pair), buy (buy), and material (material); the ordinate represents the attention values (attention values).
Fig. 6 and 7 show the distribution values of the first 10 commodity characteristics in the user network and the commodity network. FIG. 6 illustrates that the same merchandise feature has different importance to different users because the same merchandise feature may get both a low attention value, i.e., an attention value less than 0.1, and a high attention value, i.e., an attention value greater than 0.1, from different users. From the average of attention, the user generally looks more heavily at the material, quality and price of the goods.
FIG. 7 shows that in the commodity network, most commodity features get attention values greater than 0.1, except for the commodity feature "pair". In the user review, "Pair" is a neutral word that describes an item, such as "a pair of jeans". Thus, the user may remain neutral with respect to the merchandise feature. Experimental results indicate that the frequently mentioned commercial features may not achieve high attentiveness values.
In order to further verify the effectiveness of the attention values obtained by the graph neural network model in the embodiment of the application, the application performs manual analysis, and 377 attention values are randomly selected from the obtained attention values according to a statistical method. The two evaluators independently verify the correctness of the obtained attention value by manually reading the user comments.
An attention value is considered correct if it is greater than 0.1 and its corresponding merchandise feature gets a positive user emotion in the user's comment, i.e., the user likes the merchandise feature's performance in the merchandise, or the attention value is less than 0.1 and its corresponding merchandise feature gets a neutral or negative emotion in the user's comment, i.e., the user does not like the merchandise feature's performance in the merchandise.
The experimental results show that 75% of the learned attention values are correct. Therefore, the graph neural network model of the embodiments of the present application can learn representative attention values to aggregate the commodity feature embedding vectors.
After the neural network is trained by the method shown in fig. 3 or fig. 4, the neural network can be used to predict the user rating of the commodity by the user.
A schematic flow chart diagram of a method of predicting user ratings based on a neural network of one embodiment of the present application is shown in fig. 8. The method may include S801 and S802.
S801, generating target characterization information by using a neural network based on the characterization information of the adjacent commodity feature of each user in the plurality of users, the characterization information of the adjacent commodity of each user, the characterization information of the adjacent commodity feature of each commodity in the plurality of commodities, the characterization information of the adjacent user of each commodity feature and the characterization information of the adjacent commodity in the adjacent commodity feature of each user, the characterization information of the adjacent user of each commodity feature and the characterization information of the adjacent commodity in the adjacent commodity feature of each commodity, wherein the target characterization information comprises the characterization information of the user to be evaluated, the characterization information of the commodity to be evaluated, the characterization information of the adjacent commodity feature of the commodity to be evaluated and the characterization information of the adjacent commodity feature of the user to be evaluated.
This step may refer to S301 in fig. 3, or S401 to S404 in fig. 4, which is not described herein again.
S802, the neural network is used for obtaining first rating information of the commodity to be rated of the user to be rated based on the target representation information. The first rating information is used for representing or indicating the rating of the commodity to be rated by the user to be rated
This step may refer to S302 in fig. 3 or S405 in fig. 4, which is not described herein again.
As shown in fig. 9, the present embodiment provides a system architecture 900. The data collection device 960 is configured to collect user data and commodity data and store the user data and commodity data in the database 930, and the training device 920 generates the target model/rule 901 based on the user data and the commodity data maintained in the database 930. The user data can comprise historical shopping information of the user and historical comments of the user on the commodity, and the commodity data comprises historical shopping information of the purchased commodity and historical user comments of the commodity.
The training device 920 may perform the method shown in fig. 3 or fig. 4, so as to train the target model/rule 901 for predicting the rating information of the user on the commodity.
The goal models/rules obtained by the training device 920 may be applied in different systems or devices. In fig. 9, the execution device 910 is configured with an I/O interface 912 for data interaction with an external device, and a "user" can input user information and inquired commodity information to the I/O interface 912 through a client device 940.
The execution device 910 may call data, code, etc. from the data storage system 950 and may store data, instructions, etc. in the data storage system 950.
The calculation module 911 processes the input user information and commodity information using the target model/rule 901, thereby obtaining user rating information of the commodity by the user. For example, the calculation module 911 may execute S301 and S302 in fig. 3, or execute S401 to S405 in fig. 4, thereby obtaining the user rating information.
Finally, I/O interface 912 returns the results of the processing to client device 940 for presentation to the user.
In the case shown in FIG. 9, the user may manually specify data to be input into the execution device 910, for example, by operating in an interface provided by the I/O interface 912. Alternatively, the client device 940 may automatically input data to the I/O interface 912 and obtain the results, and if the client device 940 automatically inputs data to obtain authorization from the user, the user may set the corresponding permissions in the client device 940. The user may view the results output by the execution device 910 at the client device 940, and the specific presentation form may be a display, a sound, an action, and the like. The client device 940 may also be used as a data collection end to store the collected user information and the information of the goods to be queried in the database 930.
It should be noted that fig. 9 is only a schematic diagram of a system architecture provided in an embodiment of the present application, and the position relationship between the devices, modules, and the like shown in the diagram does not constitute any limitation, for example, in fig. 9, the data storage system 950 is an external memory with respect to the execution device 910, and in other cases, the data storage system 950 may also be disposed in the execution device 910.
Fig. 10 and 11 are schematic structural diagrams of possible devices provided by embodiments of the present application. These devices can be used to implement the above-described method and thus also achieve the beneficial effects of the above-described method embodiments.
In the embodiment of the present application, the apparatus shown in fig. 10 or fig. 11 may be the training device 920 or the execution device 910 shown in fig. 9, and may also be a module (e.g., a chip) applied to the training device or the execution device.
The apparatus 1000 may include a processing module 1001 and a rating module 1002. The processing module 1001 may also be referred to as a processing unit 1001 and the rating module 1002 may also be referred to as a rating unit 1002.
In one implementation, the apparatus 1000 may be used to implement the method illustrated in FIG. 8 described above. For example, the processing module 1001 is for implementing S801 and the rating module 1002 is for implementing S802.
In another implementation, the apparatus 1000 may further include a training module. The apparatus 1000 in this implementation may be used to implement the method illustrated in fig. 3 described above. For example, the processing module 1001 is used to implement S301, the rating module 1002 is used to implement S302, and the training module may be used to implement S303.
In yet another implementation, the apparatus 1000 may further include a training module. The apparatus 1000 in this implementation may be used to implement the method illustrated in fig. 4 described above. For example, the processing module 1001 is configured to implement S401 to S403, the rating module 1002 is configured to implement S404, and the training module is configured to implement S405.
Fig. 11 is a schematic structural diagram of an apparatus according to another embodiment of the present application. The apparatus 1100 shown in fig. 11 may be used to perform the method described in any of the previous embodiments.
As shown in fig. 11, the apparatus 1100 of the present embodiment includes: memory 1101, processor 1102, communication interface 1103, and bus 1104. The memory 1101, the processor 1102 and the communication interface 1103 are communicatively connected to each other through a bus 1104.
The memory 1101 may be a Read Only Memory (ROM), a static memory device, a dynamic memory device, or a Random Access Memory (RAM). The memory 1101 may store programs and when the programs stored in the memory 1101 are executed by the processor 1102, the processor 1102 may be configured to perform the steps of the methods illustrated in fig. 3, 4 or 8.
The processor 1102 may be a general Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits, configured to execute related programs to implement the lane inference method or the lane inference model training method according to the embodiment of the present disclosure.
The processor 1102 may also be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the method of the embodiments of the present application may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 1102.
The processor 1102 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1101, and the processor 1102 reads the information in the memory 1101, and completes the functions required to be performed by each method in the embodiments of the present application in combination with the hardware thereof, for example, each step/function of the embodiments shown in fig. 3, fig. 4 or fig. 8 may be performed.
The communication interface 1103 may enable communication between the apparatus 1100 and other devices or communication networks using, but not limited to, transceiver devices.
Bus 1104 may include a path that conveys information between various components of apparatus 1100 (e.g., memory 1101, processor 1102, communication interface 1103).
It should be understood that the apparatus 1100 shown in the embodiments of the present application may be an electronic device, or may also be a chip configured in an electronic device.
It should be understood that the processor in the embodiments of the present application may be a Central Processing Unit (CPU), and the processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will also be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of Random Access Memory (RAM) are available, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchlink DRAM (SLDRAM), and direct bus RAM (DR RAM).
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer instructions or computer programs. The procedures or functions according to the embodiments of the present application are wholly or partially generated when the computer instructions or the computer program are loaded or executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more collections of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk.
It should be understood that the term "and/or" herein is merely one type of association relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. In addition, the "/" in this document generally indicates that the former and latter associated objects are in an "or" relationship, but may also indicate an "and/or" relationship, which may be understood with particular reference to the former and latter text.
In the present application, "at least one" means one or more, "a plurality" means two or more. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. A method for predicting user ratings based on a neural network, comprising:
generating target characterization information by using a neural network based on the characterization information of the adjacent commodity feature of each user in a plurality of users, the characterization information of the adjacent commodity of each user, the characterization information of the adjacent commodity feature of each commodity in a plurality of commodities, the characterization information of the adjacent user of each commodity, the characterization information of the adjacent user and the characterization information of the adjacent commodity of each commodity feature in the adjacent commodity feature of each user, the characterization information of the adjacent user and the characterization information of the adjacent commodity of each commodity feature in the adjacent commodity feature of each commodity, wherein the target characterization information comprises the characterization information of a user to be evaluated, the characterization information of a commodity to be evaluated, the characterization information of the adjacent commodity feature of the commodity to be evaluated and the characterization information of the adjacent commodity feature of the user to be evaluated;
obtaining first rating information based on the target representation information by using the neural network, wherein the first rating information represents user rating of the to-be-rated user on the to-be-rated commodity;
the plurality of users comprise a plurality of users corresponding to a plurality of user nodes connected through commodity nodes and/or commodity feature nodes in a three-part graph, the adjacent commodity features of each user comprise commodity features corresponding to commodity feature nodes connected with the user nodes corresponding to each user in the three-part graph, the adjacent commodities of each user comprise commodities corresponding to the commodity nodes connected with the user nodes corresponding to each user in the three-part graph, the commodities comprise commodities corresponding to a plurality of commodity nodes connected through the user nodes and/or the commodity feature nodes in the three-part graph, the adjacent commodity features of each commodity comprise commodity features corresponding to the commodity feature nodes connected with the commodity nodes corresponding to each commodity in the three-part graph, and the adjacent users of each commodity comprise commodities connected with the commodity nodes corresponding to each commodity in the three-part graph A user corresponding to the user node;
the three-part graph comprises M1Individual user node, M2Individual commodity node and M3Item feature node, said M1A user node and M1One-to-one correspondence of each user, M2Each commodity node and M2One-to-one correspondence of each commodity, theM is3Individual commodity characteristic node and M3Each commodity feature is in one-to-one correspondence, and M is1The user node corresponding to the u-th user in each user is connected with the commodity node corresponding to the commodity commented and/or purchased by the u-th user, the user node corresponding to the u-th user is connected with the commodity feature node corresponding to the commodity feature commented by the u-th user, and the M is used for determining the commodity feature of the commodity commented by the u-th user2The commodity node corresponding to the ith commodity in each commodity is connected with the commodity feature node corresponding to the commodity feature of the commented ith commodity, and M1、M2And M3Are each an integer greater than 1, u is less than or equal to M1I is less than or equal to M2Is a positive integer of (1).
2. The method of claim 1, wherein the neural network is a graph neural network, and wherein the method further comprises:
initializing the representation information of each user, the representation information of each commodity, the representation information of each adjacent commodity feature of each user and the representation information of each adjacent commodity feature of each commodity;
accordingly, the generating of the target characterizing information based on the characterizing information of the adjacent commodity feature of each of the plurality of users, the characterizing information of the adjacent commodity of the each user, the characterizing information of the adjacent commodity feature of each of the plurality of commodities, and the characterizing information of the adjacent user of the each commodity, the characterizing information of the adjacent user and the adjacent commodity of the adjacent commodity feature of the each user, the characterizing information of the adjacent user and the adjacent commodity of the adjacent commodity feature of the each commodity, and the characterizing information of the adjacent commodity includes:
step one, using an embedded updating layer of the graph neural network to update the characterization information of each user based on the characterization information of each user, the characterization information of the adjacent commodity features of each user and the characterization information of the adjacent commodities of each user;
secondly, updating the representation information of each commodity based on the representation information of each commodity, the representation information of adjacent users of each commodity and the representation information of adjacent commodity features of each commodity by using an embedded updating layer of the graph neural network;
and thirdly, updating the characteristic information of each commodity feature based on the characteristic information of each commodity feature in the plurality of commodity features, the characteristic information of the first adjacent commodity of each commodity feature and the characteristic information of the first adjacent user of each commodity feature by using the embedded updating layer of the graph neural network.
3. The method according to claim 2, wherein the graph neural network comprises an L-layer embedded updating layer, wherein L is equal to the maximum connection relationship length between the node corresponding to the user to be evaluated in the three-part graph and the node used for updating the characterization information of the user to be evaluated.
4. The method of claim 2 or 3, wherein said updating the characterizing information of each user based on the characterizing information of each user, the characterizing information of the adjacent commodity features of each user and the characterizing information of the adjacent commodity of each user using the embedded updating layer of the graph neural network comprises:
using the attention network in the embedded update layer based on the n of each user1The characterization information of each adjacent commodity feature in the adjacent commodity features is obtained, and the n is obtained by each user1An attention value for each of a plurality of adjacent commodity features;
using the embedded update layer, according to the n1A value of attention of each of adjacent commodity features for said n1Merging the characterization information of the adjacent commodity characteristics to obtain first commodity characteristic information;
and merging the characterization information of each user, the first commodity characterization information and the characterization information of the adjacent commodity of each user by using the embedded updating layer to obtain user characterization information, wherein the characterization information of each user is updated to the user characterization information.
5. The method of any one of claims 2 to 4, wherein said updating the characterizing information of each commodity based on the characterizing information of each commodity, the characterizing information of the adjacent users of each commodity and the characterizing information of the adjacent commodity features of each commodity using the embedded updating layer of the graph neural network comprises:
using the attention network embedded in the update layer based on the n of each item2The characterization information of each adjacent commodity feature in the adjacent commodity features is obtained to obtain the n of each commodity pair2An attention value for each of a plurality of adjacent commodity features;
using the embedded update layer, according to the n2A value of attention of each of adjacent commodity features for said n2Fusing the characterization information of the adjacent commodity characteristics to obtain second commodity characteristic information;
and merging the characterization information of each commodity, the second commodity characterization information and the characterization information of the adjacent user of each commodity by using the embedded updating layer to obtain commodity characterization information, wherein the characterization information of each commodity is updated to the commodity characterization information.
6. The method according to any one of claims 2 to 5, wherein the first adjacent user is the user to be rated and the first adjacent commodity is the commodity to be rated.
7. The method of any one of claims 1 to 6, wherein said using the neural network to obtain first rating information based on the target characterization information comprises:
merging the target characterization information by using the neural network to obtain merged information;
predicting, using the neural network, the first rating information based on the consolidated information.
8. The method according to any one of claims 1 to 7, further comprising:
and training the neural network based on the first rating information and the real rating information of the to-be-rated commodity of the to-be-rated user.
9. An apparatus for predicting user ratings based on a neural network, comprising:
a processing module for generating target characterizing information based on the characterizing information of the adjacent commodity feature of each user of a plurality of users, the characterizing information of the adjacent commodity of each user, the characterizing information of the adjacent commodity feature of each commodity of a plurality of commodities, the characterizing information of the adjacent user of each commodity, the characterizing information of the adjacent user and the adjacent commodity of each commodity feature of the adjacent commodity feature of each user, the characterizing information of the adjacent user and the adjacent commodity of each commodity feature of the adjacent commodity feature of each commodity and the characterizing information of the adjacent commodity by using a neural network, the target characterization information comprises characterization information of a user to be evaluated, characterization information of a commodity to be evaluated, characterization information of adjacent commodity features of the commodity to be evaluated and characterization information of adjacent commodity features of the user to be evaluated;
the rating module is used for obtaining first rating information based on the target representation information by using the neural network, wherein the first rating information represents the user rating of the to-be-rated user on the to-be-rated commodity;
the plurality of users comprise a plurality of users corresponding to a plurality of user nodes connected through commodity nodes and/or commodity feature nodes in a three-part graph, the adjacent commodity features of each user comprise commodity features corresponding to commodity feature nodes connected with the user nodes corresponding to each user in the three-part graph, the adjacent commodities of each user comprise commodities corresponding to the commodity nodes connected with the user nodes corresponding to each user in the three-part graph, the commodities comprise commodities corresponding to a plurality of commodity nodes connected through the user nodes and/or the commodity feature nodes in the three-part graph, the adjacent commodity features of each commodity comprise commodity features corresponding to the commodity feature nodes connected with the commodity nodes corresponding to each commodity in the three-part graph, and the adjacent users of each commodity comprise commodities connected with the commodity nodes corresponding to each commodity in the three-part graph A user corresponding to the user node;
the three-part graph comprises M1Individual user node, M2Individual commodity node and M3Item feature node, said M1A user node and M1One-to-one correspondence of each user, M2Each commodity node and M2One-to-one correspondence of each commodity, M3Individual commodity characteristic node and M3Each commodity feature is in one-to-one correspondence, and M is1The user node corresponding to the u-th user in each user is connected with the commodity node corresponding to the commodity commented and/or purchased by the u-th user, the user node corresponding to the u-th user is connected with the commodity feature node corresponding to the commodity feature commented by the u-th user, and the M is used for determining the commodity feature of the commodity commented by the u-th user2The commodity node corresponding to the ith commodity in each commodity is connected with the commodity feature node corresponding to the commodity feature of the commented ith commodity, and M1、M2And M3Are each an integer greater than 1, u is less than or equal to M1I is less than or equal to M2Is a positive integer of (1).
10. The apparatus of claim 9, wherein the neural network is a graph neural network, wherein the apparatus further comprises an initialization module;
the initialization module is used for initializing the representation information of each user, the representation information of each commodity, the representation information of each adjacent commodity feature of each user and the representation information of each adjacent commodity feature of each commodity;
correspondingly, the processing module is specifically configured to perform the following steps:
step one, using an embedded updating layer of the graph neural network to update the characterization information of each user based on the characterization information of each user, the characterization information of the adjacent commodity features of each user and the characterization information of the adjacent commodities of each user;
secondly, updating the representation information of each commodity based on the representation information of each commodity, the representation information of adjacent users of each commodity and the representation information of adjacent commodity features of each commodity by using an embedded updating layer of the graph neural network;
and thirdly, updating the characteristic information of each commodity feature based on the characteristic information of each commodity feature in the plurality of commodity features, the characteristic information of the first adjacent commodity of each commodity feature and the characteristic information of the first adjacent user of each commodity feature by using the embedded updating layer of the graph neural network.
11. The apparatus according to claim 10, wherein the graph neural network comprises L layers of embedded updating layers, where L is equal to a maximum connection relationship length between a node corresponding to the user to be evaluated in the three-part graph and a node used for updating the characterization information of the user to be evaluated.
12. The apparatus according to claim 10 or 11, wherein the processing module is specifically configured to:
using the attention network in the embedded update layer based on the n of each user1The characterization information of each adjacent commodity feature in the adjacent commodity features is obtained, and the n is obtained by each user1An attention value for each of a plurality of adjacent commodity features;
using the embedded update layer, according to the n1A value of attention of each of adjacent commodity features for said n1Merging the characterization information of the adjacent commodity characteristics to obtain first commodity characteristic information;
and merging the characterization information of each user, the first commodity characterization information and the characterization information of the adjacent commodity of each user by using the embedded updating layer to obtain user characterization information, wherein the characterization information of each user is updated to the user characterization information.
13. The apparatus according to any one of claims 10 to 12, wherein the processing module is specifically configured to:
using the attention network embedded in the update layer based on the n of each item2The characterization information of each adjacent commodity feature in the adjacent commodity features is obtained to obtain the n of each commodity pair2An attention value for each of a plurality of adjacent commodity features;
using the embedded update layer, according to the n2A value of attention of each of adjacent commodity features for said n2Fusing the characterization information of the adjacent commodity characteristics to obtain second commodity characteristic information;
and merging the characterization information of each commodity, the second commodity characterization information and the characterization information of the adjacent user of each commodity by using the embedded updating layer to obtain commodity characterization information, wherein the characterization information of each commodity is updated to the commodity characterization information.
14. The apparatus according to any one of claims 10 to 13, wherein the first adjacent user is the user to be rated and the first adjacent commodity is the commodity to be rated.
15. The apparatus according to any one of claims 9 to 14, wherein the prediction module is specifically configured to:
merging the target characterization information by using the neural network to obtain merged information;
predicting, using the neural network, the first rating information based on the consolidated information.
16. The apparatus of any one of claims 9 to 15, further comprising a training module to:
and training the neural network based on the first rating information and the real rating information of the to-be-rated commodity of the to-be-rated user.
17. An apparatus for predicting user ratings based on a neural network, comprising: a memory and a processor;
the memory is to store program instructions;
the processor is configured to invoke program instructions in the memory to perform the method of any of claims 1 to 8.
18. A chip comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the at least one processor being configured to execute a computer program or instructions to perform the method of any one of claims 1 to 8.
19. A computer-readable medium, characterized in that the computer-readable medium stores program code for computer execution, the program code comprising instructions for performing the method of any of claims 1 to 8.
CN202011176586.9A 2020-10-28 2020-10-28 Method and device for predicting user rating based on graph neural network Pending CN112488355A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011176586.9A CN112488355A (en) 2020-10-28 2020-10-28 Method and device for predicting user rating based on graph neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011176586.9A CN112488355A (en) 2020-10-28 2020-10-28 Method and device for predicting user rating based on graph neural network

Publications (1)

Publication Number Publication Date
CN112488355A true CN112488355A (en) 2021-03-12

Family

ID=74927275

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011176586.9A Pending CN112488355A (en) 2020-10-28 2020-10-28 Method and device for predicting user rating based on graph neural network

Country Status (1)

Country Link
CN (1) CN112488355A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113869992A (en) * 2021-12-03 2021-12-31 平安科技(深圳)有限公司 Artificial intelligence based product recommendation method and device, electronic equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110956492A (en) * 2019-11-13 2020-04-03 福建省建瓯第一中学 Commodity pushing method based on big data science and dynamic weight adjustment
CN111523047A (en) * 2020-04-13 2020-08-11 中南大学 Multi-relation collaborative filtering algorithm based on graph neural network
CN111681067A (en) * 2020-04-17 2020-09-18 清华大学 Long-tail commodity recommendation method and system based on graph attention network
CN111695719A (en) * 2020-04-20 2020-09-22 清华大学 User value prediction method and system
CN111695965A (en) * 2020-04-26 2020-09-22 清华大学 Product screening method, system and equipment based on graph neural network
CN111753207A (en) * 2020-06-29 2020-10-09 华东师范大学 Collaborative filtering model of neural map based on comments

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110956492A (en) * 2019-11-13 2020-04-03 福建省建瓯第一中学 Commodity pushing method based on big data science and dynamic weight adjustment
CN111523047A (en) * 2020-04-13 2020-08-11 中南大学 Multi-relation collaborative filtering algorithm based on graph neural network
CN111681067A (en) * 2020-04-17 2020-09-18 清华大学 Long-tail commodity recommendation method and system based on graph attention network
CN111695719A (en) * 2020-04-20 2020-09-22 清华大学 User value prediction method and system
CN111695965A (en) * 2020-04-26 2020-09-22 清华大学 Product screening method, system and equipment based on graph neural network
CN111753207A (en) * 2020-06-29 2020-10-09 华东师范大学 Collaborative filtering model of neural map based on comments

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113869992A (en) * 2021-12-03 2021-12-31 平安科技(深圳)有限公司 Artificial intelligence based product recommendation method and device, electronic equipment and medium
CN113869992B (en) * 2021-12-03 2022-03-18 平安科技(深圳)有限公司 Artificial intelligence based product recommendation method and device, electronic equipment and medium

Similar Documents

Publication Publication Date Title
Chaudhuri et al. On the platform but will they buy? Predicting customers' purchase behavior using deep learning
CN111784455B (en) Article recommendation method and recommendation equipment
Nilashi et al. Analysis of travellers’ online reviews in social networking sites using fuzzy logic approach
Bali et al. R: Unleash machine learning techniques
EP4181026A1 (en) Recommendation model training method and apparatus, recommendation method and apparatus, and computer-readable medium
WO2023011382A1 (en) Recommendation method, recommendation model training method, and related product
CN113256367B (en) Commodity recommendation method, system, equipment and medium for user behavior history data
CN110008397B (en) Recommendation model training method and device
Wang et al. Perceiving the next choice with comprehensive transaction embeddings for online recommendation
CN109034960B (en) Multi-attribute inference method based on user node embedding
CN112308650B (en) Recommendation reason generation method, device, equipment and storage medium
CN112488863B (en) Dangerous seed recommendation method and related equipment in user cold start scene
CN109584006B (en) Cross-platform commodity matching method based on deep matching model
CN110321473B (en) Multi-modal attention-based diversity preference information pushing method, system, medium and device
Hong et al. Development of a new knowledge-based fabric recommendation system by integrating the collaborative design process and multi-criteria decision support
CN113918832A (en) Graph convolution collaborative filtering recommendation system based on social relationship
CN111967924A (en) Commodity recommendation method, commodity recommendation device, computer device, and medium
CN113918834A (en) Graph convolution collaborative filtering recommendation method fusing social relations
Sengupta et al. Simple surveys: Response retrieval inspired by recommendation systems
Zeng et al. Collaborative filtering via heterogeneous neural networks
CN112488355A (en) Method and device for predicting user rating based on graph neural network
Zhang et al. Garment recommendation in an e-shopping environment by using a Markov Chain and Complex Network integrated method
CN114429384B (en) Intelligent product recommendation method and system based on e-commerce platform
CN113656589B (en) Object attribute determining method, device, computer equipment and storage medium
CN110020195A (en) Article recommended method and device, storage medium, electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination