CN116712736A - User recommendation method and related device - Google Patents

User recommendation method and related device Download PDF

Info

Publication number
CN116712736A
CN116712736A CN202210192040.5A CN202210192040A CN116712736A CN 116712736 A CN116712736 A CN 116712736A CN 202210192040 A CN202210192040 A CN 202210192040A CN 116712736 A CN116712736 A CN 116712736A
Authority
CN
China
Prior art keywords
user
node
user characteristics
neighbor
target node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210192040.5A
Other languages
Chinese (zh)
Inventor
林文清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210192040.5A priority Critical patent/CN116712736A/en
Publication of CN116712736A publication Critical patent/CN116712736A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/795Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The application discloses a user recommendation method which is suitable for friend recommendation in application programs, such as user recommendation in game software. The method can be applied to various scenes such as cloud technology, artificial intelligence, intelligent traffic, map field, auxiliary driving and the like. The method comprises the following steps: and processing the user characteristics of the first user and the user characteristics of the second user by using the graph neural network model to obtain friend probabilities, wherein the friend probabilities indicate the probability that the second user and the first user form a friend relationship. Then, a target user recommended to the first user is determined based on the friend probability, the target user belonging to the second user. Because the training sample used in the graph neural network model training comprises the user characteristics of a plurality of users with friend relations, the accuracy of acquiring friend probabilities can be effectively improved by using the graph neural network model. And the user experience is improved, and the pertinence of user recommendation is improved.

Description

User recommendation method and related device
Technical Field
The application relates to the technical field of Internet, in particular to a user recommendation method and a related device.
Background
With the continuous development of the internet, more and more users communicate or interact in game software, and different players (i.e. users) are connected together by adding game friends.
At present, a neural network model is usually trained based on user characteristics of each game player, and then game friends are recommended to a target user by using the trained neural network model.
Because only the user characteristic information of the game player is considered, some game players with low possibility of becoming friends can be recommended to the target user, so that the pertinence of the user recommendation is poor, and the user experience is affected.
Disclosure of Invention
In view of this, the present application provides, in one aspect, a method for user recommendation, including:
training the graph neural network model by using a training sample of the graph neural network model, wherein the training sample of the graph neural network model comprises: the method comprises the steps that a first neighbor node set corresponds to user characteristics of users, the first neighbor node set comprises one or more neighbor nodes, and users corresponding to the neighbor nodes in the first neighbor node set have friend relations with users corresponding to target nodes;
acquiring user characteristics of a first user and user characteristics of a second user, wherein the user characteristics comprise one or more of the following: user active features, user gaming features, user payment features, or user social features;
Processing the user characteristics of the first user and the user characteristics of the second user by adopting a graph neural network model, and obtaining friend probability, wherein the friend probability indicates the probability that the second user and the first user form a friend relationship;
and determining a target user recommended to the first user according to the probability of one or more friends, wherein the target user belongs to the second user.
In the embodiment of the application, the user characteristics of the first user and the user characteristics of the second user are processed by using the graph neural network model, and the friend probability is obtained, wherein the friend probability indicates the probability that the second user and the first user form a friend relationship. Then, a target user recommended to the first user is determined based on the friend probability, the target user belonging to the second user. Because the training sample used in the graph neural network model training comprises the user characteristics of a plurality of users with friend relations, the accuracy of acquiring friend probabilities can be effectively improved by using the graph neural network model. And the user experience is improved, and the pertinence of user recommendation is improved.
Another aspect of the present application provides a user recommendation apparatus, including:
the receiving and transmitting module is used for acquiring the user characteristics of the first user and the user characteristics of the second user, wherein the user characteristics comprise one or more of the following: user active features, user gaming features, user payment features, or user social features;
The processing module is used for processing the user characteristics of the first user and the user characteristics of the second user by adopting the graph neural network model to obtain friend probability, the friend probability indicates the probability that the second user and the first user form a friend relationship, and the training sample of the graph neural network model comprises: the method comprises the steps that a first neighbor node set corresponds to user characteristics of users, the first neighbor node set comprises one or more neighbor nodes, and users corresponding to the neighbor nodes in the first neighbor node set have friend relations with users corresponding to target nodes;
and the processing module is also used for determining a second user recommended to the first user according to the probabilities of the plurality of friends.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
the training sample of the graph neural network model further comprises: the second neighbor node set corresponds to the user characteristics of the users, the second neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the second neighbor node set have friend relations with the users corresponding to the first neighbor nodes.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
The second neighbor node set comprises a first neighbor node subset, the users corresponding to the neighbor nodes included in the first neighbor node subset do not have friend relations with the users corresponding to the target nodes, and the user characteristics of the users corresponding to the neighbor nodes in the first neighbor node subset are used as negative training samples of the graph neural network.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
the second neighbor node set comprises a second neighbor node subset, and the users corresponding to the neighbor nodes included in the second neighbor node subset have friend relations with the users corresponding to the target nodes;
the positive training samples of the graph neural network include: the first neighbor node subset includes neighbor nodes corresponding to user characteristics of the user, and the first neighbor node subset includes neighbor nodes corresponding to user characteristics of the user.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
the processing module is also used for determining the user characteristics of the first target node and the user characteristics of the neighbor nodes corresponding to the first target node in the training sample of the graph neural network model;
The processing module is further used for carrying out aggregation processing on the user characteristics of the first target node and the user characteristics of the neighbor nodes corresponding to the first target node to obtain a first aggregation vector;
the processing module is also used for determining the user characteristics of the second target node and the user characteristics of the neighbor nodes corresponding to the second target node in the training sample of the graph neural network model;
the processing module is further used for carrying out aggregation processing on the user characteristics of the second target node and the user characteristics of the neighbor nodes corresponding to the second target node to obtain a second aggregation vector;
the processing module is also used for carrying out combination processing on the first aggregate vector and the second aggregate vector to obtain a first combined vector;
the processing module is also used for determining a target loss function according to the first merging vector;
and the processing module is also used for optimizing parameters of the graph neural network to be trained until the target loss function converges to obtain the trained graph neural network.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
the processing module is further used for carrying out normalization processing on the user characteristics of the first target node and the user characteristics of the neighbor nodes corresponding to the first target node to obtain the user characteristics of the first target node after normalization processing and the user characteristics of the neighbor nodes corresponding to the first target node after normalization processing;
The processing module is further used for carrying out aggregation processing on the user characteristics of the first target node after normalization processing and the user characteristics of the neighbor nodes corresponding to the first target node after normalization processing to obtain a first aggregation vector;
the processing module is further used for carrying out normalization processing on the user characteristics of the second target node and the user characteristics of the neighbor nodes corresponding to the second target node to obtain the user characteristics of the second target node after normalization processing and the user characteristics of the neighbor nodes corresponding to the second target node after normalization processing;
and the processing module is also used for carrying out aggregation processing on the user characteristics of the second target node after normalization processing and the user characteristics of the neighbor nodes corresponding to the second target node after normalization processing to obtain a second aggregation vector.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the user features further include:
user gaming characteristics including one or more of the following: game level, game segment, game play count, game play odds, or game scene preference.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the user-active features include one or more of the following: login times, login time, sign-in times, or online time.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the user payment feature includes one or more of the following: the number of payments, the total amount paid, the amount paid over a period of time, or the maximum amount paid.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the user social features include one or more of the following: chat times, chat duration, praise times, team formation duration, gift times, or gift amount.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
the training sample of the graph neural network model further comprises: the third neighbor node set corresponds to the user characteristics of the users, the third neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the third neighbor node set have friend relations with the users corresponding to the second neighbor nodes.
Another aspect of the present application provides a computer apparatus comprising: a memory, a processor, and a bus system;
wherein the memory is used for storing programs;
the processor is used for executing the program in the memory, and the processor is used for executing the method according to the aspects according to the instructions in the program code;
The bus system is used to connect the memory and the processor to communicate the memory and the processor.
Another aspect of the application provides a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the methods of the above aspects.
In another aspect of the application, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the methods provided in the above aspects.
From the above technical solutions, the embodiment of the present application has the following advantages:
and processing the user characteristics of the first user and the user characteristics of the second user by using the graph neural network model to obtain friend probabilities, wherein the friend probabilities indicate the probability that the second user and the first user form a friend relationship. Then, a target user recommended to the first user is determined based on the friend probability, the target user belonging to the second user. Because the training sample used in the graph neural network model training comprises the user characteristics of a plurality of users with friend relations, the accuracy of acquiring friend probabilities can be effectively improved by using the graph neural network model. And the user experience is improved, and the pertinence of user recommendation is improved.
Drawings
Fig. 1 is an application scenario schematic diagram of a method for user recommendation provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of an acquisition of the neural network according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for user recommendation according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a node corresponding to a friend relationship in an embodiment of the present application;
FIG. 5 is a schematic diagram of an operation interface according to an embodiment of the present application;
FIG. 6 is a schematic diagram of another operation interface according to an embodiment of the present application;
FIG. 7 is a schematic diagram of training of the neural network according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a user recommendation device according to an embodiment of the present application;
fig. 9 is a schematic diagram of a server structure according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "includes" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application. The step numbers in the following embodiments are set for convenience of illustration only, and the order between the steps is not limited in any way, and the execution order of the steps in the embodiments may be adaptively adjusted according to the understanding of those skilled in the art.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
Before describing embodiments of the present application in further detail, the terms and terminology involved in the embodiments of the present application will be described first, and the terms and terminology involved in the embodiments of the present application are applicable to the following explanation.
Map (Graph): a data form composed of a plurality of nodes (also called vertices) through mutual connection, wherein the nodes can be entities such as people, institutions and the like, and the connection (called edges) between the nodes represents a certain relationship (such as friend relationship, subordinate relationship and the like); a graph may have only one node and one edge (referred to as a single graph), or may have multiple nodes or multiple edges (referred to as an outlier), where the edges in the graph may be directed edges (referred to as a directed graph) or undirected edges (referred to as an undirected graph).
Fig. neural network (Graph Neural Networks): the input of the map-based machine learning method can be defined as map structure data or node characteristic data, and the output is a characterization vector for each node or the whole map.
Generalization performance (Generalization ability): refers to the recognition capability of a machine learning algorithm for input samples that have not been seen.
Source domain: in the transfer learning process, a large amount of common knowledge exists in the knowledge domain where the transferred knowledge is located for transfer learning.
Target domain: the knowledge domain to which the migrated knowledge is migrated in the migration learning process, namely the domain where the task is located in the machine learning application.
The pre-training method of the graph neural network provided by the embodiment of the application can be applied to artificial intelligence technology. Artificial intelligence (Artificial Intelligence, AI) is the theory, method, technique and application system that uses a digital computer or a machine controlled by a digital computer to simulate, extend and extend human intelligence, sense the environment, acquire knowledge and use the knowledge to obtain optimal results. The artificial intelligence technology is a comprehensive subject, and relates to the technology with wide fields, namely the technology with a hardware level and the technology with a software level. Wherein, the artificial intelligence basic technology generally comprises technologies such as a sensor, a special artificial intelligence chip, cloud acquisition, distributed storage, big data processing technology, an operation/interaction system, electromechanical integration and the like; the artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and other directions.
The embodiment of the application can be applied to various scenes such as cloud technology, artificial intelligence, intelligent traffic, auxiliary driving and the like. The user recommending method can be used in a user recommending device. The user recommendation means may be integrated in a computer device, which may be a server or a terminal. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, network acceleration services (Content Delivery Network, CDN), basic cloud computing services such as big data and an artificial intelligent platform. Wherein the server may be a node in a blockchain. The terminal can be a mobile phone, a tablet personal computer, a notebook computer, an intelligent television, wearable intelligent equipment, a personal computer (Personal Computer, PC), intelligent voice interaction equipment, intelligent household appliances, a vehicle-mounted terminal and other equipment.
The intelligent transportation system (Intelligent Traffic System, ITS), also called intelligent transportation system (Intelligent Transportation System), is a comprehensive transportation system which uses advanced scientific technology (information technology, computer technology, data communication technology, sensor technology, electronic control technology, automatic control theory, operation study, artificial intelligence, etc.) effectively and comprehensively for transportation, service control and vehicle manufacturing, and enhances the connection among vehicles, roads and users, thereby forming a comprehensive transportation system for guaranteeing safety, improving efficiency, improving environment and saving energy. Or alternatively;
The intelligent vehicle-road cooperative system (Intelligent Vehicle Infrastructure Cooperative Systems, IVICS), which is simply called a vehicle-road cooperative system, is one development direction of an Intelligent Transportation System (ITS). The vehicle-road cooperative system adopts advanced wireless communication, new generation internet and other technologies, carries out vehicle-vehicle and vehicle-road dynamic real-time information interaction in all directions, develops vehicle active safety control and road cooperative management on the basis of full-time idle dynamic traffic information acquisition and fusion, fully realizes effective cooperation of people and vehicles and roads, ensures traffic safety, improves traffic efficiency, and forms a safe, efficient and environment-friendly road traffic system.
Specifically, the user recommendation method provided by the embodiment of the application can be applied to various application scenes in the artificial intelligence field. Specifically, for example, in an application scenario of a social network, other users that he/she may recognize need to be recommended to the user, a prediction model for an interpersonal relationship may be established through the graph neural network obtained in the embodiment of the present application, and then the interpersonal relationship between the user and the other users is predicted through the prediction model, and relevant information of the other users that the user may recognize is given; in the application scene of medical research, the properties of chemical molecules with different structures need to be analyzed, a prediction model aiming at the molecular properties can be established through the graph neural network obtained in the embodiment of the application, and then the chemical properties of the molecules are predicted through the prediction model, so that the medicine screening work is convenient. It will be appreciated that the above application scenario is merely exemplary, and is not meant to limit the implementation of the user recommendation method in the embodiment of the present application. In the application scenes, the artificial intelligence system can use the graph neural network trained by the user recommendation method to perform fine adjustment on the graph neural network through a small amount of training data in the appointed task scene, and then the needed prediction model can be obtained for executing the appointed task. Based on the graph neural network obtained by the pre-training method provided by the embodiment of the application, the parameters are updated for several times under the application scenes, so that the ideal effect can be achieved.
In the embodiment of the application, the artificial intelligence technology is mainly machine learning.
Machine Learning (ML) is a multi-domain interdisciplinary, involving multiple disciplines such as probability theory, statistics, approximation theory, convex analysis, algorithm complexity theory, etc. It is specially studied how a computer simulates or implements learning behavior of a human to acquire new knowledge or skills, and reorganizes existing knowledge structures to continuously improve own performance. Machine learning is the core of artificial intelligence, and is the fundamental approach to making computers intelligent, with many types of algorithms applied throughout the various fields of artificial intelligence. The machine learning can be classified into: supervised learning, unsupervised learning, semi-supervised learning, and reinforcement learning. The functions according to the algorithm can be divided into: regression algorithms, classification algorithms, clustering algorithms, dimension reduction algorithms, integration algorithms, and the like.
With the development of artificial intelligence technology, neural networks are gradually rising and applied to various industries. The data used in traditional machine learning are more common data in Euclidean space, under the Euclidean space, the most obvious characteristic of the data is that the data has a regular space structure, such as a regular square grid of pictures, the voice data is a one-dimensional sequence, the data can be represented by a one-dimensional or two-dimensional matrix, and the data processing is simpler. And, these data have a typical characteristic in use: the data are independent from each other. However, in some practical application scenarios, the data may be in a form of a spatial structure without rules, that is, there is data in a non-euclidean space, such as an abstract map of electronic transactions, recommendation systems, social networks, etc., where each node in the map is not regularly connected to other nodes, and there may be information related to each other. The graph neural network is a model specially used for processing data of the pattern type, and can model the data of non-Euclidean space and capture internal dependency relations between the data, so that a characterization vector for a node or a pattern can be generated better.
Blockchains are novel application modes of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanisms, encryption algorithms, and the like. The Blockchain (Blockchain), which is essentially a decentralised database, is a string of data blocks that are generated by cryptographic means in association, each data block containing a batch of information of network transactions for verifying the validity of the information (anti-counterfeiting) and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer.
The blockchain underlying platform may include processing modules for user management, basic services, smart contracts, and operation detection. The user management module is responsible for identity information management of all blockchain participants, including maintenance of public and private key generation (account management), key management, maintenance of corresponding relation between the real identity of the user and the blockchain address (authority management) and the like, and under the condition of authorization, supervision and audit of transaction conditions of certain real identities, and provision of rule configuration (wind control audit) of risk control; the basic service module is deployed on all block chain node devices, is used for verifying the validity of a service request, recording the service request on a storage after the effective request is identified, for a new service request, the basic service firstly analyzes interface adaptation and authenticates the interface adaptation, encrypts service information (identification management) through an identification algorithm, and transmits the encrypted service information to a shared account book (network communication) in a complete and consistent manner, and records and stores the service information; the intelligent contract module is responsible for registering and issuing contracts, triggering contracts and executing contracts, a developer can define contract logic through a certain programming language, issue the contract logic to a blockchain (contract registering), invoke keys or other event triggering execution according to the logic of contract clauses to complete the contract logic, and simultaneously provide a function of registering contract upgrading; the operation detection module is mainly responsible for deployment in the product release process, modification of configuration, contract setting, cloud adaptation and visual output of real-time states in product operation, for example: alarms, detecting network conditions, detecting node device health status, etc.
The platform product service layer provides basic capabilities and implementation frameworks of typical applications, and developers can complete the blockchain implementation of business logic based on the basic capabilities and the characteristics of the superposition business. The application service layer provides the application service based on the block chain scheme to the business participants for use.
In order to facilitate understanding of the technical solution provided by the embodiments of the present application, an application scenario of the method for recommending users based on artificial intelligence is described below by taking an example of a scenario in which the method for recommending users based on artificial intelligence provided by the embodiments of the present application is applied to interaction between a terminal device and a server.
Referring to fig. 1, fig. 1 is an application scenario schematic diagram of a method for user recommendation according to an embodiment of the present application. As shown in fig. 1, the application scenario includes a terminal device 110 and a server 120, where the terminal device 110 and the server 120 communicate through a network. Wherein the terminal device 110 is configured to provide the server 120 with basic information required for user recommendation, such as user characteristics (including but not limited to, user active characteristics, user game characteristics, user payment characteristics, or user social characteristics), etc. The server 120 is configured to execute the method for recommending users according to the embodiment of the present application, and provide, based on the terminal device 110, user characteristics to recommend a target user to the first user, where the target user has a high probability of being in a friend relationship with the first user.
In a specific implementation, after the terminal device 110 transmits the user characteristics to the server 120 through the network, the server 120 may first invoke the pre-trained neural network model 121 to process the user characteristics of the first user and the user characteristics of the second user, so as to obtain the friend probabilities of the first user and the second user, where the friend probabilities indicate the second user and the second user. The server 120 may then determine a target user recommended to the first user based on the obtained plurality of friend probabilities based on the graph neural network model. Further, the server 120 may transmit the friend link of the target user to the terminal device 110 via the network.
In practical application, when the graph neural network model 121 obtains the friend probability, the user activity feature, the user game feature, the user payment feature, or the user social feature may be used, and the user registration address information, the user login address information, the user gender information, or the user age information feature may be used.
It should be understood that the above-mentioned user features provided by the terminal device 110 to the server 120 are only examples, and in practical applications, fewer or more user features may be provided by the terminal device 110 to the server 120, for example, the terminal device 110 may provide only the user active features and the user game features to the server 120, and further, other information required in determining, by the server 120 itself, the friend probabilities based on the user active features and the user game features, and no limitation is made on the user features provided by the terminal device 110 to the server 120.
It should be understood that, the application scenario shown in fig. 1 is only an example, and in practical application, the method for recommending users provided in the embodiment of the present application may be independently executed by a terminal device, the method for recommending users provided in the embodiment of the present application may be independently executed by a server, and the method for recommending users provided in the embodiment of the present application may be executed by the terminal device and the server in cooperation, where no limitation is made on the application scenario of the method for recommending users provided in the embodiment of the present application.
The terminal device 110 may be referred to as a user terminal including, but not limited to, a cell phone, a computer, an intelligent voice interaction device, an intelligent home appliance, a vehicle terminal, an aircraft, etc.
Graph neural network: refers to a neural network comprising a plurality of embedded update layers (embedding updating layer) and a prediction layer (prediction layer).
Typically, an embedding update layer performs two operations to update the embedding vector of a node in a three-part graph (part graph). These two operations may include aggregation (aggregation) and merging (combination).
An aggregation operation aggregates the embedded vectors of the neighbor nodes of the current node using an aggregation function to derive an aggregate embedded vector of the neighbor nodes. Common aggregation methods include mean, max-pool, long short-term memory (LSTM) and attention network.
Combining (combination) operation combines the neighbor embedded vector obtained by aggregation and the current embedded vector of the current node to generate an updated embedded vector of the current node. Common merging includes stitching.
An embedded update layer may update the embedded vectors of all nodes in the three graphs simultaneously. The number of layers of the embedded update layer of the neural network is usually set when the model is applied. As an example, the number of layers of the embedded update layer may be set according to the accuracy requirement of the prediction rating, for example, when the accuracy requirement of the prediction rating is high, the number of layers of different embedded update layers may be tested, and the number of layers of the embedded update layer with the highest accuracy may be selected. Alternatively, the number of layers in which the update layer is embedded may be empirically set.
Referring to fig. 2, fig. 2 is an acquisition schematic diagram of a neural network according to an embodiment of the application. The application adopts the graph neural network model to process the user characteristics of the user and obtain the friend probability. Each user acts as a node, and the user characteristics of the user act as the initial characteristic vector of the node. Message propagation is then performed through the social relationship network structure, thereby updating the feature vector of the node, also referred to as model characterization of the user. Finally, the target user to be recommended is determined through nearest neighbor searching (Nearest Neighbor Search) through model characterization of the user.
The method for recommending users based on artificial intelligence provided by the application is described in detail below by way of embodiments. Referring to fig. 3, fig. 3 is a flowchart of an embodiment of a method for user recommendation according to an embodiment of the present application, where the method for user recommendation provided by the embodiment of the present application includes:
301. and acquiring the user characteristics of the first user and the user characteristics of the second user.
In this embodiment, when the user recommending apparatus needs to recommend friends to the first user, the user recommending apparatus obtains the user characteristics of the first user and the user characteristics of the second user. The second user is another user that may be in a buddy relationship with the first user, the second user including one or more users.
In the embodiment of the application, the user characteristic of the user can be understood as the user characteristic vector of the user. The user in the embodiments of the present application may also be understood as a game player. The user characteristics include one or more of the following: a user active feature, a user paid feature, or a user social feature. When the embodiment of the application is applied to friend recommendation in a game, the user characteristics can also comprise user game characteristics. It can be appreciated that the user recommendation method provided by the embodiment of the application can also be applied to friend recommendation of social software, and the embodiment of the application is not limited to the method.
The following are respectively illustrated:
A. user active characteristics refer to user login status information or presence status information over a period of time. Illustratively, the user-active features include, but are not limited to: the number of logins of the user, the login time of the user (for example, login days), the number of check-ins of the user, or the online time of the user, etc.
B. The user game feature refers to feature information of the user related to the game. Illustratively, the user gaming features include, but are not limited to: the game level of the user, the game segment of the user, the game play number of the user, the game play win ratio of the user, the game scene preference of the user and the like.
C. The user payment feature refers to feature information of the user related to payment in the game. Exemplary, the user payment feature includes, but is not limited to: the number of payments by the user, the total amount paid by the user, the amount paid by the user over a period of time, the maximum amount paid by the user, etc.
D. The social characteristics of the user refer to social interaction characteristic information of the user with other users in the game. Exemplary, the user social features include, but are not limited to: the chat times of the user, the chat time of the user, the praise times of the user, the team forming times of the user and other users, the team forming time of the user and other users, the gift sending times of the user, the gift sending amount of the user and the like.
One possible implementation scenario is: referring to fig. 5, fig. 5 is a schematic diagram of an operation interface according to an embodiment of the application. The first user clicks the relevant control of "Add friends" in the game application interface. Then, in response to the click operation, the user recommendation device acquires the user characteristics of the first user. The user recommendation device determines a second user among other users except the first user, and then acquires the user characteristics of the second user. For example: and screening the user which is closer to the first user in the login geographic position as a second user. Then, step 302 is entered, where the user recommendation device processes the user characteristics of the first user and the user characteristics of the second user by using the graph neural network model, so as to obtain the probability of friends. Next, step 303 is entered, where the user recommendation device determines, according to the probability of one or more friends, a target user to recommend to the first user, where relevant information of the target user and links to add friends are shown in a "people who may be interested" window of the game application interface. For example, fig. 6 is a schematic diagram of fig. 6, where fig. 6 is a schematic diagram of another operation interface, and a target user shown in a "person of possible interest" window includes: user B, user D, user G, and user H.
Illustratively, the first user clicks into the "Add friends" interface through the "Add friends" control of the game interface. The user recommendation device obtains a user characteristic of the first user, for example, the user characteristic of the first user includes: sex: men "," usual period: 22: 00-24:00 "," segment division: 500"," geographical location: shenzhen "and" preference mode: ranking the games). The user recommendation device determines a target user recommended to the first user from the second user through the graph neural network model according to the first user characteristic. Specifically, in the interface of "adding friends", the target user recommended to the first user is shown in the sub-window of "people of possible interest". For example: user B, user D, user G, and user H. Wherein the user characteristics of user B include: sex: men "," usual period: 22: 00-24:00 "," segment division: 400"," geographical location: shenzhen "and" preference mode: ranking the games). The user characteristics of user D include: sex: female "," usual period: 23: 00-01:00 "," segment division: 540"," geographic location: shenzhen "and" preference mode: ranking the games). The user characteristics of user G include: sex: men "," usual period: 23: 00-01:00 "," segment division: 520"," geographic location: guangzhou "and" preference mode: ranking the games). The user characteristics of user H include: sex: men "," usual period: 22: 00-01:00 "," segment division: 520"," geographic location: wheatstone "and" preference mode: ranking the games). As can be seen from the user characteristics of the user B, the user D, the user G, and the user H, preference modes, segment scores, geographic locations, or time between the user B, the user D, the user G, and the user H and the first user are similar. Therefore, the likelihood that user B, user D, user G, and user H are in a friend relationship with the first user is high.
302. And processing the user characteristics of the first user and the user characteristics of the second user by adopting the graph neural network model to obtain the friend probability.
In this embodiment, the user recommendation device processes the user characteristics of the first user and the user characteristics of the second user by using the graph neural network model, and obtains the probability of friends.
One possible implementation is: and (3) enabling the node corresponding to the first user to be v and the node corresponding to the second user to be u, and acquiring the friend probabilities of the first user and the second user by adopting the following method:
wherein θ (v, u) is the probability of friends of the first user and the second user. b is the paranoid parameter in the logistic regression (logistic regression) model. Sigma is a Sigmoid function, for example:
the node p is a neighbor node corresponding to the node v, the user corresponding to the node p has a friend relationship with the user corresponding to the node v, and N v Comprising one or more nodes having a neighbour relation to node v, N v A set of neighbor nodes that can be considered as node v, node p belonging to N v . f (v) is the user characteristic of the user corresponding to node v (i.e., the user characteristic of the first user), f (p) is the user characteristic of the user corresponding to node p, and w (v, p) is the first parameter in the graph neural network. f' (v) is an aggregate vector obtained by aggregating the user characteristics of the first user (node v) and the user characteristics of the other users (node p) having a friend relationship with the first user.
Wherein u is a second user, p 'is a neighboring node corresponding to the node u, the user corresponding to the node p' has a friend relationship with the user corresponding to the node u, and N u Comprising one or more nodes having a neighbour relation to node u, N u The set of neighbor nodes that can be considered as node u, node p' belonging to N u
f (u) is the user characteristic of the user corresponding to the node u (i.e., the user characteristic of the second user), f (p ') is the user characteristic of the user corresponding to the node p', and w (v, p) is the second parameter in the graph neural network. f (p ') is an aggregate vector obtained by aggregating the user characteristics of the second user (node u) and the user characteristics of the other users (node p') having a friend relationship with the second user. w (u, p') is consistent with w (v, p) and is a first parameter in the graph neural network, and the first parameter is obtained in the training process of the graph neural network.
303. And determining a target user recommended to the first user according to the probability of one or more friends.
In this embodiment, the user recommendation device obtains one or more friend probabilities (each friend probability corresponds to a second user) based on the user characteristics of the first user and the user characteristics of the second user. And then the user recommending device sorts the one or more friend probabilities, and selects H users with the front sorted friend probabilities as target users, wherein H is a positive integer. Then, the user recommending means recommends the target user to the first user. For example: the user recommending device displays the relevant information of the target user and the links for adding friends in a 'people who may be interested' window of the game application interface. For example, as shown in table 1, table 1 illustrates the probability that the second user is in a friend relationship with the first user. The user recommending means selects 5 second users (users a to e) of which probability is highest as target users recommended to the first user.
TABLE 1
Second user Friend probability
User a 0.36
User b 0.32
User c 0.30
User d 0.25
User e 0.21
User f 0.15
User g 0.11
In the embodiment of the application, the user recommending device processes the user characteristics of the first user and the user characteristics of the second user by using the graph neural network model to obtain the friend probability, wherein the friend probability indicates the probability that the second user and the first user form a friend relationship. Then, a target user recommended to the first user is determined based on the friend probability, the target user belonging to the second user. Because the training sample used in the graph neural network model training comprises the user characteristics of a plurality of users with friend relations, the accuracy of acquiring friend probabilities can be effectively improved by using the graph neural network model. In a possible simulation experiment scene, compared with the traditional XGBoost algorithm, the map neural network model of the scheme is adopted to obtain the probability of friends and recommend the friends, and the hit rate is improved by 2.98%. The hit rate refers to the probability that the recommended target user and the first user eventually combine to be in a friend relationship.
In combination with the foregoing embodiment, the training sample of the graph neural network model further includes: the second neighbor node set corresponds to the user characteristics of the users, the second neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the second neighbor node set have friend relations with the users corresponding to the first neighbor nodes. Specifically, taking fig. 4 as an example, if the target node is node a, the first neighboring node set is nodes B, C, D and E, and the second neighboring node set is nodes E and F.
In a possible implementation, the first set of neighbor nodes may be a first-order set of neighbor nodes of the target node, for example: let the target node be node u and the neighbor node set of node u be N u Sample N u In (a) and (b)A node, as a first set of neighbor nodes: n' u Wherein 0 is<α<1, alpha is a custom hyper-parameter. For example: α=0.5.
For N' u Each node v, v e N 'in the tree' u Neighbor node set N of node v v ,N v Removing N u Node u, then sample N v Each node v of (a)Obtaining a second-order neighbor node set from u, denoted N 2 u ,N 2 u As a second set of neighbor nodes.
By the method, training samples of the graph neural network model can be effectively expanded, and accuracy of the graph neural network model in acquiring friend probability is improved.
In combination with the foregoing embodiment, the user recommendation device trains the graph neural network model by using the training sample of the graph neural network model, and specifically includes:
in this embodiment, the user recommendation apparatus trains the graph neural network model using the training sample of the graph neural network model. Specifically, the training samples of the graph neural network model include: the first neighbor node set corresponds to user characteristics of users, the first neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the first neighbor node set have friend relations with the users corresponding to the target nodes.
It should be noted that, in the embodiment of the present application, two nodes have a neighbor relationship, or one node is a neighbor node of another node, which means that users corresponding to the two nodes have a friend relationship (or form a friend relationship).
For easy understanding, please refer to fig. 4, fig. 4 is a schematic diagram of a node corresponding to a friend relationship in an embodiment of the present application. Node a, node B, node C, node D, node E, and node F are included in fig. 4. Where each node corresponds to a user (which can also be understood as a player for each node). And a connection line exists between the two nodes to indicate that the users corresponding to the two nodes have friend relations. For example, in fig. 4, the user corresponding to node a has a friend relationship with the user corresponding to node B, the user corresponding to node a has a friend relationship with the user corresponding to node D, the user corresponding to node a has a friend relationship with the user corresponding to node C, and the user corresponding to node a has a friend relationship with the user corresponding to node E. The users corresponding to node B have friend relations with the users corresponding to node E and node F. The user corresponding to node D has a friend relationship with the user corresponding to node F.
And taking the node A as a target node, wherein the first neighbor node set corresponding to the node A comprises a node B, a node D and a node C. When the graph neural network model is trained, the training samples used include: the user characteristics of the corresponding user of the node A, the user characteristics of the corresponding user of the node B, the user characteristics of the corresponding user of the node C and the user characteristics of the corresponding user of the node D.
And aiming at the various characteristics, the user recommending device splices the various characteristics to form the user characteristics of the user. And then, carrying out normalization processing on the user characteristics by the graph neural network model, wherein the normalized user characteristics are used as training samples of the graph neural network model.
Specifically, since the user characteristics are different in the value ranges of the multiple dimensions, normalization processing is required for the data in the different dimensions. Illustratively, for data x, the data may be obtainedx is a maximum and a minimum in a certain dimension. The maximum value is denoted as x max The minimum value is expressed as x min . Let the normalized data x be data x', wherein,
optionally, if the maximum value and the minimum value of the data x are equal in a certain dimension, the data of the data x in the dimension is removed. The reason is that the data for this dimension does not have a distinction, and does not help with model training.
It is understood that the user recommendation device may train the graph neural network model online and use the graph neural network model to obtain the probability of friends. The user recommendation device may also use the offline trained neural network model to obtain the probability of friends, which is not limited in the embodiment of the present application.
In combination with the foregoing embodiment, the training sample of the graph neural network model further includes: the third neighbor node set corresponds to the user characteristics of the users, the third neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the third neighbor node set have friend relations with the users corresponding to the second neighbor nodes.
In a possible implementation, for N 2 u Each node v, v e N in the tree 2 u Neighbor node set N of node v v ,N v Removing N 2 u Node u, then sample N v Each node v of (a)Obtaining a third-order neighbor node set from u, denoted N 3 u ,N 3 u As a third set of neighbor nodes.
It will be appreciated that in the embodiment of the present application, the training of the graph neural network model may use the user features of the higher-order neighbor node set, which is not limited in the embodiment of the present application.
By the method, training samples of the graph neural network model can be effectively expanded, and accuracy of the graph neural network model in acquiring friend probability is improved.
In combination with the foregoing embodiment, the second neighboring node set includes a first neighboring node subset, and the user corresponding to the neighboring node included in the first neighboring node subset does not have a friend relationship with the user corresponding to the target node, and the user feature of the user corresponding to the neighboring node in the first neighboring node subset is used as a negative training sample of the graph neural network.
For ease of understanding, in fig. 4, because the second neighboring node set includes node E and node F, and the user corresponding to node E has a friend relationship with the user corresponding to node a, the user corresponding to node F does not have a friend relationship with the user corresponding to node a, and therefore, the first neighboring node subset corresponding to the target node includes node F.
By taking the user characteristics of the users which do not have friend relation with the target users as negative samples in the training samples, the distinguishing property of the graph neural network model can be effectively improved.
In combination with the foregoing embodiment, the second neighboring node set includes a second neighboring node subset, and a user corresponding to the neighboring node included in the second neighboring node subset has a friend relationship with a user corresponding to the target node; the positive training samples of the graph neural network include: the first neighbor node subset includes neighbor nodes corresponding to user characteristics of the user, and the first neighbor node subset includes neighbor nodes corresponding to user characteristics of the user.
For ease of understanding, in fig. 4, because the second neighboring node set includes node E and node F, and the user corresponding to node E has a friend relationship with the user corresponding to node a, the user corresponding to node F does not have a friend relationship with the user corresponding to node a, and therefore, the second neighboring node subset corresponding to the target node includes node E.
By taking the user characteristics of the user with the friend relation with the target user as the positive sample in the training samples, the distinguishing property of the graph neural network model can be effectively improved.
In connection with the foregoing embodiments, a detailed description of how the neural network is trained is provided below. For ease of understanding, please refer to fig. 7, fig. 7 is a training schematic diagram of a neural network according to an embodiment of the present application.
Specifically, the user recommendation device trains the graph neural network model by using a training sample of the graph neural network model, including:
the user recommending device determines the user characteristics of the first target node (namely, the user characteristics of the user corresponding to the first target node) and the user characteristics of the neighbor node corresponding to the first target node (namely, the user characteristics of the neighbor node corresponding to the first target node) in the training sample of the graph neural network model. Specifically, the target node includes a first target node, and the neighbor node corresponding to the first target node belongs to a first neighbor node set, a second neighbor node set or a third neighbor node set corresponding to the first target node;
and the user recommending device carries out aggregation processing on the user characteristics of the first target node and the user characteristics of the neighbor nodes corresponding to the first target node to obtain a first aggregation vector. The neighbor node corresponding to the first target node may be all neighbor nodes corresponding to the first target node, or may be part of neighbor nodes corresponding to the first target node, which is not limited in the embodiment of the present application.
One possible implementation is: adding the user characteristics of the neighbor nodes corresponding to the first target node, then executing average processing, adding the result and the user characteristics of the first target node, and then executing average processing to obtain a first aggregate vector;
the user recommending device determines the user characteristics of the second target node and the user characteristics of the neighbor nodes corresponding to the second target node in the training sample of the graph neural network model. Specifically, the target node includes a second target node, and the neighbor node corresponding to the second target node belongs to a first neighbor node set, a second neighbor node set, or a third neighbor node set corresponding to the second target node. The neighbor node corresponding to the second target node may be all neighbor nodes corresponding to the second target node, or may be part of neighbor nodes corresponding to the second target node, which is not limited in the embodiment of the present application;
and the user recommending device carries out aggregation processing on the user characteristics of the second target node and the user characteristics of the neighbor nodes corresponding to the second target node to obtain a second aggregation vector.
One possible implementation is: adding the user characteristics of the neighbor nodes corresponding to the second target node, then executing average processing, and adding the result and the user characteristics of the second target node, then executing average processing to obtain a second aggregate vector;
The user recommending device performs combination processing on the first aggregate vector and the second aggregate vector to obtain a first combined vector;
the user recommending device determines a target loss function according to the first merging vector. One possible implementation is: acquiring and determining a target loss function by using a logistic regression algorithm;
and the user recommending device optimizes parameters of the graph neural network to be trained until the target loss function converges, so as to obtain the trained graph neural network.
In a possible implementation manner, aggregating the user characteristics of the first target node and the user characteristics of the neighboring nodes corresponding to the first target node to obtain a first aggregate vector, where the aggregating includes:
wherein v is a first target node, p is a neighboring node corresponding to the first target node, N v Comprising one or more nodes having a neighbour relation to a first target node, p belonging to N v F (v) is the user characteristic of the first target node, f (p) is the user characteristic of the neighbor node corresponding to the first target node, w (v, p) is the first parameter to be learned in the graph neural network model, and f' (v) is the first aggregate vector.
In a possible implementation manner, aggregating the user characteristics of the second target node and the user characteristics of the neighboring nodes corresponding to the second target node to obtain a second aggregate vector, where the aggregating includes:
Wherein u is a second target node, p' is a neighboring node corresponding to the second target node, N u Comprising one or more nodes having a neighbour relation to a second target node, p' belonging to N u F (u) is the user characteristic of the second target node, f (p ') is the user characteristic of the neighbor node corresponding to the second target node, w (u, p ') is the first parameter to be learned in the graph neural network model, and f ' (u) is the second aggregate vector.
In one possible implementation manner, the merging processing is performed on the first aggregate vector and the second aggregate vector to obtain a first merged vector, which includes:
g(v,u)=(f′(v),f′(u));
where g (v, u) is the first combined vector, f '(v) is the first aggregate vector, and f' (u) is the second aggregate vector.
In one possible implementation, determining the target loss function from the first combining vector includes:
wherein Loss is a target Loss function, W is a second parameter to be learned, g (v, u) is a first merge vector, b is a paranoid parameter in a logistic regression model, sigma is a Sigmoid function,and indicating whether the user corresponding to the first target node and the user corresponding to the second target node have a friend relationship.
According to the embodiment of the application, the graphic neural network model is trained by the method, various training samples are fully referred in the training process, and training samples with higher learning cost and training samples which are difficult to distinguish can be effectively distinguished by the method, so that the distinguishing property of the graphic neural network model can be effectively improved, and the training cost of the graphic neural network model is further reduced.
The foregoing description of the solution provided by the embodiments of the present application has been mainly presented in terms of a method. It will be appreciated that the user recommendation device, in order to implement the above-mentioned functions, comprises corresponding hardware structures and/or software modules for performing the respective functions. Those of skill in the art will readily appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the function modules of the user recommending device according to the method example, for example, each function module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a user recommendation device according to an embodiment of the present application. The user recommendation device 800 provided in the embodiment of the present application includes: a processing module 801 and a transceiver module 802;
a transceiver module 802, configured to obtain a user characteristic of the first user and a user characteristic of the second user, where the user characteristic includes one or more of the following: user active features, user gaming features, user payment features, or user social features;
the processing module 801 is configured to process a user feature of the first user and a user feature of the second user by using a graph neural network model, obtain a probability of a friend, where the probability of the friend indicates a probability that the second user and the first user form a friend relationship, and a training sample of the graph neural network model includes: the method comprises the steps that a first neighbor node set corresponds to user characteristics of users, the first neighbor node set comprises one or more neighbor nodes, and users corresponding to the neighbor nodes in the first neighbor node set have friend relations with users corresponding to target nodes;
the processing module 801 is further configured to determine, according to the plurality of friend probabilities, a second user recommended to the first user.
In one possible implementation of the present invention,
the training sample of the graph neural network model further comprises: the second neighbor node set corresponds to the user characteristics of the users, the second neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the second neighbor node set have friend relations with the users corresponding to the first neighbor nodes.
In one possible implementation of the present invention,
the second neighbor node set comprises a first neighbor node subset, the users corresponding to the neighbor nodes included in the first neighbor node subset do not have friend relations with the users corresponding to the target nodes, and the user characteristics of the users corresponding to the neighbor nodes in the first neighbor node subset are used as negative training samples of the graph neural network.
In one possible implementation of the present invention,
the second neighbor node set comprises a second neighbor node subset, and the users corresponding to the neighbor nodes included in the second neighbor node subset have friend relations with the users corresponding to the target nodes;
the positive training samples of the graph neural network include: the first neighbor node subset includes neighbor nodes corresponding to user characteristics of the user, and the first neighbor node subset includes neighbor nodes corresponding to user characteristics of the user.
In one possible implementation of the present invention,
the processing module 801 is further configured to determine a user characteristic of a first target node in a training sample of the graph neural network model, and a user characteristic of a neighboring node corresponding to the first target node;
the processing module 801 is further configured to aggregate the user characteristics of the first target node and the user characteristics of the neighboring node corresponding to the first target node to obtain a first aggregate vector;
the processing module 801 is further configured to determine a user characteristic of a second target node in the training sample of the graph neural network model, and a user characteristic of a neighboring node corresponding to the second target node;
the processing module 801 is further configured to aggregate the user characteristics of the second target node and the user characteristics of the neighboring node corresponding to the second target node to obtain a second aggregate vector;
the processing module 801 is further configured to combine the first aggregate vector and the second aggregate vector to obtain a first combined vector;
the processing module 801 is further configured to determine a target loss function according to the first combining vector;
the processing module 801 is further configured to optimize parameters of the graph neural network to be trained until the target loss function converges, thereby obtaining a trained graph neural network.
In one possible design, in another implementation of another aspect of the embodiments of the present application,
the processing module 801 is further configured to normalize the user characteristics of the first target node and the user characteristics of the neighboring node corresponding to the first target node, to obtain the user characteristics of the first target node after normalization processing and the user characteristics of the neighboring node corresponding to the first target node after normalization processing;
the processing module 801 is further configured to aggregate the user characteristics of the normalized first target node and the user characteristics of the neighboring nodes corresponding to the normalized first target node to obtain a first aggregate vector;
the processing module 801 is further configured to normalize the user characteristics of the second target node and the user characteristics of the neighboring node corresponding to the second target node, to obtain the user characteristics of the second target node after normalization processing and the user characteristics of the neighboring node corresponding to the second target node after normalization processing;
the processing module 801 is further configured to aggregate the user characteristics of the normalized second target node and the user characteristics of the neighboring nodes corresponding to the normalized second target node, to obtain a second aggregate vector.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the user features further include:
user gaming characteristics including one or more of the following: game level, game segment, game play count, game play odds, or game scene preference.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the user-active features include one or more of the following: login times, login time, sign-in times, or online time.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the user payment feature includes one or more of the following: the number of payments, the total amount paid, the amount paid over a period of time, or the maximum amount paid.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the user social features include one or more of the following: chat times, chat duration, praise times, team formation duration, gift times, or gift amount.
In one possible implementation of the present application,
The processing module 801 is further configured to aggregate the user characteristics of the first target node and the user characteristics of the neighboring node corresponding to the first target node to obtain a first aggregate vector, where the first aggregate vector includes:
wherein v is a first target node, p is a neighboring node corresponding to the first target node, N v Comprising one or more nodes having a neighbour relation to a first target node, p belonging to N v F (v) is the user characteristic of the first target node, f (p) is the user characteristic of the neighbor node corresponding to the first target node, w (v, p) is the first parameter to be learned in the graph neural network model, and f' (v) is the first aggregate vector.
In one possible implementation of the present invention,
the processing module 801 is further configured to aggregate the user characteristics of the second target node and the user characteristics of the neighboring node corresponding to the second target node to obtain a second aggregate vector, where the processing module includes:
wherein u is a second target node, p' is a neighboring node corresponding to the second target node, N u Comprising one or more nodes having a neighbour relation to a second target node, p' belonging to N u F (u) is the user characteristic of the second target node, f (p ') is the user characteristic of the neighbor node corresponding to the second target node, w (u, p ') is the first parameter to be learned in the graph neural network model, and f ' (u) is the second aggregate vector.
In one possible implementation of the present invention,
the processing module 801 is further configured to combine the first aggregate vector and the second aggregate vector to obtain a first combined vector, and includes:
g(v,u)=(f′(v),f′(u));
where g (v, u) is the first combined vector, f '(v) is the first aggregate vector, and f' (u) is the second aggregate vector.
In one possible implementation of the present invention,
the processing module 801 is further configured to determine, according to the first combining vector, a target loss function, including:
wherein Loss is a target Loss function, W is a second parameter to be learned, g (v, u) is a first merge vector, b is a paranoid parameter in a logistic regression model, sigma is a Sigmoid function,and indicating whether the user corresponding to the first target node and the user corresponding to the second target node have a friend relationship.
In one possible implementation of the present invention,
the processing module 801 is further configured to process the user characteristics of the first user and the user characteristics of the second user by using a graph neural network model, and obtain a friend probability, where the processing module includes:
let the node corresponding to the first user be v and let the node corresponding to the second user be u, the processing module 801 obtains the friend probabilities of the first user and the second user by using the following method:
Wherein θ (v, u) is the probability of friends of the first user and the second user.
In one possible implementation of the present application,
the training sample of the graph neural network model further comprises: the third neighbor node set corresponds to the user characteristics of the users, the third neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the third neighbor node set have friend relations with the users corresponding to the second neighbor nodes.
Fig. 9 is a schematic diagram of a server structure provided in an embodiment of the present application, where the server 700 may vary considerably in configuration or performance, and may include one or more central processing units (central processing units, CPU) 722 (e.g., one or more processors) and memory 732, one or more storage media 730 (e.g., one or more mass storage devices) storing applications 742 or data 744. Wherein memory 732 and storage medium 730 may be transitory or persistent. The program stored in the storage medium 730 may include one or more modules (not shown), each of which may include a series of instruction operations on a server. Still further, the central processor 722 may be configured to communicate with the storage medium 730 and execute a series of instruction operations on the server 700 in the storage medium 730.
The Server 700 may also include one or more power supplies 726, one or more wired or wireless network interfaces 750, one or more input/output interfaces 758, and/or one or more operating systems 741, such as Windows Server TM ,Mac OS X TM ,Unix TM ,Linux TM ,FreeBSD TM Etc.
The steps performed by the server in the above embodiments may be based on the server structure shown in fig. 9.
Fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present application, as shown in fig. 10, for convenience of explanation, only a portion related to the embodiment of the present application is shown, and specific technical details are not disclosed, please refer to a method portion according to an embodiment of the present application. The terminal device is also called a user terminal, and the terminal device may be any terminal device including a mobile phone, a tablet computer, a personal digital assistant (Personal Digital Assistant, PDA), a Point of Sales (POS), a vehicle computer, etc., and the user terminal includes, but is not limited to, a mobile phone, a computer, an intelligent voice interaction device, an intelligent home appliance, a vehicle terminal, an aircraft, etc. Taking a terminal device as a mobile phone as an example:
fig. 10 is a block diagram showing a part of the structure of a mobile phone related to a terminal device provided by an embodiment of the present application. Referring to fig. 10, the mobile phone includes: radio Frequency (RF) circuitry 810, memory 820, input unit 830, display unit 840, sensor 850, audio circuitry 860, wireless fidelity (wireless fidelity, wiFi) module 870, processor 880, power supply 890, and the like. It will be appreciated by those skilled in the art that the handset construction shown in fig. 10 is not limiting of the handset and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The following describes the components of the mobile phone in detail with reference to fig. 10:
the RF circuit 810 may be used for receiving and transmitting signals during a message or a call, and in particular, after receiving downlink information of a base station, it is processed by the processor 880; in addition, the data of the design uplink is sent to the base station. Typically, the RF circuitry 810 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (Low Noise Amplifier, LNA), a duplexer, and the like. In addition, the RF circuitry 810 may also communicate with networks and other devices via wireless communications. The wireless communications may use any communication standard or protocol including, but not limited to, global system for mobile communications (Global System of Mobile communication, GSM), general packet radio service (General Packet Radio Service, GPRS), code division multiple access (Code Division Multiple Access, CDMA), wideband code division multiple access (Wideband Code Division Multiple Access, WCDMA), long term evolution (Long Term Evolution, LTE), email, short message service (Short Messaging Service, SMS), and the like.
The memory 820 may be used to store software programs and modules, and the processor 880 performs various functional applications and data processing of the cellular phone by executing the software programs and modules stored in the memory 820. The memory 820 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, memory 820 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The input unit 830 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function controls of the handset. In particular, the input unit 830 may include a touch panel 831 and other input devices 832. The touch panel 831, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 831 or thereabout using any suitable object or accessory such as a finger, stylus, etc.), and actuate the corresponding connection device according to a predetermined program. Alternatively, the touch panel 831 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 880 and can receive commands from the processor 880 and execute them. In addition, the touch panel 831 may be implemented in various types of resistive, capacitive, infrared, surface acoustic wave, and the like. The input unit 830 may include other input devices 832 in addition to the touch panel 831. In particular, other input devices 832 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc.
The display unit 840 may be used to display information input by a user or information provided to the user and various menus of the mobile phone. The display unit 840 may include a display panel 841, and optionally, the display panel 841 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 831 may overlay the display panel 841, and when the touch panel 831 detects a touch operation thereon or thereabout, the touch operation is transferred to the processor 880 to determine the type of touch event, and the processor 880 then provides a corresponding visual output on the display panel 841 according to the type of touch event. Although in fig. 10, the touch panel 831 and the display panel 841 are implemented as two separate components to implement the input and input functions of the mobile phone, in some embodiments, the touch panel 831 and the display panel 841 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 850, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 841 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 841 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for applications of recognizing the gesture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the handset are not described in detail herein.
Audio circuitry 860, speaker 861, microphone 862 may provide an audio interface between the user and the handset. The audio circuit 860 may transmit the received electrical signal converted from audio data to the speaker 861, and the electrical signal is converted into a sound signal by the speaker 861 to be output; on the other hand, microphone 862 converts the collected sound signals into electrical signals, which are received by audio circuit 860 and converted into audio data, which are processed by audio data output processor 880 for transmission to, for example, another cell phone via RF circuit 810, or which are output to memory 820 for further processing.
WiFi belongs to a short-distance wireless transmission technology, and a mobile phone can help a user to send and receive emails, browse webpages, access streaming media and the like through a WiFi module 870, so that wireless broadband Internet access is provided for the user. Although fig. 10 shows a WiFi module 870, it is understood that it does not belong to the necessary constitution of the handset, and can be omitted entirely as needed within the scope of not changing the essence of the invention.
The processor 880 is a control center of the mobile phone, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile phone and processes data by running or executing software programs and/or modules stored in the memory 820 and calling data stored in the memory 820, thereby performing an overall inspection of the mobile phone. In the alternative, processor 880 may include one or more processing units; alternatively, the processor 880 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 880.
The handset further includes a power supply 890 (e.g., a battery) for powering the various components, optionally in logical communication with the processor 880 through a power management system, as well as performing functions such as managing charge, discharge, and power consumption by the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which will not be described herein.
The steps performed by the terminal device in the above-described embodiments may be based on the terminal device structure shown in fig. 10.
Embodiments of the present application also provide a computer-readable storage medium having a computer program stored therein, which when run on a computer, causes the computer to perform the method as described in the foregoing embodiments.
Embodiments of the present application also provide a computer program product comprising a program which, when run on a computer, causes the computer to perform the method described in the previous embodiments.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (15)

1. A method of user recommendation, comprising:
acquiring user characteristics of a first user and user characteristics of a second user, wherein the user characteristics comprise one or more of the following: user active features, user paid features, or user social features;
processing the user characteristics of the first user and the user characteristics of the second user by adopting a graph neural network model to obtain friend probabilities, wherein the friend probabilities indicate the probability that the second user and the first user form a friend relationship, and a training sample of the graph neural network model comprises: the method comprises the steps that a first neighbor node set corresponds to user characteristics of users, the first neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the first neighbor node set have friend relations with the users corresponding to target nodes;
and determining a target user recommended to the first user according to one or more friend probabilities, wherein the target user belongs to the second user.
2. The method of claim 1, wherein the training sample of the graph neural network model further comprises: the second neighbor node set corresponds to user characteristics of users, the second neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the second neighbor node set have friend relations with the users corresponding to the first neighbor nodes.
3. The method of claim 2, wherein the second set of neighbor nodes includes a first subset of neighbor nodes, the first subset of neighbor nodes includes users that have no friend relationship with users that correspond to the target node, and user characteristics of users that correspond to neighbor nodes in the first subset of neighbor nodes are used as negative training samples for the graph neural network.
4. A method according to any of claims 2-3, wherein the second set of neighboring nodes comprises a second subset of neighboring nodes, the second subset of neighboring nodes comprising users having a friend relationship with users corresponding to the target node;
the positive training samples of the graph neural network include: the first neighbor node subset includes neighbor nodes corresponding to user features of users, and the first neighbor node subset includes neighbor nodes corresponding to user features of users.
5. The method according to any one of claims 1-4, further comprising:
determining user characteristics of a first target node in a training sample of the graph neural network model and user characteristics of the neighbor nodes corresponding to the first target node;
Performing aggregation processing on the user characteristics of the first target node and the user characteristics of the neighbor nodes corresponding to the first target node to obtain a first aggregation vector;
determining user characteristics of a second target node in a training sample of the graph neural network model and user characteristics of the neighbor nodes corresponding to the second target node;
performing aggregation processing on the user characteristics of the second target node and the user characteristics of the neighbor nodes corresponding to the second target node to obtain a second aggregation vector;
combining the first aggregate vector and the second aggregate vector to obtain a first combined vector;
determining a target loss function according to the first merging vector;
and optimizing parameters of the graph neural network to be trained until the target loss function converges, so as to obtain the trained graph neural network.
6. The method of claim 5, wherein the step of determining the position of the probe is performed,
the aggregation processing is performed on the user characteristics of the first target node and the user characteristics of the neighbor nodes corresponding to the first target node to obtain a first aggregate vector, including:
Normalizing the user characteristics of the first target node and the user characteristics of the neighbor nodes corresponding to the first target node to obtain the user characteristics of the first target node after normalization and the user characteristics of the neighbor nodes corresponding to the first target node after normalization;
performing aggregation processing on the user characteristics of the first target node after normalization processing and the user characteristics of the neighbor nodes corresponding to the first target node after normalization processing to obtain a first aggregation vector;
the aggregation processing is performed on the user characteristics of the second target node and the user characteristics of the neighboring nodes corresponding to the second target node to obtain a second aggregate vector, including:
normalizing the user characteristics of the second target node and the user characteristics of the neighbor nodes corresponding to the second target node to obtain the user characteristics of the second target node after normalization and the user characteristics of the neighbor nodes corresponding to the second target node after normalization;
and carrying out aggregation processing on the user characteristics of the second target node after normalization processing and the user characteristics of the neighbor nodes corresponding to the second target node after normalization processing to obtain the second aggregation vector.
7. The method of any of claims 1-6, wherein the user characteristic further comprises:
a user gaming feature, the user gaming feature comprising one or more of: game level, game segment, game play count, game play odds, or game scene preference.
8. The method of any of claims 1-7, wherein the user-active features include one or more of: login times, login time, sign-in times, or online time.
9. The method of any of claims 1-8, wherein the user payment feature comprises one or more of: the number of payments, the total amount paid, the amount paid over a period of time, or the maximum amount paid.
10. The method of any of claims 1-9, wherein the user social features include one or more of: chat times, chat duration, praise times, team formation duration, gift times, or gift amount.
11. The method according to any one of claims 2-10, wherein the training sample of the graph neural network model further comprises: the third neighbor node set corresponds to user characteristics of users, the third neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the third neighbor node set have friend relations with the users corresponding to the second neighbor nodes.
12. A user recommendation device, comprising:
the receiving and transmitting module is used for acquiring the user characteristics of the first user and the user characteristics of the second user, wherein the user characteristics comprise one or more of the following: user active features, user gaming features, user payment features, or user social features;
the processing module is configured to process the user characteristics of the first user and the user characteristics of the second user by using a graph neural network model, obtain a friend probability, where the friend probability indicates a probability that the second user and the first user form a friend relationship, and the training sample of the graph neural network model includes: the method comprises the steps that a first neighbor node set corresponds to user characteristics of users, the first neighbor node set comprises one or more neighbor nodes, and the users corresponding to the neighbor nodes in the first neighbor node set have friend relations with the users corresponding to target nodes;
and the processing module is also used for determining the second user recommended to the first user according to the friend probabilities.
13. A computer device, comprising: a memory, a processor, and a bus system;
Wherein the memory is used for storing programs;
the processor being for executing a program in the memory, the processor being for executing the method of any one of claims 1 to 11 according to instructions in program code;
the bus system is used for connecting the memory and the processor so as to enable the memory and the processor to communicate.
14. A computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1 to 11.
15. A computer program product comprising a computer program and instructions which, when executed by a processor, implement the method of any one of claims 1 to 11.
CN202210192040.5A 2022-02-28 2022-02-28 User recommendation method and related device Pending CN116712736A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210192040.5A CN116712736A (en) 2022-02-28 2022-02-28 User recommendation method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210192040.5A CN116712736A (en) 2022-02-28 2022-02-28 User recommendation method and related device

Publications (1)

Publication Number Publication Date
CN116712736A true CN116712736A (en) 2023-09-08

Family

ID=87873920

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210192040.5A Pending CN116712736A (en) 2022-02-28 2022-02-28 User recommendation method and related device

Country Status (1)

Country Link
CN (1) CN116712736A (en)

Similar Documents

Publication Publication Date Title
Pan et al. Composite social network for predicting mobile apps installation
CN110431585B (en) User portrait generation method and device
CN108280458A (en) Group relation kind identification method and device
CN110995810B (en) Object identification method based on artificial intelligence and related device
CN110782289B (en) Service recommendation method and system based on user portrait
Qi et al. Model aggregation techniques in federated learning: A comprehensive survey
CN111914180B (en) User characteristic determining method, device, equipment and medium based on graph structure
Zhang et al. Self organizing feature map for fake task attack modelling in mobile crowdsensing
CN114334036A (en) Model training method, related device, equipment and storage medium
CN112052399B (en) Data processing method, device and computer readable storage medium
CN107807940B (en) Information recommendation method and device
KR20230110221A (en) Method and apparatus for providing solutions for brand improvement
CN115392405A (en) Model training method, related device and storage medium
CN116957678A (en) Data processing method and related device
CN116943229A (en) Data processing method, device, equipment and storage medium
CN113409096B (en) Target object identification method and device, computer equipment and storage medium
CN116957585A (en) Data processing method, device, equipment and storage medium
CN116712736A (en) User recommendation method and related device
Campana et al. MyDigitalFootprint: An extensive context dataset for pervasive computing applications at the edge
CN113761784A (en) Data processing method, training method and device of data processing model
CN112163164B (en) User tag determining method and related device
CN111368211B (en) Relation chain determining method, device and storage medium
CN112862289B (en) Information matching method and device for clinical research practitioner
KR102562282B1 (en) Propensity-based matching method and apparatus
CN117009171A (en) Method and device for processing pre-lost object and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination