CN113159893B - Message pushing method and device based on gate control graph neural network and computer equipment - Google Patents

Message pushing method and device based on gate control graph neural network and computer equipment Download PDF

Info

Publication number
CN113159893B
CN113159893B CN202110452588.4A CN202110452588A CN113159893B CN 113159893 B CN113159893 B CN 113159893B CN 202110452588 A CN202110452588 A CN 202110452588A CN 113159893 B CN113159893 B CN 113159893B
Authority
CN
China
Prior art keywords
product
customer
client
neural network
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110452588.4A
Other languages
Chinese (zh)
Other versions
CN113159893A (en
Inventor
李雷来
王健宗
瞿晓阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202110452588.4A priority Critical patent/CN113159893B/en
Publication of CN113159893A publication Critical patent/CN113159893A/en
Application granted granted Critical
Publication of CN113159893B publication Critical patent/CN113159893B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Biophysics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Strategic Management (AREA)
  • Biomedical Technology (AREA)
  • Development Economics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a message pushing method and device based on a gate control graph neural network and computer equipment, wherein the method comprises the following steps: constructing a bipartite graph taking a customer and a product as nodes, wherein the purchase relation of the customer and the product forms an edge of the bipartite graph; inputting the two graphs into a gating graph neural network model to obtain the purchase probability of each customer for purchasing the products connected with the adjacent customers; acquiring a client matched with a client to be pushed from the bipartite graph based on a proximity algorithm as a target client; obtaining a target product with the target client from the bipartite graph according to the target client; and determining a product to be pushed based on the purchase probability of the target customer purchasing the target product and pushing the message. The method is based on the graph neural network technology, and higher-order information between the customer and the product purchased by the adjacent customer is obtained from the two graphs through constructing the two graphs of the customer and the product and through the graph neural network with the gating unit, so that more accurate recommendation to the user is realized.

Description

Message pushing method and device based on gate control graph neural network and computer equipment
Technical Field
The present invention relates to the graphic neural network technology, and in particular, to a method and apparatus for pushing a message based on a gated graphic neural network, and a computer device.
Background
In recent years, in the process of providing accurate personalized services for clients, an e-commerce platform performs matching between a user and an object by modeling historical behaviors of the user, so that the user is helped to select interested information from massive data, in the prior art, the e-commerce platform generally performs product recommendation on the user by adopting a graph neural network, but only performs personalized recommendation on the user by utilizing low-order data information between the product and the user composition, so that the accuracy is low and the efficiency is low when the product recommendation is performed on the user.
Disclosure of Invention
The embodiment of the invention provides a message pushing method, a message pushing device and computer equipment based on a gating graph neural network, and aims to solve the problems of low accuracy and low efficiency when recommending products to users in the prior art.
In a first aspect, an embodiment of the present invention provides a method for pushing a message based on a neural network of a gating map, including:
constructing a bipartite graph taking a customer and a product as nodes, wherein the purchase relation of the customer and the product forms an edge of the bipartite graph;
Inputting the two graphs into a preset gating graph neural network model to obtain the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers;
acquiring a client matched with a client to be pushed from the bipartite graph based on a proximity algorithm as a target client;
obtaining a plurality of products connected with adjacent clients of the target client from the bipartite graph according to the target client as target products;
and determining a product to be pushed based on the purchase probability of the target customer purchasing the target product, and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed in a message mode.
In a second aspect, an embodiment of the present invention provides a message pushing device based on a gatekeeper graph neural network, including:
a first construction unit configured to construct a bipartite graph having a customer and a product as nodes, wherein a purchase relationship of the customer and the product constitutes an edge of the bipartite graph;
the first input unit is used for inputting the two graphs into a preset gating graph neural network model to obtain the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers;
The first acquisition unit is used for acquiring a client matched with the client to be pushed from the bipartite graph based on a proximity algorithm as a target client;
a second obtaining unit, configured to obtain, from the bipartite graph, a plurality of products connected to adjacent clients of the target client as target products according to the target client;
and the pushing unit is used for determining a product to be pushed based on the purchase probability of the target customer purchasing the target product and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed in a message mode.
In a third aspect, an embodiment of the present invention further provides a computer device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor executes the computer program to implement a message pushing method based on a gatekeeper graph neural network according to the first aspect.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium, where the computer readable storage medium stores a computer program, where the computer program when executed by a processor causes the processor to perform the message pushing method based on the gated graph neural network according to the first aspect.
The embodiment of the invention provides a message pushing method, a device and computer equipment based on a gating graph neural network, wherein the method comprises the following steps: constructing a bipartite graph taking a customer and a product as nodes, wherein the purchase relation of the customer and the product forms an edge of the bipartite graph; inputting the two graphs into a preset gating graph neural network model to obtain the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers; then, based on a proximity algorithm, acquiring a client matched with the client to be pushed from the two graphs as a target client, and acquiring a plurality of products connected with adjacent clients of the target client from the two graphs as target products; and finally, determining a product to be pushed based on the purchase probability of the target customer purchasing the target product, and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed in a message mode. According to the method, the two graphs of the customer and the product are constructed, and the high-order information between the customer and the product purchased by the adjacent customer is obtained from the two graphs through the graph neural network with the gating unit, so that more accurate recommendation of the customer is realized, and the success rate of recommendation is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a message pushing method based on a gate control graph neural network according to an embodiment of the present invention;
fig. 2 is a schematic sub-flowchart of a message pushing method based on a gate control graph neural network according to an embodiment of the present invention;
fig. 3 is another schematic sub-flowchart of a message pushing method based on a gate-controlled graph neural network according to an embodiment of the present invention;
fig. 4 is another schematic sub-flowchart of a message pushing method based on a gate-controlled graph neural network according to an embodiment of the present invention;
fig. 5 is another schematic sub-flowchart of a message pushing method based on a gate-controlled graph neural network according to an embodiment of the present invention;
fig. 6 is another schematic sub-flowchart of a message pushing method based on a gate-controlled graph neural network according to an embodiment of the present invention;
fig. 7 is another schematic sub-flowchart of a message pushing method based on a gate-controlled graph neural network according to an embodiment of the present invention;
Fig. 8 is a schematic block diagram of a message pushing device based on a gated graph neural network according to an embodiment of the present invention;
FIG. 9 is a schematic block diagram of a subunit of a message pushing device based on a gatekeeper graph neural network according to an embodiment of the present invention;
FIG. 10 is a schematic block diagram of another subunit of a message pushing device based on a gatekeeper graph neural network according to an embodiment of the present invention;
FIG. 11 is a schematic block diagram of another subunit of a message pushing device based on a gatekeeper graph neural network according to an embodiment of the present invention;
FIG. 12 is a schematic block diagram of another subunit of a message pushing device based on a gatekeeper graph neural network according to an embodiment of the present invention;
FIG. 13 is a schematic block diagram of another subunit of a message pushing device based on a gatekeeper graph neural network according to an embodiment of the present invention;
FIG. 14 is a schematic block diagram of another subunit of a message pushing device based on a gatekeeper graph neural network according to an embodiment of the present invention;
fig. 15 is a schematic block diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Referring to fig. 1, fig. 1 is a flowchart of a message pushing method based on a neural network of a gating map according to an embodiment of the present invention. The message pushing method based on the gate control graph neural network is applied to a server, and the method is executed through application software installed in the server. The message pushing method based on the gating pattern neural network is described in detail below.
As shown in fig. 1, the method includes the following steps S110 to S150.
S110, constructing a bipartite graph taking a customer and a product as nodes, wherein the purchase relation of the customer and the product forms an edge of the bipartite graph.
Specifically, the customer is a user who has purchased a product, the nodes in the two graphs are the customer and the product respectively, and the edges in the two graphs are formed by the purchase relationship of the customer and the product and are non-directional. By constructing a bipartite graph formed by taking a customer and a product as nodes and taking the purchase relationship of the customer and the product as edges, and inputting the formed bipartite graph into a preset gating graph neural network model, potential high-order information between the customer and the product can be obtained, and further, when the product recommendation is carried out on a user, the accuracy of message pushing is greatly improved.
In another embodiment, as shown in fig. 2, step S110 includes sub-steps S111, S112, and S113.
S111, acquiring customer information, product information and purchase information of the customer for purchasing the product, wherein the customer information and the product information are required for constructing the bipartite graph.
Specifically, the customer information is information such as gender, age, occupation, income, assets and the like of the customer, the product information is information such as payment amount and the like of the customer when the customer purchases the product, and the purchase information is connection information between the customer purchases the product, namely data information of the customer purchasing the product. In this embodiment, the bipartite graph can be constructed by acquiring the customer information and the product information of all customers from the existing system, and then purchasing the product by the customer information, the product information, and the purchase information of the customer.
S112, preprocessing the client information and the product information respectively to obtain preprocessed client information and preprocessed product information.
Specifically, after the customer information and the product information are collected, a large amount of messy, repeated and incomplete data exist, so that the execution efficiency of a data mining algorithm is seriously affected, and deviation of mining results is possibly caused, so that the customer information and the product information need to be preprocessed, and a two-part graph with nodes being the customer and the product is conveniently constructed.
In another embodiment, as shown in FIG. 3, step S112 includes sub-steps S1121 and S1122.
S1121, discretizing the customer information and the product information respectively to obtain discretized customer information and discretized product information.
Specifically, the customer information and the product information are acquired in the existing service system, and the customer information and the product information are usually continuous characteristic information, so that discretization processing is required to be performed on the customer information and the product information respectively, and rapid iterative processing is performed on the graph embedded information in the two graphs by using a follow-up gated graph neural network model. In this embodiment, the discretization processing of the customer information and the product information is further implemented by performing data binning on the customer information and the product information. Among these, data binning is a data preprocessing technique that reduces the effects of minor observation errors, a method of grouping multiple consecutive values into a smaller number of "bins".
And S1122, respectively performing characteristic embedding on the discretized customer information and the discretized product information to obtain the preprocessed customer information and the preprocessed product information.
Specifically, feature embedding is to convert data into feature representations (vectors) with fixed sizes so as to facilitate processing and calculation, namely, dimension reduction processing is performed on discretized customer information and discretized product information. In this embodiment, after discretized customer information and discretized product information are obtained, firstly, an One-Hot encoding technology is adopted to encode the discretized customer information and the discretized product information so as to make the distance between features more reasonable, then, a PCA (Principal Component Analysis ) technology is adopted to obtain the customer information and the product information after feature embedding from the customer information and the product information after the on-Hot encoding, wherein the One-Hot encoding is a process of converting category variables into a form which is easy to be utilized by a machine learning algorithm, and the PAC technology is a process of converting data into a new coordinate system through linear transformation so as to reduce the dimension of the data, thereby realizing feature embedding of the discretized customer information and the discretized product information.
S113, constructing the bipartite graph according to the preprocessed customer information, the preprocessed product information and the purchasing information.
In this embodiment, the bipartite graph is represented by an adjacency matrix, wherein data of a main diagonal in the adjacency matrix represents an association relationship between a customer and a product, if a value of the main diagonal in the adjacency matrix is 1, the customer purchases the product, and if the value of the main diagonal is 0, the customer does not purchase the product. After the two charts are built, the two charts are input into a pre-trained gating chart neural network model, the probability of each customer in the two charts purchasing a product connected with an adjacent customer can be obtained, and the recommendation of the product to the user can be carried out through the probability.
S120, inputting the two graphs into a preset gating graph neural network model, and obtaining the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers.
Usually, the gating pattern neural network is a model of message transmission based on a spatial domain of the gating neural network, and the gating pattern neural network is characterized in that a GRU network in the cyclic neural network is used for information transmission, and the GRU is used as a memory-bearing unit for information transmission in two patterns. In this embodiment, according to the distance between the nodes in the two graphs as the propagation time sequence, the two graphs are propagated in the gated neural network model, so as to obtain the high-order information of the products connected with the customers and the adjacent customers in the two graphs, and the purchasing probability of the customers on the products can be obtained through the high-order information, wherein the products connected with the customers and the adjacent customers are connected through the adjacent customers.
In another embodiment, as shown in fig. 4, step S120 includes sub-steps S121 and S122.
S121, performing convolution operation on the bipartite graph to obtain the feature vector of each node in the bipartite graph and the feature vector of the adjacent side of each node.
S122, inputting the feature vector of each node and the feature vector of the adjacent side of each node into the gating map neural network model based on an attention mechanism to obtain the purchase probability of each customer in the bipartite map for purchasing the product connected with the adjacent customer.
In this embodiment, the convolution operation is performed on the two graphs by using a three-layer graph convolution neural network, so as to obtain the feature vector at each node in the two graphs. When the three-layer graph convolutional neural network carries out convolutional operation on the two-layer graph, a first layer graph convolutional neural network in the three-layer graph convolutional neural network is used for carrying out convolutional operation on any client node in the two-layer graph which is a first client, an adjacent node of the first client is a first product and the edge of the first client connected with the first product, so that the characteristic vector of the first client, the characteristic vector of the first product and the characteristic vector of the edge of the first client connected with the first product are obtained; the second-layer graph convolution neural network is used for carrying out convolution operation on a second client which is an adjacent node of the first product and an edge connected with the second client by the first product, so as to obtain a feature vector of the second client and a feature vector of an edge connected with the second client by the first product; the third layer graph convolution neural network is used for carrying out convolution operation on the adjacent node of the second customer as a second product, the edge of the second customer connected with the second product, further obtaining the feature vector of the second product, the feature vector of the edge of the second customer connected with the second product, then sequentially inputting the feature vector of the first customer, the feature vector of the first product, the feature vector of the edge of the first customer connected with the first product, the feature vector of the second customer, the feature vector of the edge of the first customer connected with the second customer, the feature vector of the second product and the feature vector of the edge of the second customer connected with the second product into the gate graph neural network model in a time sequence mode, and then carrying out prediction on the high-order information, so as to obtain the purchase probability of each customer connected with the adjacent customer in the second graph. The first customer and the second customer are connected through the first product in the two-part diagram, and the second product is connected with the first product through the second customer in the two-part diagram.
In another embodiment, as shown in fig. 5, step S122 includes sub-steps S1221, S1222, S1223, S1224, and S1225.
S1221, selecting any client node in the two graphs as a first client, taking a product connected with the first client as a first product, taking another client connected with the first product as a second client, and taking another product connected with the second client as a second product.
S1222, inputting the feature vector of the first customer, the feature vector of the first product, and the feature vector of the connecting edge of the first customer and the first product into the gating map neural network model to obtain a first hidden state of the first customer.
S1223, inputting the first hidden state, the feature vector of the second customer and the feature vector of the connecting edge of the second customer and the first product into the gating map neural network model based on an attention mechanism to obtain a second hidden state of the first customer.
S1224, inputting the second hidden state, the feature vector of the second product, and the feature vector of the connecting edge of the second customer and the second product into the gating map neural network model based on an attention mechanism to obtain a third hidden state of the first customer.
In this embodiment, any one of the client nodes in the two graphs is selected as a first client, a product connected by the first client is used as a first product, another client connected by the first product is used as a second client, another product connected by the second client is used as a second product, the first hidden state is a feature vector of the first client in the two graphs, a neighbor node of the first client is a feature vector of the first product, and a feature vector of an adjacent edge of the first client is input into the neural network model of the gate graph, the feature vector of the adjacent edge of the first client is a feature vector of an edge connected by the first client and the first product, namely, a feature vector of a connection relationship, and the update function of the first hidden state is: h is a 1 =f(x 1 ,x 2 ,x (1,2) ) Wherein h is 1 An update function, x, representing the first hidden state 1 Feature vector, x, expressed as first customer 2 Feature vector, x, expressed as first product (1,2) A feature vector represented as a neighboring edge of the first customer. After the first hidden state, the characteristic that the neighbor node of the first product is a second customer, and the characteristic of the connecting edge of the first product and the second customer are input into the gating graph neural network model by the attention mechanism, screening is performed through an updating gate and a resetting gate in the gating neural network model, and the second hidden state is output, wherein the principle of the output of the third hidden state is the same as that of the second hidden state, and finally the third hidden state can be output.
S1225, classifying the third hidden state according to a preset classifier to obtain the purchase probability of the first customer purchasing the second product.
Specifically, the classifier is configured to normalize a score of a third hidden state output in the neural network model of the gating map, and use a Sigmoid function to normalize the score to predict a probability of purchasing the second product according to the first customer, where the third hidden state is higher-order information of the first customer and the second product, and the probability is a probability of purchasing the second product by the first customer. The third hidden state is classified by the classifier, so that the purchase probability of the first customer purchasing the second product can be predicted. In this embodiment, the classifier uses a BP neural network with three layers as the classifier and uses a hyperbolic tangent function as the activation function, and the calculation process is as follows:
h i d ij =tanh(w 11 ×h j +w 12 ×s t-1 +b)
e ij =h i d ij ×w 21
wherein h is j H is a sequence of feature vectors associated with the third hidden state i d ij Is in a third hidden state s t-1 Is in a second hidden state e ij To score, w 11 And w 12 The second layer weight of BP neural network with three layers, b is bias term, w 21 The third layer weight is the BP neural network with three layers.
In another embodiment, as shown in fig. 6, step S1225 includes sub-steps S12251 and S12252.
S12251, splicing the third hidden state and the feature vector of the first client based on an attention mechanism to obtain a spliced feature vector.
S12252, classifying the spliced feature vectors according to the classifier to obtain the purchase probability of the first customer purchasing the second product.
In this embodiment, the third hidden state and the feature vector of the first customer are both represented in a matrix manner, the weight of the third hidden state and the weight of the feature vector of the first customer are distributed through an attention mechanism, then the feature vector after the splicing is obtained by performing transverse splicing, and finally the purchasing probability of purchasing the second product by the first customer can be obtained by performing normalization processing on the spliced feature by using a Sigmoid function.
S130, acquiring a client matched with the client to be pushed from the bipartite graph based on a proximity algorithm as a target client.
Specifically, the proximity algorithm is also called K-nearest neighbor classification (K-nearest neighbor classification) algorithm, in which the inputs are test data and training sample data sets and the outputs are the classes of test samples, i.e. the principle of the proximity algorithm is to determine which class x belongs to when predicting a new value x, based on what class it is from the nearest K points. In this embodiment, the clients to be pushed are classified by using a proximity algorithm, then a target client matched with the clients to be pushed is obtained from the two graphs according to the classification result of the clients to be pushed, and finally the products of the clients to be pushed are recommended according to the target client.
In another embodiment, as shown in fig. 7, step S130 includes sub-steps S131, S132, and S133.
S131, receiving a recommendation request of the client to be pushed and acquiring characteristic information of the client to be pushed according to the recommendation request.
Specifically, the recommendation request is instruction information for requesting to push the message of the client to be pushed, the characteristic information of the client to be pushed can be obtained through the instruction information, then the client to be pushed is classified through the clients in the two graphs according to the characteristic information of the client to be pushed, and further the target client matched with the client to be pushed can be obtained from the two graphs.
S132, acquiring a plurality of clients matched with the client to be pushed from the two graphs according to the characteristic information of the client to be pushed.
In this embodiment, an euclidean distance calculation formula is adopted to calculate attribute information of each client to be pushed, the attribute information is matched with each client in the bipartite graph, a distance between each client in the bipartite graph and the client to be pushed is obtained, and then the clients adjacent to the client to be pushed are obtained from the distance according to a preset threshold. The Euclidean distance calculation formula is as follows:
Wherein x is n Representing any one attribute information, y, of the client to be pushed n Representing any one attribute information of the clients in the bipartite graph, d n The distance between the client and the client to be pushed in the two images is represented, the preset threshold is used for judging the representation of the distance between the client and the client to be pushed in the two images, and when the distance is larger than the threshold, the client in the two images can be judged to be far away from the client to be pushed, otherwise, the client is adjacent to the client to be pushed.
S133, obtaining the maximum likelihood value of the client to be pushed and obtaining the target client from the clients according to the maximum likelihood value of the client to be pushed.
In this embodiment, the maximum likelihood value of each client in the plurality of clients is obtained by obtaining a normal distribution function of the clients in the bipartite graph, obtaining an average value, and then classifying the clients to be pushed through a bayesian formula, thereby obtaining a classification tag of the clients to be pushed, and finally obtaining a target client which is most matched with the clients to be pushed from the plurality of clients according to the classification tag of the clients to be pushed. The maximum likelihood value is obtained from a normal distribution function of a sample distribution function of a client in the bipartite graph through a maximum likelihood estimation method, wherein the maximum likelihood estimation method is also called maximum likelihood estimation or maximum likelihood estimation, and the maximum likelihood estimation method is a statistical method based on a maximum likelihood principle and comprises the following principles: a random test has several possible results a, B, c..if in one test a result a appears, it can be determined that the probability P (a) of the occurrence of the experimental condition on a is large.
S140, acquiring a plurality of products connected with adjacent clients of the target client from the bipartite graph according to the target client as target products.
S150, determining a product to be pushed based on the purchase probability of the target customer purchasing the target product, and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed in a message mode.
Specifically, the plurality of products are connected to the target customer in the bipartite graph through the adjacent customers of the target customer, that is, the target customer has not purchased the plurality of products. After the target client matched with the client to be pushed is obtained, the products and the probability of purchasing the products by the target client can be obtained, and then the corresponding products with high probability can be screened out according to a preset threshold to be used as the products to be pushed so as to carry out Top-K recommendation on the client to be pushed. The Top-K recommendation is based on the probability of purchasing the screened product by the target client, so that the client to be pushed is customized and recommended, and meanwhile, the success rate of recommending the client to be pushed is improved.
In the message pushing method based on the gating graph neural network, which is provided by the embodiment of the invention, a two-part graph taking a client and a product as nodes is constructed, wherein the purchasing relationship of the client and the product forms the edge of the two-part graph; inputting the two graphs into a preset gating graph neural network model to obtain the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers; acquiring a client matched with a client to be pushed from the bipartite graph based on a proximity algorithm as a target client; obtaining a plurality of products connected with adjacent clients of the target client from the bipartite graph according to the target client as target products; and determining a product to be pushed based on the purchase probability of the target customer purchasing the target product, and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed. According to the method, the two graphs of the customer and the product are constructed, and the potential high-order information between the customer and the product is obtained from the two graphs through the graph neural network with the gating unit, so that more accurate recommendation is achieved for the user, the success rate of the recommendation is improved, meanwhile, basic support is provided for the user to select a plurality of proper financial resources, and the success rate of the recommendation and the trust degree of the user to a company are improved.
The embodiment of the invention also provides a message pushing device 100 based on the gating pattern neural network, which is used for executing any embodiment of the message pushing method based on the gating pattern neural network.
In particular, referring to fig. 8, fig. 8 is a schematic block diagram of a message pushing device 100 based on a neural network of a gating map according to an embodiment of the present invention.
As shown in fig. 8, the message pushing device 100 based on the gated graph neural network includes a first construction unit 110, a first input unit 120, a first obtaining unit 130, a second obtaining unit 140, and a pushing unit 150.
A first construction unit 110 for constructing a bipartite graph having a customer and a product as nodes, wherein a purchase relationship of the customer and the product constitutes an edge of the bipartite graph.
In other embodiments of the invention, as shown in fig. 9, the first construction unit 110 includes: a third acquisition unit 111, a preprocessing unit 112 and a second construction unit 113.
A third acquisition unit 111 for acquiring customer information, product information, and purchase information of the customer for purchasing the product, which are required for constructing the bipartite graph; a preprocessing unit 112, configured to preprocess the client information and the product information respectively, to obtain preprocessed client information and preprocessed product information; and a second construction unit 113, configured to construct the bipartite graph according to the preprocessed customer information, the preprocessed product information, and the purchase information.
In other embodiments of the invention, as shown in fig. 10, the preprocessing unit 112 includes: a discretization processing unit 1121, and a feature embedding unit 1122.
A discretization processing unit 1121, configured to perform discretization processing on the client information and the product information, respectively, to obtain discretized client information and discretized product information; and a feature embedding unit 1122, configured to perform feature embedding on the discretized customer information and the discretized product information, respectively, to obtain the preprocessed customer information and the preprocessed product information.
The first input unit 120 is configured to input the bipartite graph into a preset gated graph neural network model, so as to obtain a purchase probability of each customer in the bipartite graph for purchasing a product connected with an adjacent customer.
In other inventive embodiments, as shown in fig. 11, the first input unit 120 includes: a convolution unit 121 and a second input unit 122.
A convolution unit 121, configured to perform a convolution operation on the bipartite graph to obtain a feature vector of each node in the bipartite graph and a feature vector of an adjacent edge of each node; the second input unit 122 is configured to input the feature vector of each node and the feature vector of the adjacent side of each node into the gated graph neural network model based on the attention mechanism, so as to obtain a purchase probability of each customer in the bipartite graph for purchasing a product connected with the adjacent customer.
In other inventive embodiments, as shown in fig. 12, the second input unit 122 includes: a selection unit 1221, a third input unit 1222, a fourth input unit 1223, a fifth input unit 1224, and a first classification unit 1225.
A selecting unit 1221, configured to select any one of the client nodes in the two graphs as a first client, take a product connected to the first client as a first product, take another client connected to the first product as a second client, and take another product connected to the second client as a second product; the third input unit 1222 is configured to input, to the gated graph neural network model, a feature vector of a node in the bipartite graph that is a first customer, a feature vector of a neighboring node of the first customer that is a first product, and a feature vector of an adjacent edge of the first customer, to obtain a first hidden state of the first customer; a fourth input unit 1223, configured to input, based on an attention mechanism, the first hidden state, a feature vector of a second client that is a neighboring node of the first product, and a feature vector of a connecting edge between the first product and the second client into the gated graph neural network model, to obtain a second hidden state of the first client; a fifth input unit 1224, configured to input, based on an attention mechanism, the second hidden state, a feature vector of a second product that is a neighboring node of the second client, and a feature vector of a connecting edge between the second client and the second product into the gated graph neural network model, to obtain a third hidden state of the first client; and a first classifying unit 1225, configured to classify the third hidden state according to a preset classifier, so as to obtain a probability that the first customer purchases the second product.
In other inventive embodiments, as shown in fig. 13, the sixth input unit 1225 includes: a stitching unit 12251 and a second classification unit 12252.
A stitching unit 12251, configured to stitch the third hidden state and the feature vector of the first client based on an attention mechanism to obtain a stitched feature vector; and a second classification unit 12252, configured to classify the spliced feature vector according to the classifier, to obtain a purchase probability of the first customer purchasing the second product.
The first obtaining unit 130 is configured to obtain, from the bipartite graph, a client matching with the client to be pushed as a target client based on a proximity algorithm.
In other inventive embodiments, as shown in fig. 14, the first obtaining unit 130 includes: a receiving unit 131, a fourth acquiring unit 132, and a fifth acquiring unit 133.
The receiving unit 131 is configured to receive a recommendation request of the client to be pushed and obtain feature information of the client to be pushed according to the recommendation request; a fourth obtaining unit 132, configured to obtain, from the bipartite graph, a plurality of clients that are matched with the client to be pushed according to the feature information of the client to be pushed; and a fifth obtaining unit 133, configured to obtain the maximum likelihood value of the client to be pushed and obtain the target client from the plurality of clients according to the maximum likelihood value of the client to be pushed.
And a second obtaining unit 140, configured to obtain, from the bipartite graph, a plurality of products connected to adjacent clients of the target client as target products according to the target client.
And the pushing unit 150 is configured to determine a product to be pushed based on a purchase probability of the target customer purchasing the target product, and push the product to be pushed to a user terminal corresponding to the customer to be pushed in a message manner.
The message pushing device 100 based on the gated graph neural network provided by the embodiment of the invention is used for executing the construction of the two graphs taking the customer and the product as nodes, wherein the purchase relationship of the customer and the product forms the edges of the two graphs; inputting the two graphs into a preset gating graph neural network model to obtain the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers; acquiring a client matched with a client to be pushed from the bipartite graph based on a proximity algorithm as a target client; obtaining a plurality of products connected with adjacent clients of the target client from the bipartite graph according to the target client as target products; and determining a product to be pushed based on the purchase probability of the target customer purchasing the target product, and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed in a message mode.
Referring to fig. 15, fig. 15 is a schematic block diagram of a computer device according to an embodiment of the present invention.
With reference to fig. 15, the device 500 includes a processor 502, a memory, and a network interface 505, which are connected by a system bus 501, wherein the memory may include a storage medium 503 and an internal memory 504.
The storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032, when executed, may cause the processor 502 to perform a gatekeeper-graph neural network based message pushing method.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall device 500.
The internal memory 504 provides an environment for the execution of a computer program 5032 in the non-volatile storage medium 503, which computer program 5032, when executed by the processor 502, causes the processor 502 to perform a gatekeeper-graph neural network based message pushing method.
The network interface 505 is used for network communication, such as providing for transmission of data information, etc. It will be appreciated by those skilled in the art that the structure shown in fig. 15 is merely a block diagram of a portion of the structure associated with the present inventive arrangements and is not limiting of the apparatus 500 to which the present inventive arrangements are applied, and that a particular apparatus 500 may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to execute a computer program 5032 stored in a memory to perform the following functions: constructing a bipartite graph taking a customer and a product as nodes, wherein the purchase relation of the customer and the product forms an edge of the bipartite graph; inputting the two graphs into a preset gating graph neural network model to obtain the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers; acquiring a client matched with a client to be pushed from the bipartite graph based on a proximity algorithm as a target client; obtaining a plurality of products connected with adjacent clients of the target client from the bipartite graph according to the target client as target products; and determining a product to be pushed based on the purchase probability of the target customer purchasing the target product, and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed in a message mode.
Those skilled in the art will appreciate that the embodiment of the apparatus 500 shown in fig. 15 is not limiting of the specific construction of the apparatus 500, and in other embodiments, the apparatus 500 may include more or less components than illustrated, or certain components may be combined, or a different arrangement of components. For example, in some embodiments, the device 500 may include only the memory and the processor 502, and in such embodiments, the structure and the function of the memory and the processor 502 are consistent with the embodiment shown in fig. 15, and will not be described herein.
It should be appreciated that in an embodiment of the invention, the processor 502 may be a central processing unit (Central Processing Unit, CPU), the processor 502 may also be other general purpose processors 502, digital signal processors 502 (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor 502 may be the microprocessor 502 or the processor 502 may be any conventional processor 502 or the like.
In another embodiment of the invention, a computer storage medium is provided. The storage medium may be a nonvolatile computer-readable storage medium or a volatile storage medium. The storage medium stores a computer program 5032, wherein the computer program 5032 when executed by the processor 502 performs the steps of: constructing a bipartite graph taking a customer and a product as nodes, wherein the purchase relation of the customer and the product forms an edge of the bipartite graph; inputting the two graphs into a preset gating graph neural network model to obtain the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers; acquiring a client matched with a client to be pushed from the bipartite graph based on a proximity algorithm as a target client; obtaining a plurality of products connected with adjacent clients of the target client from the bipartite graph according to the target client as target products; and determining a product to be pushed based on the purchase probability of the target customer purchasing the target product, and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed in a message mode.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus, device and unit described above may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein. Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided by the present invention, it should be understood that the disclosed apparatus, device and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the units is merely a logical function division, there may be another division manner in actual implementation, or units having the same function may be integrated into one unit, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or elements, or may be an electrical, mechanical, or other form of connection.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment of the present invention.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units may be stored in a storage medium if implemented in the form of software functional units and sold or used as stand-alone products. Based on such understanding, the technical solution of the present invention may be essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing an apparatus 500 (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (8)

1. The message pushing method based on the gating graph neural network is characterized by comprising the following steps of:
constructing a bipartite graph taking a customer and a product as nodes, wherein the purchase relation of the customer and the product forms an edge of the bipartite graph;
inputting the two graphs into a preset gating graph neural network model to obtain the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers;
acquiring a client matched with a client to be pushed from the bipartite graph based on a proximity algorithm as a target client;
obtaining a plurality of products connected with adjacent clients of the target client from the bipartite graph according to the target client as target products;
determining a product to be pushed based on the purchase probability of the target customer purchasing the target product, and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed in a message mode;
Inputting the bipartite graph into a preset gating graph neural network model to obtain the purchase probability of each customer in the bipartite graph for purchasing the product connected with the adjacent customer, wherein the method comprises the following steps of:
performing convolution operation on the bipartite graph to obtain a feature vector of each node in the bipartite graph and a feature vector of an adjacent side of each node;
inputting the feature vector of each node and the feature vector of the adjacent side of each node into the gating map neural network model based on an attention mechanism to obtain the purchase probability of each customer in the bipartite map for purchasing a product connected with the adjacent customer;
the obtaining, based on the proximity algorithm, a client matched with the client to be pushed from the bipartite graph as a target client includes:
receiving a recommendation request of the client to be pushed and acquiring characteristic information of the client to be pushed according to the recommendation request;
acquiring a plurality of clients matched with the client to be pushed from the bipartite graph according to the characteristic information of the client to be pushed;
and obtaining the maximum likelihood value of the client to be pushed, and obtaining the target client from the clients according to the maximum likelihood value of the client to be pushed.
2. The message pushing method based on the gate map neural network according to claim 1, wherein the constructing a bipartite graph with a customer and a product as nodes, wherein the purchase relationship of the customer and the product forms an edge of the bipartite graph, comprises:
acquiring customer information, product information and purchase information of the customer for purchasing the product required for constructing the bipartite graph;
preprocessing the client information and the product information respectively to obtain preprocessed client information and preprocessed product information;
and constructing the bipartite graph according to the preprocessed customer information, the preprocessed product information and the purchasing information.
3. The message pushing method based on the gate control graph neural network according to claim 2, wherein the preprocessing is performed on the client information and the product information respectively to obtain preprocessed client information and preprocessed product information, and the method comprises the following steps:
respectively carrying out discretization processing on the client information and the product information to obtain discretized client information and discretized product information;
and respectively performing feature embedding on the discretized customer information and the discretized product information to obtain the preprocessed customer information and the preprocessed product information.
4. The message pushing method based on the gate map neural network according to claim 1, wherein the inputting the feature vector of each node and the feature vector of the adjacent side of each node into the gate map neural network model based on the attention mechanism, to obtain the purchase probability of each customer in the two-part map for purchasing the product connected with the adjacent customer, includes:
selecting any client node in the two graphs as a first client, taking a product connected with the first client as a first product, taking another client connected with the first product as a second client, and taking another product connected with the second client as a second product;
inputting the characteristic vector of the first customer, the characteristic vector of the first product and the characteristic vector of the connecting edge of the first customer and the first product into the gating map neural network model to obtain a first hidden state of the first customer;
inputting the first hidden state, the feature vector of the second customer and the feature vector of the connecting edge of the second customer and the first product into the gating map neural network model based on an attention mechanism to obtain a second hidden state of the first customer;
Inputting the second hidden state, the feature vector of the second product and the feature vector of the connecting edge of the second customer and the second product into the gating map neural network model based on an attention mechanism to obtain a third hidden state of the first customer;
and classifying the third hidden state according to a preset classifier to obtain the purchase probability of the first customer for purchasing the second product.
5. The message pushing method based on the gated graph neural network according to claim 4, wherein the classifying the third hidden state according to a preset classifier to obtain the purchase probability of the first customer purchasing the second product includes:
splicing the third hidden state and the feature vector of the first client based on an attention mechanism to obtain a spliced feature vector;
and classifying the spliced feature vectors according to the classifier to obtain the purchase probability of the first customer for purchasing the second product.
6. A message pushing device based on a gate graph neural network, wherein the device is configured to perform a message pushing method based on a gate graph neural network as claimed in any one of claims 1 to 5, and the device includes:
A first construction unit configured to construct a bipartite graph having a customer and a product as nodes, wherein a purchase relationship of the customer and the product constitutes an edge of the bipartite graph;
the first input unit is used for inputting the two graphs into a preset gating graph neural network model to obtain the purchase probability of each customer in the two graphs for purchasing the products connected with the adjacent customers;
the first acquisition unit is used for acquiring a client matched with the client to be pushed from the bipartite graph based on a proximity algorithm as a target client;
a second obtaining unit, configured to obtain, from the bipartite graph, a plurality of products connected to adjacent clients of the target client as target products according to the target client;
and the pushing unit is used for determining a product to be pushed based on the purchase probability of the target customer purchasing the target product and pushing the product to be pushed to a user terminal corresponding to the customer to be pushed in a message mode.
7. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the gatekeeper neural network-based message pushing method according to any one of claims 1 to 5 when the computer program is executed by the processor.
8. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which when executed by a processor causes the processor to perform the gatekeeper neural network-based message pushing method according to any one of claims 1 to 5.
CN202110452588.4A 2021-04-26 2021-04-26 Message pushing method and device based on gate control graph neural network and computer equipment Active CN113159893B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110452588.4A CN113159893B (en) 2021-04-26 2021-04-26 Message pushing method and device based on gate control graph neural network and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110452588.4A CN113159893B (en) 2021-04-26 2021-04-26 Message pushing method and device based on gate control graph neural network and computer equipment

Publications (2)

Publication Number Publication Date
CN113159893A CN113159893A (en) 2021-07-23
CN113159893B true CN113159893B (en) 2023-08-29

Family

ID=76870722

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110452588.4A Active CN113159893B (en) 2021-04-26 2021-04-26 Message pushing method and device based on gate control graph neural network and computer equipment

Country Status (1)

Country Link
CN (1) CN113159893B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114022246B (en) * 2021-11-05 2023-08-11 平安科技(深圳)有限公司 Product information pushing method and device, terminal equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105893585A (en) * 2016-04-05 2016-08-24 电子科技大学 Label data-based bipartite graph model academic paper recommendation method
CN111782765A (en) * 2020-06-24 2020-10-16 安徽农业大学 Recommendation method based on graph attention machine mechanism
CN111967972A (en) * 2020-08-18 2020-11-20 中国银行股份有限公司 Financial product recommendation method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105893585A (en) * 2016-04-05 2016-08-24 电子科技大学 Label data-based bipartite graph model academic paper recommendation method
CN111782765A (en) * 2020-06-24 2020-10-16 安徽农业大学 Recommendation method based on graph attention machine mechanism
CN111967972A (en) * 2020-08-18 2020-11-20 中国银行股份有限公司 Financial product recommendation method and device

Also Published As

Publication number Publication date
CN113159893A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN109242633B (en) Commodity pushing method and device based on bipartite graph network
US6636862B2 (en) Method and system for the dynamic analysis of data
Murillo et al. k-maxitive fuzzy measures: A scalable approach to model interactions
CN112348079B (en) Data dimension reduction processing method and device, computer equipment and storage medium
CN112085615A (en) Method and device for training graph neural network
Koloseni et al. Differential evolution based nearest prototype classifier with optimized distance measures for the features in the data sets
Tomani et al. Parameterized temperature scaling for boosting the expressive power in post-hoc uncertainty calibration
CN109937421B (en) Two-class classification method for predicting class to which specific item belongs and computing device using same
Azath et al. Software effort estimation using modified fuzzy C means clustering and hybrid ABC-MCS optimization in neural network
KR20230056239A (en) Ai-based vegan cosmetic recommendation method
CN113159893B (en) Message pushing method and device based on gate control graph neural network and computer equipment
CN111159481A (en) Edge prediction method and device of graph data and terminal equipment
CN114511387A (en) Product recommendation method and device, electronic equipment and storage medium
KR20220107940A (en) Method for measuring lesion of medical image
Rahmat et al. Supervised feature selection using principal component analysis
Tao et al. A Model of High‐Dimensional Feature Reduction Based on Variable Precision Rough Set and Genetic Algorithm in Medical Image
CN112991026A (en) Commodity recommendation method, system, equipment and computer readable storage medium
CN115905648B (en) Gaussian mixture model-based user group and financial user group analysis method and device
Huynh-Van et al. Classifying the lung images for people infected with COVID-19 based on the extracted feature interval
CN110751501A (en) Commodity shopping guide method, device, equipment and storage medium in new retail mode
Eastman et al. A weighted normalized likelihood procedure for empirical land change modeling
Godichon-Baggioni et al. A penalized criterion for selecting the number of clusters for K-medians
Pelegrina et al. A novel multi-objective-based approach to analyze trade-offs in Fair Principal Component Analysis
Lai et al. Efficient guided hypothesis generation for multi-structure epipolar geometry estimation
Chabane et al. Intelligent personalized shopping recommendation using clustering and supervised machine learning algorithms

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant