CN116561429A - Training of product recommendation model, product recommendation method and device and electronic equipment - Google Patents

Training of product recommendation model, product recommendation method and device and electronic equipment Download PDF

Info

Publication number
CN116561429A
CN116561429A CN202310610396.0A CN202310610396A CN116561429A CN 116561429 A CN116561429 A CN 116561429A CN 202310610396 A CN202310610396 A CN 202310610396A CN 116561429 A CN116561429 A CN 116561429A
Authority
CN
China
Prior art keywords
product
tag
user
information
label
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310610396.0A
Other languages
Chinese (zh)
Inventor
姚俊展
罗剑平
周小涵
夏沛霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202310610396.0A priority Critical patent/CN116561429A/en
Publication of CN116561429A publication Critical patent/CN116561429A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9024Graphs; Linked lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The disclosure provides a training method and device for a product recommendation model, a product recommendation method and device and electronic equipment, and can be applied to the technical fields of big data, artificial intelligence and the like and the financial and technological field. The method comprises the following steps: obtaining a product relation graph model constructed according to product labels, wherein the product relation graph model comprises first node information, second node information and side information, the first node information comprises a first product label and a first user label, the second node information only comprises the second product label, and the side information represents an association relationship between two nodes corresponding to the side information; determining a second user label corresponding to the second product label according to the first product label, the first user label, the second product label and the side information; and obtaining a trained product recommendation model according to the first node information, the side information and the second node information with the second product label and the second user label.

Description

Training of product recommendation model, product recommendation method and device and electronic equipment
Technical Field
The disclosure relates to the technical fields of big data, artificial intelligence and the like and the financial science and technology field, in particular to a training method and device for a product recommendation model, a product recommendation method and device and electronic equipment.
Background
The investment products of the prior banks are numerous in types, namely self-owned loan products, financial market products, and substitute fund products, financial products, insurance products and the like, and each type of investment products is numerous in number, so that the common users can find products meeting the demands of the common users, and each type of products also has respective recommendation models.
The inventor finds that the recommendation model divided according to the product types has a disadvantage in the process of realizing the conception of the disclosure, namely, a customer needs to enter an actual product page to see the recommendation of a single product, and the user experience is insufficient.
Therefore, how to provide accurate and three-dimensional product portraits and product recommendations for customers from a plurality of investment product types and mass investment products is a problem to be solved in the industry.
Disclosure of Invention
In view of the above, the present disclosure provides a training method for a product recommendation model, a product recommendation method, a device and an electronic apparatus.
According to one aspect of the present disclosure, there is provided a training method of a product recommendation model, including: obtaining a product relation graph model constructed according to product labels, wherein the product relation graph model comprises first node information, second node information and side information, the first node information comprises a first product label and a first user label, the second node information only comprises the second product label, and the side information represents an association relation between two nodes corresponding to the side information; determining a second user tag corresponding to the second product tag according to the first product tag, the first user tag, the second product tag and the side information; and obtaining a trained product recommendation model according to the first node information, the side information and the second node information with the second product label and the second user label.
According to another aspect of the present disclosure, there is provided a product recommendation method including: determining a target user tag according to target user attribute information of a target user; and inputting the target user label into a product recommendation model to obtain a target product recommended for the target user, wherein the product recommendation model is a trained product recommendation model obtained according to the training method of the product recommendation model.
According to another aspect of the present disclosure, there is provided a training apparatus of a product recommendation model, including: the first acquisition module is used for acquiring a product relation graph model constructed according to the product labels, wherein the product relation graph model comprises first node information, second node information and side information, the first node information comprises a first product label and a first user label, the second node information comprises only the second product label, and the side information represents that an association relationship exists between two nodes corresponding to the side information; a first determining module, configured to determine a second user tag corresponding to the second product tag according to the first product tag, the first user tag, the second product tag, and the side information; and a first obtaining module, configured to obtain a trained product recommendation model according to the first node information, the side information, and second node information having the second product tag and the second user tag.
According to another aspect of the present disclosure, there is provided a product recommendation device including: a fifth determining module, configured to determine a target user tag according to target user attribute information of the target user; and the second obtaining module is used for inputting the target user label into a product recommendation model to obtain a target product recommended by the target user, wherein the product recommendation model is a trained product recommendation model obtained by a training device of the product recommendation model according to the disclosure.
According to another aspect of the present disclosure, there is provided an electronic device including: one or more processors; and a memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform at least one of the training method and the product recommendation method of the product recommendation model described in the present disclosure.
According to another aspect of the present disclosure, there is also provided a computer-readable storage medium having stored thereon executable instructions that, when executed by a processor, cause the processor to perform at least one of the training method and the product recommendation method of the product recommendation model described in the present disclosure.
According to another aspect of the present disclosure, there is also provided a computer program product comprising a computer program which, when executed by a processor, implements at least one of the training method and the product recommendation method of the product recommendation model described in the present disclosure.
According to the training of the product recommendation model, the product recommendation method, the device and the electronic equipment, a product relationship graph model constructed according to the product label is obtained, wherein the product relationship graph model comprises first node information, second node information and side information, the first node information comprises the first product label and a first user label, the second node information only comprises the second product label, and the side information represents that an association relationship exists between two nodes corresponding to the side information; determining a second user label corresponding to the second product label according to the first product label, the first user label, the second product label and the side information; and obtaining a trained product recommendation model according to the first node information, the side information and the second node information with the second product label and the second user label. The product nodes can be marked according to the user labels, and the label information of the product nodes without marked user labels can be updated by using the product nodes marked with the user labels through a label propagation algorithm, so that the product nodes are propagated in the whole network until convergence, community labels determined by the basic information of the product labels and the user labels are formed, various types of products in the corresponding community labels are recommended to users matched with the labels, and the technical problem that the recommendation of single type of products can be seen only by entering into an actual product page is at least partially relieved. The method and the device have the advantages that the three-dimensional product portraits and product recommendations are obtained according to the user characteristic labels of the clients, and a comprehensive investment guiding effect is provided for the users.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be more apparent from the following description of embodiments of the disclosure with reference to the accompanying drawings, in which:
FIG. 1 schematically illustrates an application scenario diagram of at least one of a training method and a product recommendation method of a product recommendation model according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow chart of a method of training a product recommendation model, according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a schematic diagram of a product relationship similarity matrix model in accordance with an embodiment of the present disclosure;
FIG. 4 schematically illustrates a schematic view of an initial product relationship graph model in accordance with an embodiment of the present disclosure;
FIG. 5 schematically illustrates a flow chart of training a product relationship graph model in accordance with an embodiment of the present disclosure;
FIG. 6 schematically illustrates a schematic diagram of a trained product relationship graph model, according to an embodiment of the disclosure;
FIG. 7 schematically illustrates a flow chart of a method of implementing product recommendation based on the trained product recommendation model described above, according to an embodiment of the disclosure;
FIG. 8 schematically illustrates a block diagram of a training device of a product recommendation model, according to an embodiment of the present disclosure;
FIG. 9 schematically illustrates a block diagram of a product recommendation device, according to an embodiment of the present disclosure; and
Fig. 10 schematically illustrates a block diagram of an electronic device adapted to implement at least one of a training method and a product recommendation method of a product recommendation model, in accordance with an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is only exemplary and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. In addition, in the following description, descriptions of well-known structures and techniques are omitted so as not to unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and/or the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It should be noted that the terms used herein should be construed to have meanings consistent with the context of the present specification and should not be construed in an idealized or overly formal manner.
Where expressions like at least one of "A, B and C, etc. are used, the expressions should generally be interpreted in accordance with the meaning as commonly understood by those skilled in the art (e.g.," a system having at least one of A, B and C "shall include, but not be limited to, a system having a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
In the technical scheme of the disclosure, the related data (such as including but not limited to personal information of a user) are collected, stored, used, processed, transmitted, provided, disclosed, applied and the like, all conform to the regulations of related laws and regulations, necessary security measures are adopted, and the public welcome is not violated.
The embodiment of the disclosure provides a training method and device for a product recommendation model, a product recommendation method and device and electronic equipment. The training method of the product recommendation model comprises the following steps: obtaining a product relation graph model constructed according to product labels, wherein the product relation graph model comprises first node information, second node information and side information, the first node information comprises a first product label and a first user label, the second node information only comprises the second product label, and the side information represents an association relationship between two nodes corresponding to the side information; determining a second user label corresponding to the second product label according to the first product label, the first user label, the second product label and the side information; and obtaining a trained product recommendation model according to the first node information, the side information and the second node information with the second product label and the second user label.
Fig. 1 schematically illustrates an application scenario diagram of at least one of a training method and a product recommendation method of a product recommendation model according to an embodiment of the present disclosure.
As shown in fig. 1, an application scenario 100 according to this embodiment may include a first terminal device 101, a second terminal device 102, a third terminal device 103, a network 104, and a server 105. The network 104 is a medium used to provide a communication link between the first terminal device 101, the second terminal device 102, the third terminal device 103, and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 through the network 104 using at least one of the first terminal device 101, the second terminal device 102, the third terminal device 103, to receive or send messages, etc. Various communication client applications, such as a shopping class application, a web browser application, a search class application, an instant messaging tool, a mailbox client, social platform software, etc. (by way of example only) may be installed on the first terminal device 101, the second terminal device 102, and the third terminal device 103.
The first terminal device 101, the second terminal device 102, the third terminal device 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablets, laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (by way of example only) providing support for websites browsed by the user using the first terminal device 101, the second terminal device 102, and the third terminal device 103. The background management server may analyze and process the received data such as the user request, and feed back the processing result (e.g., the web page, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that at least one of the training method and the product recommendation method of the product recommendation model provided in the embodiments of the present disclosure may be generally executed by the server 105. Accordingly, at least one of the training device and the product recommendation device of the product recommendation model provided in the embodiments of the present disclosure may be generally disposed in the server 105. At least one of the training method and the product recommendation method of the product recommendation model provided in the embodiments of the present disclosure may also be performed by a server or a server cluster that is different from the server 105 and is capable of communicating with the first terminal device 101, the second terminal device 102, the third terminal device 103, and/or the server 105. Accordingly, at least one device of the training device and the product recommendation device of the product recommendation model provided in the embodiments of the present disclosure may also be provided in a server or a server cluster that is different from the server 105 and is capable of communicating with the first terminal device 101, the second terminal device 102, the third terminal device 103, and/or the server 105.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
It should be noted that, the training of the product recommendation model, the product recommendation method, the device and the electronic equipment of the disclosure may be used in technical fields such as big data and artificial intelligence, and in financial science and technology, and may also be used in any field other than the technical fields of big data and artificial intelligence, and the financial science and technology, and the training of the product recommendation model, the product recommendation method, the device and the application field of the electronic equipment of the disclosure are not limited.
The training method of the product recommendation model of the disclosed embodiment will be described in detail below with reference to the scenario described in fig. 1 through fig. 2 to 6.
FIG. 2 schematically illustrates a flow chart of a method of training a product recommendation model, according to an embodiment of the disclosure.
As shown in fig. 2, the training method of the product recommendation model of this embodiment includes operations S210 to S230.
In operation S210, a product relationship graph model constructed according to product labels is obtained, where the product relationship graph model includes first node information, second node information and side information, the first node information includes a first product label and a first user label, the second node information includes only the second product label, and the side information characterizes that there is an association relationship between two nodes corresponding to the side information.
According to embodiments of the present disclosure, products may include various types of multimedia information such as news, novels, videos, etc., and may also include various types of financial products such as funds, deposits, investment products, etc., and may not be limited thereto. For example, the product may be a financial product and the product label (including the first product label and the second product label) may include at least one of: product type, number of purchases, number of redemption, rate of return, total rate of return, net value, risk index, etc., and may not be so limited. The feature values may be obtained for each financial product using a data acquisition tool, and constructed as a piece of data, to obtain a product label. Wherein a certain type of product may be identified as not involved if it does not involve a part of the feature.
For example, corresponding to the product type, number of purchases, number of redemption, rate of return, total rate of return, net value, risk index, etc., the following data information may be obtained for investment product a: foundation, 1000000, 500000,3.00%,10.55%,1.5502, medium risk; the following data information can be obtained for investment product B: deposit, 1000000, not related, 2.45%,2.45%,1.00, low risk.
According to embodiments of the present disclosure, the product relationship graph model may be a model having a graph structure. Nodes and edges may be included in the graph structure. Each node in the product relationship graph model may correspond to a product. The method comprises the step of including first node information or second node information corresponding to the product. Each edge in the product relationship graph model may characterize edge information.
According to embodiments of the present disclosure, a product relationship graph may be determined from a first product label and a second product label. Then, a first user tag may be configured for a certain node or nodes in the product relationship graph to obtain an initial product relationship graph model.
In operation S220, a second user tag corresponding to the second product tag is determined according to the first product tag, the first user tag, the second product tag, and the side information.
According to an embodiment of the present disclosure, each of the first user tag and the second user tag may include at least one of: age, gender, income, education level, occupation, risk bearing capacity, customer grade, etc., and may not be limited thereto. The feature values may be obtained for each individual user or each existing historical user of the product using a data acquisition tool, and constructed as a piece of data to obtain a user tag. Wherein the risk tolerance capability can be obtained by investigation analysis of the user when the user purchases the product.
It should be noted that the data acquisition tool may include, for example, a crawler tool, and the like, and may not be limited thereto.
According to embodiments of the present disclosure, training the initial product relationship graph model by using the first user tag as input to the product relationship graph model in conjunction with the LPA (Label Propagation Algorithm, tag propagation algorithm), for example, may result in a second user tag that is tailored to each second product tag.
In operation S230, a trained product recommendation model is obtained based on the first node information, the side information, and the second node information having the second product tag and the second user tag.
According to embodiments of the present disclosure, each node in the trained product recommendation model may include a product tag (including a first product tag or a second product tag) and a user tag (including a first user tag or a second user tag) that is adapted to the product tag.
According to the embodiment of the disclosure, the product nodes can be marked according to the user labels, and the label information of the product nodes without marked user labels can be updated by using the product nodes marked with the user labels through a label propagation algorithm, so that the product nodes are propagated in the whole network until convergence, community labels determined by the basic information of the product labels and the user labels are formed, various types of products in the corresponding community labels are recommended to users matched with the labels, and the technical problem that recommendation of single types of products can be seen only after entering into an actual product page is at least partially relieved. The method and the device have the advantages that the three-dimensional product portraits and product recommendations are obtained according to the user characteristic labels of the clients, and a comprehensive investment guiding effect is provided for the users.
The method shown in fig. 2 is described in further detail below in connection with specific embodiments.
According to an embodiment of the present disclosure, before performing the above operation S210, an initial product relationship graph model may also be first constructed, and the constructing method may include: first product attribute information of a first product and second product attribute information of a second product are obtained. And carrying out structural conversion on the first product attribute information according to the first predefined mapping relation to obtain a first product label. The first predefined mapping relation characterizes the mapping relation between various product attribute information and numerical information. And carrying out structural conversion on the second product attribute information according to the first predefined mapping relation to obtain a second product label. User attribute information of a user associated with a first product is obtained. And carrying out structural conversion on the user attribute information according to a second predefined mapping relation to obtain a first user tag, wherein the second predefined mapping relation characterizes the mapping relation between various types of user attribute information and numerical information. First node information is determined based on a first product tag and a first user tag corresponding to the same first product. And determining second node information according to the second product label. And determining the side information according to the first similarity between every two product labels in the first product label and the second product label.
According to an embodiment of the present disclosure, the first product attribute information and the second product attribute information may include at least one of: product type, number of purchases, number of redemption, rate of return, total rate of return, net value, risk index, etc., and may not be so limited. The first product label and the second product label can also be characterized as characteristic values of the attribute information of the various products. The first predefined mapping relationship may include a correspondence relationship between various types of product attribute information and characteristic values thereof, and may not be limited thereto. For example, the first predefined mapping relationship may include the contents as shown in tables 1 and 2. Table 1 may define correspondence between product related/unrelated tags and dictionary values. Table 2 may define a correspondence between product types and dictionary values.
Table 1:
whether or not to relate to the present label Dictionary values
Relates to 1
Not involving 0
Table 2:
investment product type Dictionary values
Fund(s) 1
Financial management 2
Safety device 3
Deposit 4
... ...
In accordance with embodiments of the present disclosure, the structured transformation may represent, for example, a process of transforming literal information into dictionary values. For example, in combination with the foregoing tables 1 and 2, after the first product attribute information and the second product attribute information obtained are structurally converted, a product label table as shown in table 3 may be obtained, for example.
Table 3:
according to an embodiment of the present disclosure, after the first and second product tags of digital type are obtained, a process of determining side information according to a first similarity between each two product tags of the first and second product tags may be performed. After determining the side information, a product relationship similarity matrix model may be constructed, for example.
FIG. 3 schematically illustrates a schematic diagram of a product relationship similarity matrix model in accordance with an embodiment of the present disclosure.
As shown in FIG. 3, the product relationship similarity matrix model includes nodes a, b, c, d, e, f, g, h, i, and edges a-b, a-c, b-c, a-d, c-e, e-f, e-i, e-g, e-f, f-h, f-i, g-h, h-i. Each node includes a product label (including the first product label and the second product label described above) for the corresponding product.
According to an embodiment of the present disclosure, the user attribute information may include at least one of: age, gender, income, education level, occupation, risk bearing capacity, customer grade, etc., and may not be limited thereto. The first user tag may also be characterized as a characteristic value of the aforementioned various types of user attribute information. The second predefined mapping relationship may include a correspondence relationship between various types of user attribute information and characteristic values thereof, and may not be limited thereto. For example, the second predefined mapping relationship may include content as shown in table 4. Table 4 may define a gender data dictionary.
Table 4:
according to an embodiment of the present disclosure, the gender, risk tolerance, occupation, education level in the user attribute information are unstructured data, which may be converted into digital structured data by creating a data dictionary according to a second predefined mapping relation before modeling. For example, after the obtained user attribute information is structured and converted, an existing investment user information table as shown in table 5 can be obtained.
Table 5:
according to the embodiment of the disclosure, the first user tag can also be used for determining the tag of the user according to the existing user tag system and combining one or more characteristic combinations of the user. For example, existing user tagging systems may include guest group tags (18-30 years old: young guests; 31_40 years old: strong guests; 45-60 years old: middle aged guests), risk bearing tags (1-2: low risk, 3_4: medium risk, 5: high risk), customer quality tags (income > 50 and educational level 5: premium, 50 > income > 35 and educational level 4: medium), and the like, and may not be so limited. Based on this existing user tagging system, the user in Table 5 is tagged, for example, to obtain the results shown in Table 6.
Table 6:
User' s Customer base label Risk bearing label Customer quality label
User A Adult group of guests Risk in Medium and medium
User B Senior light passenger group Risk in High quality
... ... ... ...
According to an embodiment of the present disclosure, after obtaining the user attribute information as shown in table 6, the unstructured data in table 6 may be further structured and converted by combining with the second predefined mapping relationship to obtain a digital first user tag. And then, the first user label can be used as a reference to be input into the LPA-based product relationship graph model, and part of nodes in the product relationship graph model are marked, for example, an initial product relationship graph model can be obtained.
FIG. 4 schematically illustrates a schematic diagram of an initial product relationship graph model, according to an embodiment of the present disclosure.
As shown in FIG. 4, the initial product relationship graph model includes nodes a, b, c, d, e, f, g, h, i, and edges a-b, a-c, b-c, a-d, c-e, e-f, e-i, e-g, e-f, f-h, f-i, g-h, h-i. Wherein a first user tag, such as "young guest group", etc., is configured for node a, and may not be limited thereto. A first user tag, such as "adult group", etc., is configured for the node h, and may not be limited thereto.
By the above-described embodiments of the present disclosure, a structuring process is incorporated. The data in the model calculation can be converted into a unified data form, so that the model processing is convenient, and the model precision is improved.
According to an embodiment of the present disclosure, the above operation S220 may include: and determining the information propagation probability between the first node information and the second node information according to the second similarity between the first product label and the second product label. And determining a second user tag according to the first user tag and the information propagation probability.
According to an embodiment of the present disclosure, the above-described second similarity may be determined by calculating a euclidean distance between the first product tag and the second product tag. The second similarity may be determined as the probability of information propagation, or may be further processed in combination with a predefined formula to obtain the probability of information propagation.
According to an embodiment of the present disclosure, determining the information propagation probability between the first node information and the second node information according to the second similarity between the first product tag and the second product tag may include: and determining first side weight information between the first node information and the second node information according to preset constraint parameters and Euclidean distance between the first product label and the second product label. Second side weight information between each two node information of the first node information and the second node information is determined. And determining the information propagation probability according to the first side weight information and the second side weight information.
According to embodiments of the present disclosure, edge weight information may characterize the weights of edges in a product relationship graph model. The greater the weight of an edge, the more similar the two nodes, the easier the user label on one node will propagate to the other node.
For example, the side weight information (including the first side weight information and the second side weight information described above) may be determined in conjunction with equation (1).
In formula (1), w ij Can represent first edge weight information, x between nodes i and j i ,x j The first product label and the second product label may be represented respectively, or product labels corresponding to every two node information in the first node information and the second node information may be represented respectively, and α may represent a preset constraint parameter.
For example, the probability of information propagation may be determined in conjunction with equation (2).
In formula (2), P ij Can represent the probability of information propagation, w, from node i to node j ik Second side weight information between each two node information of the first node information and the second node information may be represented, n may represent the number of nodes, and i, j, k may represent parameters representing the nodes.
It should be noted that, the manner of determining the second similarity based on the euclidean distance, determining the side weight information based on the formula (1), and determining the information propagation probability based on the formula (2) is merely an exemplary embodiment, and in an actual implementation process, the method may not be limited thereto, and one may be capable of determining the second similarity, the side weight information, and the information propagation probability.
According to the embodiment of the disclosure, the second user tag can be determined according to the first user tag under the condition that the information propagation probability is determined to be larger than a preset probability value. In this process, for example, a first user tag may be determined to be a second user tag. The first user tag and the probability value characterized by the probability of propagation of the information may also be determined as the second user tag.
Through the embodiment of the disclosure, the user label corresponding to each product label can be calculated more accurately based on the label propagation algorithm, and the obtained trained model can have higher output precision.
According to an embodiment of the present disclosure, the above operation S220 may further include: and determining a probability transition matrix according to the information propagation probability between every two node information in the first node information and the second node information. And determining an initial user soft tag matrix according to the first user tag and initial parameter information representing the second user tag. And determining a target user soft tag matrix according to the probability transition matrix and the initial user soft tag matrix, wherein elements in the target user soft tag matrix comprise user tag information and probability values corresponding to the user tag information. And determining a second user label according to the target user soft label matrix.
For example, let (x 1, y 1) … (x 1, y 1) determine the first node information. Wherein XL= { x1 … x1} is the first product label, YL= { y1 … y1 }. Epsilon {1 … C } is the first user label, the label number C is known, and both exist in the label data. Let (x1+1, y1+1) … (x1+u, y1+u) be the second node information. Wherein xu= { x1+ … x1+u } is the second product label, yu= { y1+ … y1+u } is the second user label to be predicted, 1 < u, n= 1+U. Let data set x= { X1 … X1+u } ∈r. The training process of the product relationship graph model can be converted into: from the data set X, a corresponding second user label is predicted for each second product label in the unlabeled data set YU using learning of YL.
According to an embodiment of the present disclosure, for example, the information propagation probability may be calculated for each two node information of the first node information and the second node information in combination with the foregoing formulas (1) to (2). For example, corresponding to the above-described embodiment, by calculating the information propagation probability between each two node information, the n×n probability transition matrix P can be obtained. Combining YL and YU to obtain an n×c initial user soft label matrix f= [ YL; YU ]. By performing a training process for F for multiple rounds according to P, a target user soft label matrix can be obtained. And determining the second user label according to the parameter information representing the second user label in the target user soft label matrix.
It should be noted that the soft tag may characterize the probability that the remaining product tag fits each user tag in the corresponding matrix, rather than being mutually exclusive, the product tag belongs to only one user tag with a probability of 1.
According to an embodiment of the present disclosure, determining the target user soft tag matrix according to the probability transition matrix and the initial user soft tag matrix may include: and obtaining an i+1th round user soft tag matrix according to the probability transition matrix and the i round user soft tag matrix, wherein the 1 st round user soft tag matrix is an initial user soft tag matrix, and i is an integer greater than or equal to 1. And determining the first round of user soft tag matrix as the target user soft tag matrix in response to determining that the information propagation probability between the first target product tag information and the second target product tag information obtained by the first round of information is smaller than a preset threshold value. The first target product label information characterizes product label information corresponding to a node for obtaining second user label information in the first round, and the second target product label information is product label information with a node association relation with the first target product label information.
It should be noted that the first target product label may include a plurality of first target product labels, which is not limited herein.
FIG. 5 schematically illustrates a flow chart of training a product relationship graph model in accordance with an embodiment of the present disclosure.
As shown in fig. 5, the method includes operations S510 to S540.
In operation S510, a probability transition matrix P and an initial user soft tag matrix F are obtained.
In operation S520, propagation is performed: f=pf.
In the process, according to the calculation result, the parameter information of the new labeling user label in the F can be correspondingly updated.
In operation S530, it is determined whether F converges. If yes, executing operation S540; if not, operations S520 to S530 are performed.
In this process, whether F converges may be determined by judging whether or not the probability of information propagation between the first target product label information and the second target product label information is smaller than a preset threshold. In the case where it is determined that the information propagation probability between the first target product tag information and the second target product tag information is smaller than the preset threshold value, it may be determined that fconvergence.
In operation S540, a trained product relationship graph model is obtained from the first product label, the second product label, and F.
FIG. 6 schematically illustrates a schematic diagram of a trained product relationship graph model, according to an embodiment of the disclosure.
As shown in FIG. 6, nodes a, b, c, d, e, f, g, h, i are included in the trained product relationship graph model, as well as edges a-b, a-c, b-c, a-d, c-e, e-f, e-i, e-g, e-f, f-h, f-i, g-h, h-i. Each node includes a product tag and a user tag.
It should be noted that, for other labels of the user, such as risk bearing labels, customer quality labels, etc., corresponding label labeling results may be obtained based on the above method.
According to an embodiment of the present disclosure, determining the second user tag according to the target user soft tag matrix may also include: in response to determining that the user tag corresponding to the same second node information includes a plurality of candidate user tags, determining a candidate user tag having a highest probability value among the plurality of candidate user tags as the second user tag of the second node information.
For example, in the case where it is finally determined that the predicted user tag for a certain product tag includes a plurality of user tags having probability values, the user tag having the highest probability value may be taken as the user tag corresponding to the product tag.
Through the embodiment of the disclosure, the model output precision can be further improved, and the accuracy of the follow-up product recommendation based on the model can be effectively improved.
According to the embodiment of the disclosure, after the trained product recommendation model is obtained based on the method, the user label is taken as input based on the model, and products matched with the user can be recommended for different users.
Fig. 7 schematically illustrates a flowchart of a method for implementing product recommendation based on the trained product recommendation model described above, according to an embodiment of the disclosure.
As shown in fig. 7, the method includes operations S710 to S720.
In operation S710, a target user tag is determined according to target user attribute information of the target user.
In operation S720, the target user tag is input into the trained product recommendation model to obtain a target product recommended for the target user.
According to embodiments of the present disclosure, from each node in a trained product recommendation model, a user label that is appropriate for the product for that node may be determined. Based on this, in the case where a product needs to be recommended for the target user, the target product recommended for the target user can be obtained by executing S710 to S720 described above.
For example, table 7 shows a product recommendation result.
Table 7:
according to an embodiment of the present disclosure, the target user tag may include at least a first user sub-tag and a second user sub-tag, where the first user sub-tag has a first weight, and the second user sub-tag has a second weight, and the second weight is greater than the first weight. The step of inputting the target user tag into the product recommendation model to obtain the target product recommended for the target user may include: and in response to determining that no product which is completely matched with the target user label exists in the products to be recommended, inputting the second user sub-label into a product recommendation model to obtain the product which is matched with the second user sub-label, and taking the product as the target product. And in response to determining that no product matched with the second user sub-label exists in the products to be recommended, inputting the first user sub-label into a product recommendation model to obtain the product matched with the first user sub-label as a target product.
For example, the target user tags include guest group tags: the elderly users, risk bearing labels: low risk, etc., and risk bearing tags are weighted more than guest group tags. In the case that there is no tag that is perfectly matched with the target user among the user tags of the products to be recommended, the target user may be first recommended a product whose risk-bearing tag is low risk. In the case that no risk bearing label is a low risk product among the products to be recommended, a product whose guest group label is a senior user may be recommended to the target user.
Through the embodiment of the disclosure, the label propagation algorithm is used for forming the communities of labels from a plurality of investment product types and mass investment products, providing accurate and three-dimensional product portraits and product recommendations for clients, helping the clients to know the investment trend of the market and achieving better investment guiding effect.
Based on the training method of the product recommendation model, the invention further provides a training device of the product recommendation model. The device will be described in detail below in connection with fig. 8.
Fig. 8 schematically illustrates a block diagram of a training apparatus of a product recommendation model according to an embodiment of the present disclosure.
As shown in fig. 8, the training apparatus 800 of the product recommendation model of this embodiment includes a first acquisition module 810, a first determination module 820, and a first acquisition module 830.
The first obtaining module 810 is configured to obtain a product relationship graph model constructed according to product labels, where the product relationship graph model includes first node information, second node information and side information, the first node information includes a first product label and a first user label, the second node information includes only a second product label, and the side information characterizes that there is an association relationship between two nodes corresponding to the side information. In an embodiment, the first obtaining module 810 may be configured to perform the operation S210 described above, which is not described herein.
The first determining module 820 is configured to determine a second user tag corresponding to the second product tag according to the first product tag, the first user tag, the second product tag, and the side information. In an embodiment, the first determining module 820 may be used to perform the operation S220 described above, which is not described herein.
The first obtaining module 830 is configured to obtain a trained product recommendation model according to the first node information, the side information, and the second node information having the second product tag and the second user tag. In an embodiment, the first obtaining module 830 may be configured to perform the operation S230 described above, which is not described herein.
According to an embodiment of the present disclosure, the first determination module includes a first determination unit and a second determination unit.
And the first determining unit is used for determining the information propagation probability between the first node information and the second node information according to the second similarity between the first product label and the second product label.
And the second determining unit is used for determining a second user tag according to the first user tag and the information propagation probability.
According to an embodiment of the present disclosure, the first determination unit includes a first determination subunit, a second determination subunit, and a third determination subunit.
The first determining subunit is configured to determine first edge weight information between the first node information and the second node information according to a preset constraint parameter and a euclidean distance between the first product tag and the second product tag.
And a second determination subunit configured to determine second side weight information between each two node information in the first node information and the second node information.
And a third determining subunit, configured to determine the information propagation probability according to the first side weight information and the second side weight information.
According to an embodiment of the present disclosure, the first determination module includes a third determination unit, a fourth determination unit, a fifth determination unit, and a sixth determination unit.
And a third determining unit configured to determine a probability transition matrix according to the information propagation probability between each two node information in the first node information and the second node information.
And a fourth determining unit, configured to determine an initial user soft tag matrix according to the first user tag and initial parameter information characterizing the second user tag.
And a fifth determining unit, configured to determine a target user soft tag matrix according to the probability transition matrix and the initial user soft tag matrix, where elements in the target user soft tag matrix include user tag information and probability values corresponding to the user tag information.
And a sixth determining unit, configured to determine the second user tag according to the target user soft tag matrix.
According to an embodiment of the present disclosure, the fifth determining unit includes an obtaining subunit and a fourth determining subunit.
The method comprises the steps of obtaining a subunit, namely obtaining an i+1st round user soft tag matrix according to a probability transition matrix and an i round user soft tag matrix, wherein the 1 st round user soft tag matrix is an initial user soft tag matrix, and i is an integer greater than or equal to 1.
And a fourth determining subunit, configured to determine, in response to determining that the information propagation probability between the first target product label information and the second target product label information obtained in the 1 st round is smaller than a preset threshold, the I-th round user soft label matrix as a target user soft label matrix, where the first target product label information characterizes product label information corresponding to a node that obtains the second user label information in the I-th round, and the second target product label information is product label information that has a node association relationship with the first target product label information.
According to an embodiment of the present disclosure, the sixth determining unit comprises a fifth determining subunit.
And a fifth determining subunit configured to determine, as the second user tag of the second node information, a candidate user tag having the highest probability value among the plurality of candidate user tags in response to determining that the user tags corresponding to the same second node information include the plurality of candidate user tags.
According to an embodiment of the disclosure, the training device for the product recommendation model further includes a second acquisition module, a first conversion module, a second conversion module, a third acquisition module, a third conversion module, a second determination module, a third determination module, and a fourth determination module.
The second acquisition module is used for acquiring the first product attribute information of the first product and the second product attribute information of the second product.
The first conversion module is used for carrying out structural conversion on the first product attribute information according to a first predefined mapping relation to obtain a first product label, wherein the first predefined mapping relation represents the mapping relation between various product attribute information and numerical information.
And the second conversion module is used for carrying out structural conversion on the second product attribute information according to the first predefined mapping relation to obtain a second product label.
And the third acquisition module is used for acquiring the user attribute information of the user related to the first product.
And the third conversion module is used for carrying out structural conversion on the user attribute information according to a second predefined mapping relation to obtain the first user tag, wherein the second predefined mapping relation characterizes the mapping relation between various types of user attribute information and numerical information.
And the second determining module is used for determining the first node information according to the first product label and the first user label corresponding to the same first product.
And the third determining module is used for determining second node information according to the second product label.
And the fourth determining module is used for determining the side information according to the first similarity between every two product labels in the first product label and the second product label.
Based on the product recommendation method, the disclosure further provides a product recommendation device. The device will be described in detail below in connection with fig. 9.
Fig. 9 schematically shows a block diagram of a product recommendation device according to an embodiment of the present disclosure.
As shown in fig. 9, the training device 800 for a product recommendation model of this embodiment includes a fifth determining module 910 and a second obtaining module 920.
The fifth determining module 910 is configured to determine a target user tag according to target user attribute information of the target user. In an embodiment, the fifth determining module 910 may be configured to perform the operation S710 described above, which is not described herein.
The second obtaining module 920 is configured to input a target user tag into a product recommendation model to obtain a target product recommended for the target user, where the product recommendation model is a trained product recommendation model obtained according to the training method of the product recommendation model disclosed in the present disclosure. In an embodiment, the second obtaining module 920 may be configured to perform the operation S720 described above, which is not described herein.
According to an embodiment of the disclosure, the target user tag includes at least a first user sub-tag and a second user sub-tag, the first user sub-tag having a first weight, the second user sub-tag having a second weight, the second weight being greater than the first weight. The second obtaining module includes a first obtaining unit and a second obtaining unit.
The first obtaining unit is used for inputting the second user sub-label into the product recommendation model to obtain the product matched with the second user sub-label as the target product in response to determining that the product which is completely matched with the target user label does not exist in the products to be recommended.
And the second obtaining unit is used for inputting the first user sub-label into the product recommendation model to obtain the product matched with the first user sub-label as a target product in response to determining that the product matched with the second user sub-label does not exist in the products to be recommended.
According to an embodiment of the present disclosure, any of the first acquisition module 810, the first determination module 820, and the first acquisition module 830, or the fifth determination module 910, the second acquisition module 920 may be combined in one module to be implemented, or any of the modules may be split into a plurality of modules. Alternatively, at least some of the functionality of one or more of the modules may be combined with at least some of the functionality of other modules and implemented in one module. According to embodiments of the present disclosure, at least one of the first acquisition module 810, the first determination module 820, and the first acquisition module 830, or the fifth determination module 910, the second acquisition module 920, may be implemented at least in part as hardware circuitry, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or as hardware or firmware in any other reasonable manner of integrating or packaging the circuitry, or as any one of or a suitable combination of any of the three. Alternatively, at least one of the first acquisition module 810, the first determination module 820 and the first acquisition module 830, or the fifth determination module 910, the second acquisition module 920 may be at least partially implemented as computer program modules, which when executed, may perform the respective functions.
Fig. 10 schematically illustrates a block diagram of an electronic device adapted to implement at least one of a training method and a product recommendation method of a product recommendation model, in accordance with an embodiment of the present disclosure.
As shown in fig. 10, an electronic device 1000 according to an embodiment of the present disclosure includes a processor 1001 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1002 or a program loaded from a storage section 1008 into a Random Access Memory (RAM) 1003. The processor 1001 may include, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or an associated chipset and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), or the like. The processor 1001 may also include on-board memory for caching purposes. The processor 1001 may include a single processing unit or multiple processing units for performing different actions of the method flows according to embodiments of the present disclosure.
In the RAM 1003, various programs and data necessary for the operation of the electronic apparatus 1000 are stored. The processor 1001, the ROM 1002, and the RAM 1003 are connected to each other by a bus 1004. The processor 1001 performs various operations of the method flow according to the embodiment of the present disclosure by executing programs in the ROM 1002 and/or the RAM 1003. Note that the program may be stored in one or more memories other than the ROM 1002 and the RAM 1003. The processor 1001 may also perform various operations of the method flow according to the embodiments of the present disclosure by executing programs stored in the one or more memories.
According to an embodiment of the disclosure, the electronic device 1000 may also include an input/output (I/O) interface 1005, the input/output (I/O) interface 1005 also being connected to the bus 1004. The electronic device 1000 may also include one or more of the following components connected to an input/output (I/O) interface 1005: an input section 1006 including a keyboard, a mouse, and the like; an output portion 1007 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), etc., and a speaker, etc.; a storage portion 1008 including a hard disk or the like; and a communication section 1009 including a network interface card such as a LAN card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The drive 1010 is also connected to an input/output (I/O) interface 1005 as needed. A removable medium 1011, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is installed as needed in the drive 1010, so that a computer program read out therefrom is installed as needed in the storage section 1008.
The present disclosure also provides a computer-readable storage medium that may be embodied in the apparatus/device/system described in the above embodiments; or may exist alone without being assembled into the apparatus/device/system. The computer-readable storage medium carries one or more programs which, when executed, implement methods in accordance with embodiments of the present disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example, but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, according to embodiments of the present disclosure, the computer-readable storage medium may include ROM 1002 and/or RAM 1003 and/or one or more memories other than ROM 1002 and RAM 1003 described above.
Embodiments of the present disclosure also include a computer program product comprising a computer program containing program code for performing the methods shown in the flowcharts. The program code, when executed in a computer system, causes the computer system to implement the item recommendation method provided by embodiments of the present disclosure.
The above-described functions defined in the system/apparatus of the embodiments of the present disclosure are performed when the computer program is executed by the processor 1001. The systems, apparatus, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the disclosure.
In one embodiment, the computer program may be based on a tangible storage medium such as an optical storage device, a magnetic storage device, or the like. In another embodiment, the computer program may also be transmitted in the form of signals on a network medium, distributed, and downloaded and installed via the communication section 1009, and/or installed from the removable medium 1011. The computer program may include program code that may be transmitted using any appropriate network medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 1009, and/or installed from the removable medium 1011. The above-described functions defined in the system of the embodiments of the present disclosure are performed when the computer program is executed by the processor 1001. The systems, devices, apparatus, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the disclosure.
According to embodiments of the present disclosure, program code for performing computer programs provided by embodiments of the present disclosure may be written in any combination of one or more programming languages, and in particular, such computer programs may be implemented in high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. Programming languages include, but are not limited to, such as Java, c++, python, "C" or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that the features recited in the various embodiments of the disclosure and/or in the claims may be provided in a variety of combinations and/or combinations, even if such combinations or combinations are not explicitly recited in the disclosure. In particular, the features recited in the various embodiments of the present disclosure and/or the claims may be variously combined and/or combined without departing from the spirit and teachings of the present disclosure. All such combinations and/or combinations fall within the scope of the present disclosure.
The embodiments of the present disclosure are described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described above separately, this does not mean that the measures in the embodiments cannot be used advantageously in combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be made by those skilled in the art without departing from the scope of the disclosure, and such alternatives and modifications are intended to fall within the scope of the disclosure.

Claims (14)

1. A training method of a product recommendation model, comprising:
obtaining a product relation graph model constructed according to product labels, wherein the product relation graph model comprises first node information, second node information and side information, the first node information comprises a first product label and a first user label, the second node information only comprises the second product label, and the side information represents an association relation between two nodes corresponding to the side information;
Determining a second user tag corresponding to the second product tag according to the first product tag, the first user tag, the second product tag and the side information; and
and obtaining a trained product recommendation model according to the first node information, the side information and the second node information with the second product label and the second user label.
2. The method of claim 1, wherein the determining a second user tag corresponding to the second product tag based on the first product tag, the first user tag, the second product tag, and the side information comprises:
determining an information propagation probability between the first node information and the second node information according to a second similarity between the first product tag and the second product tag; and
and determining the second user tag according to the first user tag and the information propagation probability.
3. The method of claim 2, wherein the determining the probability of information propagation between the first node information and the second node information based on a second similarity between the first product tag and the second product tag comprises:
Determining first side weight information between the first node information and the second node information according to preset constraint parameters and Euclidean distance between the first product label and the second product label;
determining second side weight information between every two node information in the first node information and the second node information; and
and determining the information propagation probability according to the first side weight information and the second side weight information.
4. The method of claim 1, wherein the determining a second user tag corresponding to the second product tag based on the first product tag, the first user tag, the second product tag, and the side information comprises:
determining a probability transition matrix according to the information propagation probability between every two node information in the first node information and the second node information;
determining an initial user soft tag matrix according to the first user tag and initial parameter information representing the second user tag;
determining a target user soft tag matrix according to the probability transition matrix and the initial user soft tag matrix, wherein elements in the target user soft tag matrix comprise user tag information and probability values corresponding to the user tag information; and
And determining the second user label according to the target user soft label matrix.
5. The method of claim 4, wherein said determining a target user soft label matrix from said probability transition matrix and said initial user soft label matrix comprises:
obtaining an i+1st round user soft tag matrix according to the probability transition matrix and the i round user soft tag matrix, wherein the 1 st round user soft tag matrix is the initial user soft tag matrix, and i is an integer greater than or equal to 1; and
and determining an I-th round user soft tag matrix as the target user soft tag matrix in response to determining that the information propagation probability between the first target product tag information and the second target product tag information obtained in the I-th round is smaller than a preset threshold, wherein the first target product tag information characterizes product tag information corresponding to a node for obtaining the second user tag information in the I-th round, and the second target product tag information is product tag information with a node association relation with the first target product tag information.
6. The method of claim 4, wherein said determining the second user tag from the target user soft tag matrix comprises:
And in response to determining that the user tag corresponding to the same second node information comprises a plurality of candidate user tags, determining the candidate user tag with the highest probability value in the plurality of candidate user tags as the second user tag of the second node information.
7. The method of any of claims 1-6, further comprising: prior to the acquisition of the product relationship graph model constructed from product tags,
acquiring first product attribute information of a first product and second product attribute information of a second product;
carrying out structural conversion on the first product attribute information according to a first predefined mapping relation to obtain the first product label, wherein the first predefined mapping relation represents the mapping relation between various product attribute information and numerical information;
carrying out structural conversion on the second product attribute information according to the first predefined mapping relation to obtain the second product label;
acquiring user attribute information of a user related to the first product;
carrying out structural conversion on the user attribute information according to a second predefined mapping relation to obtain the first user tag, wherein the second predefined mapping relation represents the mapping relation between various user attribute information and numerical information;
Determining the first node information according to the first product tag and the first user tag corresponding to the same first product;
determining the second node information according to the second product label; and
and determining the side information according to the first similarity between every two product labels in the first product label and the second product label.
8. A product recommendation method comprising:
determining a target user tag according to target user attribute information of a target user; and
inputting the target user tag into a product recommendation model to obtain a target product recommended for the target user, wherein the product recommendation model is a trained product recommendation model obtained according to the method of any one of claims 1-7.
9. The method of claim 8, wherein the target user tag comprises at least a first user sub-tag and a second user sub-tag, the first user sub-tag having a first weight and the second user sub-tag having a second weight, the second weight being greater than the first weight; inputting the target user tag into a product recommendation model, and obtaining the target product recommended for the target user comprises the following steps:
In response to determining that no product which is completely matched with the target user tag exists in the products to be recommended, inputting the second user sub-tag into the product recommendation model to obtain a product which is matched with the second user sub-tag as the target product; and
and in response to determining that no product matched with the second user sub-label exists in the products to be recommended, inputting the first user sub-label into the product recommendation model to obtain the product matched with the first user sub-label as the target product.
10. A training device for a product recommendation model, comprising:
the first acquisition module is used for acquiring a product relation graph model constructed according to the product labels, wherein the product relation graph model comprises first node information, second node information and side information, the first node information comprises a first product label and a first user label, the second node information comprises only the second product label, and the side information represents that an association relationship exists between two nodes corresponding to the side information;
a first determining module, configured to determine a second user tag corresponding to the second product tag according to the first product tag, the first user tag, the second product tag, and the side information; and
The first obtaining module is configured to obtain a trained product recommendation model according to the first node information, the side information, and second node information having the second product tag and the second user tag.
11. A product recommendation device, comprising:
a fifth determining module, configured to determine a target user tag according to target user attribute information of the target user; and
a second obtaining module, configured to input the target user tag into a product recommendation model to obtain a target product recommended for the target user, where the product recommendation model is a trained product recommendation model obtained according to the apparatus of claim 10.
12. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method of any of claims 1-9.
13. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to perform the method according to any of claims 1 to 9.
14. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 9.
CN202310610396.0A 2023-05-26 2023-05-26 Training of product recommendation model, product recommendation method and device and electronic equipment Pending CN116561429A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310610396.0A CN116561429A (en) 2023-05-26 2023-05-26 Training of product recommendation model, product recommendation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310610396.0A CN116561429A (en) 2023-05-26 2023-05-26 Training of product recommendation model, product recommendation method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN116561429A true CN116561429A (en) 2023-08-08

Family

ID=87486078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310610396.0A Pending CN116561429A (en) 2023-05-26 2023-05-26 Training of product recommendation model, product recommendation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN116561429A (en)

Similar Documents

Publication Publication Date Title
CN111898031B (en) Method and device for obtaining user portrait
CN110390408B (en) Transaction object prediction method and device
CN113393306A (en) Product recommendation method and device, electronic equipment and computer readable medium
US11544333B2 (en) Analytics system onboarding of web content
JP2023517518A (en) Vector embedding model for relational tables with null or equivalent values
US20230245210A1 (en) Knowledge graph-based information recommendation
CN116756281A (en) Knowledge question-answering method, device, equipment and medium
CN116308704A (en) Product recommendation method, device, electronic equipment, medium and computer program product
CN116561429A (en) Training of product recommendation model, product recommendation method and device and electronic equipment
CN114493853A (en) Credit rating evaluation method, credit rating evaluation device, electronic device and storage medium
CN114219601A (en) Information processing method, device, equipment and storage medium
CN113779240A (en) Information identification method, device, computer system and readable storage medium
CN113159877A (en) Data processing method, device, system and computer readable storage medium
US11397783B1 (en) Ranking similar users based on values and personal journeys
CN117132401A (en) Product recommendation method, device, electronic equipment, medium and computer program product
US20220005085A1 (en) Information processing apparatus and non-transitory computer readable medium
CN115687624A (en) Text classification method and device, electronic equipment and medium
CN116385115A (en) Product recommendation method, device, equipment and medium based on matrix filling
CN115048585A (en) Product recommendation method, and training method, device and equipment of product recommendation model
CN116756410A (en) Product service recommendation method, product service recommendation device, equipment and storage medium
CN115034476A (en) Project risk prediction method, device, equipment, medium and program product
CN116664241A (en) Financial product recommendation method, device, equipment and medium
CN116881659A (en) Product classification method, device, equipment and medium
CN117474362A (en) Scheme information processing method, device and equipment for transformation and upgrading of enterprises
CN117573973A (en) Resource recommendation method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination