CN117474634A - Product recommendation method, device, equipment and storage medium - Google Patents

Product recommendation method, device, equipment and storage medium Download PDF

Info

Publication number
CN117474634A
CN117474634A CN202311669698.1A CN202311669698A CN117474634A CN 117474634 A CN117474634 A CN 117474634A CN 202311669698 A CN202311669698 A CN 202311669698A CN 117474634 A CN117474634 A CN 117474634A
Authority
CN
China
Prior art keywords
information
initial
model
training
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311669698.1A
Other languages
Chinese (zh)
Inventor
李文启
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Agricultural Bank of China
Original Assignee
Agricultural Bank of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agricultural Bank of China filed Critical Agricultural Bank of China
Priority to CN202311669698.1A priority Critical patent/CN117474634A/en
Publication of CN117474634A publication Critical patent/CN117474634A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/02Banking, e.g. interest calculation or account maintenance
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Strategic Management (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Marketing (AREA)
  • Software Systems (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Medical Informatics (AREA)
  • Technology Law (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a product recommendation method, a device, equipment and a storage medium. The method comprises the following steps: acquiring initial information corresponding to an object to be recommended; inputting initial information into a first model to obtain target information corresponding to an object to be recommended, wherein the first model is obtained by iteratively training the first initial model through a first training sample set, and the first training sample set comprises: the initial information sample and the target information sample corresponding to the initial information sample; inputting target information into a second model to obtain target product information corresponding to an object to be recommended, wherein the second model is obtained by iteratively training a second initial model through a second training sample set, and the second training sample set comprises: the target information sample and the product information sample corresponding to the target information sample; and recommending the target product information to the object to be recommended. Through the technical scheme of the invention, financial products can be pushed to the user efficiently and accurately, and the user experience is improved.

Description

Product recommendation method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of data processing, in particular to a product recommendation method, a device, equipment and a storage medium.
Background
At present, the user group of the bank is huge, more potential customers of financial products are discovered, and the pushing of the financial products in the bank to the customer group is a feasible scheme. However, it takes a lot of time and labor to locate potential good users and to achieve accurate pushing of financial products. Therefore, there is a need for an efficient financial product pushing method to better and faster push financial products to potential premium customers.
Disclosure of Invention
The embodiment of the invention provides a product recommending method, device, equipment and storage medium, which can push financial products for users efficiently and accurately and promote user experience.
According to an aspect of the present invention, there is provided a product recommendation method including:
acquiring initial information corresponding to an object to be recommended;
inputting the initial information into a first model to obtain target information corresponding to the object to be recommended, wherein the first model is obtained by iteratively training a first initial model through a first training sample set, and the first training sample set comprises: the method comprises the steps of an initial information sample and a target information sample corresponding to the initial information sample;
inputting the target information into a second model to obtain target product information corresponding to the object to be recommended, wherein the second model is obtained by iteratively training a second initial model through a second training sample set, and the second training sample set comprises: the product information sample comprises a target information sample and a product information sample corresponding to the target information sample;
And recommending the target product information to the object to be recommended.
According to another aspect of the present invention, there is provided a product recommendation apparatus, comprising:
the acquisition module is used for acquiring initial information corresponding to the object to be recommended;
the first input module is configured to input the initial information into a first model to obtain target information corresponding to the object to be recommended, where the first model is obtained by iteratively training a first initial model through a first training sample set, and the first training sample set includes: the method comprises the steps of an initial information sample and a target information sample corresponding to the initial information sample;
the second input module is configured to input the target information into a second model to obtain target product information corresponding to the object to be recommended, where the second model is obtained by iteratively training a second initial model through a second training sample set, and the second training sample set includes: the product information sample comprises a target information sample and a product information sample corresponding to the target information sample;
and the recommending module is used for recommending the target product information to the object to be recommended.
According to another aspect of the present invention, there is provided an electronic apparatus including:
At least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the product recommendation method according to any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute the product recommendation method according to any of the embodiments of the present invention.
According to the embodiment of the invention, initial information corresponding to the object to be recommended is acquired, the initial information is input into a first model, and target information corresponding to the object to be recommended is obtained, wherein the first model is obtained by iteratively training a first initial model through a first training sample set, and the first training sample set comprises: the initial information sample and the target information sample corresponding to the initial information sample are used for inputting target information into a second model to obtain target product information corresponding to an object to be recommended, wherein the second model is obtained by iteratively training a second initial model through a second training sample set, and the second training sample set comprises: and recommending the target product information to the object to be recommended by the target information sample and the product information sample corresponding to the target information sample. Through the technical scheme of the invention, financial products can be pushed to the user efficiently and accurately, and the user experience is improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a product recommendation method in an embodiment of the invention;
FIG. 2 is a schematic diagram of a product recommendation device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device implementing a product recommendation method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the user should be informed and authorized of the type, usage range, usage scenario, etc. of the personal information related to the present disclosure in an appropriate manner according to the relevant legal regulations.
Example 1
Fig. 1 is a flowchart of a product recommendation method in an embodiment of the present invention, where the method may be applied to a product recommendation case, and the method may be performed by a product recommendation device in an embodiment of the present invention, where the device may be implemented in a software and/or hardware manner, as shown in fig. 1, and the method specifically includes the following steps:
S101, obtaining initial information corresponding to an object to be recommended.
In this embodiment, the object to be recommended may be, for example, a user group in a bank of the product information to be recommended.
Optionally, the initial information corresponding to the object to be recommended includes: attribute information, behavior information, and historical product information.
The attribute information may be basic information data of an object to be recommended, for example, may be personal information of a client; the behavior information may be behavior data of an object to be recommended, for example, the user purchases a financial product, or the user performs a deposit and withdrawal operation; the historical product information may be financial product information that the object to be recommended has purchased.
Advances in information technology and their massive use in the banking industry have resulted in massive amounts of user personal basic information data and user behavior data, which have embedded therein a great deal of information useful to the industry. By data mining of such user information, such as information of the user's financial condition, risk preference, financial product purchase history, financial product purchase category, etc., a huge data set can be obtained. However, the data set formed by these data contains a large number of features with different information and often has a high dimension, so that further processing of these data requires a large amount of computational cost. In addition, these data often contain partially redundant, uncorrelated, and noisy features, resulting in misinterpretation of the correlation between the data and the class labels by the machine learning model during the training process. The existence of these features not only increases the consumption of computing resources, but also brings about the problem of overfitting, which reduces the classification accuracy of machine learning to some extent. Therefore, in order to more efficiently and accurately use these data to determine whether a banking customer is inclined to purchase a financial product, it is necessary to pre-process these data using feature selection.
Feature selection is an important step in data preprocessing aimed at selecting the feature subset with the greatest amount of information from a given feature set, thereby better extracting feature information for proper classification. By eliminating redundant, uncorrelated or noisy features in the original dataset, feature selection can achieve the goal of saving computing resources and improve the classification accuracy and generalization ability of the machine learning algorithm to some extent. Based on these advantages of feature selection, feature selection is considered an important element in refining data used in machine learning models.
The feature selection method can effectively eliminate irrelevant, redundant and noise features in massive user data, simplify the data set, reduce the consumption of computing resources for the training process of the subsequent machine learning model, and improve the classification performance of machine learning to a certain extent, so that the obtained model can judge high-quality clients more accurately. Feature selection is a complex optimization problem that is difficult to solve not only because the search space is large, but also because the data features are not independent of each other. Interactions between features may cause individually related features to become redundant or individually weakly related features to become highly related when combined with other features. This clearly increases the complexity of the problem. Therefore, the use of scientifically efficient search techniques is a key factor in the feature selection problem.
The existing feature selection method for simplifying mass data information can be divided into two main types according to the evaluation standard of feature subsets: filter-based methods and wrapper-based methods. Filter-based methods utilize knowledge from different disciplines as criteria for measuring subsets of features, and in particular can be further divided into univariate methods of evaluating and ordering individual features to select features, such as: chi-squared, information gain, mutual information, etc., and multivariate methods such as FCBF and CFS that evaluate feature subsets as a whole. In the wrapper-based method, a machine learning classification model is used to determine whether the feature contains more valid information. In addition, the search techniques used in the present feature selection problem can be divided into: exhaustive searches, random searches, and heuristic searches.
The wrapper-based approach relies on the classification results of the machine learning model in the feature subset evaluation process. The introduction of the classification model enables the wrapper-based method to more effectively analyze the merits of the feature subsets, thereby enabling the selected feature subsets to iterate in a more optimal direction during feature selection. Compared to wrapper-based methods, filter-based methods ignore the performance of selected features on classification algorithms, and thus the resulting feature subsets tend to be lower in classification performance than wrapper-based methods.
Some search techniques, such as exhaustive and random searches, have been applied to solve the feature selection problem. However, when the number of features of a dataset is too large, it may become impractical to use an exhaustive search. For a dataset with N features, there are 2 N A subset of features is possible and therefore the computational cost of an exhaustive search increases exponentially. The random search adopts a completely random mode to select the feature subset, and early experience cannot be utilized in the search process, so that the method has certain blindness. In the worst case, random searches can fall into the same dilemma as exhaustive searches.
Therefore, the present embodiment uses an efficient search technique for feature selection to enable a resulting feature subset to have a smaller number of features while ensuring high classification accuracy.
S102, inputting initial information into the first model to obtain target information corresponding to the object to be recommended.
In this embodiment, the first model may be a model for screening the initial information to obtain a reduced data set. Preferably, the first model may be a trained gray wolf optimization algorithm model.
The first model is obtained by iteratively training a first initial model through a first training sample set.
In the actual operation process, the first training sample set may be constructed by using basic information data and user behavior data of a large number of users actually acquired by a certain bank.
Preferably, in this embodiment, the first initial model may be an untrained gray wolf optimization algorithm model.
Wherein the first training sample set comprises: the initial information sample and the target information sample corresponding to the initial information sample.
In this embodiment, the initial information sample may be basic information data and user behavior data of a large number of users that are not filtered, and the target information sample may be a reduced data set determined from the initial information sample.
The target information may be basic information data and user behavior data of the reduced user output after the first model filters the input initial information.
According to the method, a feature selection method is introduced in the process of training the prediction model, initial information of users to be recommended is input into the first model for screening, irrelevant, redundant and noise features in massive user data are eliminated, a data set is simplified, the calculation resource consumption is reduced for the training process of a subsequent machine learning model, and the classification performance of machine learning is improved to a certain extent, so that the obtained model can judge high-quality clients more accurately.
The embodiment of the invention selects the gray wolf optimization algorithm with higher performance as the search technology in the characteristic selection process, so that the exploration degree of a problem solution space is improved, a better characteristic subset is easy to obtain, and the obtained characteristic subset is effectively prevented from sinking into local optimum.
S103, inputting the target information into the second model to obtain target product information corresponding to the object to be recommended.
In this embodiment, the second model may be a model for determining product information suitable for recommendation to the user to be recommended based on the target information. Preferably, the second model may be a trained machine learning model, and specifically, may be a trained SVM (Support Vector Machine ) model, for example.
The second model is obtained by iteratively training a second initial model through a second training sample set.
In the actual operation process, the second training sample set can be constructed by utilizing the basic information data of massive users actually acquired by a certain bank and the simplified data set in the user behavior data and the product information suitable for each user.
Preferably, in the present embodiment, the second initial model may be an untrained machine learning model, and specifically, may be an untrained SVM model, for example.
Wherein the second training sample set comprises: the target information sample and the product information sample corresponding to the target information sample.
It should be noted that, in this embodiment, the product information samples may be appropriate recommended product information corresponding to a huge amount of target information samples.
The target product information may be suitable recommended product information output after the second model is matched according to the input target information.
Specifically, the target information is input into the second model to obtain target product information corresponding to the object to be recommended, so that financial products possibly purchased by the user to be recommended can be predicted more quickly and accurately.
S104, recommending the target product information to the object to be recommended.
Specifically, the determined target product information is recommended to the object to be recommended.
According to the embodiment of the invention, initial information corresponding to the object to be recommended is acquired, the initial information is input into a first model, and target information corresponding to the object to be recommended is obtained, wherein the first model is obtained by iteratively training a first initial model through a first training sample set, and the first training sample set comprises: the initial information sample and the target information sample corresponding to the initial information sample are used for inputting target information into a second model to obtain target product information corresponding to an object to be recommended, wherein the second model is obtained by iteratively training a second initial model through a second training sample set, and the second training sample set comprises: and recommending the target product information to the object to be recommended by the target information sample and the product information sample corresponding to the target information sample. Through the technical scheme of the invention, financial products can be pushed to the user efficiently and accurately, and the user experience is improved.
Optionally, iteratively training the first initial model through the first training sample set includes:
a first initial model is established.
And inputting the initial information samples in the first training sample set into a first initial model to obtain prediction information.
The prediction information may be a filtered reduced data set output by the first initial model, and may include basic information data of the user, user behavior data, and the like.
And training parameters of a first initial model according to a first function formed by the target information sample corresponding to the initial information sample and the prediction information.
The first function may be a function formed according to the target information sample and the prediction information corresponding to the initial information sample, and used for improving the accuracy of selecting the data screening features by the model. For example, the parameter of the first initial model may be a weight of the first initial model.
And the return execution is used for inputting the initial information samples in the first training sample set into the first initial model to obtain the predicted information until the first model is obtained.
Optionally, iteratively training a second initial model through a second training sample set, including:
a second initial model is established.
And inputting the target information sample in the second training sample set into a second initial model to obtain predicted product information.
The predicted product information may be appropriate product information corresponding to the target information sample output by the second initial model.
And training parameters of a second initial model according to a second function formed by the product information sample corresponding to the target information sample and the predicted product information.
The second function may be formed according to a product information sample corresponding to the target information sample and predicted product information, and is a function for improving matching accuracy of the model to the user information and the product information. Illustratively, the parameter of the second initial model may be a weight of the second initial model.
And the return execution is carried out to input the target information sample in the second training sample set into the second initial model to obtain the predicted product information until the second model is obtained.
In the actual operation process, the purpose of the feature selection problem is to obtain an optimal feature subset by deleting redundant features and reserving a plurality of features with the largest information quantity. It can be seen that each solution to the feature selection problem can be represented simply and clearly by a one-dimensional vector. Specifically, the dimension of the vector is the same as the number of features of the data set to be processed, each dimension corresponds to the features of the data set in sequence and the value of each dimension is only 0 or 1, wherein 1 represents that the corresponding feature is selected, and otherwise is discarded. Thus, the feature selection problem can be seen as a binary optimization problem, whose solution space is a discrete binary space. Evaluation criteria for solutions: the evaluation criteria for the feature selection problem solution may be mathematically defined as fitness function values. Constructing an fitness function aiming at a target of feature selection is an important step of feature selection, and two main targets exist in a feature selection problem, namely, the accuracy of classification is improved, or the classification error rate is reduced; another objective of the feature selection problem is to minimize the number of features selected. To cope with this problem, the processing manner of the present embodiment is as follows:
Optionally, training parameters of the first initial model according to a first function formed by the target information sample corresponding to the initial information sample and the prediction information includes:
and determining a first metric and a second metric according to the target information sample and the prediction information corresponding to the initial information sample.
In this embodiment, the first metric may be a classification error rate, and the second metric may be a feature selection rate.
Specifically, the classification error rate can be expressed as:
f 1 =1-Arr;
wherein f 1 The classification error rate is represented, and Arr represents the accuracy of classification. The classification accuracy Arr may be obtained by a specific classifier, for example, an SVM classifier is used to calculate the classification accuracy, where the SVM classifier may be another trained classifier, or may be a second model in the embodiment, which is not limited in this embodiment. Specifically, the embodiment of the invention can use the data set consisting of the client personal information, the behavior data, the purchased financial product information and other class labels, select 80% of the data as a training set for training the SVM model, and use 20% of the data as a testing set for verifying the accuracy of the classification model.
Further, the feature selectivity can be expressed as:
Wherein f 2 Representing feature selectivity, N S And N F Representing the number of selected features and the total number of features, respectively.
The first function is determined from the first metric and the second metric.
Specifically, according to the classification error rate f 1 And feature selectivity f 2 A first function is determined.
Parameters of the first initial model are trained according to a first function.
In practice, the two goals of improving the accuracy of classification, or reducing the classification error rate, and minimizing the number of features selected are often contradictory. A reduction in the number of features may result in a reduction in accuracy, while an increase in accuracy may require the selection of more features. Taking these two cases into consideration, a linear weighting method is adopted according to the classification error rate f 1 And feature selectivity f 2 A first function is constructed.
Optionally, determining the first function according to the first metric and the second metric includes:
and acquiring a first weight, a second weight and a preset step length.
The first weight may be a weight corresponding to a first metric preset according to an actual situation, and the second weight may be a weight corresponding to a second metric preset according to an actual situation. In this embodiment, the first weight and the second weight satisfy the constraint condition that the sum is 1, but specific values of the first weight and the second weight may be set by the user, which is not limited in this embodiment.
The preset step length may be a step length of performing iterative update calculation on the first weight and the second weight, and specific data of the preset step length is not limited in this embodiment.
And determining an initial function according to the first weight, the first metric, the second weight and the second metric.
In this embodiment, the representation of the initial function may be, for example: first weight x first metric + second weight x second metric. Specifically, the initial function may be expressed as:
fitness=a·f 1 +b·f 2
wherein a is [0,1 ]]And b=1-a is used to balance the classification error rate f 1 And feature selectivity f 2 Both parameters.
And determining a first function according to the preset step length and the initial function.
Specifically, the first weight and the second weight in the initial function are iteratively updated according to a preset step length, and finally the first function is determined.
Optionally, determining the first function according to the preset step size and the initial function includes:
and acquiring the fitness function value corresponding to the initial function.
Specifically, the fitness function value fitness corresponding to the initial function is calculated.
And carrying out iterative updating on the first weight and the second weight in the initial function based on a preset step length to obtain new first weight and second weight, and obtaining a new fitness function value corresponding to the function after each iterative updating.
For example, the first weight may be, for example, 0.1, the second weight may be, for example, 0.9, the preset step size may be, for example, 0.01, the initial function is 0.1×first metric+0.9×second metric, after the first iteration update, the function may be updated to 0.11×first metric+0.89×second metric, after the second iteration update, the function may be updated to 0.12×first metric+0.88×second metric, and so on, to obtain two weights and a new function after each iteration update, and calculate the fitness function value corresponding to the function after each iteration update.
And determining a function formed by the first weight and the second weight which correspond to the lowest fitness function value and the first measurement index and the second measurement index as a first function.
For example, when the value of a is 0.99, the fitness function value corresponding to the function is the lowest, and then 0.99×the first metric+0.01×the second metric may be determined as the first function, and the parameters of the first initial model are trained according to the first function until the first model is obtained.
According to the technical scheme, the efficient feature selection method is combined with the bank financial product business, a set of accurate financial product pushing strategies is designed, massive user personal data are screened through the efficient feature selection method, features with highest correlation with whether customers purchase the financial products or not are found, a simplified data set is built through the feature subsets containing more information, the simplified data set is used for training a machine learning model, potential association relations between user information, user behaviors and whether the users wish to purchase the financial products or not are obtained, finally the newly obtained customer data to be recommended are analyzed through the trained model, and corresponding financial products are pushed to high-quality customers predicted to purchase the financial products in an intentional mode. According to the technical scheme provided by the embodiment of the invention, for newly obtained user information data and behavior data, all data are not needed, and only part of data are selected, so that financial products possibly purchased by the user can be predicted and recommended more quickly and accurately by using the trained SVM model, and the user experience is improved.
Example two
Fig. 2 is a schematic structural diagram of a product recommendation device in an embodiment of the invention. The embodiment may be applicable to the case of product recommendation, and the device may be implemented in a software and/or hardware manner, and may be integrated in any device that provides a function of product recommendation, as shown in fig. 2, where the product recommendation device specifically includes: the system comprises an acquisition module 201, a first input module 202, a second input module 203 and a recommendation module 204.
The acquiring module 201 is configured to acquire initial information corresponding to an object to be recommended;
the first input module 202 is configured to input the initial information into a first model to obtain target information corresponding to the object to be recommended, where the first model is obtained by iteratively training a first initial model through a first training sample set, and the first training sample set includes: the method comprises the steps of an initial information sample and a target information sample corresponding to the initial information sample;
the second input module 203 is configured to input the target information into a second model, and obtain target product information corresponding to the object to be recommended, where the second model is obtained by iteratively training a second initial model through a second training sample set, and the second training sample set includes: the product information sample comprises a target information sample and a product information sample corresponding to the target information sample;
And the recommending module 204 is configured to recommend the target product information to the object to be recommended.
Optionally, the first input module 202 includes:
the first establishing sub-module is used for establishing a first initial model;
the first input sub-module is used for inputting initial information samples in the first training sample set into the first initial model to obtain prediction information;
the first training submodule is used for training parameters of the first initial model according to a first function formed by the target information sample corresponding to the initial information sample and the prediction information;
and the first execution sub-module is used for returning to execute the operation of inputting the initial information samples in the first training sample set into the first initial model to obtain the prediction information until the first model is obtained.
Optionally, the second input module 203 includes:
the second building sub-module is used for building a second initial model;
the second input sub-module is used for inputting the target information sample in the second training sample set into the second initial model to obtain predicted product information;
the second training submodule is used for training parameters of the second initial model according to a second function formed by the product information sample corresponding to the target information sample and the predicted product information;
And the second execution sub-module is used for returning to execute the operation of inputting the target information sample in the second training sample set into the second initial model to obtain the predicted product information until the second model is obtained.
Optionally, the first training submodule includes:
the first determining unit is used for determining a first metric index and a second metric index according to the target information sample corresponding to the initial information sample and the prediction information;
a second determining unit configured to determine a first function according to the first metric and the second metric;
and the training unit is used for training the parameters of the first initial model according to the first function.
Optionally, the second determining unit includes:
the acquisition subunit is used for acquiring the first weight, the second weight and a preset step length;
a first determining subunit, configured to determine an initial function according to the first weight, the first metric, the second weight, and the second metric;
and the second determining subunit is used for determining a first function according to the preset step length and the initial function.
Optionally, the second determining subunit is specifically configured to:
acquiring an fitness function value corresponding to the initial function;
Iteratively updating the first weight and the second weight in the initial function based on the preset step length to obtain new first weight and second weight, and obtaining a new fitness function value corresponding to the function after each iteration update;
and determining a first function as a function formed by a first weight and a second weight which correspond to the lowest fitness function value and the first measurement index and the second measurement index.
Optionally, the initial information corresponding to the object to be recommended includes: attribute information, behavior information, and historical product information.
The product can execute the product recommending method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the product recommending method.
Example III
Fig. 3 shows a schematic diagram of an electronic device 30 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 3, the electronic device 30 includes at least one processor 31, and a memory, such as a Read Only Memory (ROM) 32, a Random Access Memory (RAM) 33, etc., communicatively connected to the at least one processor 31, wherein the memory stores a computer program executable by the at least one processor, and the processor 31 can perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 32 or the computer program loaded from the storage unit 38 into the Random Access Memory (RAM) 33. In the RAM 33, various programs and data required for the operation of the electronic device 30 may also be stored. The processor 31, the ROM 32 and the RAM 33 are connected to each other via a bus 34. An input/output (I/O) interface 35 is also connected to bus 34.
Various components in electronic device 30 are connected to I/O interface 35, including: an input unit 36 such as a keyboard, a mouse, etc.; an output unit 37 such as various types of displays, speakers, and the like; a storage unit 38 such as a magnetic disk, an optical disk, or the like; and a communication unit 39 such as a network card, modem, wireless communication transceiver, etc. The communication unit 39 allows the electronic device 30 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 31 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 31 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 31 performs the various methods and processes described above, such as the product recommendation method:
acquiring initial information corresponding to an object to be recommended;
inputting the initial information into a first model to obtain target information corresponding to the object to be recommended, wherein the first model is obtained by iteratively training a first initial model through a first training sample set, and the first training sample set comprises: the method comprises the steps of an initial information sample and a target information sample corresponding to the initial information sample;
inputting the target information into a second model to obtain target product information corresponding to the object to be recommended, wherein the second model is obtained by iteratively training a second initial model through a second training sample set, and the second training sample set comprises: the product information sample comprises a target information sample and a product information sample corresponding to the target information sample;
And recommending the target product information to the object to be recommended.
In some embodiments, the product recommendation method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 38. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 30 via the ROM 32 and/or the communication unit 39. When the computer program is loaded into RAM 33 and executed by processor 31, one or more steps of the product recommendation method described above may be performed. Alternatively, in other embodiments, the processor 31 may be configured to perform the product recommendation method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method of product recommendation, comprising:
acquiring initial information corresponding to an object to be recommended;
inputting the initial information into a first model to obtain target information corresponding to the object to be recommended, wherein the first model is obtained by iteratively training a first initial model through a first training sample set, and the first training sample set comprises: the method comprises the steps of an initial information sample and a target information sample corresponding to the initial information sample;
inputting the target information into a second model to obtain target product information corresponding to the object to be recommended, wherein the second model is obtained by iteratively training a second initial model through a second training sample set, and the second training sample set comprises: the product information sample comprises a target information sample and a product information sample corresponding to the target information sample;
and recommending the target product information to the object to be recommended.
2. The method of claim 1, wherein iteratively training the first initial model through the first training sample set comprises:
establishing a first initial model;
inputting an initial information sample in the first training sample set into the first initial model to obtain prediction information;
Training parameters of the first initial model according to a first function formed by the target information sample corresponding to the initial information sample and the prediction information;
and returning to the operation of inputting the initial information samples in the first training sample set into the first initial model to obtain the prediction information until the first model is obtained.
3. The method of claim 1, wherein iteratively training the second initial model through the second training sample set comprises:
establishing a second initial model;
inputting the target information sample in the second training sample set into the second initial model to obtain predicted product information;
training parameters of the second initial model according to a second function formed by the product information sample corresponding to the target information sample and the predicted product information;
and returning to the operation of inputting the target information samples in the second training sample set into the second initial model to obtain predicted product information until a second model is obtained.
4. The method of claim 2, wherein training parameters of the first initial model according to a first function formed by the target information samples corresponding to the initial information samples and the prediction information comprises:
Determining a first metric index and a second metric index according to the target information sample corresponding to the initial information sample and the prediction information;
determining a first function according to the first metric and the second metric;
and training parameters of the first initial model according to the first function.
5. The method of claim 4, wherein determining a first function from the first metric and the second metric comprises:
acquiring a first weight, a second weight and a preset step length;
determining an initial function according to the first weight, the first metric, the second weight and the second metric;
and determining a first function according to the preset step length and the initial function.
6. The method of claim 5, wherein determining a first function from the preset step size and the initial function comprises:
acquiring an fitness function value corresponding to the initial function;
iteratively updating the first weight and the second weight in the initial function based on the preset step length to obtain new first weight and second weight, and obtaining a new fitness function value corresponding to the function after each iteration update;
And determining a first function as a function formed by a first weight and a second weight which correspond to the lowest fitness function value and the first measurement index and the second measurement index.
7. The method of claim 1, wherein the initial information corresponding to the object to be recommended includes: attribute information, behavior information, and historical product information.
8. A product recommendation device, comprising:
the acquisition module is used for acquiring initial information corresponding to the object to be recommended;
the first input module is configured to input the initial information into a first model to obtain target information corresponding to the object to be recommended, where the first model is obtained by iteratively training a first initial model through a first training sample set, and the first training sample set includes: the method comprises the steps of an initial information sample and a target information sample corresponding to the initial information sample;
the second input module is configured to input the target information into a second model to obtain target product information corresponding to the object to be recommended, where the second model is obtained by iteratively training a second initial model through a second training sample set, and the second training sample set includes: the product information sample comprises a target information sample and a product information sample corresponding to the target information sample;
And the recommending module is used for recommending the target product information to the object to be recommended.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the product recommendation method of any one of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to implement the product recommendation method of any one of claims 1-7 when executed.
CN202311669698.1A 2023-12-07 2023-12-07 Product recommendation method, device, equipment and storage medium Pending CN117474634A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311669698.1A CN117474634A (en) 2023-12-07 2023-12-07 Product recommendation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311669698.1A CN117474634A (en) 2023-12-07 2023-12-07 Product recommendation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117474634A true CN117474634A (en) 2024-01-30

Family

ID=89634857

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311669698.1A Pending CN117474634A (en) 2023-12-07 2023-12-07 Product recommendation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117474634A (en)

Similar Documents

Publication Publication Date Title
GB2547993A (en) Real time autonomous archetype outlier analytics
KR20170134601A (en) Data processing method and apparatus
CN111125529A (en) Product matching method and device, computer equipment and storage medium
CN114118287A (en) Sample generation method, sample generation device, electronic device and storage medium
CN115545886A (en) Overdue risk identification method, overdue risk identification device, overdue risk identification equipment and storage medium
CN113204614B (en) Model training method, method for optimizing training data set and device thereof
CN116883181B (en) Financial service pushing method based on user portrait, storage medium and server
CN113435900A (en) Transaction risk determination method and device and server
CN115759283A (en) Model interpretation method and device, electronic equipment and storage medium
CN117474634A (en) Product recommendation method, device, equipment and storage medium
CN115601071A (en) Marketing strategy optimization method, device, equipment and medium for product
CN115827994A (en) Data processing method, device, equipment and storage medium
CN116228284A (en) Goods demand prediction method, training device, computer system and medium
CN113052512A (en) Risk prediction method and device and electronic equipment
CN114328123A (en) Abnormality determination method, training method, device, electronic device, and storage medium
CN113807391A (en) Task model training method and device, electronic equipment and storage medium
CN112906723A (en) Feature selection method and device
CN114037058B (en) Pre-training model generation method and device, electronic equipment and storage medium
CN114416513B (en) Processing method and device for search data, electronic equipment and storage medium
CN114637921B (en) Item recommendation method, device and equipment based on modeling accidental uncertainty
CN114037057B (en) Pre-training model generation method and device, electronic equipment and storage medium
CN116361460A (en) Data integration method and device, storage medium, electronic equipment and product
CN116881857A (en) Product recommendation method, device, equipment and medium
CN115221421A (en) Data processing method and device, electronic equipment and storage medium
CN115034893A (en) Deep learning model training method, risk assessment method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination