CN112765482A - Product delivery method, device, equipment and computer readable medium - Google Patents

Product delivery method, device, equipment and computer readable medium Download PDF

Info

Publication number
CN112765482A
CN112765482A CN202011545806.0A CN202011545806A CN112765482A CN 112765482 A CN112765482 A CN 112765482A CN 202011545806 A CN202011545806 A CN 202011545806A CN 112765482 A CN112765482 A CN 112765482A
Authority
CN
China
Prior art keywords
feature
product
target
cross
feature vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011545806.0A
Other languages
Chinese (zh)
Inventor
王卿
张懿
付雅馨
陈锋杰
刘冬冬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weimin Insurance Agency Co Ltd
Original Assignee
Weimin Insurance Agency Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weimin Insurance Agency Co Ltd filed Critical Weimin Insurance Agency Co Ltd
Priority to CN202011545806.0A priority Critical patent/CN112765482A/en
Publication of CN112765482A publication Critical patent/CN112765482A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9536Search customisation based on social or collaborative filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0253During e-commerce, i.e. online transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • G06Q30/0271Personalized advertisement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations

Abstract

The application relates to a product delivery method, a product delivery device, equipment and a computer readable medium. The method comprises the following steps: acquiring first characteristic information of a target object and second characteristic information of a product to be released, wherein the first characteristic information is used for representing the historical behavior of the target object on a target platform, the second characteristic information is used for representing the product characteristic of the product to be released, and the product to be released comprises a virtual object resource; extracting cross features and associated features between the first feature information and the second feature information, and determining a preference result of the target object to the product to be launched by using the cross features and the associated features; determining a target product combination from the products to be launched according to the preference result; and displaying the target display card matched with the target product combination to the target object. According to the method and the device, the incidence relation between the user characteristics and the characteristics of the products to be released is mined, so that the products to be released preferred by the target object are found, and the technical problem that the actual fitness of the recommended products and the user is not high is solved.

Description

Product delivery method, device, equipment and computer readable medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a computer-readable medium for product delivery.
Background
Along with the rapid development of information technology and internet, especially the appearance of electronic commerce websites and social networking websites in recent years, information resource overload has become a serious problem for users and providers, and a recommendation system becomes an effective means for solving the problem, and analyzes user interests or connections among items according to user information to provide personalized recommendation service for users.
At present, in the related art, a recommendation system performs full-scale display and release on all products, so that all users see the same page after entering the recommendation system, and then the recommendation system can release certain products to the users by guiding the users to perform certain actions. Product recommendation is performed by adopting a collaborative filtering recommendation algorithm or improving the collaborative filtering algorithm, but the collaborative filtering algorithm has the problems of sparse characteristics, cold start and the like. In the related art described above, the degree of engagement of the recommended product with the user is not high.
Aiming at the problem that the actual fitness of the recommended product and the user is not high, an effective solution is not provided at present.
Disclosure of Invention
The application provides a product release method, a product release device, product release equipment and a computer readable medium, which are used for solving the technical problem that the actual fitness of a recommended product with a user is not high.
According to an aspect of an embodiment of the present application, there is provided a product delivery method, including: acquiring first characteristic information of a target object and second characteristic information of a product to be released, wherein the first characteristic information is used for representing the historical behavior of the target object on a target platform, the second characteristic information is used for representing the product characteristic of the product to be released, and the product to be released comprises virtual object resources used for exchanging virtual resources; extracting cross features and associated features between the first feature information and the second feature information, and determining a preference result of the target object to the product to be launched by using the cross features and the associated features; determining a target product combination from the products to be launched according to the preference result; and displaying the target display card matched with the target product combination to the target object.
According to another aspect of the embodiments of the present application, there is provided a product delivery device, including: the system comprises an information acquisition module, a data processing module and a data processing module, wherein the information acquisition module is used for acquiring first characteristic information of a target object and second characteristic information of a product to be released, the first characteristic information is used for representing the historical behavior of the target object on a target platform, the second characteristic information is used for representing the product characteristic of the product to be released, and the product to be released comprises virtual object resources used for virtual resource exchange; the preference matching module is used for extracting cross features and associated features between the first feature information and the second feature information and determining a preference result of the target object to-be-released product by utilizing the cross features and the associated features; the combination determining module is used for determining a target product combination from the products to be released according to the preference result; and the releasing and displaying module is used for displaying the target display card matched with the target product combination to the target object.
According to another aspect of the embodiments of the present application, there is provided an electronic device, including a memory, a processor, a communication interface, and a communication bus, where the memory stores a computer program executable on the processor, and the memory and the processor communicate with each other through the communication bus and the communication interface, and the processor implements the method when executing the computer program.
According to another aspect of embodiments of the present application, there is also provided a computer readable medium having non-volatile program code executable by a processor, the program code causing the processor to perform the above-mentioned method.
Compared with the related art, the technical scheme provided by the embodiment of the application has the following advantages:
the technical scheme includes that first characteristic information of a target object and second characteristic information of a product to be released are obtained, the first characteristic information is used for representing the historical behavior of the target object on a target platform, the second characteristic information is used for representing the product characteristics of the product to be released, and the product to be released is a virtual object resource; extracting cross features and associated features between the first feature information and the second feature information, and determining a preference result of the target object to the product to be launched by using the cross features and the associated features; determining a target product combination from the products to be launched according to the preference result; and displaying the target display card matched with the target product combination to the target object. According to the method and the device, the incidence relation between the user characteristics and the characteristics of the products to be released is mined, so that the products to be released preferred by the target object are found, and the technical problem that the actual fitness of the recommended products and the user is not high is solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the technical solutions in the embodiments or related technologies of the present application, the drawings needed to be used in the description of the embodiments or related technologies will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without any creative effort.
Fig. 1 is a schematic diagram of a hardware environment of an alternative product delivery method according to an embodiment of the present application;
FIG. 2 is a flow chart of an alternative product delivery method according to an embodiment of the present application;
FIG. 3 is a flow chart illustrating an alternative user preference mining process provided in accordance with an embodiment of the present application;
FIG. 4 is a schematic diagram of an alternative dual tower feature network provided in accordance with an embodiment of the present application;
FIG. 5 is a schematic diagram of an alternative product recommendation provided in accordance with an embodiment of the present application;
FIG. 6 is a block diagram of an alternative product delivery apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an alternative electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
First, partial nouns or terms appearing in the description of the embodiments of the present application are applicable to the following explanations:
a neural network: the neural network may be composed of neural units, which may be referred to as xsAnd an arithmetic unit with intercept b as input, the output of the arithmetic unit may be:
Figure BDA0002856145870000041
wherein s is 1, 2, …… n, n is a natural number greater than 1, WsIs xsB is the bias of the neural unit. f is an activation function (activation functions) of the neural unit for introducing a nonlinear characteristic into the neural network to convert an input signal in the neural unit into an output signal. The output signal of the activation function may be used as an input to the next convolutional layer. The activation function may be a sigmoid function. A neural network is a network formed by a number of the above-mentioned single neural units joined together, i.e. the output of one neural unit may be the input of another neural unit. The input of each neural unit can be connected with the local receiving domain of the previous layer to extract the characteristics of the local receiving domain, and the local receiving domain can be a region composed of a plurality of neural units.
Deep neural network: deep Neural Networks (DNNs), also known as multi-layer neural networks, can be understood as neural networks having many hidden layers, where "many" has no particular metric. From the division of DNNs by the location of different layers, neural networks inside DNNs can be divided into three categories: input layer, hidden layer, output layer. Generally, the first layer is an input layer, the last layer is an output layer, and the middle layers are hidden layers. For example, a fully-connected neural network is fully connected between layers, that is, any neuron at the i-th layer must be connected with any neuron at the i + 1-th layer. Although DNN appears complex, it is not really complex in terms of the work of each layer, simply the following linear relational expression:
Figure BDA0002856145870000051
wherein the content of the first and second substances,
Figure BDA0002856145870000052
is the input vector of the input vector,
Figure BDA0002856145870000053
is the output vector of the output vector,
Figure BDA0002856145870000054
is an offset vectorW is a weight matrix (also called coefficient), and α () is an activation function. Each layer is only for the input vector
Figure BDA0002856145870000052
Obtaining the output vector through such simple operation
Figure BDA0002856145870000056
Due to the large number of DNN layers, the coefficient W and the offset vector
Figure BDA0002856145870000057
The number of the same is large. The definition of these parameters in DNN is as follows: taking coefficient W as an example: assume that in a three-layer DNN, the linear coefficients of the 4 th neuron of the second layer to the 2 nd neuron of the third layer are defined as
Figure BDA0002856145870000058
The superscript 3 represents the number of layers in which the coefficient W is located, while the subscripts correspond to the third layer index 2 of the output and the second layer index 4 of the input. The summary is that: the coefficients of the kth neuron of the L-1 th layer to the jth neuron of the L-1 th layer are defined as
Figure BDA0002856145870000059
Note that the input layer is without the W parameter. In deep neural networks, more hidden layers make the network more able to depict complex situations in the real world. Theoretically, the more parameters the higher the model complexity, the larger the "capacity", which means that it can accomplish more complex learning tasks. The final goal of the process of training the deep neural network, i.e., learning the weight matrix, is to obtain the weight matrix (the weight matrix formed by the vectors W of many layers) of all the layers of the deep neural network that is trained.
Cross network: a Cross Network (Cross Network) is used to explicitly and efficiently learn Cross features, which is composed of a plurality of Cross layers (Cross Layer), each Layer being represented by the following formula:
Figure BDA00028561458700000510
wherein the content of the first and second substances,
Figure BDA0002856145870000061
the output characteristics of the l-th layer and the l + 1-th layer respectively,
Figure BDA0002856145870000062
is the weight and offset of the l-th layer. The particular structure of Cross networks allows the order (degree) of features to grow as the depth of the Network increases. For example, a Cross Network with l layers, whose high polymeric purity degree is l + 1. Since the weight w and the offset b of each cross layer are vectors, assuming that the dimensions are b, the number of layers is LcThe total parameter of Cross Network is d Lc*2。
In the related technology, a recommendation system is used for carrying out full-scale display and release on all products, so that all users see the same page after entering the recommendation system, and then the recommendation system can release certain products to the users after guiding the users to carry out certain actions. Meanwhile, the complex interaction and information output cause an 'information overload phenomenon' to a user, and the user experience is poor; the user can not use the products randomly issued by the system without any requirement, and various resources of the company are wasted, such as verification and cancellation calculation of manual resources after the products are issued; the user does not have difference with other platforms in activity cognition, the attraction of the product is insufficient, and the user cannot generate the motivation of sharing and obtaining again.
Product recommendation is performed by adopting a collaborative filtering recommendation algorithm or improving the collaborative filtering algorithm, but the collaborative filtering algorithm has the problems of sparse characteristics, cold start and the like.
To address the problems noted in the background, according to an aspect of embodiments of the present application, embodiments of a product delivery method are provided.
Alternatively, in the embodiment of the present application, the product delivery method may be applied to a hardware environment formed by the terminal 101 and the server 103 as shown in fig. 1. The terminal can be a user terminal, when a user accesses the software platform through the terminal, the recommendation system displays a target product put by the target user in a display interface of the terminal, the server is a data server of the software platform, and correlation mining between the user and the product is performed by calling user data and product data, so that the target product which is more likely to be liked and interested by the target user is recommended to the target user.
As shown in fig. 1, a server 103 is connected to a terminal 101 through a network, which may be used to provide services for the terminal or a client installed on the terminal, and a database 105 may be provided on the server or separately from the server, and is used to provide data storage services for the server 103, and the network includes but is not limited to: wide area network, metropolitan area network, or local area network, and the terminal 101 includes but is not limited to a PC, a cell phone, a tablet computer, and the like.
A product delivery method in this embodiment of the present application may be executed by the server 103, or may be executed by both the server 103 and the terminal 101, as shown in fig. 2, where the method may include the following steps:
step S202, first characteristic information of the target object and second characteristic information of the product to be released are obtained, the first characteristic information is used for representing the historical behavior of the target object on the target platform, the second characteristic information is used for representing the product characteristic of the product to be released, and the product to be released comprises virtual object resources used for virtual resource exchange.
In the embodiment of the present application, the target object is a user who accesses a target software platform through a terminal device, and preferably, the target software platform is an application program, an applet, a website, and the like of an insurance service.
The first characteristic information of the target object comprises basic information of the user, such as name, age, common address, gender, current mobile phone number attribution and the like, and the basic information is filled when the user registers the insurance service software platform for the first time to use the user. The first characteristic information may further include behavioral habit preference information of the user, such as love of smoking, love of drinking, late sleep, and the like, which may be collected through a user behavioral habit adjustment questionnaire. The first characteristic information may further include recorded information generated by the user through specific operations such as browsing, clicking, paying attention to, etc. on the software platform of the insurance service, such as browsing a sales promotion type operation activity, a platform type point exchange activity, a certain product introduction page, a personal center, etc.
The product to be released can be an insurance product, and can also be a virtual object resource such as a platform activity prize and the like for exchanging virtual resources, such as a virtual resource representing a cash red envelope, a gift certificate, a gift, a VIP card and the like. The second characteristic information of the product to be released comprises the type, price, label, bonus limit, the number of clicks in the last week, the click rate in the last week and the like of the product.
And step S204, extracting cross features and associated features between the first feature information and the second feature information, and determining a preference result of the target object to the product to be released by using the cross features and the associated features.
According to the embodiment of the application, starting from two aspects of user characteristics and product characteristics, the correlation between the user characteristics and the product characteristics is captured, the cross characteristics are obtained, the deeper nonlinear correlation is obtained, the correlation characteristics are obtained, and therefore products which are in accordance with the preference probability of the user and are larger in probability are found to be delivered.
Optionally, as shown in fig. 3, the step S204 of extracting a cross feature and an associated feature between the first feature information and the second feature information, and determining the preference result of the target object for the product to be delivered by using the cross feature and the associated feature may specifically include the following steps:
step 302, converting the first feature information into a first feature vector, and converting the second feature information into a second feature vector, where the first feature information includes at least one of feature information of a numerical class and feature information of an attribute class of the target object, and the second feature information includes at least one of feature information of a numerical class and feature information of an attribute class of a product to be delivered.
In the embodiment of the present application, the first feature information may be converted into the first feature vector by using an Embedding manner, and the second feature information may be converted into the second feature vector. The first feature information comprises basic features of the user, such as sex, age, resident province, current number attribution and the like, and also comprises behavior preference features, such as product browsing history of the user in the last week, product purchasing history of the target platform in the last week and the like. The feature information of the value class and the feature information of the attribute class are included regardless of the user feature or the product feature. In the features of the user and the product, one feature may contain a plurality of values, for example, the price of the product is different at different times, and the age of a person also increases with time, so that vector conversion cannot be directly performed by using Embedding _ lookup, and the Embedding _ lookup only accepts one-hot encoding with only one numerical value, so that the first feature information and the second feature information need to be discretized first to be converted into multi-hot encoding (multi-hot), and then are embedded into the first feature vector and the second feature vector through Embedding.
Optionally, converting the first feature information into a first feature vector, and converting the second feature information into a second feature vector includes:
determining the data type of the first characteristic information and the second characteristic information;
under the condition that the data type is an attribute type, discretizing the first characteristic information to obtain a first multi-hot code; inputting the first multi-hot code into the embedding layer, and acquiring a first feature vector obtained by vectorizing the first multi-hot code by the embedding layer; discretizing the second characteristic information to obtain a second multi-hot code; inputting the second multi-hot code into the embedding layer, and acquiring a second feature vector obtained by vectorizing the second multi-hot code by the embedding layer;
under the condition that the data type is a numerical type, acquiring an initial feedback data set, wherein the initial feedback data set is behavior data generated when a user logs in a target platform for the first time; dividing the initial feedback data set into a plurality of sub data sets, and classifying the first characteristic information into each sub data set; discretizing each subdata set to obtain a third multi-hot code; inputting the third multi-hot code into the embedding layer, and acquiring a first characteristic vector obtained by vectorizing the third multi-hot code by the embedding layer, wherein the vector length of the first characteristic vector is matched with the number of the sub data sets; dividing the initial feedback data set into a plurality of sub data sets, and classifying the second characteristic information into each sub data set; discretizing each subdata set to obtain a fourth multi-hot code; inputting the fourth multi-hot code into the embedding layer, obtaining a second feature vector obtained by vectorizing the fourth multi-hot code by the embedding layer, wherein the vector length of the second feature vector is matched with the number of the sub data sets, for example, the vector length is consistent with the number of the sub data sets and is divided into 10 buckets, and then, a vector with the length of 10 is adopted to record the feature.
In the embodiment of the application, discretization processing can be performed on the feature information, the first feature information and the second feature information both contain attribute feature information and numerical continuous features, and different discrete modes of the feature information are different.
For attribute type features, such as gender, resident province, product classification label, product type and the like of a user, discretization processing can be directly carried out on the attributes, namely discretization is carried out on first feature information to obtain first multi-hot codes, and discretization is carried out on second feature information to obtain second multi-hot codes. And inputting the first multi-hot code into the Embedding layer, acquiring a first characteristic vector by adopting Embedding, inputting the second multi-hot code into the Embedding layer, and acquiring a second characteristic vector by adopting Embedding.
For numerical continuous features, such as the age of a user, the number of browses for each product, the number of clicks of a product in a last week, and the like, the cold start sample (i.e., the initial feedback data set) can be fished and divided into a plurality of sub data sets, i.e., sub-buckets. Classifying the first characteristic information into each subdata set, and classifying the first characteristic information into each subdata set averagely or randomly; discretizing each subdata set to obtain a third multi-hot code; dividing the initial feedback data set into a plurality of sub data sets, preferably, adopting equal frequency buckets, classifying the second characteristic information into the sub data sets, preferably, averagely classifying the second characteristic information into the sub data sets; and discretizing each subdata set to obtain a fourth multi-hot code so as to enhance the statistical significance of the characteristics and avoid the defect of great/minimum value sensitivity caused by equidistant sub-bucket.
The equal frequency sub-bucket means that the number of samples in each divided bucket (sub-data set) is the same or similar. After the characteristic information is subjected to barreling, segmentation points can be extracted, and then discretization processing is carried out on the characteristic. For example, if a feature a belongs to segment 7 of a 10-segment bucket, the present application may use a vector with a length of 10 to record the feature, and record the 7 th number in the vector as 1, so as to characterize the current feature a. And inputting the third multi-hot code into the embedded layer, acquiring a first feature vector by adopting Embedding, inputting the fourth multi-hot code into the embedded layer, and acquiring a second feature vector by adopting Embedding.
In the embodiment of the present application, after discretizing the feature information, 0/1 vectors, that is, the first multi-hot code, the second multi-hot code, the third multi-hot code, and the fourth multi-hot code, are obtained, and the multi-hot codes are input to the embedding layer to obtain the feature vectors.
Optionally, before obtaining the initial feedback data set, the method further comprises constructing the initial feedback data set as follows:
determining an initial user group;
dividing an initial user group into a plurality of putting groups;
releasing at least one product to each release group;
and acquiring feedback data of the users in each releasing group on the released product to obtain an initial feedback data set.
In the embodiment of the application, for a software platform using insurance services for the first time, a system automatically uses users using the software platform for the first time as initial users, classifies the initial user groups into a plurality of release groups, releases at least one product to each release group, and then obtains feedback data of the users in each release group on the released products to obtain an initial feedback data set, which is also called a cold start sample. If the data fed back by the user indicates that the product delivered to the user for the first time is disliked by the user, the user preference is marked as 1 if the preference of the user is inconsistent with the product delivered for the first time, namely conversion occurs, otherwise, the product delivered for the first time is consistent with the preference of the user if no conversion occurs, and the preference is marked as 0.
Step S304, determining cross features and associated features between the first feature information and the second feature information by using the first feature vector and the second feature vector, wherein the cross features and the associated features are used for representing the association relationship between the target object and the product to be released.
In the embodiment of the application, the first feature vector and the second feature vector can be input into a double-tower feature network, so that the association relationship between the product to be delivered and the target object is mined by using the double-tower feature network.
In the embodiment of the present application, as shown in fig. 4, the dual-tower feature network includes a crossover network and a depth network. The idea of cross-network reference to FM model (factor Machine) can capture the correlation between user-side features and product-side features by using automated feature cross, for example, a user with age 50 prefers micro-insurance experience ticket and similar information, the cross-network is to change each feature into a vector, automatically learn the correlation between each two vector pairs, such as the above feature with age 50 and two features preferring micro-insurance experience ticket, and then learn the correlation of the vector pair after vectorization, and of course, the vector pair is any two vector pairs, and the correlation is low, so that the correlation is automatically ignored. The deep network comprises a plurality of fully connected layers and is used for capturing the nonlinear relation of the user and the product at a deeper level, for example, the input features are combined in a nonlinear mode in a wx + b form. The more the number of fully connected layers in the deep network, the stronger the ability to capture nonlinearity.
Optionally, step S304 may further include the following steps:
inputting the first feature vector and the second feature vector into a cross network, and acquiring cross features output by the cross network, wherein a cross layer of the cross network is used for determining the cross features between the first feature vector and the second feature vector, and the incidence relation comprises a relation expressed by the cross features;
inputting the first feature vector and the second feature vector into a deep network, and acquiring associated features output by the deep network, wherein at least one layer of full-connection layer in the deep network is used for extracting the associated features between the first feature vector and the second feature vector, and the associated relationships comprise relationships represented by the associated features.
Optionally, inputting the first feature vector and the second feature vector into a cross network, and acquiring cross features output by the cross network includes:
passing the first feature vector and the second feature vector to an intersection layer;
extracting co-occurrence features in the cross-layer that occur in the first feature vector and that occur in pairs in the second feature vector;
determining the weight of the co-occurrence feature according to the total proportion of the co-occurrence feature in the first feature vector and the second feature vector;
and determining the first-order characteristic and the second-order characteristic by utilizing the co-occurrence characteristic and the weight to obtain a cross characteristic, wherein the cross characteristic comprises the first-order characteristic and the second-order characteristic.
In the embodiment of the present application, the length of the vector of the multi-hot coding may be recorded as M, and at the same time, in a default case, a dimension may be set for the feature vector, and as a preference, the feature vector may be set to 128 dimensions. As shown in fig. 4, the first feature vector is partially input into the cross network, partially input into the deep network, and the second feature vector is the same, so after the corresponding vector is taken out, the present application respectively processes the vectors to be input into the cross network and the deep network: for the vector input into the cross network, no processing is performed, and the vector is recorded as Hi, so that the vector dimension at the moment is M × 128; for vectors input to the deep network, the feature vectors characterizing the current input sample are summed over all vectors, denoted XiThe specific calculation method is as follows:
Figure BDA0002856145870000131
wherein, embjRepresenting the fetched feature vectorsI.e. the first eigenvector and the second eigenvector.
In the embodiment of the application, the characteristic cross calculation is optimized by referring to a method for capturing cross characteristics by an FM model in a cross network. After the feature vectors are input into the cross layer of the cross network, the following can be automatically learned through the cross network: (1) which features have co-occurrence response features, i.e. features that match each other, occur in pairs; (2) how important these co-occurrence response features are, i.e., the magnitude of the weight. Meanwhile, the original FM algorithm is optimized, element-level feature intersection is not adopted any more, and vector granularity feature intersection is adopted. With reference to the FM equation:
Figure BDA0002856145870000132
wherein the embedding layer input is adopted as a cross item in the application, namely
Figure BDA0002856145870000133
And
Figure BDA0002856145870000134
Figure BDA0002856145870000135
and
Figure BDA0002856145870000136
are elements in Hi, w is a first order weight,
Figure BDA0002856145870000137
the vector is a hidden vector, and the vector is a hidden vector,
Figure BDA0002856145870000138
for the cross feature of the cross network output, the first-order feature is:
<w,Hi>
the second-order characteristics are as follows:
Figure BDA0002856145870000139
the operation has the advantages that the cross features among feature vectors can be captured, so that the statistical significance of feature cross is more obvious, and the noise is reduced; meanwhile, the hidden vector needing to be calculated is reduced from M to M from M128, so that the model complexity is greatly reduced, the time overhead of online prediction is reduced, and the online prediction efficiency is improved.
Optionally, inputting the first feature vector and the second feature vector into a deep network, and acquiring the associated features output by the deep network includes:
fitting the first and second feature vectors by target regularization to determine the associated features while avoiding overfitting.
In the embodiment of the application, the sparsification of the network can be controlled in the deep network, and specifically, the characteristic overfitting can be avoided through the L1 regularization, so that the generalization capability of the model is stronger. The specific calculation of the L1 regularization is as follows:
l1-norm=λ·|Ω|
wherein lambda represents a penalty term coefficient for controlling the degree of penalty; the term | Ω | represents the sum of the absolute values of all parameters in the model, and this term is used to characterize the complexity of the model. Will vector XiInputting a deep network, capturing deeper nonlinear relation between users and products through the deep network, and outputting the correlation characteristics
Figure BDA0002856145870000141
In a deep network, if the DNN is comprised of M fully-connected layers, the output of the jth fully-connected layer is
Figure BDA0002856145870000142
Then the output of layer j +1 is:
Figure BDA0002856145870000143
wherein, WjAs weight term of the j-th layer, bjFor the bias term at layer j, σ is the activation function, and ReLU can be chosen to be in its specific form, as follows:
Figure BDA0002856145870000144
the correlation characteristics of the deep network output are the output of the last full connection layer:
Figure BDA0002856145870000145
and S306, splicing the cross features and the associated features into a target feature vector.
Optionally, the step S306 of stitching the cross feature and the associated feature into the target feature vector may further include:
passing the cross feature and the associated feature to a connection layer;
and acquiring a target feature vector obtained by splicing the cross feature and the associated feature by the connection layer.
In the embodiment of the application, the connection layer can collect the output result of the double-tower network, namely the cross characteristics of the cross network output
Figure BDA0002856145870000151
Correlation features with deep network output
Figure BDA0002856145870000152
Wherein the cross features further include first order features:
<w,Hi>
second-order characteristics:
Figure BDA0002856145870000153
therefore, the first-order feature, the second-order feature and the associated feature of the cross feature can be spliced into the target feature vector KiTool for measuringThe body is as follows:
q1,i=<w,Hi>
Figure BDA0002856145870000154
Figure BDA0002856145870000155
where concat is denoted as the join operation, q1,iRepresenting the first order features of the sample i in the cross feature, q2,iRepresenting the second order features of sample i in the cross features.
And step S308, determining a preference result according to the target feature vector.
In the embodiment of the present application, the target feature vector K may be obtainediThe input to the loss layer is the final output calculated by the objective loss function. The method modifies the loss function, and directly adopts the softmax function to calculate the probability output pi(Xi∈Gj) Wherein G isjDenotes the jth class, then piCan be expressed as:
Figure BDA0002856145870000161
where exp denotes the natural index,
Figure BDA0002856145870000162
for the weight of the last softmax layer,
Figure BDA0002856145870000163
the bias parameter for the last softmax layer.
When the loss function is calculated, the calculation mode of the softmax function can be directly referred to, and if the final output label is yiThen the target loss function can be expressed as:
Figure BDA0002856145870000164
where N denotes the number of samples, yiAnd the real classification result of the current sample is represented, log represents a natural logarithm, and the finally output label indicates the product to be released which is consistent with or similar to the preference habit of the user.
In the embodiment of the application, the user can be labeled with the feature label through analyzing the user features, the target range is reduced, an operation activity strategy is convenient to make, the accurate delivery is convenient, the user is further stimulated to achieve an expected business target, and the business achievement efficiency is improved.
And step S206, determining a target product combination from the products to be released according to the preference result.
Optionally, the preference result includes a preference degree of the target object for each product to be delivered (i.e. a probability output p calculated by the softmax function mentioned abovei) Determining a target product combination from the products to be released according to the preference result comprises:
sequencing the plurality of preference degrees according to a target sequence to obtain a preference degree sequence;
selecting a target number of preference degrees from the preference degree sequence according to a sorting sequence;
and taking the products to be launched corresponding to each preference degree as target products, and sequencing and combining the target products with the target number according to the corresponding arrangement sequence of the preference degrees to obtain a target product combination.
In the embodiment of the application, preference results of the products to be released and the user are obtained through the calculation of the steps, that is, the preference degrees of the products to be released and the corresponding labels of the target objects are obtained, so that the preference degrees can be sorted in a certain sequence, such as from big to small, or from small to playing, and the products to be released corresponding to the top 3 with the largest preference degrees are selected as the target products and are combined according to the sorting sequence of the corresponding preference degrees to obtain the target product combination.
And step S208, displaying the target display card matched with the target product combination to the target object.
In the embodiment of the application, a target product corresponding to a target product combination is determined, and a data packet corresponding to each target product is obtained, wherein the data packet comprises a display style and a corresponding virtual resource, and the virtual resource refers to a resource which can be used for deduction consumption after being triggered (clicked, touched, slid and the like) by a user through a front-end page to be retrieved; combining the data packets corresponding to the target products, and simplifying the interaction displayed at the front end into a card form for displaying, further, the card content information (target product combination) can comprise a prize picture, a prize brief introduction file and a corresponding jump link or a virtual function button; and directly getting after clicking by the user or getting with the jump landing page.
As shown in fig. 5, the main content W1 of the display page is the same, the target product combination in the specific recommendation area W2 varies according to the placement strategy corresponding to the user, and if the preference result calculated for the first user indicates that the gift A, B, C is the top 3 with the preference degree from large to small, the placement strategy 1 is adopted for the first user, the display combination 1 is determined, and finally the corresponding interface is displayed for the first user. Similarly, if the second user matches the releasing strategy 2, the releasing display combination 2 is released to the second user, the third user matches the releasing strategy 3, the releasing display combination 3 is released to the third user, the target product combination accords with the user preference, the product releasing precision is increased, and the conversion rate is further improved.
According to still another aspect of an embodiment of the present application, as shown in fig. 6, there is provided a product dispensing device including:
the information acquisition module 601 is configured to acquire first characteristic information of a target object and second characteristic information of a product to be released, where the first characteristic information is used to represent a historical behavior of the target object on a target platform, the second characteristic information is used to represent a product characteristic of the product to be released, and the product to be released is a virtual object resource;
the preference matching module 603 is configured to extract cross features and associated features between the first feature information and the second feature information, and determine a preference result of the target object for the product to be delivered by using the cross features and the associated features;
a combination determining module 605, configured to determine a target product combination from the products to be released according to the preference result;
and the release display module 607 is used for displaying the target display card matched with the target product combination to the target object.
It should be noted that the information obtaining module 601 in this embodiment may be configured to execute step S202 in this embodiment, the preference matching module 603 in this embodiment may be configured to execute step S204 in this embodiment, the combination determining module 605 in this embodiment may be configured to execute step S206 in this embodiment, and the placement displaying module 607 in this embodiment may be configured to execute step S208 in this embodiment.
It should be noted here that the modules described above are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the disclosure of the above embodiments. It should be noted that the modules described above as a part of the apparatus may operate in a hardware environment as shown in fig. 1, and may be implemented by software or hardware.
Optionally, the preference matching module comprises:
the embedding unit is used for converting the first characteristic information into a first characteristic vector and converting the second characteristic information into a second characteristic vector, wherein the first characteristic information comprises at least one of characteristic information of a numerical class and characteristic information of an attribute class of the target object, and the second characteristic information comprises at least one of characteristic information of the numerical class and characteristic information of the attribute class of a product to be released;
the characteristic extraction unit is used for determining cross characteristics and associated characteristics between the first characteristic information and the second characteristic information by utilizing the first characteristic vector and the second characteristic vector, and the cross characteristics and the associated characteristics are used for expressing the association relationship between the target object and the product to be released;
the vector splicing unit is used for splicing the cross features and the associated features into target feature vectors;
and the preference result determining unit is used for determining a preference result according to the target feature vector.
Optionally, the embedding unit comprises:
the data type determining subunit is used for determining the data types of the first characteristic information and the second characteristic information;
the first embedding subunit is used for discretizing the first characteristic information to obtain a first multi-hot code under the condition that the data type is an attribute type; inputting the first multi-hot code into the embedding layer, and acquiring a first feature vector obtained by vectorizing the first multi-hot code by the embedding layer; discretizing the second characteristic information to obtain a second multi-hot code; inputting the second multi-hot code into the embedding layer, and acquiring a second feature vector obtained by vectorizing the second multi-hot code by the embedding layer;
the second embedding subunit is used for acquiring an initial feedback data set under the condition that the data type is a numerical type, wherein the initial feedback data set is behavior data generated when a user logs in a target platform for the first time; dividing the initial feedback data set into a plurality of sub data sets, and classifying the first characteristic information into each sub data set; discretizing each subdata set to obtain a third multi-hot code; inputting the third multi-hot code into the embedding layer, and acquiring a first characteristic vector obtained by vectorizing the third multi-hot code by the embedding layer, wherein the vector length of the first characteristic vector is matched with the number of the sub data sets; dividing the initial feedback data set into a plurality of sub data sets, and classifying the second characteristic information into each sub data set; discretizing each subdata set to obtain a fourth multi-hot code; and inputting the fourth multi-hot code into the embedding layer, and acquiring a second characteristic vector obtained by vectorizing the fourth multi-hot code by the embedding layer, wherein the vector length of the second characteristic vector is matched with the number of the sub-data sets.
Optionally, the feature extraction unit includes:
the cross network subunit is used for inputting the first feature vector and the second feature vector into a cross network and acquiring cross features output by the cross network, a cross layer of the cross network is used for determining the cross features between the first feature vector and the second feature vector, and the association relationship comprises a relationship expressed by the cross features;
and the deep network subunit is used for inputting the first feature vector and the second feature vector into a deep network and acquiring the association features output by the deep network, at least one layer of full connection layer in the deep network is used for extracting the association features between the first feature vector and the second feature vector, and the association relationship comprises the relationship represented by the association features.
Optionally, the vector stitching unit further includes:
an output transfer subunit for transferring the cross feature and the associated feature to the connection layer;
and the splicing vector obtaining subunit is used for obtaining a target feature vector obtained by splicing the cross feature and the associated feature by the connecting layer.
Optionally, the combination determination module comprises:
the preference degree sequencing unit is used for sequencing the preference degrees according to a target sequence to obtain a preference degree sequence;
the preference degree selecting unit is used for selecting a target number of preference degrees from the preference degree sequence according to the sorting sequence;
and the combination determining unit is used for taking the products to be released corresponding to the preference degrees as target products, and sequencing and combining the target products with the target quantity according to the arrangement sequence of the corresponding preference degrees to obtain the target product combination.
According to another aspect of the embodiments of the present application, an electronic device is provided, as shown in fig. 7, and includes a memory 701, a processor 703, a communication interface 705, and a communication bus 707, where the memory 701 stores a computer program that is executable on the processor 703, the memory 701 and the processor 703 communicate with each other through the communication interface 705 and the communication bus 707, and the processor 703 implements the steps of the method when executing the computer program.
The memory and the processor in the electronic equipment are communicated with the communication interface through a communication bus. The communication bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc.
The Memory may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
There is also provided, in accordance with yet another aspect of an embodiment of the present application, a computer program product or computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the steps of any of the embodiments described above.
Optionally, in an embodiment of the present application, the computer program product or the computer program is a program code for a processor to execute the following steps:
acquiring first characteristic information of a target object and second characteristic information of a product to be released, wherein the first characteristic information is used for representing the historical behavior of the target object on a target platform, and the second characteristic information is used for representing the product characteristic of the product to be released;
determining a preference result of the target object to the product to be launched by utilizing the first characteristic information and the second characteristic information;
and determining a target product to be released to the target object from the products to be released according to the preference result.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments, and this embodiment is not described herein again.
When the embodiments of the present application are specifically implemented, reference may be made to the above embodiments, and corresponding technical effects are achieved.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units performing the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and in actual implementation, there may be other divisions, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk. It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is merely exemplary of the present application and is presented to enable those skilled in the art to understand and practice the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method of product delivery, comprising:
acquiring first characteristic information of a target object and second characteristic information of a product to be released, wherein the first characteristic information is used for representing the historical behavior of the target object on a target platform, the second characteristic information is used for representing the product characteristic of the product to be released, and the product to be released comprises virtual object resources used for exchanging virtual resources;
extracting cross features and associated features between the first feature information and the second feature information, and determining a preference result of the target object to the product to be released by using the cross features and the associated features;
determining a target product combination from the products to be released according to the preference result;
and displaying the target display card matched with the target product combination to the target object.
2. The method of claim 1, wherein extracting cross features and associated features between the first feature information and the second feature information, and determining a result of the target object's preference for the product to be delivered by using the cross features and the associated features comprises:
converting the first feature information into a first feature vector, and converting the second feature information into a second feature vector, wherein the first feature information includes at least one of feature information of a numerical class and feature information of an attribute class of the target object, and the second feature information includes at least one of feature information of a numerical class and feature information of an attribute class of the product to be delivered;
determining the cross feature and the associated feature between the first feature information and the second feature information by using the first feature vector and the second feature vector, wherein the cross feature and the associated feature are used for representing an association relationship between the target object and the product to be released;
splicing the cross features and the associated features into target feature vectors;
and determining the preference result according to the target feature vector.
3. The method of claim 2, wherein converting the first feature information into a first feature vector and converting the second feature information into a second feature vector comprises:
determining the data type of the first characteristic information and the second characteristic information;
under the condition that the data type is the attribute type, discretizing the first characteristic information to obtain a first multi-hot code; inputting the first multi-hot code into an embedding layer, and acquiring the first characteristic vector obtained by vectorizing the first multi-hot code by the embedding layer; discretizing the second characteristic information to obtain a second multi-hot code; inputting the second multi-hot code into the embedding layer, and acquiring a second feature vector obtained by vectorizing the second multi-hot code by the embedding layer;
under the condition that the data type is the numerical type, acquiring an initial feedback data set, wherein the initial feedback data set is behavior data generated when a user logs in the target platform for the first time; dividing the initial feedback data set into a plurality of sub data sets, and classifying the first characteristic information into each sub data set; discretizing each subdata set to obtain a third multi-hot code; inputting the third multi-hot code into the embedding layer, and obtaining the first feature vector obtained by vectorizing the third multi-hot code by the embedding layer, wherein the vector length of the first feature vector is matched with the number of the sub data sets; dividing the initial feedback data set into a plurality of sub data sets, and averagely classifying the second characteristic information into the sub data sets; discretizing each subdata set to obtain a fourth multi-hot code; inputting the fourth multi-hot code into the embedding layer, and obtaining the second feature vector obtained by vectorizing the fourth multi-hot code by the embedding layer, wherein the vector length of the second feature vector is matched with the number of the sub data sets.
4. The method of claim 3, wherein determining the cross feature and the associated feature between the first feature information and the second feature information using the first feature vector and the second feature vector comprises:
inputting the first feature vector and the second feature vector into a cross network, and acquiring the cross features output by the cross network, wherein a cross layer of the cross network is used for determining the cross features between the first feature vector and the second feature vector, and the association relationship comprises a relationship represented by the cross features;
inputting the first feature vector and the second feature vector into a deep network, and obtaining the associated features output by the deep network, wherein at least one fully-connected layer in the deep network is used for extracting the associated features between the first feature vector and the second feature vector, and the associated relationships comprise relationships represented by the associated features.
5. The method of claim 4, wherein inputting the first eigenvector and the second eigenvector into a cross-network, and obtaining the cross-feature of the cross-network output comprises:
passing the first feature vector and the second feature vector to the cross-layer;
extracting co-occurrence features in the cross-layer that occur in the first feature vector and that occur in pairs in the second feature vector;
determining the weight of the co-occurrence feature according to the total proportion of the co-occurrence feature in the first feature vector and the second feature vector;
determining a first-order feature and a second-order feature by using the co-occurrence feature and the weight to obtain the cross feature, wherein the cross feature comprises the first-order feature and the second-order feature.
6. The method of claim 2, wherein stitching the cross features and the associated features into a target feature vector comprises:
passing the cross feature and the associated feature to a connection layer;
and acquiring the target feature vector obtained by splicing the cross feature and the associated feature by the connecting layer.
7. The method according to any one of claims 1 to 6, wherein the preference result comprises a preference degree of the target object for each product to be released, and determining a target product combination from the products to be released according to the preference result comprises:
sequencing the plurality of preference degrees according to a target sequence to obtain a preference degree sequence;
selecting a target number of the preference degrees from the preference degree sequence according to a sorting sequence;
and taking the products to be launched corresponding to the preference degrees as target products, and sequencing and combining the target products with the target quantity according to the corresponding arrangement sequence of the preference degrees to obtain the target product combination.
8. A product delivery device, comprising:
the system comprises an information acquisition module, a storage module and a processing module, wherein the information acquisition module is used for acquiring first characteristic information of a target object and second characteristic information of a product to be released, the first characteristic information is used for representing the historical behavior of the target object on a target platform, the second characteristic information is used for representing the product characteristic of the product to be released, and the product to be released comprises virtual object resources used for exchanging virtual resources;
the preference matching module is used for extracting cross features and associated features between the first feature information and the second feature information and determining a preference result of the target object on the product to be released by utilizing the cross features and the associated features;
the combination determining module is used for determining a target product combination from the products to be released according to the preference result;
and the releasing and displaying module is used for displaying the target display card matched with the target product combination to the target object.
9. An electronic device comprising a memory, a processor, a communication interface and a communication bus, wherein the memory stores a computer program operable on the processor, and the memory and the processor communicate with the communication interface via the communication bus, wherein the processor implements the method of any of claims 1 to 7 when executing the computer program.
10. A computer-readable medium having non-volatile program code executable by a processor, wherein the program code causes the processor to perform the method of any of claims 1 to 7.
CN202011545806.0A 2020-12-23 2020-12-23 Product delivery method, device, equipment and computer readable medium Pending CN112765482A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011545806.0A CN112765482A (en) 2020-12-23 2020-12-23 Product delivery method, device, equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011545806.0A CN112765482A (en) 2020-12-23 2020-12-23 Product delivery method, device, equipment and computer readable medium

Publications (1)

Publication Number Publication Date
CN112765482A true CN112765482A (en) 2021-05-07

Family

ID=75695481

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011545806.0A Pending CN112765482A (en) 2020-12-23 2020-12-23 Product delivery method, device, equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN112765482A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113407851A (en) * 2021-07-15 2021-09-17 北京百度网讯科技有限公司 Method, device, equipment and medium for determining recommendation information based on double-tower model
CN113469752A (en) * 2021-07-22 2021-10-01 北京沃东天骏信息技术有限公司 Content recommendation method and device, storage medium and electronic equipment
CN115222461A (en) * 2022-09-19 2022-10-21 杭州数立信息技术有限公司 Intelligent marketing accurate recommendation method
CN113407851B (en) * 2021-07-15 2024-05-03 北京百度网讯科技有限公司 Method, device, equipment and medium for determining recommended information based on double-tower model

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113407851A (en) * 2021-07-15 2021-09-17 北京百度网讯科技有限公司 Method, device, equipment and medium for determining recommendation information based on double-tower model
CN113407851B (en) * 2021-07-15 2024-05-03 北京百度网讯科技有限公司 Method, device, equipment and medium for determining recommended information based on double-tower model
CN113469752A (en) * 2021-07-22 2021-10-01 北京沃东天骏信息技术有限公司 Content recommendation method and device, storage medium and electronic equipment
CN115222461A (en) * 2022-09-19 2022-10-21 杭州数立信息技术有限公司 Intelligent marketing accurate recommendation method
CN115222461B (en) * 2022-09-19 2023-01-10 杭州数立信息技术有限公司 Intelligent marketing accurate recommendation method

Similar Documents

Publication Publication Date Title
CN113626719B (en) Information recommendation method, device, equipment, storage medium and computer program product
Yang et al. Friend or frenemy? Predicting signed ties in social networks
CN110717098B (en) Meta-path-based context-aware user modeling method and sequence recommendation method
Zhu et al. Online purchase decisions for tourism e-commerce
CN112288042B (en) Updating method and device of behavior prediction system, storage medium and computing equipment
CN109087178A (en) Method of Commodity Recommendation and device
CN109034960B (en) Multi-attribute inference method based on user node embedding
CN111143684B (en) Artificial intelligence-based generalized model training method and device
Chen et al. IR-Rec: An interpretive rules-guided recommendation over knowledge graph
CN112100221B (en) Information recommendation method and device, recommendation server and storage medium
CN112380453B (en) Article recommendation method and device, storage medium and equipment
Shen et al. A voice of the customer real-time strategy: An integrated quality function deployment approach
CN112765482A (en) Product delivery method, device, equipment and computer readable medium
CN114417174B (en) Content recommendation method, device, equipment and computer storage medium
CN112633690A (en) Service personnel information distribution method, service personnel information distribution device, computer equipment and storage medium
CN113590976A (en) Recommendation method of space self-adaptive graph convolution network
CN116823410B (en) Data processing method, object processing method, recommending method and computing device
CN114693409A (en) Product matching method, device, computer equipment, storage medium and program product
CN116764236A (en) Game prop recommending method, game prop recommending device, computer equipment and storage medium
CN111507471B (en) Model training method, device, equipment and storage medium
Ling et al. Extracting implicit friends from heterogeneous information network for social recommendation
Zheng et al. Modeling high-order relation to explore user intent with parallel collaboration views
CN116628236B (en) Method and device for delivering multimedia information, electronic equipment and storage medium
Zhang et al. A novel fine-grained user trust relation prediction for improving recommendation accuracy
CN109325186B (en) Behavior motivation inference algorithm integrating user preference and geographic features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination