CN115705383A - Sequence recommendation algorithm, system, terminal and medium based on graph neural network time sequence feature extraction - Google Patents

Sequence recommendation algorithm, system, terminal and medium based on graph neural network time sequence feature extraction Download PDF

Info

Publication number
CN115705383A
CN115705383A CN202110931379.8A CN202110931379A CN115705383A CN 115705383 A CN115705383 A CN 115705383A CN 202110931379 A CN202110931379 A CN 202110931379A CN 115705383 A CN115705383 A CN 115705383A
Authority
CN
China
Prior art keywords
user
term
neural network
attention
long
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110931379.8A
Other languages
Chinese (zh)
Inventor
丁玥
李蕴哲
陈皓
王东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Dingsuan Intelligent Technology Co ltd
Original Assignee
Shanghai Dingsuan Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Dingsuan Intelligent Technology Co ltd filed Critical Shanghai Dingsuan Intelligent Technology Co ltd
Priority to CN202110931379.8A priority Critical patent/CN115705383A/en
Publication of CN115705383A publication Critical patent/CN115705383A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application provides a sequence recommendation algorithm, a system, a terminal and a medium based on the time sequence feature extraction of a graph neural network, which respectively define parameters of a user, an article and a time period and set corresponding initial feature embedding; calculating a corresponding self-attention value based on the initial feature embedding of the user, the article and the time period; inputting the attention value and the user parameters into a multi-head attention neural network so as to output the corresponding long-term interest of the user; and inputting the long-term interest and the short-term interaction item set of the user into a multi-head attention neural network so as to output the corresponding long-term and short-term interest of the user. According to the method, the popularity of the commodity at the current time can be more accurately described through the combined analysis of the commodity characteristics and the time characteristics, new characteristic dimensions rich in context information are provided through the introduction of the time characteristics, and the calculation problem of the popularity of the neglected commodity for a long time can be solved after the time characteristics are obtained.

Description

Sequence recommendation algorithm, system, terminal and medium based on graph neural network time sequence feature extraction
Technical Field
The present application relates to the field of sequence recommendation technologies, and in particular, to a sequence recommendation algorithm, system, terminal, and medium based on graph neural network timing feature extraction.
Background
The Recommendation System (RS) aims to generate a personalized recommendation list for each user. Collaborative Filtering (CF) is the most important technique for establishing RSs. Representative CF methods are a matrix decomposition (MF) -based technique, a Probabilistic Graphical Model (PGM), and a Deep Learning (DL) method. However, the recommendation algorithm has long been plagued by temporal dynamics problems, which are also considered to be one of the classical problems in RS. Both user preferences and popularity of goods may change over time, but most conventional CF methods analyze user-item interaction pairs in a static manner. Some studies have attempted to introduce temporal windows into the predictive model. For example, the time weight CF method calculates the time window weight and the instance decay of an item; the MF method fuses baseline predictive variables changing with time to model the time behavior of a user in a hidden feature interaction function, and a session-based time graph (STG) method fuses the long-term and short-term preferences of the user in a graph mode. The above method is not helpful to improve the prediction accuracy, and has two main reasons. First, applying time window analysis may lose many signals. Second, these efforts have difficulty mining to the potential relationships between user-interactive items.
Recently, sequential recommendations have become an emerging topic in RS and have attracted more and more attention. Sequence recommendations have significant practical application value because the user's actions typically occur sequentially in the real world. Sequential recommendations aim at analyzing and making predictions of the next interactive item for a given sequence of user interactive items, finding the included dependencies and estimating the current interests of the user. Sequential pattern mining and markov chains are two representative solutions for sequential recommendations, but there are significant disadvantages. The sequential pattern mining method causes a problem of generating redundant patterns and losing infrequent patterns. Markov chain-based models capture the latest user item interactions, but ignore the long-term dependencies. Recursive Neural Network (RNN) based approaches are effective at modeling order dependencies, but most existing RNN based work faces the problem of losing high correlation relationships and introducing noise. Models based on attention mechanisms achieve satisfactory results in sequential recommendations, mainly because the dependencies between user history scoring items are correctly modeled from a global perspective.
The importance of describing temporal features is not noticeable, and it is very important to describe temporal features in the sequential recommendation. For example, we can obtain features for "december" and "august" of two months through their feature vectors, which provides a new feature dimension with rich context information. The significance of obtaining the time signature is that we can solve the calculation problem of popularity of items that have been ignored for a long time. Although the popularity of all items at different times is easily obtained through statistical information, it is computationally inconvenient and inefficient, and cannot be efficiently integrated into the predictive algorithm. More importantly, we do not know the reasons behind it, which factors affect these items. Through the joint analysis of the commodity characteristics and the time characteristics, the popularity of the commodity at the current time can be more accurately described. Another emerging problem in order recommendation is finding a multi-scale sequential pattern when dealing with long sequences of items for each user. To illustrate in a simple example, a user purchases five items: one iPhone, one iWatch, one airpots, one piece of coat and one pair, the first three items purchased in month 11 and the last two items purchased in month 12. Intuitively, there are strong dependencies between the first three items and between the last two items. The relationship between airpots and the jacket is relatively weak. The most likely pattern behind the entire sequence is that there is a promotional event at month 11, while the weather is getting cooler at month 12. Therefore, the introduction of temporal features is very important for complex pattern mining effective in sequential recommendation.
Disclosure of Invention
In view of the above shortcomings of the prior art, the present application aims to provide a sequence recommendation algorithm, system, terminal and medium based on graph-volume neural network timing feature extraction, so as to solve the problem that the existing recommendation system cannot mine the potential relationship of users.
To achieve the above and other related objects, a first aspect of the present application provides a sequence recommendation algorithm based on graph volume neural network timing feature extraction, including: respectively defining user, article and time period parameters, and setting corresponding initial feature embedding according to the user, article and time period parameters; calculating a corresponding self-attention value based on the initial feature embedding of the user, the item and the time period; inputting the attention value and the user parameters into a multi-head attention neural network so as to output the corresponding long-term interest of the user; and inputting the long-term interest of the user and the short-term interaction item set of the user into a multi-head attention neural network so as to output the corresponding long-term and short-term interest of the user.
In some embodiments of the first aspect of the present application, the calculating the corresponding self-attention value comprises: and embedding and inputting the initial characteristics of the user, the article and the time period into a two-layer multi-layer sensing machine, and calculating and outputting the self-attention value by the multi-layer sensing machine.
In some embodiments of the first aspect of the present application, the attention value is calculated by:
Figure BDA0003210890950000021
Figure BDA0003210890950000022
a j =h T tanh(W t (e ij +e sn ) + bt); wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003210890950000023
representing long-term interactive items
Figure BDA0003210890950000024
Attention value of (e) Λ ij Representing characteristic embeddings of articles, alpha j Represents the calculated attention value andnormalization is carried out through softmax; h is T Representing projection vectors, wt representing a projection matrix, e sn Indicating time embedding and bt constant.
In some embodiments of the first aspect of the present application, the calculating of the long-term interest of the user includes:
Figure BDA0003210890950000025
Figure BDA0003210890950000026
Figure BDA0003210890950000027
MH (-) represents the embedding of the features obtained after the calculation of the multi-head attention calculation, Q represents a Query projection matrix, K represents a Key projection matrix, and V represents a Value projection matrix;
Figure BDA0003210890950000031
denotes the l projection space, d k Represents the dimension of a feature, attention (Q) 1 ,K 1 ,V 1 ) The attention of a plurality of heads is shown,
Figure BDA0003210890950000032
indicating long-term user interest.
In some embodiments of the first aspect of the present application, the calculating of the long-term and short-term interests of the user includes:
Figure BDA0003210890950000033
Figure BDA0003210890950000034
wherein the content of the first and second substances,
Figure BDA0003210890950000035
representing long-short term interest of the user, MH (-) representing feature embedding obtained after calculation of multi-head attention calculation, Q h Denotes the h Query projection matrix, K h Representing the h Key projection space, V h Represents the h-th Value projection space,
Figure BDA0003210890950000036
which represents the h-th projection space,
Figure BDA0003210890950000037
respectively representing different projection matrices.
To achieve the above and other related objects, a second aspect of the present application provides a sequence recommendation system based on a graph volume neural network timing feature extraction, including: the initial characteristic module is used for defining parameters of users, articles and time periods and setting corresponding initial characteristic embedding according to the parameters; a self-attention module for calculating a corresponding self-attention value based on the initial feature embedding of the user, the item, and the time period; the multi-head attention module is used for inputting the attention value and the user parameters into a multi-head attention neural network so as to output the corresponding long-term interest of the user; and inputting the long-term interest of the user and the short-term interactive item set of the user into the multi-head attention neural network so as to output the corresponding long-term and short-term interest of the user.
To achieve the above and other related objects, a third aspect of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for sequence recommendation algorithm based on graph volume neural network timing feature extraction.
To achieve the above and other related objects, a fourth aspect of the present application provides an electronic terminal comprising: a processor and a memory; the memory is used for storing computer programs, and the processor is used for executing the computer programs stored by the memory so as to enable the terminal to execute the sequence recommendation algorithm method based on the time sequence feature extraction of the graph volume neural network.
As described above, the sequence recommendation algorithm, system, terminal and medium based on the time sequence feature extraction of the graph neural network according to the present application have the following beneficial effects: according to the method, the popularity of the commodity at the current time can be more accurately described through the combined analysis of the commodity characteristics and the time characteristics, new characteristic dimensions rich in context information are provided through the introduction of the time characteristics, and the calculation problem of the popularity of the neglected commodity for a long time can be solved after the time characteristics are obtained.
Drawings
Fig. 1 is a flowchart illustrating a sequence recommendation algorithm according to an embodiment of the present application.
Fig. 2 is a heterogeneous schematic diagram of a user-item-time triplet in an embodiment of the present application.
Fig. 3 is a flowchart illustrating a sequence recommendation algorithm according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a sequence recommendation system according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of an electronic terminal according to an embodiment of the application.
Detailed Description
The following description of the embodiments of the present application is provided by way of specific examples, and other advantages and effects of the present application will be readily apparent to those skilled in the art from the disclosure herein. The present application is capable of other and different embodiments and its several details are capable of modifications and/or changes in various respects, all without departing from the spirit of the present application. It should be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It is noted that in the following description, reference is made to the accompanying drawings which illustrate several embodiments of the present application. It is to be understood that other embodiments may be utilized and that mechanical, structural, electrical, and operational changes may be made without departing from the spirit and scope of the present application. The following detailed description is not to be taken in a limiting sense, and the scope of embodiments of the present application is defined only by the claims of the issued patent. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. Spatially relative terms, such as "upper," "lower," "left," "right," "lower," "below," "lower," "above," "upper," and the like, may be used herein to facilitate describing one element or feature's relationship to another element or feature as illustrated in the figures.
In this application, unless expressly stated or limited otherwise, the terms "mounted," "connected," "secured," "retained," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and/or "including" specify the presence of stated features, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, operations, elements, components, items, species, and/or groups thereof. It should be further understood that the terms "or" and/or "as used herein are to be interpreted as being inclusive or meaning any one or any combination. Thus, "a, B or C" or "a, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions or operations are inherently mutually exclusive in some way.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions in the embodiments of the present invention are further described in detail by the following embodiments in conjunction with the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the invention.
Fig. 1 shows a schematic flow chart of a sequence recommendation algorithm based on the time series feature extraction of the graph neural network in an embodiment of the present invention. The sequence recommendation algorithm of the present embodiment includes the following steps.
It should be noted that the sequence recommendation algorithm based on the graph neural network timing feature extraction may be applied to various types of hardware devices, such as an ARM (Advanced RISC Machines) controller, an FPGA (Field Programmable Gate Array) controller, an SoC (System on Chip) controller, a DSP (Digital Signal Processing) controller, or an MCU (micro controller Unit) controller; or a personal computer such as a desktop computer, a notebook computer, a tablet computer, a smart phone, a smart bracelet, a smart watch, a smart helmet, a smart television and the like; or the servers may be arranged on one or more entity servers according to various factors such as functions, loads and the like, or may be formed by distributed or centralized server clusters.
Step S11: the user, the item and the time period are respectively defined and embedded according to the setting corresponding to the initial characteristic.
Specifically, defining U as a user, I as an article, and S as a time window set; and | U |, | I |, | S | respectively represent the lengths thereof. Wherein S = { S = { S = 1 ,S 2 ,......,S t-1 ,S t H, for each user U e U:
the interaction sequence of user u is represented as:
Figure BDA0003210890950000051
the long-term interactive item set for user u is represented as:
Figure BDA0003210890950000052
user' sThe short-term interactive item set of u is represented as:
Figure BDA0003210890950000053
the user-item-time triplet is represented in an anomaly graph as shown in fig. 2.
For each set of users, items and time windows, a corresponding initial feature embedding is set:
Figure BDA0003210890950000054
further, for the time feature embedding, a subchannel convolution aggregation mode can be adopted to obtain:
Figure BDA0003210890950000055
Figure BDA0003210890950000061
S new =σ(W s [u agg ||i agg ||s]+b s );
where σ (·) represents a nonlinear activation function; u. of agg Temporal feature embedding, i, representing a user agg Representing temporal feature embedding of an item, S new Time characteristic embedding representing a time window, W representing a weight, b representing a constant.
Step S12: based on the initial feature embedding of the user, the item, and the time period, a corresponding self-attention value is calculated.
Specifically, by using a two-layer multi-layer perceptron MLP, attention values are calculated, and the output result can obtain the preference feature vectors of the user in different time periods. The attention value is calculated as follows:
a j =h T tanh(W t (e ij +e sn )+bt);
Figure BDA0003210890950000062
Figure BDA0003210890950000063
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003210890950000064
representing long-term interactive items
Figure BDA0003210890950000065
Attention value of (e) Λ ij Representing characteristic embeddings of articles, alpha j Representing the calculated attention value and carrying out normalization through softmax; h is T Representing projection vectors, wt representing projection matrices, e sn Indicating time embedding and bt constant.
It should be noted that the multi-layer perceptron (MLP) is an artificial neural network ANN in a forward structure, and maps a set of input vectors to a set of output vectors. MLP can be viewed as a directed graph, consisting of multiple levels of nodes, each level being fully connected to the next level. Except for the input nodes, each node is a neuron with a nonlinear activation function, and the MLP is trained by using a supervised learning method of a BP back propagation algorithm. The MLP is the popularization of the sensor, and the defect that the sensor cannot identify linear irreparable data is overcome.
Step S13: inputting the attention value and the user parameter into a multi-head attention neural network so as to output the corresponding long-term interest of the user, wherein the calculation formula is as follows:
Figure BDA0003210890950000066
Figure BDA0003210890950000067
Figure BDA0003210890950000068
Figure BDA0003210890950000071
MH (-) represents the embedding of the features obtained after the calculation of the multi-head attention calculation, Q represents a Query projection matrix, K represents a Key projection matrix, and V represents a Value projection matrix;
Figure BDA0003210890950000072
denotes the l projection space, d k Represents the dimension of a feature, attention (Q) 1 ,K 1 ,V 1 ) The attention of a plurality of heads is shown,
Figure BDA0003210890950000073
indicating long-term user interest.
Step S14: inputting the long-term interest and the short-term interaction item set of the user into a multi-head attention neural network, and outputting the corresponding long-term and short-term interest of the user according to the following calculation formula:
Figure BDA0003210890950000074
Figure BDA0003210890950000075
wherein the content of the first and second substances,
Figure BDA0003210890950000076
representing long-short term interest of the user, MH (-) representing feature embedding obtained after calculation of multi-head attention calculation, Q h Denotes the h Query projection matrix, K h Representing the h Key projection space, V h Represents the h-th Value projection space,
Figure BDA0003210890950000077
which represents the h-th projection space,
Figure BDA0003210890950000078
respectively representing different projection matrices.
So far, the overall flowchart of the sequence recommendation algorithm based on the graph neural network timing characteristic extraction in this embodiment may refer to fig. 3: module 31 is a user long-term interaction item collection
Figure BDA0003210890950000079
Module
32 is a user short-term interactive item collection
Figure BDA00032108909500000710
Module
33 includes a Multi-head Attention neural Network (Multi-head Attention Network), and module 34 is a user long-short term interest
Figure BDA00032108909500000711
Figure BDA00032108909500000712
User long-term interaction item
Figure BDA00032108909500000713
Aggregating item1 to generate item initial feature embedding e i1 Long term user interaction of items
Figure BDA00032108909500000714
Aggregating item2 to generate item initial feature embedding e i2 Article initial feature embedding and time window initial feature embedding aggregate, i.e. e i1 And e s1 、e i2 And e s1 The polymerization result is inputted into a Self-attention pool (Self-attention pool) to output
Figure BDA00032108909500000715
Attention value of
Figure BDA00032108909500000716
Based on the same principle, the user can interact with the article for a long time
Figure BDA00032108909500000717
Corresponding to the calculated attention value
Figure BDA00032108909500000718
User long-term interaction item
Figure BDA00032108909500000719
Corresponding calculation is carried out to obtain the attention value
Figure BDA00032108909500000720
Embedding user initial features, attention values
Figure BDA00032108909500000721
Inputting into a Multi-head Attention Network (Multi-head Attention Network), and outputting the corresponding long-term interest of the user
Figure BDA00032108909500000722
User short-term interactive item collections
Figure BDA00032108909500000723
Aggregate with item7, item8 to generate e i7 And e i8 ,e i7 、e i8 And
Figure BDA00032108909500000724
aggregated results of, and long-term user interest
Figure BDA00032108909500000725
Inputting into Multi-head Attention Network (Multi-head Attention Network), and outputting long-term and short-term user interests correspondingly
Figure BDA00032108909500000726
For ease of understanding, the present invention was tested on three datasets, gowalla, amazon Video Games and Foursquare. Wherein, the Gowalla dataset records check-in information of the user and point of interest information and time, the Amazon Video Games dataset contains product reviews and metadata from the Amazon website, and the used subset only contains (user, item, score, time stamp) tuples. The Foursquare dataset contains check-in data obtained in a location-based social network.
The statistical information of the data set is as follows:
Figure BDA0003210890950000081
evaluation was performed using two common evaluation indices, recall and MRR, specifically defined as:
Figure BDA0003210890950000082
Figure BDA0003210890950000083
comparative testing was performed using the following baseline method:
BPR-MF: a bayesian personalized ranking framework using matrix factorization as an internal predictor. BPR is the most representative traditional method for ranking tasks.
FPMC: factorization based on personalized transition matrices and personalized markov chain models. It is a generalization of markov chains and matrix factorization.
HRM: a hierarchical model may capture a user's sequential behavior and the user's overall preferences. HRM has two types of aggregation operations, average pool and maximum pool. Here we chose maximum pooling because it could lead to better results.
And (5) case: this method uses convolution operations, and case captures a sequential pattern of "images" of L x d on the nearest L items with d embedded size.
The SHAN: the method is based on a hierarchical attention network that models the long-term and short-term preferences of users for sequential recommendations.
SASrec: the method employs an attention mechanism to capture long-term semantics and make predictions based on relatively few actions.
SG-RNN: this method models the conversation sequence as data in a graph structure to capture complex transformations of the item.
TiSASRec: a time interval-based self-awareness model for sequential recommendations that models both absolute position and relative time intervals between items. It achieves the most advanced performance.
The results of the experiments are as follows, showing that the proposed HGSHAN method of the present invention exceeds all baseline methods.
Figure BDA0003210890950000091
Tests were performed on different aggregation modes of nodes in the graph, corresponding to the following,
case 1: time embedding is calculated in a standard graph aggregation manner.
Case 2: the connections between the time and the items are removed in the figure.
Case 3: the connection between time and user is removed in the figure.
Case 4: rather than using the graph structure, a randomized temporal feature embedding is used.
Case 5: the embedded map is removed for the entire computation time.
The results of the tests are shown in the following table, and the best results are obtained on the three data sets by the calculation method of the subchannel aggregation proposed in the present embodiment.
Figure BDA0003210890950000092
The time window cut was tested and the results of the algorithm were tested on three data sets with time windows of one day, one week and one month, respectively, as shown in the following table:
Figure BDA0003210890950000101
during model training, the prediction mode is still an inner product-based mode, and time feature embedding is added to the features of the user and the features of the articles:
Figure BDA0003210890950000102
and updating by adopting a Bayesian personalized sorting mode. User u's preference for item i is greater than item k, which can be defined as:
Figure BDA0003210890950000103
wherein the content of the first and second substances,
Figure BDA0003210890950000104
representing a long-term interactive collection of items for user u,
Figure BDA0003210890950000105
representing short-term interactive items of user u.
The whole objective function is defined as
Figure BDA0003210890950000106
The parameter space of the whole model is theta = { theta = g ,Θ t ,Θ l ,Θ h }, set [40, 60, 80, 100, 120, 140 for the hidden variable size search space]L-2 regularization parameter search space is [0.0001,0.001,0.01,0.1]. The search space of the learning rate is [0.0001,0.001,0.01,0.1 ]]. The batch size is set to 1, and the number of graph samples for each user and item search space is [10,20,30,40,50]。
Fig. 4 is a schematic structural diagram illustrating a sequence recommendation system based on a graph neural network timing feature extraction according to an embodiment of the present invention. The sequence recommendation system of the present embodiment includes an initial feature module 401, a self-attention module 402, and a multi-head attention module 403.
The initial characteristic module is used for defining parameters of users, articles and time periods and setting corresponding initial characteristic embedding according to the parameters; the self-attention module 402 is configured to calculate a corresponding self-attention value based on the initial feature embedding of the user, the item, and the time period; the multi-head attention module is used for inputting the attention value and the user parameters into a multi-head attention neural network so as to output the corresponding long-term interest of the user; and inputting the long-term interest of the user and the short-term interactive item set of the user into a multi-head attention neural network so as to output the corresponding long-term and short-term interest of the user.
It should be understood that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or can be implemented in the form of hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the initial feature module may be a processing element that is separately installed, or may be implemented by being integrated into a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and a processing element of the apparatus calls and executes the functions of the initial feature module. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 5 is a schematic structural diagram of an electronic terminal according to an embodiment of the present invention. This example provides an electronic terminal, includes: a processor 51, a memory 52, a communicator 53; the memory 52 is connected with the processor 51 and the communicator 53 through a system bus and completes mutual communication, the memory 52 is used for storing computer programs, the communicator 53 is used for communicating with other devices, and the processor 51 is used for operating the computer programs, so that the electronic terminal executes the steps of the sequence recommendation algorithm based on the extraction of the time sequence characteristics of the atlas neural network.
The system bus mentioned above may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The system bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown, but this is not intended to represent only one bus or type of bus. The communication interface is used for realizing communication between the database access device and other equipment (such as a client, a read-write library and a read-only library). The Memory may include a Random Access Memory (RAM), and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components.
The invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the sequence recommendation algorithm method based on the time sequence feature extraction of the graph volume neural network.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the above method embodiments may be performed by hardware associated with a computer program. The aforementioned computer program may be stored in a computer readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
In the embodiments provided herein, the computer-readable and/or writable storage medium may include read-only memory, random-access memory, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash memory, U-disk, removable hard disk, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable-writable storage media and data storage media do not include connections, carrier waves, signals, or other transitory media, but are intended to be non-transitory, tangible storage media. Disk and disc, as used in this application, includes Compact Disc (CD), laser disc, optical disc, digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
In summary, the present application provides a sequence recommendation algorithm, a system, a terminal, and a medium based on graph neural network timing feature extraction, which can more accurately describe the popularity of a commodity at the current time through the joint analysis of commodity features and time features, and provide a new feature dimension that enriches context information through the introduction of time features, and can solve the calculation problem of the popularity of an article that has been ignored for a long time after obtaining the time features. Therefore, the application effectively overcomes various defects in the prior art and has high industrial utilization value.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the present application. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical concepts disclosed in the present application shall be covered by the claims of the present application.

Claims (8)

1. A sequence recommendation algorithm based on the time sequence feature extraction of a graph neural network is characterized by comprising the following steps:
respectively defining user, article and time period parameters, and setting corresponding initial feature embedding according to the user, article and time period parameters;
calculating a corresponding self-attention value based on the initial feature embedding of the user, the article and the time period;
inputting the attention value and the user parameters into a multi-head attention neural network so as to output the corresponding long-term interest of the user;
and inputting the long-term interest and the short-term interaction item set of the user into a multi-head attention neural network so as to output the corresponding long-term and short-term interest of the user.
2. The sequence recommendation algorithm based on graph-volume neural network timing feature extraction of claim 1, wherein the calculating the corresponding self-attention value comprises: and embedding and inputting the initial characteristics of the user, the article and the time period into a two-layer multi-layer sensing machine, and calculating and outputting the self-attention value by the multi-layer sensing machine.
3. The sequence recommendation algorithm based on the graph neural network time series feature extraction as claimed in claim 1, wherein the attention value is calculated by the following steps:
Figure FDA0003210890940000011
Figure FDA0003210890940000012
a j =h T tanh(W t (e ij +e sn )+bt);
wherein the content of the first and second substances,
Figure FDA0003210890940000013
representing long-term interactive items
Figure FDA0003210890940000014
Attention value of (e) Λ ij Representing characteristic embeddings of articles, alpha j Representing the calculated attention value and carrying out normalization through softmax; h is T Representing projection vectors, wt representing projection matrices, e sn Indicating time embedding and bt constant.
4. The sequence recommendation algorithm based on the time series feature extraction of the graph neural network of claim 1, wherein the calculation manner of the long-term interest of the user comprises:
Figure FDA0003210890940000015
Figure FDA0003210890940000016
Figure FDA0003210890940000017
MH (·) represents the embedding of features obtained after multi-head attention calculation, Q represents a Query projection matrix, K represents a Key projection matrix, and V represents a Value projection matrix;
Figure FDA0003210890940000018
denotes the l projection space, d k Represents the dimension of a feature, attention (Q) 1 ,K 1 ,V 1 ) It is noted that a plurality of heads are shown,
Figure FDA0003210890940000019
indicating long-term user interest.
5. The sequence recommendation algorithm based on the time-series feature extraction of the graph neural network according to claim 1, wherein the calculation manner of the long-term and short-term interests of the user comprises:
Figure FDA00032108909400000110
Figure FDA0003210890940000021
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA0003210890940000022
representing long-short term interest of the user, MH (-) representing feature embedding obtained after calculation of multi-head attention calculation, Q h Denotes the h Query projection matrix, K h Representing the h Key projection space, V h Represents the h-th Value projection space,
Figure FDA0003210890940000023
which represents the h-th projection space,
Figure FDA0003210890940000024
respectively representing different projection matrices.
6. A sequence recommendation system based on the time sequence feature extraction of a graph-volume neural network is characterized by comprising the following steps:
the initial characteristic module is used for defining parameters of users, articles and time periods and setting corresponding initial characteristic embedding according to the parameters;
a self-attention module for calculating a corresponding self-attention value based on the initial feature embedding of the user, the item and the time period;
the multi-head attention module is used for inputting the attention value and the user parameters into a multi-head attention neural network so as to output the corresponding long-term interest of the user; and inputting the long-term interest of the user and the short-term interactive item set of the user into a multi-head attention neural network so as to output the corresponding long-term and short-term interest of the user.
7. A computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the sequence recommendation algorithm based on the graph volume neural network timing feature extraction of any one of claims 1 to 5.
8. An electronic terminal, comprising: a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to execute the computer program stored in the memory to cause the terminal to execute the sequence recommendation algorithm based on the graph volume neural network timing feature extraction according to any one of claims 1 to 5.
CN202110931379.8A 2021-08-13 2021-08-13 Sequence recommendation algorithm, system, terminal and medium based on graph neural network time sequence feature extraction Pending CN115705383A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110931379.8A CN115705383A (en) 2021-08-13 2021-08-13 Sequence recommendation algorithm, system, terminal and medium based on graph neural network time sequence feature extraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110931379.8A CN115705383A (en) 2021-08-13 2021-08-13 Sequence recommendation algorithm, system, terminal and medium based on graph neural network time sequence feature extraction

Publications (1)

Publication Number Publication Date
CN115705383A true CN115705383A (en) 2023-02-17

Family

ID=85180169

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110931379.8A Pending CN115705383A (en) 2021-08-13 2021-08-13 Sequence recommendation algorithm, system, terminal and medium based on graph neural network time sequence feature extraction

Country Status (1)

Country Link
CN (1) CN115705383A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116595157A (en) * 2023-07-17 2023-08-15 江西财经大学 Dynamic interest transfer type session recommendation method and system based on user intention fusion

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116595157A (en) * 2023-07-17 2023-08-15 江西财经大学 Dynamic interest transfer type session recommendation method and system based on user intention fusion
CN116595157B (en) * 2023-07-17 2023-09-19 江西财经大学 Dynamic interest transfer type session recommendation method and system based on user intention fusion

Similar Documents

Publication Publication Date Title
US11741361B2 (en) Machine learning-based network model building method and apparatus
Bolón-Canedo et al. Feature selection for high-dimensional data
CN111815415B (en) Commodity recommendation method, system and equipment
CN111798273A (en) Training method of purchase probability prediction model of product and purchase probability prediction method
US20110282861A1 (en) Extracting higher-order knowledge from structured data
JP2019507398A (en) Collaborative filtering method, apparatus, server, and storage medium for fusing time factors
Wang et al. Hierarchical attentive transaction embedding with intra-and inter-transaction dependencies for next-item recommendation
CN116601626A (en) Personal knowledge graph construction method and device and related equipment
CN110598084A (en) Object sorting method, commodity sorting device and electronic equipment
CN113159450A (en) Prediction system based on structured data
CN113268667A (en) Chinese comment emotion guidance-based sequence recommendation method and system
CN112749737A (en) Image classification method and device, electronic equipment and storage medium
CN114417161B (en) Virtual article time sequence recommendation method, device, medium and equipment based on special-purpose map
CN115879508A (en) Data processing method and related device
CN116049536A (en) Recommendation method and related device
WO2023050143A1 (en) Recommendation model training method and apparatus
CN115705383A (en) Sequence recommendation algorithm, system, terminal and medium based on graph neural network time sequence feature extraction
CN113159449A (en) Structured data-based prediction method
Trirat et al. Universal time-series representation learning: A survey
Wang et al. Jointly modeling intra-and inter-transaction dependencies with hierarchical attentive transaction embeddings for next-item recommendation
CN115344794A (en) Scenic spot recommendation method based on knowledge map semantic embedding
CN115705384A (en) Decoupling recommendation method, system, terminal and medium based on knowledge graph fusion
US20230267277A1 (en) Systems and methods for using document activity logs to train machine-learned models for determining document relevance
CN113420214B (en) Electronic transaction object recommendation method, device and equipment
US11934384B1 (en) Systems and methods for providing a nearest neighbors classification pipeline with automated dimensionality reduction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination