CN111738780A - Method and system for recommending object - Google Patents

Method and system for recommending object Download PDF

Info

Publication number
CN111738780A
CN111738780A CN202010757535.9A CN202010757535A CN111738780A CN 111738780 A CN111738780 A CN 111738780A CN 202010757535 A CN202010757535 A CN 202010757535A CN 111738780 A CN111738780 A CN 111738780A
Authority
CN
China
Prior art keywords
user behavior
behavior sequence
feature
user
cross
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010757535.9A
Other languages
Chinese (zh)
Inventor
钱浩
崔卿
周俊
李龙飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202010757535.9A priority Critical patent/CN111738780A/en
Publication of CN111738780A publication Critical patent/CN111738780A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0255Targeted advertisements based on user history
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • G06Q30/0271Personalized advertisement

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Theoretical Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the specification discloses a method and a system for recommending objects, wherein the method comprises the following steps: obtaining candidate object characteristics, user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation; processing the screening factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics, wherein the screening factor characteristics comprise the user characteristics, and the cross characteristics comprise attention information of the user to at least one object related to the user historical operation; and processing the candidate object feature, the user feature and the cross feature to determine a recommendation score of the candidate object relative to the user.

Description

Method and system for recommending object
Technical Field
The embodiment of the specification relates to the technical field of computers, in particular to a method and a system for recommending objects.
Background
With the rapid development of the internet, personalized recommendations have occupied a considerable position in internet products. The personalized recommendation can effectively improve the viscosity of the product for the user, so that the user is encouraged to generate more behaviors. The personalized recommendation can find and analyze interest preference of the user from historical behavior data of the user, and further recommend content which may be interested in the user. Accurately analyzing the preference of the user is the key for successful recommendation.
Therefore, the embodiment of the specification provides a method and a system for recommending an object, so as to realize accurate recommendation for a user.
Disclosure of Invention
One aspect of embodiments of the present specification provides a method of recommending an object, the method including: obtaining candidate object characteristics, user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation; processing the screening factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics, wherein the screening factor characteristics comprise the user characteristics, and the cross characteristics comprise attention information of the user to at least one object related to the user historical operation; and processing the candidate object feature, the user feature and the cross feature to determine a recommendation score of the candidate object relative to the user.
An aspect of an embodiment of the present specification provides a system for recommending an object, the system including: the characteristic acquisition module is used for acquiring candidate object characteristics, user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation; the characteristic operation module is used for processing the screening factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics, wherein the screening factor characteristics comprise the user characteristics, and the cross characteristics comprise attention information of the user to at least one object related to the user historical operation; and the recommendation score determining module is used for processing the candidate object characteristics, the user characteristics and the cross characteristics and determining the recommendation score of the candidate object relative to the user.
One aspect of embodiments of the present specification provides an apparatus for recommending an object, the apparatus comprising a processor and a memory; the memory is used for storing instructions, and the processor is used for executing the instructions to realize the corresponding operation of the method for recommending the object in any previous item.
One aspect of an embodiment of the present specification provides a method of representing a serialized user behavior, the method comprising: acquiring user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation; processing the screening factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics, wherein the screening factor characteristics comprise the user characteristics, and the cross characteristics comprise attention information of the user to at least one object related to the user historical operation; determining the cross feature as a user behavior representation.
One aspect of an embodiment of the present specification provides a representation system for serializing user behavior, the system comprising: the characteristic acquisition module is used for acquiring user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation; the characteristic operation module is used for processing factor selection characteristics and the initial user behavior sequence characteristics to obtain cross characteristics, wherein the screening factor characteristics comprise the user characteristics, and the cross characteristics comprise attention information of the user to at least one object related to the user historical operation; and the representation determining module is used for determining the cross features as user behavior representations.
One aspect of embodiments of the present specification provides an apparatus for representing a serialized user behavior, the apparatus comprising a processor and a memory; the memory is used for storing instructions, and the processor is used for executing the instructions to realize the corresponding operation of the representation method of the serialized user behavior.
Drawings
The present description will be further described by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a system for recommending objects, shown in some embodiments of the present description;
FIG. 2 is a flow diagram of a method of recommending objects, shown in accordance with some embodiments of the present description;
FIG. 3 is a flow diagram illustrating determining cross-signatures in accordance with some embodiments of the present description;
FIG. 4 is a flow diagram illustrating the determination of updated user behavior sequence characteristics and user behavior characteristics for a current round in accordance with some embodiments of the present description;
FIG. 5 is a schematic diagram of a structure of a recommendation model shown in accordance with some embodiments of the present description;
FIG. 6 is a schematic diagram of a structure of a feature crossbar network shown in accordance with some embodiments of the present description;
FIG. 7 is a block diagram of a system for recommending objects, shown in accordance with some embodiments of the present description;
FIG. 8 is a block diagram of a presentation system for serialized user behavior shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "device", "unit" and/or "module" as used in this specification is a method for distinguishing different components, elements, parts or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic diagram of an application scenario of a system for recommending an object according to some embodiments of the present description.
In the field of production and living of people, a scene of object recommendation based on historical behavior data of a user is often encountered. The user behavior may be various activities performed by the user on the service platform, such as exposure, clicking, joining a shopping cart, purchasing, and the like. The objects may be various types of products, such as goods, services, news, etc., that the respective service platforms provide to the user.
However, not any historical behavior represents the user's will or mind, e.g., the user's browsing is only for time consuming, and there is no desire or need to purchase; as another example, user browsing is simply because some rewarding task is being accomplished and is not really of interest. Thus, historical behavior data often contains noise that can affect the accuracy of determining recommended objects. Therefore, the specification provides an algorithm, the screening characteristics at least comprising user characteristics and user behavior sequence characteristics are operated, meaningless information in historical behavior data is automatically removed in the operation process, important characteristics (namely cross characteristics) valuable to final purchasing behaviors are screened out, a target object recommended to a user is further determined based on the valuable important characteristics, and the recommendation accuracy is improved.
As shown in fig. 1, an application scenario 100 of a system for recommending objects may include a processing device 110, a network 120, and a user terminal 130.
The processing device 110 may be used to process information and/or data associated with a recommended object to perform one or more of the functions disclosed in this specification. In some embodiments, the processing device 110 may determine candidate recommendation objects and their characteristics, user characteristics, and initial user behavior sequence characteristics. In some embodiments, the processing device 110 may process the filtering factor features at least including the user features and the initial user behavior sequence features to obtain cross features (e.g., user behavior representation). In some embodiments, processing device 110 may process the candidate object features, the user features, and the cross features to determine a recommendation score for the candidate object relative to the user. In some embodiments, the processing device 110 may select a target object from a plurality of candidate objects to recommend to the user based on the recommendation score. In some embodiments, processing device 110 may include one or more processing engines (e.g., single core processing engines or multiple core processing engines). By way of example only, the processing device 110 may include one or more combinations of a central processing unit (cpu), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), an image processor (GPU), a physical arithmetic processing unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, and the like. In some embodiments, one or more storage devices may be included in the processing device for storing data that needs to be processed by the processing device or result data of the processing, and the like. For example, initial user behavior characteristics, environmental characteristics, etc. may be stored in the storage device.
Network 120 may facilitate the exchange of information and/or data. In some embodiments, one or more components (e.g., processing device 110, user terminal 130) in the scenegraph 100 may communicate information to other components in the scenegraph 100 over the network 120. For example, processing device 110 may obtain user characteristics from user terminal 130 via network 120. As another example, the user terminal 130 may obtain the candidate object recommended by the processing device 110 through the network 120. In some embodiments, the network 120 may be any form of wired network, wireless network, or any combination thereof. By way of example only, network 120 may be one or more combinations of a wireline network, a fiber optic network, a telecommunications network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth network, and so forth.
User terminal 130 may be a device having data acquisition, storage, and/or transmission capabilities. In some embodiments, user terminal 130 may obtain user characteristics. In some embodiments, the user terminal 130 may receive the target object determined by the processing device 110. In some embodiments, the user of the user terminal 130 may be a user of an online service using the application platform. The user terminal 130 may obtain behavior data of the user on the application platform. In some embodiments, the user terminal 130 may include, but is not limited to, a mobile device 130-1, a tablet 130-2, a laptop 130-3, a desktop 130-4, and the like, or any combination thereof. Exemplary mobile devices 130-1 may include, but are not limited to, smart phones, Personal Digital Assistants (PDAs), handheld game consoles, smart watches, wearable devices, virtual display devices, display enhancement devices, and the like, or any combination thereof. In some embodiments, the user terminal 130 may send the retrieved data to one or more devices in the application scenario 100.
It should be noted that the above description of the various components in the application scenario 100 is for illustration and description only and does not limit the scope of applicability of the present description. It will be apparent to those skilled in the art, given the benefit of this disclosure, that additions or subtractions of components in the application scenario 100 may be made. However, such variations are still within the scope of the present description.
FIG. 2 is a flow diagram illustrating a method of recommending objects in accordance with some embodiments of the present description. In some embodiments, the process 200 may be implemented by the system for recommending objects 700, or the processing device 110 shown in FIG. 1. As shown in fig. 2, the process 200 may include the following steps:
step 202, obtaining candidate object characteristics, user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation. In some embodiments, step 202 may be performed by feature acquisition module 710.
The object is any content that can be provided by the online service platform to the user, such as commodities, songs, videos, pictures and the like. The object feature refers to related information of the object. In some embodiments, object features may include, but are not limited to: one or more of the group of category, merchandise identification, price, production related information, sales related information, and the like. The production-related information may reflect the source of the commodity, such as the manufacturer or the place of production. The sales related information may reflect the supply or sales channel of the goods, e.g., agent, sales platform, etc. The object characteristics may also include other characteristics, which are not limited in this embodiment, such as material and production process of the object.
The candidate object is an alternative of the target object recommended to the user by the online service platform. In some embodiments, the processing device may determine the candidate object based on a triggering operation of the user (e.g., the user searching, clicking on a platform, etc.). For example, the processing device recalls, by the recall system, the at least one candidate object based on the object operated by the user. Typically, the recall system may initially screen objects from a large number of candidate objects on the order of hundreds to thousands. The recall system may adopt a certain policy to recall, for example, the similarity between the candidate object and the object operated by the user.
The candidate object feature refers to the relevant information of the candidate object, and can be represented by I. In some embodiments, candidatesObject features can be represented as vectors
Figure 260357DEST_PATH_IMAGE001
Figure 331825DEST_PATH_IMAGE001
Either sparse or dense. For example, the candidate object features are converted into (n) through a one-hot coding mode1*m1) (or (m)1*n1) Sparse vector of), n1Number of features representing candidate object, m1Representing the code length of each feature. For another example, the candidate object feature can be converted into (n) by vector embedding (embedding)1D) (or (d n)1) D represents the dimensionality of the embedded vector. Vector embedding may be achieved through embedding models, including, for example, the Word2Vec model, the Glove model, the Bert model, and the like.
The user is a service user of the online service platform. The user characteristics refer to the related information of the user and can be represented by U. In some embodiments, user characteristics may include, but are not limited to: gender, age, occupation, income, current location, place of residence, and the like. In some embodiments, similar to the candidate object features, the user features may also be represented as vectors
Figure 111562DEST_PATH_IMAGE002
Figure 162564DEST_PATH_IMAGE002
May be (n)2* m2) (or (m)2*n2) A sparse vector of (n) may be used2D) (or (d n)2) Dense vector of n)2Number of features representing the user, m2Representing the code length of each feature.
The initial user behavior sequence feature is serialization information of user historical operations which are initially acquired from the platform. The history operation may be any operation of the object (e.g., a good) by the user, such as browsing, clicking, purchasing, commenting, and the like. In some embodiments, the initial user behavior sequence feature may reflect information related to historical user operations over a predetermined time (e.g., last month, last 10 days, etc.). The initial user behavior sequence feature may also reflect information about a user's historical actions that meet preset requirements (e.g., type of action (e.g., purchase), duration of action (e.g., browsing duration exceeds 5 min), etc.).
In some embodiments, the initial user behavior sequence feature comprises feature information of at least one object related to a user's historical operations. It is understood that the sequence feature formed by the feature of the at least one object related to the user's historical operation may reflect the user's historical operation or behavior in a certain platform. In some embodiments, the processing device may perform serialization arrangement on the feature information of at least one object of the user history operation based on a preset serialization rule, so as to obtain an initial user behavior sequence feature. The serialization rule may be a sequence of operation time of each object by the user, or may be a sequence of operation time duration of each object by the user. Wherein, the characteristic information of each object of the user history operation can be similar to the candidate object characteristics.
For example, if the user sequentially clicks the L items, i.e., item 1 and item 2 … …, in chronological order
Figure 134193DEST_PATH_IMAGE003
… … item L. Similar to the candidate features described above, the merchandiselCan be expressed as a vector
Figure 956655DEST_PATH_IMAGE004
Figure 630213DEST_PATH_IMAGE004
May be (n)3* m1) (or (m)1*n3) A sparse vector of (n) may be used3D) (or (d n)3) Dense vector of n)3Representing the number of features per item. Further, in the present invention,the initial user behavior sequence feature may be H = &
Figure 484906DEST_PATH_IMAGE005
}。
In some embodiments, the feature acquisition module 710 may also acquire environmental features. The environmental characteristics refer to information related to the natural environment and can be represented by C. The environmental characteristics may include, but are not limited to, one or a combination of climate information or the like. In some embodiments, the environmental features may also represent vectors
Figure 308112DEST_PATH_IMAGE006
Figure 957268DEST_PATH_IMAGE006
May be (n)4* m4) (or m)4*n4) A sparse vector of (n) may be used4D) (or (d n)4) Dense vector of n)4Number of representative environmental characteristics, m4Representing the code length of each feature.
In some embodiments, the feature acquisition module 710 may acquire the user features from the user terminal 130 or a storage device; acquiring candidate object characteristics or environment characteristics from a storage device; the initial user behavior sequence features are obtained from a service log or a storage device.
Step 204, processing the screening factor features and the initial user behavior sequence features to obtain cross features, wherein the screening factor features include the user features, and the cross features include attention information of the user to at least one object related to the user historical operation. In some embodiments, step 204 may be performed by feature operation module 720.
The attention information may represent intention information of the user on the object, for example, information of a degree of attention, hobbies, preference degree, or possibility of purchase of the user on the object. In some embodiments, the attention information may be related to an attention parameter, the greater the likelihood of indicating a user's greater interest in or purchase of the object, and the like.
The filtering factor characteristic can be used for grading or filtering at least one object related to the user historical operation to obtain a cross characteristic. The filtering may be to filter objects that the user does not actually pay attention to or like, and the ranking may be to rank the objects according to the degree of the user's actual attention to or preference. In some embodiments, the screening or ranking may be through attention information. For the specific details of processing the screening factor features and the initial user behavior sequence features to obtain the cross features, reference may be made to fig. 3 and the related description thereof, which are not described herein again.
In some embodiments, the filter factor characteristics may include at least user characteristics. In still other embodiments, the filter factor characteristics may further include at least one of environmental characteristics and candidate object characteristics. When the screening factor feature includes a plurality of features (i.e., includes not only the user feature), the screening factor feature may be obtained by fusing the plurality of features. For example, the filtering factor feature may be obtained by fusing the user feature and the environment feature (or the candidate object feature). For another example, the filtering factor feature may be obtained by fusing the user feature, the environment feature, and the candidate object feature. The fusion process may be a vector stitching process. Illustratively, if the candidate object features
Figure 711597DEST_PATH_IMAGE001
Is (n)1D) vectors through which user features can pass
Figure 855134DEST_PATH_IMAGE002
Is (n)2Vector of d), environmental features
Figure 535776DEST_PATH_IMAGE006
Is (n)4D) vector, then screening factor features
Figure 621413DEST_PATH_IMAGE007
May be ((n)1+n2+n4) D) (i.e., (n)5D)) of the vector.
It can be understood that, when the filtering factor feature includes a user feature, the object in the initial user behavior sequence feature may be evaluated from the perspective of the user itself, and the object that the user may pay more attention to may be determined. For example, if the user is a teacher and the objects in the initial user behavior sequence feature comprise a textbook and a gaming machine, the user's attention parameter for the textbook may be greater than the attention parameter for the gaming machine. When the screening factor contains the environmental characteristics, the objects which are more interesting to the user can be determined according to the climate or the season, and the objects which do not accord with the current environment are filtered. Therefore, by screening the effect of the factor characteristics on the initial user behavior sequence characteristics, the interest and hobbies of the user can be mined from multiple angles, and the historical operation behaviors of the user can be accurately depicted.
In some embodiments, the cross-feature may be represented as a user behavior. The user behavior represents a user historical behavior expression reflecting the user's true intention. The user behavior representation can be used as a further feature extraction or processing result of the initial user behavior sequence features, and can be used as features with richer information quantity to be applied to various types of relevant model training or prediction.
The essence of the operation on each feature in this specification may be that the operation is performed based on a vector corresponding to the feature. For example, the operation is performed with a dense vector of the respective features.
Step 206, processing the candidate object feature, the user feature and the cross feature to determine a recommendation score of the candidate object relative to the user. In some embodiments, step 206 may be performed by recommendation score determination module 730.
In some embodiments, processing the candidate object features, the user features, and the cross-features may be determining a relationship between the candidate object features, the user features, and the cross-features. For example, the correlation between features of objects of different attention parameters in the cross-feature and candidate object features. Also for example, correlations between user features and candidate object features, and the like.
In some embodiments, the characteristics processed by the recommendation score determination module 730 may also include environmental characteristics. It can be understood that the processing result may further include a relationship between the environmental feature and the candidate object feature.
In some embodiments, the recommendation score determination module 730 may determine the recommendation score of the candidate object according to the relationships, for example, the recommendation score is higher the greater the relevance of the features of the object with the large attention parameter to the features of the candidate object is; the greater the relevance of the features of the object with the small attention parameter to the features of the candidate object, the lower the recommendation score. As another example, the greater the relevance of a user feature to a candidate object's features, the higher the recommendation score. As another example, the greater the relevance of the environmental feature to the feature of the candidate object, the higher the recommendation score.
In some embodiments, the processing device may process the candidate object features, the user features, and the cross features using a multi-layer perceptron of the recommendation model to determine a recommendation score for the candidate good relative to the user, see fig. 5 and its associated description.
Further, the processing device 110 may determine whether the candidate object is a target object recommended to the user based on the recommendation score. For example, the candidate object with the recommendation score larger than the preset threshold value is recommended to the user.
Fig. 3 is a flow diagram illustrating obtaining cross-over features according to some embodiments of the present description. In some embodiments, the process 300 may be implemented by the feature operations module 720 of the system 700 for recommending objects, or the processing device 110 shown in FIG. 1. As shown in fig. 3, the process 300 may include the following steps:
and 302, performing at least one iteration process based on the screening factor characteristics and the initial user behavior sequence characteristics to obtain user behavior characteristics corresponding to each iteration.
Each iteration process is to perform one screening on at least one object related to the historical operation based on the characteristics of the screening factors. The calculation mode of each iteration is the same, but the calculated parameters are different, specifically see steps 312 and 322, that is, any iteration of the at least one iteration includes:
step 312, based on the weight matrix, computing the screening factor features and the user behavior sequence features corresponding to the current round, and determining attention parameters corresponding to each object in the initial user behavior sequence features; the weight matrix is derived based on training.
The user behavior sequence feature corresponding to the current round may be determined based on the number of rounds corresponding to the current round. In some embodiments, when the iteration process is a first round, the user behavior sequence feature corresponding to the current round is an initial user behavior sequence feature; otherwise, determining the user behavior sequence characteristics corresponding to the current round based on the user behavior sequence characteristics corresponding to the previous round and the updated user behavior sequence characteristics corresponding to the previous round.
In some embodiments, the determining the user behavior sequence feature corresponding to the current round based on the user behavior sequence feature corresponding to the previous round and the updated user behavior sequence feature corresponding to the previous round is specifically: and accumulating the user behavior sequence characteristics corresponding to the previous round, the updated user behavior sequence characteristics corresponding to the previous round and the parameter vectors according to the position to obtain the user behavior sequence characteristics corresponding to the current round. The parameter vectors are obtained through training, and the corresponding parameter vectors corresponding to different iterations can be different. For specific details of determining the updated user behavior sequence characteristics, reference may be made to step 322 and the related description thereof, which are not described herein again.
Specifically, if the current round is the jth round, the feature operation module 720 may obtain the user behavior sequence feature corresponding to the jth round based on the following formula (1):
Figure 863038DEST_PATH_IMAGE009
(1)
wherein the content of the first and second substances,
Figure 315927DEST_PATH_IMAGE010
representing the second round of the corresponding user behavior sequence characteristicslCharacteristic information of the individual object;
Figure 365922DEST_PATH_IMAGE012
representing the second time in the updated user behavior sequence characteristics corresponding to the j-1 th timelCharacteristic information of the individual object;
Figure 356881DEST_PATH_IMAGE013
representing the first in the characteristics of the user behavior sequence corresponding to the j-1 th roundlCharacteristic information of the individual object;
Figure 243059DEST_PATH_IMAGE014
representing the corresponding parameter vector for round j-1.
It is understood that the feature operation module 720 can convert the L obtained by the formula (1)
Figure 977666DEST_PATH_IMAGE010
And forming a new vector or matrix to obtain the characteristic information of all objects in the user behavior sequence characteristics corresponding to the jth round, so as to obtain the user behavior sequence characteristics corresponding to the jth round.
In some embodiments, the above formula (1) may be implemented by a residual error network, and specific details refer to fig. 5 and the related description thereof, which are not described herein again.
The weight matrix refers to parameters of the attention mechanism network and is obtained based on training. It can be understood that, when training, parameters of different rounds of iterative processing are different, and corresponding weight matrixes obtained by training may also be different. Further details of model training are provided with reference to fig. 5 and its associated description.
In some embodiments, the feature operation module 720 may operate the filtering factor feature and the user behavior sequence feature corresponding to the current wheel based on the weight matrix corresponding to the current wheel, and determine the attention parameter corresponding to each object in the initial user behavior sequence feature.
Specifically, if the current round is the jth round, the feature operation module 720 may determine the jth round in the initial user behavior sequence feature vector corresponding to the jth round based on the following formula (2)lAttention parameter of individual subject
Figure 882168DEST_PATH_IMAGE015
Figure 808143DEST_PATH_IMAGE016
(2)
Wherein the content of the first and second substances,
Figure 414573DEST_PATH_IMAGE010
representing the second round of the corresponding user behavior sequence characteristicslCharacteristic information of the individual object;
Figure 454336DEST_PATH_IMAGE007
representing the characteristics of the screening factor;
Figure 213344DEST_PATH_IMAGE017
a weight matrix representing the jth round,
Figure 811685DEST_PATH_IMAGE018
representing a matrix multiplication. In some embodiments of the present invention, the,
Figure 249619DEST_PATH_IMAGE010
may be (n)3D) sparse vector (or matrix),
Figure 824564DEST_PATH_IMAGE007
may be (n)5D), the weight matrix may be (d n)5) Calculated attention parameter
Figure 952926DEST_PATH_IMAGE015
Then is and
Figure 66375DEST_PATH_IMAGE010
a vector (or matrix) of the same dimension.
It can be understood that the objects in the user behavior sequence features corresponding to each round are the same, but because the weight matrix and the user behavior sequence features corresponding to each round are different in each round of iterative processing, the attention parameters obtained by different rounds of calculation may be different for the same object.
Step 322, determining the updated user behavior sequence characteristics and the user behavior characteristics of the current round based on at least the attention parameter and the user behavior sequence characteristics corresponding to the current round.
The updated user behavior sequence feature may refer to a user behavior feature sequence containing attention information. Specifically, the method comprises the following steps. The updated user behavior sequence characteristics determined in the current round are based on the corresponding user behavior sequence characteristics in the previous round, and include the attention parameters determined in the current round. For the updated user behavior sequence feature corresponding to the first round, the attention parameter determined by the first round of processing is added on the basis of the initial user behavior sequence feature.
The user behavior characteristics are obtained by fusing the characteristics of at least one object related to the historical operation of the user based on the attention information. It can be understood that the user behavior characteristics cover information of all objects related to the user historical operation, and are an overall reflection of the user historical operation behavior. The user behavior characteristics corresponding to the current round are obtained by fusing the characteristics of at least one object related to the historical operation of the user based on the attention parameters obtained by the current round.
For specific details of determining the updated user behavior sequence characteristics and the user behavior characteristics of the current round based on at least the attention parameter and the user behavior sequence characteristics corresponding to the current round, please refer to fig. 4 and related descriptions thereof, which are not described herein again.
And step 304, fusing the user behavior characteristics obtained by each iteration to obtain the cross characteristics.
In some embodiments, the manner of fusing the user behavior features obtained by each iteration may be vector splicing or bitwise accumulation of the user behavior features, that is, a vector obtained by splicing or bitwise accumulation of the user behavior features obtained by each iteration is used as a cross vector.
The user behavior characteristics corresponding to each round comprise the attention parameters of the corresponding round, the determined cross characteristics can reflect all attention information by fusing the user behavior characteristics corresponding to all rounds, and all the attention information jointly reflect the real intention of the user in historical operation. Thus, as described in step 204, the cross-feature can be taken as a representation of user behavior.
According to the foregoing, the current iteration is performed on the basis of the previous iteration, and therefore, each iteration can be regarded as performing a filtering process on the user behavior sequence feature based on the filtering factor. Moreover, the current round contains richer attention information and the screening is more intensive than the previous round, i.e., the iterative process can be regarded as a process of strengthening the screening effect. Meanwhile, when the user behavior sequence features corresponding to the current round are determined, not only the updated user sequence features corresponding to the previous round are used, but also the user sequence features corresponding to the previous round (namely, the user behavior sequence features which do not contain the attention information of the previous round) are used, so that the screening reinforcement is ensured, and the excessive data loss and the data fidelity can be prevented.
FIG. 4 is a flow diagram illustrating a determination of updated user behavior sequence characteristics and user behavior characteristics for a current round in accordance with some embodiments of the present description. In some embodiments, the process 400 may be implemented by the feature computation module 720 of the system 700 for recommending objects, or the processing device 110 shown in fig. 1. As shown in fig. 4, the process 400 may include the following steps:
and 402, multiplying each attention parameter by the feature information of the corresponding object in the user behavior sequence feature corresponding to the current wheel in a bit manner to obtain the updated user behavior sequence feature.
The bitwise multiplication is an inner product or dot multiplication in the matrix.
In some embodiments, the feature operation module 720 may multiply the attention parameter corresponding to at least one object in the initial user behavior sequence feature in the current round by the feature information of the corresponding object in the user behavior sequence feature corresponding to the current round in a bit-wise manner to obtain an updated user behavior sequence feature. In some embodiments, the attention parameter used for the bitwise multiplication may be the result after being normalized. The attention parameter is normalized, for example, by softmax.
Specifically, the current round is the jth round, and the feature operation module 720 may determine the updated user behavior sequence feature of the jth round based on the following formula (3):
Figure 398130DEST_PATH_IMAGE019
(3)
wherein the content of the first and second substances,
Figure 779696DEST_PATH_IMAGE021
representing the j updated user behavior sequence characteristicslCharacteristic information of the individual object;
Figure 496985DEST_PATH_IMAGE022
representing the second round of the corresponding user behavior sequence characteristicslCharacteristic information of the individual object;
Figure 935663DEST_PATH_IMAGE023
representing the normalization process. For the
Figure 754714DEST_PATH_IMAGE015
Is (n)3D), each column in the matrix may be subjected to
Figure 438505DEST_PATH_IMAGE023
And (4) operation, wherein each column element of the matrix is respectively normalized. It can be understood that feature information of all objects in the updated user behavior sequence features of the current round can be obtained based on the formula (3), so that the updated user behavior sequence features corresponding to the current round are obtained.
And 404, performing matrix multiplication on each attention parameter and the characteristic information of the corresponding object in the user behavior sequence characteristics corresponding to the current wheel to obtain the user behavior characteristics corresponding to each object.
The matrix multiplication is the outer product or cross product of the matrix.
Specifically, the current round is the jth round, and the characteristic operation module 720 may obtain the initial value based on the following formula (4)In the user behavior sequence characteristicslUser behavior characteristics of individual objects
Figure 620088DEST_PATH_IMAGE024
Figure 967018DEST_PATH_IMAGE025
(4)
Wherein the content of the first and second substances,
Figure 522633DEST_PATH_IMAGE015
representing the second in the initial user behavior sequence feature vector corresponding to the j-th roundlAn attention parameter of the individual subject;
Figure 495268DEST_PATH_IMAGE026
represents
Figure 796937DEST_PATH_IMAGE022
The transposing of (1).
And 406, accumulating the user behavior characteristics corresponding to the objects according to the positions to obtain the user behavior characteristics of the current round.
Specifically, the current round is the jth round, and the feature operation module 720 may obtain the user behavior feature of the jth round based on the following formula (5)
Figure 311838DEST_PATH_IMAGE027
Figure 698957DEST_PATH_IMAGE028
(5)
L represents the number of objects contained in the initial user behavior sequence characteristics;
Figure 724551DEST_PATH_IMAGE024
representing the first in the initial user behavior sequence characteristicslUser behavior characteristics of individual objects.
In some embodiments, all of the steps included in fig. 2-4 above may be implemented by a pre-trained recommendation model. FIG. 5 is a block diagram of a recommendation model in accordance with some embodiments of the present description. As shown in fig. 5, the recommendation model 500 may include a coding network 5001, a feature crossing network 5002, and a multi-layer perceptron 5003.
In some embodiments, the encoding network 5001 may encode (e.g., embed) the candidate object features, the user features, the environment features, and the initial user behavior sequence features obtained by the feature obtaining module 710 to obtain respective vectors. E.g. respective dense vectors.
In some embodiments, the feature crossing network 5002 may operate on the vector of the filter factor features and the vector of the initial user behavior sequence features to obtain crossing features (i.e., user behavior representations), i.e., implement equations (1) - (5). The specific structure of the crossover network is shown in fig. 6 and its related description, which are not repeated herein.
In some embodiments, a multi-layered perceptron (MLP) may process candidate object features, user features, and cross-features to determine a recommendation score for a candidate object relative to a user.
In some embodiments, the recommendation model may perform end-to-end training acquisition based on multiple sets of training samples. Each set of training samples includes at least: sample user characteristics, sample candidate object characteristics and sample initial user behavior sequence characteristics. In some embodiments, each set of training samples further includes sample environmental characteristics. The label of each set of training samples represents whether the corresponding sample candidate is recommended (e.g., 1 for recommended, 0 for not recommended), or the score of the recommendation. In some embodiments, the sample candidate object may be an object recommended to the user by the platform, and accordingly, the label of the sample may represent whether the user purchases the sample. In the tags, whether the recommended tags or the purchased tags are recommended or not can be acquired from the platform and automatically labeled. A loss function may be constructed based on the difference in the predicted scores of the models and the labels. In the training process, the parameters of the model are continuously adjusted until the loss function meets a preset condition (e.g., convergence, smaller than a threshold value, etc.).
Fig. 6 is a schematic diagram of a structure of a feature crossbar network shown in accordance with some embodiments of the present description. As shown in fig. 6, the feature crossing network 600 includes: the multi-layer crossover network 6001 and user behavior represent network 6002.
A multi-layer crossover network 6001 to implement the multiple iteration process described in fig. 3. Namely, the input of the multilayer cross network is the initial user behavior sequence characteristic and the screening factor characteristic, and the output is the cross characteristic. Each layer of the multilayer cross network corresponds to one iteration, and the number of layers of the cross network corresponds to the number of times of the iteration. The input of each layer of the cross network at least comprises: screening factor characteristics and user behavior sequence characteristics corresponding to the current layer (namely, user behavior sequence characteristics corresponding to the current round), and outputting at least: the user behavior sequence characteristics corresponding to the next layer and the user behavior characteristics corresponding to the current layer (namely, the user behavior characteristics corresponding to the current round).
Each layer of the crossbar network may include an attention mechanism network 6011, a feature update network 6021, and a residual network 6031. The attention mechanism network is used for determining a corresponding attention parameter of at least one object in the initial user behavior sequence characteristics, namely, an algorithm of formula (2) is realized, the weight matrix is a parameter of the trained attention mechanism network, and the parameters of the attention mechanism networks of different layers are different. The feature updating network is used for determining the user behavior feature of the current layer pair based on the attention parameter and the user behavior sequence feature corresponding to the current layer, and the updated user behavior sequence feature, namely, the algorithm of the formulas (3), (4) and (5) is realized. The residual error network is used for determining the user behavior sequence characteristics corresponding to the next layer based on the updated user behavior sequence characteristics corresponding to the current layer and the user behavior sequence characteristics of the current layer pair, namely, the algorithm of the formula (1) is realized, the parameter vector is the parameter of the trained residual error network, and the parameters of the residual error networks of different layers are different.
As shown in fig. 6, for the multiple layers of crossing networks, the user behavior sequence feature output by the residual error network in the j-1 th layer of crossing networks is the user behavior sequence feature input by the j-th layer of crossing networks (i.e., the user behavior sequence feature corresponding to the j-th layer). For each model or network in each layer of cross network, the attention parameter output by the attention model is part of input of the feature updating network, and the updated user behavior sequence feature output by the feature updating network is part of input of the residual error network.
The user behavior representation network 6002 is configured to fuse user behavior features corresponding to each layer to obtain a cross feature (i.e., a user behavior representation). That is, the user behavior represents the user behavior feature that the network input is output for each layer of the cross network, and the output is the cross feature.
It should be noted that the above description of the preferred model and its components does not limit the scope of the present description to the illustrated embodiments. It will be appreciated by those skilled in the art that, having the benefit of the teachings of this system, it is possible to make adjustments to the components of the model without departing from such teachings. For example, the residual network in each layer of the cross network may be moved to the next layer, that is, the current layer of the cross network may determine the user behavior sequence feature corresponding to the current layer based on the corresponding updated user behavior sequence feature of the previous layer and the user behavior sequence feature of the previous layer pair.
FIG. 7 is a block diagram of a system for recommending objects, shown in some embodiments of the present description. As shown in fig. 7, the system 700 for recommending an object may include a feature obtaining module 710, a feature calculating module 720, and a recommendation score determining module 730.
The feature obtaining module 710 may be configured to obtain a candidate object feature, a user feature, and an initial user behavior sequence feature; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation. In some embodiments, the user features include: a combination of one or more of gender, age, occupation, income, current location, and residence; the object comprises a commodity, and the candidate object features comprise: category, merchandise identification, price, production related information, sales related information, or any combination thereof. In some embodiments, the feature obtaining module 710 may be further configured to obtain environmental features, the environmental features including: seasonal information and/or climate information.
The feature operation module 720 may be configured to process the filtering factor features and the initial user behavior sequence features to obtain cross features, where the filtering factor features include the user features, and the cross features include attention information of the user to at least one object related to the user historical operation. In some embodiments, the filter factor characteristics further include at least one of the environmental characteristics and the candidate object characteristics. In some embodiments, the filtering factor feature is a feature obtained by fusing the user feature with at least one of the environmental feature and the candidate object feature.
In some embodiments, the feature operation module 720 may be configured to: and processing the screening factor characteristics and the initial user behavior sequence characteristics by using a characteristic cross network in a recommendation model to obtain cross characteristics. In some embodiments, the feature operation module 720 may be configured to: performing at least one iteration, wherein the iteration comprises: based on a weight matrix, calculating the screening factor characteristics and the user behavior sequence characteristics corresponding to the current wheel, and determining attention parameters corresponding to each object in the initial user behavior sequence characteristics; the weight matrix is obtained based on training; determining updated user behavior sequence characteristics and user behavior characteristics of the current wheel at least based on the attention parameter and the user behavior sequence characteristics corresponding to the current wheel; when the iteration processing is a first round, the user behavior sequence feature corresponding to the current round is the initial user behavior sequence feature; otherwise, determining the user behavior sequence characteristics corresponding to the current round based on the user behavior sequence characteristics corresponding to the previous round and the updated user behavior sequence characteristics corresponding to the previous round; and fusing the user behavior characteristics obtained by each iteration to obtain the cross characteristics.
In some embodiments, the attention parameter is a matrix with the same dimension as feature information of a corresponding object in the user behavior sequence feature corresponding to the current wheel; in order to determine an updated user behavior sequence feature and a user behavior feature of the current round based on at least the attention parameter and the user behavior sequence feature corresponding to the current round, the feature operation module may be configured to: multiplying each attention parameter by the feature information of the corresponding object in the user behavior sequence feature corresponding to the current wheel according to the position to obtain an updated user behavior sequence feature; matrix multiplication is carried out on each attention parameter and the characteristic information of the corresponding object in the user behavior sequence characteristic corresponding to the current wheel, and the user behavior characteristic corresponding to each object is obtained; and accumulating the user behavior characteristics corresponding to the objects according to the positions to obtain the user behavior characteristics of the current round.
In some embodiments, in order to obtain the user behavior sequence feature corresponding to the current round based on at least the updated user behavior sequence feature determined by the previous round of iterative processing, the feature operation module is configured to: accumulating the user behavior sequence characteristics corresponding to the previous round, the updated user behavior sequence characteristics determined by the previous round of iterative processing and the parameter vectors according to bits to obtain the user behavior sequence characteristics corresponding to the current round; the parameter vector is obtained through training.
The recommendation score determination module 730 may be configured to process the candidate object feature, the user feature, and the cross feature to determine a recommendation score for the candidate object relative to the user. In some embodiments, the recommendation score determining module 730 may be configured to process the candidate object feature, the user feature, and the cross feature using a multi-tier perceptron of a recommendation model to determine a recommendation score for the candidate good relative to the user.
FIG. 8 is a block diagram of a presentation system for serialized user behavior shown in accordance with some embodiments of the present description. As shown in FIG. 8, the representation system 800 of serialized user behavior can include a feature acquisition module 710, a feature computation module 720, and a representation determination module 810.
The feature obtaining module 710 and the feature calculating module 720 refer to fig. 7, which is not described herein again.
The representation determination module 810 may be configured to determine the cross-feature as a user behavior representation.
It should be understood that the systems shown in fig. 7 and 8 and their modules may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules in this specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above descriptions of the system for recommending objects 700 and the modules thereof, and the representation system for serializing user behavior 800 and the modules thereof are only for convenience of description, and should not limit the present disclosure within the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. For example, the feature obtaining module 710, the feature calculating module 720, and the recommendation score determining module 730 disclosed in fig. 7 may be different modules in a system, or may be a module that implements the functions of the two modules. For another example, in the system 700 for recommending an object, each module may share one storage module, and each module may have its own storage module. Such variations are within the scope of the present disclosure.
The embodiment of the specification further provides a device for recommending the object, which comprises a processor and a memory; the memory is used for storing instructions, and the processor is used for executing the instructions to realize the corresponding operation of the method for recommending the object in any previous item.
The embodiment of the present specification further provides a representation device for serializing user behaviors, which includes a processor and a memory; the memory is used for storing instructions, and the processor is used for executing the instructions to realize the corresponding operation of the representation method of the serialized user behavior.
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) screening initial user behavior sequence characteristics through screening factor characteristics, filtering characteristics of objects which are not interesting to users, describing user historical behaviors as accurately as possible, and obtaining accurate user behavior expression (namely, cross vectors); (2) through multi-round iterative processing, the user behavior sequence characteristics can be sufficiently denoised, and in the determined user behavior expression, the object characteristics of the user without any intention are reduced as much as possible; (3) whether the candidate object is recommended to the user is determined based on the accurate user behavior expression, so that the influence of the object which is not interested by the user or has low interest degree on the recommendation result can be reduced, and the accuracy of object recommendation can be improved; (4) and a residual error network is introduced, and multilayer cross network information can be superposed through the residual error network, so that the previously screened important features can be added into the subsequent feature generation, and the user behavior representation which can better represent the real intention of the user is obtained. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present description may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereof. Accordingly, aspects of this description may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.), or by a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present description may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of this specification may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent specification disclosure, and other materials cited in this specification, such as articles, books, specifications, publications, documents, etc., the entire contents of which are hereby incorporated by reference into this specification. Except for files in the history of the specification that are inconsistent or conflicting with the contents of the specification, and files that are limiting of the broadest scope of the claims that are appended to the specification (whether currently or later-added to the specification). It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (26)

1. A method of recommending an object, comprising:
obtaining candidate object characteristics, user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation;
processing the screening factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics, wherein the screening factor characteristics comprise the user characteristics, and the cross characteristics comprise attention information of the user to at least one object related to the user historical operation;
and processing the candidate object feature, the user feature and the cross feature to determine a recommendation score of the candidate object relative to the user.
2. The method of claim 1, further comprising obtaining environmental characteristics, the environmental characteristics comprising: seasonal information and/or climate information;
the screening factor feature further includes at least one of the environmental feature and the candidate object feature.
3. The method according to claim 2, wherein the filter factor feature is a feature obtained by fusing the user feature with at least one of the environmental feature and the candidate object feature.
4. The method of claim 1, wherein the processing the filter factor characteristics and the initial user behavior sequence characteristics to obtain cross-features comprises:
and processing the screening factor characteristics and the initial user behavior sequence characteristics by using a characteristic cross network in a recommendation model to obtain cross characteristics.
5. The method of claim 1 or 4, wherein the processing the screening factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics comprises:
performing at least one iteration, wherein the iteration comprises:
based on a weight matrix, calculating the screening factor characteristics and the user behavior sequence characteristics corresponding to the current wheel, and determining attention parameters corresponding to each object in the initial user behavior sequence characteristics; the weight matrix is obtained based on training;
determining updated user behavior sequence characteristics and user behavior characteristics of the current wheel at least based on the attention parameter and the user behavior sequence characteristics corresponding to the current wheel; when the iteration processing is a first round, the user behavior sequence feature corresponding to the current round is the initial user behavior sequence feature; otherwise, determining the user behavior sequence characteristics corresponding to the current round based on the user behavior sequence characteristics corresponding to the previous round and the updated user behavior sequence characteristics corresponding to the previous round;
and fusing the user behavior characteristics obtained by each iteration to obtain the cross characteristics.
6. The method of claim 5, wherein the attention parameter is a matrix with feature information of a corresponding object in the user behavior sequence feature corresponding to the current wheel in the same dimension, and the determining the updated user behavior sequence feature and the user behavior feature of the current wheel based on at least the attention parameter and the user behavior sequence feature corresponding to the current wheel comprises:
multiplying each attention parameter by the feature information of the corresponding object in the user behavior sequence feature corresponding to the current wheel according to the position to obtain an updated user behavior sequence feature;
matrix multiplication is carried out on each attention parameter and the characteristic information of the corresponding object in the user behavior sequence characteristic corresponding to the current wheel, so as to obtain the user behavior characteristic corresponding to each object;
and accumulating the user behavior characteristics corresponding to the objects according to the positions to obtain the user behavior characteristics of the current round.
7. The method of claim 5, wherein determining the user behavior sequence feature corresponding to the current round based on the user behavior sequence feature corresponding to the previous round and the updated user behavior sequence feature corresponding to the previous round further comprises:
accumulating the user behavior sequence characteristics corresponding to the previous round, the updated user behavior sequence characteristics corresponding to the previous round and the parameter vectors according to the position to obtain the user behavior sequence characteristics corresponding to the current round; the parameter vector is obtained through training.
8. The method of claim 1, wherein said processing the candidate object features, the user features, and the cross-features to determine a recommendation score for the candidate good relative to the user comprises:
and processing the candidate object characteristics, the user characteristics and the cross characteristics by using a multilayer perceptron in a recommendation model, and determining the recommendation score of the candidate commodity relative to the user.
9. The method of claim 1, the user characteristics comprising: a combination of one or more of gender, age, occupation, income, current location, and residence; the object comprises a commodity, and the candidate object features comprise: a combination of one or more of a category, an item identifier, a price, production related information, and sales related information.
10. A system for recommending objects, comprising:
the characteristic acquisition module is used for acquiring candidate object characteristics, user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation;
the characteristic operation module is used for processing the screening factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics, wherein the screening factor characteristics comprise the user characteristics, and the cross characteristics comprise attention information of the user to at least one object related to the user historical operation;
and the recommendation score determining module is used for processing the candidate object characteristics, the user characteristics and the cross characteristics and determining the recommendation score of the candidate object relative to the user.
11. The system of claim 10, further comprising obtaining environmental characteristics, the environmental characteristics comprising: seasonal information and/or climate information;
the screening factor feature further includes at least one of the environmental feature and the candidate object feature.
12. The system according to claim 11, wherein the filter factor features are features obtained by fusing the user features with at least one of the environmental features and the candidate object features.
13. The system of claim 10, the feature computation module to:
and processing the screening factor characteristics and the initial user behavior sequence characteristics by using a characteristic cross network in a recommendation model to obtain cross characteristics.
14. The system of claim 10 or 13, the feature computation module to:
performing at least one iteration, wherein the iteration comprises:
based on a weight matrix, calculating the screening factor characteristics and the user behavior sequence characteristics corresponding to the current wheel, and determining attention parameters corresponding to each object in the initial user behavior sequence characteristics; the weight matrix is obtained based on training;
determining updated user behavior sequence characteristics and user behavior characteristics of the current wheel at least based on the attention parameter and the user behavior sequence characteristics corresponding to the current wheel;
when the iteration processing is a first round, the user behavior sequence feature corresponding to the current round is the initial user behavior sequence feature; otherwise, determining the user behavior sequence characteristics corresponding to the current round based on the user behavior sequence characteristics corresponding to the previous round and the updated user behavior sequence characteristics corresponding to the previous round;
and fusing the user behavior characteristics obtained by each iteration to obtain the cross characteristics.
15. The system of claim 14, wherein the attention parameter is a matrix with feature information of corresponding objects in the user behavior sequence feature corresponding to the current wheel in the same dimension;
in order to determine an updated user behavior sequence feature and a user behavior feature of the current round based on at least the attention parameter and the user behavior sequence feature corresponding to the current round, the feature operation module is configured to:
multiplying each attention parameter by the feature information of the corresponding object in the user behavior sequence feature corresponding to the current wheel according to the position to obtain an updated user behavior sequence feature;
matrix multiplication is carried out on each attention parameter and the characteristic information of the corresponding object in the user behavior sequence characteristic corresponding to the current wheel, so as to obtain the user behavior characteristic corresponding to each object;
and accumulating the user behavior characteristics corresponding to the objects according to the positions to obtain the user behavior characteristics of the current round.
16. The system of claim 14, wherein to obtain the user behavior sequence feature corresponding to the current round based at least on the updated user behavior sequence feature determined by the previous round of iterative processing, the feature computation module is configured to:
accumulating the user behavior sequence characteristics corresponding to the previous round, the updated user behavior sequence characteristics determined by the previous round of iterative processing and the parameter vectors according to bits to obtain the user behavior sequence characteristics corresponding to the current round; the parameter vector is obtained through training.
17. The system of claim 10, the recommendation score determination module to:
and processing the candidate object characteristics, the user characteristics and the cross characteristics by using a multilayer perceptron in a recommendation model, and determining the recommendation score of the candidate commodity relative to the user.
18. The system of claim 10, the user characteristics comprising: a combination of one or more of gender, age, occupation, income, current location, and residence; the object comprises a commodity, and the candidate object features comprise: a combination of one or more of a category, an item identifier, a price, production related information, and sales related information.
19. An apparatus to recommend an object, the apparatus comprising a processor and a memory; the memory is used for storing instructions, and the processor is used for executing the instructions to realize the corresponding operation of the method for recommending the object according to any one of claims 1 to 9.
20. A representation method of serialized user behavior, comprising:
acquiring user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation;
processing the screening factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics, wherein the screening factor characteristics comprise the user characteristics, and the cross characteristics comprise attention information of the user to at least one object related to the user historical operation;
determining the cross feature as a user behavior representation.
21. The method of claim 20, wherein the processing the filter factor characteristics and the initial user behavior sequence characteristics to obtain cross-features comprises:
and processing the screening factor characteristics and the initial user behavior sequence characteristics by using a characteristic cross network to obtain cross characteristics.
22. The method of claim 20 or 21, wherein the processing the filter factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics comprises:
performing at least one iteration, wherein the iteration comprises:
based on a weight matrix, calculating the screening factor characteristics and the user behavior sequence characteristics corresponding to the current wheel, and determining attention parameters corresponding to each object in the initial user behavior sequence characteristics; the weight matrix is obtained based on training;
determining updated user behavior sequence characteristics and user behavior characteristics of the current wheel at least based on the attention parameter and the user behavior sequence characteristics corresponding to the current wheel;
when the iteration processing is a first round, the user behavior sequence feature corresponding to the current round is the user behavior sequence feature; otherwise, determining the user behavior sequence characteristics corresponding to the current round based on the user behavior sequence characteristics corresponding to the previous round and the updated user behavior sequence characteristics corresponding to the previous round;
and fusing the user behavior characteristics obtained by each iteration to obtain the cross characteristics.
23. A representation system for serializing user behavior, comprising:
the characteristic acquisition module is used for acquiring user characteristics and initial user behavior sequence characteristics; the initial user behavior sequence feature comprises feature information of at least one object related to the user historical operation;
the characteristic operation module is used for processing the screening factor characteristics and the initial user behavior sequence characteristics to obtain cross characteristics, wherein the screening factor characteristics comprise the user characteristics, and the cross characteristics comprise attention information of the user to at least one object related to the user historical operation;
and the representation determining module is used for determining the cross features as user behavior representations.
24. The system of claim 23, the feature computation module to:
and processing the screening factor characteristics and the initial user behavior sequence characteristics by using a characteristic cross network to obtain cross characteristics.
25. The system of claim 23 or 24, the feature computation module to:
performing at least one iteration, wherein the iteration comprises:
based on a weight matrix, calculating the screening factor characteristics and the user behavior sequence characteristics corresponding to the current wheel, and determining attention parameters corresponding to each object in the initial user behavior sequence characteristics; the weight matrix is obtained based on training;
determining updated user behavior sequence characteristics and user behavior characteristics of the current wheel at least based on the attention parameter and the user behavior sequence characteristics corresponding to the current wheel;
when the iteration processing is a first round, the user behavior sequence feature corresponding to the current round is the initial user behavior sequence feature; otherwise, determining the user behavior sequence characteristics corresponding to the current round based on the user behavior sequence characteristics corresponding to the previous round and the updated user behavior sequence characteristics corresponding to the previous round;
and fusing the user behavior characteristics obtained by each iteration to obtain the cross characteristics.
26. A representation apparatus for serializing user behavior, the apparatus comprising a processor and a memory; the memory is configured to store instructions, and the processor is configured to execute the instructions to implement operations corresponding to the representation method of the serialized user behavior according to any one of claims 20 to 22.
CN202010757535.9A 2020-07-31 2020-07-31 Method and system for recommending object Pending CN111738780A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010757535.9A CN111738780A (en) 2020-07-31 2020-07-31 Method and system for recommending object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010757535.9A CN111738780A (en) 2020-07-31 2020-07-31 Method and system for recommending object

Publications (1)

Publication Number Publication Date
CN111738780A true CN111738780A (en) 2020-10-02

Family

ID=72656802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010757535.9A Pending CN111738780A (en) 2020-07-31 2020-07-31 Method and system for recommending object

Country Status (1)

Country Link
CN (1) CN111738780A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112070181A (en) * 2020-11-16 2020-12-11 深圳市华汉伟业科技有限公司 Image stream-based cooperative detection method and device and storage medium
CN112434173A (en) * 2021-01-26 2021-03-02 浙江口碑网络技术有限公司 Search content output method and device, computer equipment and readable storage medium
CN112712418A (en) * 2021-03-25 2021-04-27 腾讯科技(深圳)有限公司 Method and device for determining recommended commodity information, storage medium and electronic equipment
CN112785390A (en) * 2021-02-02 2021-05-11 微民保险代理有限公司 Recommendation processing method and device, terminal device and storage medium
CN112837095A (en) * 2021-02-01 2021-05-25 支付宝(杭州)信息技术有限公司 Object distribution method and system
CN114491283A (en) * 2022-04-02 2022-05-13 浙江口碑网络技术有限公司 Object recommendation method and device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015034850A2 (en) * 2013-09-06 2015-03-12 Microsoft Corporation Feature selection for recommender systems
CN109509054A (en) * 2018-09-30 2019-03-22 平安科技(深圳)有限公司 Method of Commodity Recommendation, electronic device and storage medium under mass data
CN111339415A (en) * 2020-02-25 2020-06-26 中国科学技术大学 Click rate prediction method and device based on multi-interactive attention network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015034850A2 (en) * 2013-09-06 2015-03-12 Microsoft Corporation Feature selection for recommender systems
CN109509054A (en) * 2018-09-30 2019-03-22 平安科技(深圳)有限公司 Method of Commodity Recommendation, electronic device and storage medium under mass data
CN111339415A (en) * 2020-02-25 2020-06-26 中国科学技术大学 Click rate prediction method and device based on multi-interactive attention network

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112070181A (en) * 2020-11-16 2020-12-11 深圳市华汉伟业科技有限公司 Image stream-based cooperative detection method and device and storage medium
CN112070181B (en) * 2020-11-16 2021-02-19 深圳市华汉伟业科技有限公司 Image stream-based cooperative detection method and device and storage medium
CN112434173A (en) * 2021-01-26 2021-03-02 浙江口碑网络技术有限公司 Search content output method and device, computer equipment and readable storage medium
CN112837095A (en) * 2021-02-01 2021-05-25 支付宝(杭州)信息技术有限公司 Object distribution method and system
CN112785390A (en) * 2021-02-02 2021-05-11 微民保险代理有限公司 Recommendation processing method and device, terminal device and storage medium
CN112785390B (en) * 2021-02-02 2024-02-09 微民保险代理有限公司 Recommendation processing method, device, terminal equipment and storage medium
CN112712418A (en) * 2021-03-25 2021-04-27 腾讯科技(深圳)有限公司 Method and device for determining recommended commodity information, storage medium and electronic equipment
CN114491283A (en) * 2022-04-02 2022-05-13 浙江口碑网络技术有限公司 Object recommendation method and device and electronic equipment
CN114491283B (en) * 2022-04-02 2022-07-22 浙江口碑网络技术有限公司 Object recommendation method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN111738780A (en) Method and system for recommending object
CN110717098B (en) Meta-path-based context-aware user modeling method and sequence recommendation method
CN106651542B (en) Article recommendation method and device
US10354184B1 (en) Joint modeling of user behavior
CN112487278A (en) Training method of recommendation model, and method and device for predicting selection probability
Miao et al. Context‐based dynamic pricing with online clustering
CN110008397B (en) Recommendation model training method and device
CN110413888B (en) Book recommendation method and device
CN107786943A (en) A kind of tenant group method and computing device
CN111950593A (en) Method and device for recommending model training
CN109117442B (en) Application recommendation method and device
CN112529663A (en) Commodity recommendation method and device, terminal equipment and storage medium
CN113191838A (en) Shopping recommendation method and system based on heterogeneous graph neural network
CN112818218A (en) Information recommendation method and device, terminal equipment and computer readable storage medium
CN111582932A (en) Inter-scene information pushing method and device, computer equipment and storage medium
CN110866191A (en) Recommendation recall method, apparatus and storage medium
CN111680213B (en) Information recommendation method, data processing method and device
CN117217284A (en) Data processing method and device
CN115760271A (en) Electromechanical commodity personalized recommendation method and system based on graph neural network
US11210624B1 (en) Method and system for determining quantity predictions
CN114880566A (en) User behavior analysis method, device, equipment and medium based on graph neural network
WO2023050143A1 (en) Recommendation model training method and apparatus
CN112785390B (en) Recommendation processing method, device, terminal equipment and storage medium
CN110930223A (en) Recommendation recall method, device and storage medium based on field-aware factorization machine
CN110807693A (en) Album recommendation method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201002