CN114579860A - User behavior portrait generation method and device, electronic equipment and storage medium - Google Patents

User behavior portrait generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114579860A
CN114579860A CN202210214272.6A CN202210214272A CN114579860A CN 114579860 A CN114579860 A CN 114579860A CN 202210214272 A CN202210214272 A CN 202210214272A CN 114579860 A CN114579860 A CN 114579860A
Authority
CN
China
Prior art keywords
behavior
matrix
user
portrait
joint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210214272.6A
Other languages
Chinese (zh)
Other versions
CN114579860B (en
Inventor
成杰峰
杨晓月
彭奕
蒋佳峻
李杨
丁琴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Life Insurance Company of China Ltd
Original Assignee
Ping An Life Insurance Company of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Life Insurance Company of China Ltd filed Critical Ping An Life Insurance Company of China Ltd
Priority to CN202210214272.6A priority Critical patent/CN114579860B/en
Publication of CN114579860A publication Critical patent/CN114579860A/en
Application granted granted Critical
Publication of CN114579860B publication Critical patent/CN114579860B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Mathematics (AREA)
  • Algebra (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a user behavior portrait generation method, a device, electronic equipment and a storage medium, wherein the method comprises the following steps: performing data extraction on the user behavior data to obtain a single behavior matrix, a joint behavior matrix and a non-specific behavior matrix; respectively carrying out matrix decomposition on the single behavior matrix, the joint behavior matrix and the nonspecific behavior matrix, and respectively carrying out user portrait construction according to the obtained single behavior feature group, joint behavior feature group and nonspecific behavior feature group to obtain a single behavior portrait, a joint behavior portrait and a nonspecific behavior portrait; inputting the single behavior portrait, the joint behavior portrait and the nonspecific behavior portrait into a weight determination model, and determining a first weight, a second weight and a third weight; and according to the first weight, the second weight and the third weight, carrying out weighted summation processing on the single behavior portrait, the joint behavior portrait and the unspecific behavior portrait to obtain a user behavior portrait of the user.

Description

User behavior portrait generation method and device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a user behavior portrait generation method and device, electronic equipment and a storage medium.
Background
The core of the user portrait technology lies in marking a label for a user, and at present, the label is usually a high-precision feature mark extracted by people, such as age, region, interest and the like. The user image technology is a technology for describing the overall appearance of the user by describing the label of the user from different dimensions. In the business field, the user representation technology can effectively predict and infer the needs and interested contents of the user, so that corresponding products or services can be put into the user according to the needs and the interested contents of the user, the conversion rate of real customers is improved, and auxiliary support can be provided for business activity planning and marketing decisions.
However, the fine granularity of the user behavior portrait generated by the current user portrait technology is insufficient, and the user behavior portrait can only roughly reflect the area under which the user is more active, but cannot reflect what factors are active in the area, so that the real requirements and preferences of the user cannot be accurately grasped for the products and services released by the user, the effective rate of releasing is low, the conversion rate of the real customer is low, and the releasing cost is also increased.
Disclosure of Invention
In order to solve the above problems in the prior art, embodiments of the present application provide a user behavior representation generation method, apparatus, electronic device, and storage medium, which can generate a user behavior representation with higher fine granularity, and then more accurately infer the needs and preferences of a user.
In a first aspect, an embodiment of the present application provides a user behavior representation generation method, including:
performing data extraction on the user behavior data to obtain a single behavior matrix, a joint behavior matrix and a non-specific behavior matrix;
performing matrix decomposition on the single behavior matrix to obtain a single behavior feature group, and constructing a user portrait according to the single behavior feature group to obtain a single behavior portrait;
performing matrix decomposition on the joint behavior matrix to obtain a joint behavior feature group, and constructing a user portrait according to the joint behavior feature group to obtain a joint behavior portrait;
performing matrix decomposition on the nonspecific behavior matrix to obtain a nonspecific behavior feature group, and constructing a user portrait according to the nonspecific behavior feature group to obtain a nonspecific behavior portrait;
inputting the single behavior image, the joint behavior image and the non-specific behavior image into a weight determination model, and determining a first weight corresponding to the single behavior image, a second weight corresponding to the joint behavior image and a third weight corresponding to the non-specific behavior image;
and according to the first weight, the second weight and the third weight, carrying out weighted summation processing on the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait to obtain a user behavior portrait of the user.
In a second aspect, an embodiment of the present application provides a user behavior representation generating apparatus, including:
the extraction module is used for extracting data of the user behavior data to obtain a single behavior matrix, a combined behavior matrix and a non-specific behavior matrix;
the decomposition module is used for carrying out matrix decomposition on the single behavior matrix to obtain a single behavior characteristic group, carrying out user portrait construction according to the single behavior characteristic group to obtain a single behavior portrait, carrying out matrix decomposition on the combined behavior matrix to obtain a combined behavior characteristic group, carrying out user portrait construction according to the combined behavior characteristic group to obtain a combined behavior portrait, carrying out matrix decomposition on the non-specific behavior matrix to obtain a non-specific behavior characteristic group, and carrying out user portrait construction according to the non-specific behavior characteristic group to obtain a non-specific behavior portrait;
the weight determining module is used for inputting the single behavior image, the joint behavior image and the non-specific behavior image into the weight determining model, and determining a first weight corresponding to the single behavior image, a second weight corresponding to the joint behavior image and a third weight corresponding to the non-specific behavior image;
and the portrait generation module is used for carrying out weighted summation processing on the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait according to the first weight, the second weight and the third weight so as to obtain a user behavior portrait of the user.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor coupled to a memory for storing a computer program, the processor being configured to execute the computer program stored in the memory to cause the electronic device to perform the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored thereon, the computer program causing a computer to perform the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program, the computer operable to cause the computer to perform a method according to the first aspect.
The implementation of the embodiment of the application has the following beneficial effects:
in the embodiment of the application, the user behavior data is further refined by dividing the user behavior into a single behavior, a joint behavior and a non-specific behavior. And then respectively carrying out matrix decomposition on the single behavior matrix, the joint behavior matrix and the non-specific behavior matrix on the basis of the single behavior matrix, the joint behavior matrix and the non-specific behavior matrix to construct a single behavior portrait, a joint behavior portrait and a non-specific behavior portrait. And finally, determining respective weights of the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait through a weight determination model, and then performing weighted summation processing on the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait to obtain a final user behavior portrait. Therefore, the generated user behavior representation has higher fine granularity, can reflect the activity degree of the user in each field or topic, and can further reflect the reason of the activity of the user in each field or topic. Based on this, the preference of the user can be accurately inferred, then corresponding products and services are put into the user in a targeted mode, the putting efficiency and the conversion rate of real customers are improved, and the putting cost is reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic diagram of a hardware structure of a user behavior representation generation apparatus according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart illustrating a method for generating a user behavior representation according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a method for extracting data from user behavior data to obtain a single behavior matrix, a joint behavior matrix, and a non-specific behavior matrix according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a single behavior matrix, a joint behavior matrix, and a non-specific behavior matrix according to an embodiment of the present disclosure;
FIG. 5 is a block diagram illustrating functional modules of an apparatus for generating a user behavior representation according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present application are within the scope of protection of the present application.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, result, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
First, referring to fig. 1, fig. 1 is a schematic diagram of a hardware structure of a user behavior image generating device according to an embodiment of the present disclosure. The user behavior representation generation apparatus 100 includes at least one processor 101, a communication line 102, a memory 103, and at least one communication interface 104.
In this embodiment, the processor 101 may be a general processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs according to the present disclosure.
The communication link 102, which may include a path, carries information between the aforementioned components.
The communication interface 104 may be any transceiver or other device (e.g., an antenna, etc.) for communicating with other devices or communication networks, such as an ethernet, RAN, Wireless Local Area Network (WLAN), etc.
The memory 103 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
In this embodiment, the memory 103 may be independent and connected to the processor 101 through the communication line 102. The memory 103 may also be integrated with the processor 101. The memory 103 provided in the embodiments of the present application may generally have a nonvolatile property. The memory 103 is used for storing computer-executable instructions for executing the present application, and is controlled by the processor 101 to execute. The processor 101 is configured to execute computer-executable instructions stored in the memory 103, thereby implementing the methods provided in the embodiments of the present application described below.
In alternative embodiments, computer-executable instructions may also be referred to as application code, which is not specifically limited in this application.
In alternative embodiments, processor 101 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 1.
In an alternative embodiment, user behavior representation generation device 100 may include multiple processors, such as processor 101 and processor 107 of FIG. 1. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores that process data (e.g., computer program instructions).
In an alternative embodiment, if the user behavior representation generating apparatus 100 is a server, for example, it may be an independent server, or may be a cloud server that provides basic cloud computing services such as cloud service, cloud database, cloud computing, cloud function, cloud storage, web service, cloud communication, middleware service, domain name service, security service, Content Delivery Network (CDN), big data, and artificial intelligence platform. The user behavior representation generation apparatus 100 may further include an output device 105 and an input device 106. The output device 105 is in communication with the processor 101 and may display information in a variety of ways. For example, the output device 105 may be a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display device, a Cathode Ray Tube (CRT) display device, a projector (projector), or the like. The input device 106 is in communication with the processor 101 and may receive user input in a variety of ways. For example, the input device 106 may be a mouse, a keyboard, a touch screen device, or a sensing device, among others.
The user behavior representation generation apparatus 100 may be a general-purpose device or a special-purpose device. The embodiment of the present application does not limit the type of the user behavior representation generation apparatus 100.
Next, it should be noted that the embodiments disclosed in the present application may acquire and process related data based on artificial intelligence technology. Among them, Artificial Intelligence (AI) is a theory, method, technique and application system that simulates, extends and expands human Intelligence using a digital computer or a machine controlled by a digital computer, senses the environment, acquires knowledge and uses the knowledge to obtain the best result.
The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a robot technology, a biological recognition technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
Finally, the user behavior portrait generation method in the application can be applied to scenes such as e-commerce sales, off-line entity sales, service popularization, telephone call-out, social platform popularization and the like. In the application, the social platform popularization scenario is mainly taken as an example to explain the user behavior sketch generation method, and the user behavior sketch generation method in other scenarios is similar to the implementation manner in the social platform popularization scenario and will not be described here.
Hereinafter, a user behavior representation generation method disclosed in the present application will be described:
referring to fig. 2, fig. 2 is a schematic flowchart of a user behavior representation generation method according to an embodiment of the present disclosure. The user behavior portrait generation method comprises the following steps:
201: and performing data extraction on the user behavior data to obtain a single behavior matrix, a combined behavior matrix and a non-specific behavior matrix.
In the embodiment, when a user interacts on the social platform, interaction for a corresponding purpose is usually completed through some series of actions, such as posting, praise, collecting, commenting, purchasing, sharing, and the like. Based on the data, the online behavior operation data of each user can be recorded through the system background of the social platform and serve as the data of the user behavior of the user.
In the present embodiment, the behavior of the user can be divided into a single behavior, a joint behavior, and a non-specific behavior. In particular, a single behavior represents a behavior that cannot be subdivided, and for a social platform, may be: behavior such as behavior of likes and behavior of comments; the joint behavior is a combination of multiple single behaviors which have a certain relationship with each other, and for the social platform, the joint behavior can be a posting behavior including a posting behavior and a comment behavior in the single behaviors, and the like; the non-specific behavior is a combination of all single behaviors, and simply speaking, the non-specific behavior can be a category of the non-specific behavior as long as a user makes a certain behavior. Therefore, the user behaviors are divided into the single behaviors, the joint behaviors and the non-specific behaviors, so that the further fine-grained user behaviors are realized, and meanwhile, the relation between the user behavior data and the specific behaviors (the single behaviors, the joint behaviors or the non-specific behaviors) is more obvious, so that the fine-grained degree of the subsequent model training is improved.
For example, the present embodiment provides a method for extracting data from user behavior data to obtain a single behavior matrix, a joint behavior matrix, and a non-specific behavior matrix, as shown in fig. 3, where the method includes:
301: and determining at least one domain label according to the platform to which the user behavior data belongs.
In this embodiment, a plurality of domains or topics may be included in the social platform, and for this, each domain or topic of the plurality of domains or topics is assigned a corresponding domain tag as a distinction between different domains or topics.
302: and extracting the single behavior characteristics of each field label in the at least one field label to obtain at least one single behavior label corresponding to each field label.
In the present embodiment, the behavior of the user in a different domain or topic is also different for the domain or topic. Based on this, historical user behavior data in the field or topic can be obtained through the field tag, then the frequency of each behavior in the historical user behavior data is analyzed, then the behavior with the frequency meeting the preset condition is extracted as the tag, and the at least one single behavior tag is obtained. Therefore, the at least one single behavior tag can be directly called to perform behavior matching when behavior data are subsequently extracted, so that the behavior tags do not need to be identified during extraction, and the data extraction efficiency is improved.
303: and matching at least one single behavior tag corresponding to each field tag with a preset joint behavior table to obtain at least one joint behavior tag.
In this embodiment, after the single behavior tag included in each domain tag is determined, a preset joint behavior table may be queried according to the single behavior tags to obtain at least one joint behavior tag. Specifically, the joint behavior table records some common mapping relationships between joint behaviors and single behaviors, for example, for joint behavior "posting" in the sense of posting its own speech, the single behavior "posting" and "comment" both satisfy the meaning of joint behavior "posting", and therefore, the two single behavior tags "posting" and "comment" can be combined into a joint behavior tag: "publish", stored in the federated behavior table. Similarly, the joint behavior tags can be directly called for behavior matching when behavior data are subsequently extracted, so that the behavior tags do not need to be recognized during extraction, and the data extraction efficiency is improved.
304: and splitting the user behavior data according to the at least one field label to obtain at least one piece of user behavior sub-data corresponding to the at least one field label one to one.
In this embodiment, the user behavior data is split according to the active field or topic of the user, and then the user behavior sub-data of the active field or topic of the user is obtained. Specifically, the domain label of the page where each user behavior occurs may be taken as the domain label of each user behavior, and then the user behaviors having the same domain label may be classified into one class. Based on this, the number of the user behavior sub-data obtained by splitting should be equal to the number of the active domains or topics of the user. For example: if a user is active in the sports and entertainment fields on the social media platform, the behavior data of the user should be split into the sports behavior subdata and the entertainment behavior subdata according to whether the behavior occurs in the sports field or the entertainment field.
305: and for each user behavior subdata in the at least one user behavior subdata, performing data extraction on each user behavior subdata according to at least one single behavior tag and at least one combined behavior tag corresponding to each user behavior subdata to obtain single sub-behavior data, combined sub-behavior data and non-specific sub-behavior data.
Illustratively, a user has taken several praise, post and comment actions on a social platform, and these actions are focused on the sports and entertainment areas. The user behavior data corresponding to the user will be split into the sports behavior subdata and the entertainment behavior subdata. Meanwhile, the single behavior labels in the sports field are: like, post, comment, and watch the game; the joint behavior tags are: post (i.e., a combination of posts and comments) and watch (i.e., a combination of praise and watch race); non-specific behavior tags are: all behaviors (i.e., combinations of praise, post, and comment). The single action labels under the entertainment domain are: like, posts, and comments; the joint behavior tags are: post (i.e., a combination of posts and comments); non-specific behavior tags are: all behaviors (i.e., combinations of praise, post, and comment).
Based on the above, when data is extracted, taking the sports behavior sub-data as an example, behavior data meeting the praise behavior, the posting behavior, the comment behavior and the watching competition behavior in the behavior data are respectively extracted as single sub-behavior data; combining different single sub-behavior data according to the combination mode of the combined behavior tag to obtain combined sub-behavior data; and finally, summarizing all the single child behavior data to obtain non-specific child behavior data.
306: and respectively combining the single sub-behavior data, the joint sub-behavior data and the non-specific sub-behavior data corresponding to each user behavior sub-data to obtain a single behavior matrix, a joint behavior matrix and a non-specific behavior matrix.
In this embodiment, the behavior matrix may be a two-dimensional matrix, where each row of the behavior matrix represents a user, each column represents a field or topic, and each intersecting element represents a behavior score of the user corresponding to the element under the topic or field corresponding to the element.
For example, in the present embodiment, the behavior score may be obtained by counting the number of behaviors. Specifically, user 1 performed 7 praise actions on social media, with 4 praise in the sports domain and 3 praise in the entertainment domain; 5 posting behaviors, wherein posting is 2 times in the sports field and 3 times in the entertainment field; and 10 reviews of behavior, with 4 reviews in the sports field and 6 reviews in the entertainment field. User 2 performed 6 praise actions on social media, with 3 praise in the sports domain and 3 praise in the entertainment domain; 7 posting behaviors, wherein posting is 4 times in the sports field and 3 times in the entertainment field; and 8 reviews of behavior, with 3 reviews in the sports domain and 5 reviews in the entertainment domain. Thereby, a single behavior matrix, a joint behavior matrix, and a non-specific behavior matrix as shown in fig. 4 can be obtained.
202: and performing matrix decomposition on the single behavior matrix to obtain a single behavior feature group, and constructing the user portrait according to the single behavior feature group to obtain the single behavior portrait.
In this embodiment, before matrix decomposition is performed on the single-behavior matrix, it is also necessary to complete missing values in the single-behavior matrix to eliminate or reduce errors caused by the missing values. Specifically, each row in the single behavior matrix represents a user, each column represents a domain or topic, and if a user has a behavior in a domain or topic, the element value at the intersection of the row where the user is located and the corresponding column where the domain or topic is located in the matrix is used to represent the score of a certain behavior of the user in the domain or topic. And if the user does not act on the domain or topic, the intersection position is subjected to null processing when the matrix is generated. However, these vacant positions may have a certain effect on the subsequent data analysis, and in turn, may affect the accuracy of the user behavior portrayal subsequently created. Therefore, in the present embodiment, it is necessary to complement these vacant positions.
For example, in the present embodiment, the single behavior matrix may be decomposed into the first completion matrix and the second completion matrix according to the element value of each determined element in the single behavior matrix. In short, the determined elements are all elements of the single behavior matrix except the elements missing the actual values, i.e., the elements having definite values. In other words, in the decomposition, only the element values of the determined elements in the single behavior matrix are considered, and whether the decomposition result is accurate after the decomposition is performed. Thus, the first completion matrix and the second completion matrix having no empty positions can be obtained. At this time, the decomposition process is reversely reduced, so that the element value of each missing element in the single behavior matrix can be determined through the first completion matrix and the second completion matrix, and the completed single behavior matrix, namely the first single behavior matrix, is obtained.
Illustratively, Singular Value Decomposition (SVD) may be employed to perform matrix Decomposition on the single behavior matrix. Specifically, a single behavior matrix M (M × n) is decomposed into a first completion matrix P (M × d) and a second completion matrix Q (d × n) by a Funk-SVD algorithm in singular value decomposition, where d is a custom dimension. Then, the element values of the missing elements in the first behavior matrix M (M × n), the first completion matrix P (M × d), and the second completion matrix Q (d × n) satisfy the formula (i):
Figure BDA0003532033010000101
wherein M isuvThe value of the element representing the missing element in the nth row and the vth column in the single-behavior matrix M, d is the custom dimension, PukIs the value of the kth column, Q, of the u row in the first completion matrix PkvAnd d, u, v and k are integers which are larger than or equal to 1 and are the values of the kth row and the vth column in the second completion matrix Q.
In an alternative embodiment, a simple completion manner may also be adopted, for example, a global average value or an average value of the user topics is used to complete the scores that are not scored, so as to obtain a completed matrix. In particular, an average value of the element values of all determined elements in the single behavior matrix may be determined, wherein a determined element is all elements of the single behavior matrix except the element missing the actual value. And then taking the average value as the element value of each missing element in the single behavior matrix to obtain a first single behavior matrix, wherein the missing elements are all elements except all determined elements in the single behavior matrix.
After the complemented first single behavior matrix is obtained, matrix decomposition can be performed on the first single behavior matrix to obtain a first decomposition matrix and a second decomposition matrix. And then determining the single behavior feature group according to the first decomposition matrix and the second decomposition matrix.
Specifically, in the present embodiment, the first single-behavior matrix may be decomposed in a non-negative decomposition manner, so that the first single-behavior matrix, the first decomposition matrix, and the second decomposition matrix satisfy the following formula:
Figure BDA0003532033010000111
wherein, VijRepresenting the value of the ith row and the jth column in a first single-row matrix, EihRepresenting the value of the ith row and the h column in the decomposed first matrix, GhjDenotes the value, lambda, of the h row and j column in the decomposed second matrixijRepresents VijQ is a custom dimension, and h, q, i, j are integers greater than or equal to 0.
In the present embodiment, the user image construction method using the feature group is a construction method commonly used in the art, for example: the single behavior feature set can be input into a preset portrait construction model, and then the single behavior portrait is obtained. In other words, any method that can implement constructing a user representation from behavior features in the art can be applied to this embodiment, and will not be described herein again.
203: and performing matrix decomposition on the joint behavior matrix to obtain a joint behavior characteristic group, and constructing the user portrait according to the joint behavior characteristic group to obtain the joint behavior portrait.
In this embodiment, the method for obtaining the joint behavior profile by performing matrix decomposition on the joint behavior matrix is similar to the method for obtaining the single behavior profile by performing matrix decomposition on the single behavior matrix in step 202 to obtain the single behavior profile and performing user profile construction according to the single behavior profile, and is not repeated here.
204: and performing matrix decomposition on the nonspecific behavior matrix to obtain a nonspecific behavior feature group, and constructing a user portrait according to the nonspecific behavior feature group to obtain a nonspecific behavior portrait.
In this embodiment, the method of performing matrix decomposition on the nonspecific behavior matrix to obtain a nonspecific behavior feature group, and performing user portrait construction according to the nonspecific behavior feature group to obtain a nonspecific behavior portrait is similar to the method of performing matrix decomposition on the single behavior matrix in step 202 to obtain a single behavior feature group, and performing user portrait construction according to the single behavior feature group to obtain a single behavior portrait, and is not repeated here.
205: and inputting the single behavior image, the joint behavior image and the unspecific behavior image into a weight determination model, and determining a first weight corresponding to the single behavior image, a second weight corresponding to the joint behavior image and a third weight corresponding to the unspecific behavior image.
In this embodiment, the weight determination model may be a machine learning-based classifier model.
206: and according to the first weight, the second weight and the third weight, carrying out weighted summation processing on the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait to obtain a user behavior portrait of the user.
In summary, in the user behavior representation generation method provided by the present invention, the user behavior is divided into a single behavior, a joint behavior and a non-specific behavior, so as to further refine the user behavior data. And then respectively carrying out matrix decomposition on the single behavior matrix, the joint behavior matrix and the non-specific behavior matrix on the basis of the single behavior matrix, the joint behavior matrix and the non-specific behavior matrix to construct a single behavior portrait, a joint behavior portrait and a non-specific behavior portrait. And finally, determining respective weights of the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait through a weight determination model, and then weighting the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait to obtain a final user behavior portrait. Therefore, the generated user behavior representation has higher fine granularity, can reflect the activity degree of the user in each field or topic, and can further reflect the reason of the activity of the user in each field or topic. Based on the method, the preference of the user can be accurately inferred, and then corresponding products and services are put into the user in a targeted mode, so that the putting efficiency and the conversion rate of real customers are improved, and the putting cost is reduced.
Referring to fig. 5, fig. 5 is a block diagram illustrating functional modules of a user behavior representation generating device according to an embodiment of the present disclosure. As shown in fig. 5, the user behavior representation generation apparatus 500 includes:
an extraction module 501, configured to perform data extraction on user behavior data to obtain a single behavior matrix, a joint behavior matrix, and a non-specific behavior matrix;
the decomposition module 502 is configured to perform matrix decomposition on the single behavior matrix to obtain a single behavior feature group, perform user portrait construction according to the single behavior feature group to obtain a single behavior portrait, perform matrix decomposition on the joint behavior matrix to obtain a joint behavior feature group, perform user portrait construction according to the joint behavior feature group to obtain a joint behavior portrait, perform matrix decomposition on the non-specific behavior matrix to obtain a non-specific behavior feature group, and perform user portrait construction according to the non-specific behavior feature group to obtain a non-specific behavior portrait;
the weight determining module 503 is configured to input the single behavior image, the joint behavior image, and the unspecific behavior image into the weight determining model, and determine a first weight corresponding to the single behavior image, a second weight corresponding to the joint behavior image, and a third weight corresponding to the unspecific behavior image;
and the portrait generation module 504 is configured to perform weighted summation processing on the single behavior portrait, the joint behavior portrait, and the unspecified behavior portrait according to the first weight, the second weight, and the third weight, so as to obtain a user behavior portrait of the user.
In the embodiment of the present invention, in the aspect of extracting data from user behavior data to obtain a single behavior matrix, a joint behavior matrix, and a non-specific behavior matrix, the extracting module 501 is specifically configured to:
determining at least one domain label according to a platform to which user behavior data belongs;
performing single-behavior feature extraction on each domain label in at least one domain label to obtain at least one single-behavior label corresponding to each domain label;
matching at least one single behavior tag corresponding to each field tag with a preset joint behavior table to obtain at least one joint behavior tag;
splitting the user behavior data according to the at least one field label to obtain at least one user behavior subdata, wherein the at least one user behavior subdata corresponds to the at least one field label one to one;
for each user behavior subdata in the at least one user behavior subdata, performing data extraction on each user behavior subdata according to at least one single behavior tag and at least one joint behavior tag corresponding to each user behavior subdata to obtain single sub-behavior data, joint sub-behavior data and non-specific sub-behavior data;
and respectively combining the single sub-behavior data, the joint sub-behavior data and the non-specific sub-behavior data corresponding to each user behavior subdata to obtain a single behavior matrix, a joint behavior matrix and a non-specific behavior matrix.
In an embodiment of the present invention, in performing matrix decomposition on the single behavior matrix to obtain a single behavior feature group, the decomposition module 502 is specifically configured to:
performing completion processing on the single behavior matrix to obtain a first single behavior matrix;
performing matrix decomposition on the first single behavior matrix to obtain a first decomposition matrix and a second decomposition matrix, wherein the first single behavior matrix, the first decomposition matrix and the second decomposition matrix satisfy the formula (c):
Figure BDA0003532033010000141
wherein, VijRepresenting the value of the ith row and the jth column in a first single-row matrix, EihRepresenting the value of the ith row and the h column in the decomposed first matrix, GhjDenotes the value, lambda, of the h row and j column in the decomposed second matrixijRepresents VijQ is a user-defined dimension, and h, q, i and j are integers greater than or equal to 0;
and determining the single behavior feature group according to the first decomposition matrix and the second decomposition matrix.
In an embodiment of the present invention, in terms of performing completion processing on the single behavior matrix to obtain the first single behavior matrix, the decomposition module 502 is specifically configured to:
decomposing the single behavior matrix into a first completion matrix and a second completion matrix according to the element value of each determined element in the single behavior matrix, wherein the determined elements are all elements except the elements lacking the actual values in the single behavior matrix;
determining the element value of each missing element in the single behavior matrix according to the first completion matrix and the second completion matrix to obtain a first single behavior matrix, wherein the missing elements are all elements except all determined elements in the single behavior matrix, and the element value of each missing element, the first completion matrix and the second completion matrix satisfy the formula (iv):
Figure BDA0003532033010000142
wherein M isuvThe value of the element representing the missing element of the nth row and the vth column in the single-row matrix, d is the custom dimension, PukIs the value of the kth column, Q, of the u row in the first completion matrixkvAnd d, u, v and k are integers which are larger than or equal to 1 and are the values of the kth row and the vth column in the second completion matrix.
In an embodiment of the present invention, in terms of performing completion processing on the single behavior matrix to obtain the first single behavior matrix, the decomposition module 502 is specifically configured to:
determining an average value of element values of all determined elements in the single behavior matrix, wherein the determined elements are all elements except elements lacking actual values in the single behavior matrix;
and taking the average value as the element value of each missing element in the single behavior matrix to obtain a first single behavior matrix, wherein the missing elements are all elements except all determined elements in the single behavior matrix.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 6, the electronic device 600 includes a transceiver 601, a processor 602, and a memory 603. Connected to each other by a bus 604. The memory 603 is used to store computer programs and data, and can transfer data stored in the memory 603 to the processor 602.
The processor 602 is configured to read the computer program in the memory 603 to perform the following operations:
performing data extraction on the user behavior data to obtain a single behavior matrix, a joint behavior matrix and a non-specific behavior matrix;
performing matrix decomposition on the single behavior matrix to obtain a single behavior feature group, and constructing a user portrait according to the single behavior feature group to obtain a single behavior portrait;
performing matrix decomposition on the joint behavior matrix to obtain a joint behavior feature group, and constructing a user portrait according to the joint behavior feature group to obtain a joint behavior portrait;
performing matrix decomposition on the nonspecific behavior matrix to obtain a nonspecific behavior feature group, and constructing a user portrait according to the nonspecific behavior feature group to obtain a nonspecific behavior portrait;
inputting the single behavior image, the joint behavior image and the non-specific behavior image into a weight determination model, and determining a first weight corresponding to the single behavior image, a second weight corresponding to the joint behavior image and a third weight corresponding to the non-specific behavior image;
and according to the first weight, the second weight and the third weight, carrying out weighted summation processing on the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait to obtain a user behavior portrait of the user.
In an embodiment of the present invention, in terms of performing data extraction on user behavior data to obtain a single behavior matrix, a joint behavior matrix, and a non-specific behavior matrix, the processor 602 is specifically configured to perform the following operations:
determining at least one domain label according to a platform to which user behavior data belongs;
performing single-behavior feature extraction on each domain label in at least one domain label to obtain at least one single-behavior label corresponding to each domain label;
matching at least one single behavior tag corresponding to each field tag with a preset joint behavior table to obtain at least one joint behavior tag;
splitting the user behavior data according to the at least one field label to obtain at least one user behavior subdata, wherein the at least one user behavior subdata corresponds to the at least one field label one to one;
for each user behavior subdata in the at least one user behavior subdata, performing data extraction on each user behavior subdata according to at least one single behavior tag and at least one joint behavior tag corresponding to each user behavior subdata to obtain single sub-behavior data, joint sub-behavior data and non-specific sub-behavior data;
and respectively combining the single sub-behavior data, the joint sub-behavior data and the non-specific sub-behavior data corresponding to each user behavior sub-data to obtain a single behavior matrix, a joint behavior matrix and a non-specific behavior matrix.
In an embodiment of the present invention, in matrix decomposing the single behavior matrix to obtain a single behavior feature set, the processor 602 is specifically configured to perform the following operations:
performing completion processing on the single behavior matrix to obtain a first single behavior matrix;
performing matrix decomposition on the first single behavior matrix to obtain a first decomposition matrix and a second decomposition matrix, wherein the first single behavior matrix, the first decomposition matrix and the second decomposition matrix satisfy formula (v):
Figure BDA0003532033010000161
wherein, VijRepresenting the value of the ith row and the jth column in a first single-row matrix, EihDenotes the value of the ith row and the h column in the first matrix obtained by decomposition, GhjDenotes the value, lambda, of the h row and j column in the decomposed second matrixijRepresents VijQ is a user-defined dimension, and h, q, i and j are integers greater than or equal to 0;
and determining the single behavior characteristic group according to the first decomposition matrix and the second decomposition matrix.
In an embodiment of the present invention, in performing a completion process on the single behavior matrix to obtain a first single behavior matrix, the processor 602 is specifically configured to perform the following operations:
decomposing the single behavior matrix into a first completion matrix and a second completion matrix according to the element value of each determined element in the single behavior matrix, wherein the determined elements are all elements except the elements lacking the actual values in the single behavior matrix;
determining the element value of each missing element in the single action matrix according to the first completion matrix and the second completion matrix to obtain a first single action matrix, wherein the missing elements are all elements except all determined elements in the single action matrix, and the element value of each missing element, the first completion matrix and the second completion matrix satisfy the following formula:
Figure BDA0003532033010000162
wherein M isuvThe value of the element representing the missing element of the nth row and the vth column in the single-row matrix, d is the custom dimension, PukIs the value of the kth column, Q, of the u row in the first completion matrixkvIs the value of the v column at the k row in the second completion matrix, d, u, v, k are greater than or equal toAn integer equal to 1.
In an embodiment of the present invention, in performing a completion process on the single behavior matrix to obtain a first single behavior matrix, the processor 602 is specifically configured to perform the following operations:
determining an average value of element values of all determined elements in the single behavior matrix, wherein the determined elements are all elements except elements lacking actual values in the single behavior matrix;
and taking the average value as the element value of each missing element in the single behavior matrix to obtain a first single behavior matrix, wherein the missing elements are all elements except all determined elements in the single behavior matrix.
It should be understood that the user behavior representation generation device in the present application may include a smart Phone (e.g., an Android Phone, an iOS Phone, a Windows Phone, etc.), a tablet computer, a palm computer, a notebook computer, a Mobile Internet device MID (MID), a robot, or a wearable device. The user behavior representation generating device is merely an example, and is not exhaustive, and includes but is not limited to the user behavior representation generating device. In practical applications, the user behavior representation generating device may further include: intelligent vehicle-mounted terminal, computer equipment and the like.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present invention can be implemented by combining software and a hardware platform. With this understanding in mind, all or part of the technical solutions of the present invention that contribute to the background can be embodied in the form of a software product, which can be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes instructions for causing a computer device (which can be a personal computer, a server, or a network device, etc.) to execute the methods according to the embodiments or some parts of the embodiments.
Accordingly, the present application also provides a computer readable storage medium, which stores a computer program, where the computer program is executed by a processor to implement part or all of the steps of any one of the user behavior representation generation methods as described in the above method embodiments. For example, the storage medium may include a hard disk, a floppy disk, an optical disk, a magnetic tape, a magnetic disk, a flash memory, and the like.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the user behavior representation generation methods as described in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are all alternative embodiments and that the acts and modules referred to are not necessarily required by the application.
In the above embodiments, the description of each embodiment has its own emphasis, and for parts not described in detail in a certain embodiment, reference may be made to the description of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is merely a logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some interfaces, indirect coupling or communication connection between devices or units, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, and the memory may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the methods and their core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A method for generating a user behavior representation, the method comprising:
performing data extraction on the user behavior data to obtain a single behavior matrix, a joint behavior matrix and a non-specific behavior matrix;
performing matrix decomposition on the single behavior matrix to obtain a single behavior feature group, and constructing a user portrait according to the single behavior feature group to obtain a single behavior portrait;
performing matrix decomposition on the joint behavior matrix to obtain a joint behavior feature group, and constructing a user portrait according to the joint behavior feature group to obtain a joint behavior portrait;
performing matrix decomposition on the nonspecific behavior matrix to obtain a nonspecific behavior feature group, and constructing a user portrait according to the nonspecific behavior feature group to obtain a nonspecific behavior portrait;
inputting the single behavior image, the joint behavior image and the non-specific behavior image into a weight determination model, and determining a first weight corresponding to the single behavior image, a second weight corresponding to the joint behavior image and a third weight corresponding to the non-specific behavior image;
and according to the first weight, the second weight and the third weight, carrying out weighted summation processing on the single behavior portrait, the joint behavior portrait and the nonspecific behavior portrait to obtain a user behavior portrait of the user.
2. The method of claim 1, wherein the extracting the user behavior data to obtain a single behavior matrix, a joint behavior matrix, and a non-specific behavior matrix comprises:
determining at least one domain label according to a platform to which the user behavior data belongs;
performing single behavior feature extraction on each domain label in the at least one domain label to obtain at least one single behavior label corresponding to each domain label;
matching at least one single behavior tag corresponding to each field tag with a preset joint behavior table to obtain at least one joint behavior tag;
splitting the user behavior data according to the at least one field label to obtain at least one piece of user behavior subdata, wherein the at least one piece of user behavior subdata corresponds to the at least one field label one by one;
for each user behavior subdata in the at least one user behavior subdata, respectively extracting data of each user behavior subdata according to the at least one single behavior tag and the at least one combined behavior tag corresponding to each user behavior subdata to obtain single sub-behavior data, combined sub-behavior data and non-specific sub-behavior data;
and respectively combining the single sub-behavior data, the joint sub-behavior data and the non-specific sub-behavior data corresponding to each user behavior sub-data to obtain the single behavior matrix, the joint behavior matrix and the non-specific behavior matrix.
3. The method of claim 1, wherein the matrix decomposing the single behavior matrix to obtain a set of single behavior features comprises:
completing the single behavior matrix to obtain a first single behavior matrix;
performing matrix decomposition on the first single behavior matrix to obtain a first decomposition matrix and a second decomposition matrix, wherein the first single behavior matrix, the first decomposition matrix and the second decomposition matrix satisfy the following formula:
Figure FDA0003532026000000021
wherein, VijA value, E, representing the ith row and the jth column in the first single-row matrixihRepresenting the values of the ith row and the h column in said first matrix, GhjRepresents the decomposed value of h row and j column in the second matrix, lambdaijRepresents VijQ is a user-defined dimension, and h, q, i and j are integers greater than or equal to 0;
and determining the single behavior feature group according to the first decomposition matrix and the second decomposition matrix.
4. The method of claim 3, wherein the complementing the single behavior matrix to obtain a first single behavior matrix comprises:
decomposing the single behavior matrix into a first completion matrix and a second completion matrix according to the element value of each determined element in the single behavior matrix, wherein the determined elements are all elements except the elements lacking the actual values in the single behavior matrix;
determining an element value of each missing element in the single behavior matrix according to the first completion matrix and the second completion matrix to obtain the first single behavior matrix, wherein the missing elements are all elements except all the determined elements in the single behavior matrix, and the element value of each missing element, the first completion matrix and the second completion matrix satisfy the following formulas:
Figure FDA0003532026000000031
wherein M isuvThe element value of the missing element of the nth row and the vth column in the single-row matrix is represented, and d is self-definitionDimension, PukIs the value of the kth column, Q, of the u row in the first completion matrixkvAnd d, u, v and k are integers which are larger than or equal to 1 and are the values of the kth row and the vth column in the second completion matrix.
5. The method of claim 3, wherein the complementing the single behavior matrix to obtain a first single behavior matrix comprises:
determining an average value of element values of all determined elements in the single behavior matrix, wherein the determined elements are all elements except elements lacking actual values in the single behavior matrix;
and taking the average value as an element value of each missing element in the single behavior matrix to obtain the first single behavior matrix, wherein the missing elements are all elements except all the determined elements in the single behavior matrix.
6. A user behavior representation generation apparatus, the method comprising:
the extraction module is used for extracting data of the user behavior data to obtain a single behavior matrix, a combined behavior matrix and a non-specific behavior matrix;
the decomposition module is used for carrying out matrix decomposition on the single behavior matrix to obtain a single behavior characteristic group, carrying out user portrait construction according to the single behavior characteristic group to obtain a single behavior portrait, carrying out matrix decomposition on the combined behavior matrix to obtain a combined behavior characteristic group, carrying out user portrait construction according to the combined behavior characteristic group to obtain a combined behavior portrait, carrying out matrix decomposition on the unspecific behavior matrix to obtain an unspecific behavior characteristic group, and carrying out user portrait construction according to the unspecific behavior characteristic group to obtain an unspecific behavior portrait;
the weight determining module is used for inputting the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait into a weight determining model, and determining a first weight corresponding to the single behavior portrait, a second weight corresponding to the joint behavior portrait and a third weight corresponding to the non-specific behavior portrait;
and the portrait generation module is used for carrying out weighted summation processing on the single behavior portrait, the joint behavior portrait and the non-specific behavior portrait according to the first weight, the second weight and the third weight so as to obtain the user behavior portrait of the user.
7. The apparatus according to claim 6, wherein in the aspect of performing data extraction on the user behavior data to obtain a single behavior matrix, a joint behavior matrix, and a non-specific behavior matrix, the extraction module is specifically configured to:
determining at least one domain label according to a platform to which the user behavior data belongs;
performing single behavior feature extraction on each domain label in the at least one domain label to obtain at least one single behavior label corresponding to each domain label;
matching at least one single behavior tag corresponding to each field tag with a preset joint behavior table to obtain at least one joint behavior tag;
splitting the user behavior data according to the at least one field label to obtain at least one user behavior subdata, wherein the at least one user behavior subdata is in one-to-one correspondence with the at least one field label;
for each user behavior subdata in the at least one user behavior subdata, respectively extracting data of each user behavior subdata according to the at least one single behavior tag and the at least one combined behavior tag corresponding to each user behavior subdata to obtain single sub-behavior data, combined sub-behavior data and non-specific sub-behavior data;
and respectively combining the single sub-behavior data, the joint sub-behavior data and the non-specific sub-behavior data corresponding to each user behavior sub-data to obtain the single behavior matrix, the joint behavior matrix and the non-specific behavior matrix.
8. The apparatus according to claim 6, wherein in the matrix decomposition of the single behavior matrix to obtain the single behavior feature group, the decomposition module is specifically configured to:
performing completion processing on the single behavior matrix to obtain a first single behavior matrix;
performing matrix decomposition on the first single behavior matrix to obtain a first decomposition matrix and a second decomposition matrix, wherein the first single behavior matrix, the first decomposition matrix and the second decomposition matrix satisfy the following formulas:
Figure FDA0003532026000000041
wherein, VijA value, E, representing the ith row and the jth column in the first single-row matrixihRepresenting the values of the ith row and the h column in said first matrix, GhjRepresents the decomposed value of h row and j column in the second matrix, lambdaijRepresents VijQ is a user-defined dimension, and h, q, i and j are integers greater than or equal to 0;
and determining the single behavior feature set according to the first decomposition matrix and the second decomposition matrix.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the processor, the one or more programs including instructions for performing the steps in the method of any of claims 1-5.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method according to any one of claims 1-5.
CN202210214272.6A 2022-03-04 2022-03-04 User behavior portrait generation method, device, electronic equipment and storage medium Active CN114579860B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210214272.6A CN114579860B (en) 2022-03-04 2022-03-04 User behavior portrait generation method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210214272.6A CN114579860B (en) 2022-03-04 2022-03-04 User behavior portrait generation method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114579860A true CN114579860A (en) 2022-06-03
CN114579860B CN114579860B (en) 2024-04-26

Family

ID=81772644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210214272.6A Active CN114579860B (en) 2022-03-04 2022-03-04 User behavior portrait generation method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114579860B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111949859A (en) * 2019-05-16 2020-11-17 Oppo广东移动通信有限公司 User portrait updating method and device, computer equipment and storage medium
US20210073517A1 (en) * 2019-09-06 2021-03-11 Adobe, Inc. Selecting representative recent digital portraits as cover images
CN113934612A (en) * 2021-09-27 2022-01-14 科大讯飞股份有限公司 User portrait updating method and device, storage medium and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111949859A (en) * 2019-05-16 2020-11-17 Oppo广东移动通信有限公司 User portrait updating method and device, computer equipment and storage medium
US20210073517A1 (en) * 2019-09-06 2021-03-11 Adobe, Inc. Selecting representative recent digital portraits as cover images
CN113934612A (en) * 2021-09-27 2022-01-14 科大讯飞股份有限公司 User portrait updating method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN114579860B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
CN107729937A (en) For determining the method and device of user interest label
CN111274330A (en) Target object determination method and device, computer equipment and storage medium
CN113743981B (en) Material delivery cost prediction method and device, computer equipment and storage medium
CN112926308B (en) Method, device, equipment, storage medium and program product for matching text
CN108140055A (en) Trigger application message
CN112785005A (en) Multi-target task assistant decision-making method and device, computer equipment and medium
CN113886721A (en) Personalized interest point recommendation method and device, computer equipment and storage medium
CN113656690A (en) Product recommendation method and device, electronic equipment and readable storage medium
CN112348300A (en) Method and device for pushing information
CN116304236A (en) User portrait generation method and device, electronic equipment and storage medium
CN114579860B (en) User behavior portrait generation method, device, electronic equipment and storage medium
CN115114500A (en) Rumor detection method and system based on reported information and propagation heteromorphic graph
JP2019125317A (en) Device, method, and program for processing information
CN113591881A (en) Intention recognition method and device based on model fusion, electronic equipment and medium
Narasiman et al. IndQuery-An Online Portal for Registering E-Complaints Integrated with Smart Chatbot
CN112949824A (en) Neural network-based multi-output multi-task feature evaluation method and device and electronic equipment
CN112328871A (en) Reply generation method, device, equipment and storage medium based on RPA module
CN108510071B (en) Data feature extraction method and device and computer readable storage medium
CN111460300A (en) Network content pushing method and device and storage medium
CN111507366B (en) Training method of recommendation probability model, intelligent completion method and related device
CN111046300A (en) Method and device for determining crowd attributes of users
CN106716403A (en) Automated generation of web site entry pages
CN111738789A (en) Article information pushing method, device, equipment and storage medium
CN113254622B (en) Knowledge point query method, knowledge point query device and knowledge point query server
CN114185618B (en) Service tool configuration method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant