CN110674144A - User portrait generation method and device, computer equipment and storage medium - Google Patents

User portrait generation method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN110674144A
CN110674144A CN201910747914.7A CN201910747914A CN110674144A CN 110674144 A CN110674144 A CN 110674144A CN 201910747914 A CN201910747914 A CN 201910747914A CN 110674144 A CN110674144 A CN 110674144A
Authority
CN
China
Prior art keywords
app
data
user
information
tag
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910747914.7A
Other languages
Chinese (zh)
Inventor
郭凌峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OneConnect Smart Technology Co Ltd
Original Assignee
OneConnect Smart Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OneConnect Smart Technology Co Ltd filed Critical OneConnect Smart Technology Co Ltd
Priority to CN201910747914.7A priority Critical patent/CN110674144A/en
Publication of CN110674144A publication Critical patent/CN110674144A/en
Priority to PCT/CN2020/106222 priority patent/WO2021027595A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/23Updating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/285Clustering or classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application relates to the technical field of big data, and provides a user portrait generation method and device, computer equipment and a storage medium. The method comprises the following steps: the method comprises the steps of obtaining APP data information of a user to be analyzed carried by a portrait analysis request, reading feature data of each APP and use frequency data of each APP in the APP data information, traversing APP feature data sets formed by each feature data, obtaining co-occurrence data of the feature data of each APP in the APP feature data sets, searching a correlation database of the APP and tag information, obtaining tag information corresponding to the APP, adding the co-occurrence data and the use frequency data of each APP to the tag information, obtaining an updated tag, and further generating a user portrait of the user to be analyzed. Even if the number of the APP of the user to be analyzed is small, the APP can be matched with the corresponding label information, the co-occurrence data and the use frequency data of the APPs are added to the label information, the updated label with more accurate information is obtained, the user portrait is generated, and the accuracy of the user portrait is improved.

Description

User portrait generation method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of big data technologies, and in particular, to a user portrait generation method, apparatus, computer device, and storage medium.
Background
With the development of information technology, user portrait analysis technology has emerged, which is a tool and method for objectively and accurately describing a target user. Under the background of the big data era, user information is flooded in a network, each concrete information of a user is abstracted into labels, and the labels are utilized to concretize the user image, so that targeted services are provided for the user.
In the actual operation process, the attributes, behaviors and expectations of the user are often combined with the most superficial and life-close words. As a virtual representation of an actual user, a user role formed by a user image is not constructed outside a product and a market, the formed user role has a representation, the user image is a virtual representation of a real user, the user image is divided into different types according to difference of target behavior viewpoints, the different types are quickly organized together, and then the newly obtained types are extracted to form a type of user image.
In the prior art, most of the methods for generating user portrait adopt a keyword extraction method to generate a user tag, and when only a small amount of user data is available, especially when portrait analysis is performed through APP (Application) data of a user, the method for extracting keywords is adopted to perform user portrait analysis, so that the accuracy of generating user portrait is low.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a user representation generation method, apparatus, computer device, and storage medium capable of improving user representation accuracy.
A user representation generation method, the method comprising:
acquiring APP data information of a user to be analyzed carried by an image analysis request, and reading characteristic data of each APP and use frequency data of each APP in the APP data information;
traversing APP characteristic data sets formed by the characteristic data to obtain co-occurrence data of the characteristic data of each APP in the APP characteristic data sets;
searching an association database of the APP and the tag information, acquiring the tag information corresponding to the APP, and adding co-occurrence data and use frequency data of the APP to the tag information to obtain an updated tag;
and generating the user portrait of the user to be analyzed according to the updating label.
In one embodiment, the adding the co-occurrence data and the usage frequency data of the APP to the tag information to obtain the updated tag includes:
calculating a weighting parameter corresponding to the feature data according to the co-occurrence data of the feature data and the number of APPs in the APP feature data set;
according to the weighting parameters corresponding to the characteristic data of the APP, carrying out weighting calculation on the use frequency data of the APP to obtain specific gravity data of the APP;
and adding the proportion data to the label information to obtain an updated label.
In one embodiment, before the searching for the associated database of the APP and the tag information, obtaining the tag information corresponding to the APP, and adding the co-occurrence data and the usage frequency data of the APP to the tag information to obtain the updated tag, the method further includes:
obtaining label information carried by sample users, classifying the sample users according to the label information, and obtaining a plurality of user classification sets;
obtaining APP data information corresponding to each sample user in the user classification set, determining common APPs of the sample users according to the APP data information, and establishing an association relation between the common APPs and the label information;
and updating the tag information and the generic APP into an initial database according to the association relationship to obtain an association database of the APP and the tag information.
In one embodiment, the obtaining APP data information of a user to be analyzed carried by the portrait analysis request, and reading feature data of each APP and usage frequency data of each APP in the APP data information includes:
acquiring APP data information of a user to be analyzed carried by an image analysis request;
reading the use frequency data in the APP data information, and sequencing the APPs according to the numerical value of the use frequency data to generate an APP list;
screening each APP of which the use frequency data do not meet the preset use frequency requirement in the APP data information, and updating the APP list according to a screening result;
and reading the characteristic data of each APP and the use frequency data of each APP in the updated APP list.
In one embodiment, after reading the feature data of each APP and the usage frequency data of each APP in the updated APP list, the method further includes:
according to the feature data of each APP in the updated APP list, constructing a similar APP set containing the same feature data;
adding the co-occurrence data and the use frequency data of the APP to the tag information to obtain an updated tag comprises:
according to the co-occurrence data, performing primary sequencing on the APP sets of the same type;
according to the use frequency data of each APP in the similar APP set, performing secondary sequencing on each APP in the similar APP set, and screening out the APPs of which the use frequency data is larger than a preset threshold range;
constructing a target APP set according to the screening result of each similar APP set;
adding co-occurrence data and use frequency data of the feature data corresponding to each target APP in the target APP set to the tag information, and obtaining an update tag corresponding to each target APP.
In one embodiment, the generating a user representation of the user to be analyzed according to the update tag includes:
according to the number of the updated tags, performing weight proportion distribution on each updated tag, and performing tag de-duplication processing on the updated tags with the same tag content;
according to the weight proportion distribution result, performing proportion data updating on the updated label subjected to the label de-weighting processing to obtain a secondary updated label;
and generating the user portrait of the user to be analyzed according to the secondary updating label.
A user representation generation apparatus, the apparatus comprising:
the data reading module is used for acquiring APP data information of a user to be analyzed carried by the image analysis request, and reading characteristic data of each APP and use frequency data of each APP in the APP data information;
the co-occurrence data acquisition module is used for traversing the APP feature data sets formed by the feature data to acquire co-occurrence data of the feature data of each APP in the APP feature data sets;
the tag updating module is used for searching an associated database of the APP and the tag information, acquiring tag information corresponding to the APP, and adding co-occurrence data and use frequency data of the APP to the tag information to obtain an updated tag;
and the user portrait generation module is used for generating the user portrait of the user to be analyzed according to the updating label.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
acquiring APP data information of a user to be analyzed carried by an image analysis request, and reading characteristic data of each APP and use frequency data of each APP in the APP data information;
traversing APP characteristic data sets formed by the characteristic data to obtain co-occurrence data of the characteristic data of each APP in the APP characteristic data sets;
searching an association database of the APP and the tag information, acquiring the tag information corresponding to the APP, and adding co-occurrence data and use frequency data of the APP to the tag information to obtain an updated tag;
and generating the user portrait of the user to be analyzed according to the updating label.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring APP data information of a user to be analyzed carried by an image analysis request, and reading characteristic data of each APP and use frequency data of each APP in the APP data information;
traversing APP characteristic data sets formed by the characteristic data to obtain co-occurrence data of the characteristic data of each APP in the APP characteristic data sets;
searching an association database of the APP and the tag information, acquiring the tag information corresponding to the APP, and adding co-occurrence data and use frequency data of the APP to the tag information to obtain an updated tag;
and generating the user portrait of the user to be analyzed according to the updating label.
The user portrait generation method, the device, the computer equipment and the storage medium are characterized in that based on APP data information of a user to be analyzed carried by a portrait analysis request, feature data of each APP and use frequency data of each APP in the APP data information are read, an APP feature data set formed by each feature data is traversed, co-occurrence data of the feature data of each APP in the APP feature data set is obtained, a correlation database of the APP and tag information is searched, tag information corresponding to the APP is obtained, preference characteristics of the user to use the application can be better represented through tags, even if the number of APPs of the user to be analyzed is small, corresponding tag information can be obtained according to the APP data information of the user to be analyzed, and the co-occurrence data and the use frequency data of each APP are added to the tag information through co-occurrence data analysis to obtain an updated tag, and more accurately represent the feature information of the user, the user portrait of the user to be analyzed is obtained, and the accuracy of the generated user portrait is improved.
Drawings
FIG. 1 is a diagram illustrating an exemplary user representation generation method;
FIG. 2 is a flow diagram that illustrates a method for user representation generation, according to one embodiment;
FIG. 3 is a schematic flow chart diagram illustrating a user representation generation method in accordance with another embodiment;
FIG. 4 is a schematic flow chart diagram illustrating a user representation generation method in accordance with yet another embodiment;
FIG. 5 is a schematic flow chart diagram illustrating a user representation generation method in accordance with yet another embodiment;
FIG. 6 is a schematic flow chart diagram illustrating a user representation generation method in accordance with yet another embodiment;
FIG. 7 is a block diagram of an embodiment of a user representation generation apparatus;
FIG. 8 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The user portrait generation method provided by the application can be applied to the application environment shown in FIG. 1. Wherein the terminal 102 and the server 104 communicate via a network. Server 104 obtains the APP data information of the user to be analyzed carried in the portrait analysis request sent by terminal 102, and reads the characteristic data of each APP and the frequency of use data of each APP in the APP data information, traverse the APP characteristic data set formed by each characteristic data, obtain the co-occurrence data of the characteristic data of each APP in the APP characteristic data set, find the associated database of the APP and the tag information, obtain the tag information corresponding to the APP, and add the co-occurrence data and the frequency of use data of each APP to the tag information, obtain the update tag, according to the update tag, generate the user portrait of the user to be analyzed, and push the user portrait to terminal 102. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers.
In one embodiment, as shown in FIG. 2, a user representation generation method is provided, which is exemplified by the application of the method to the server in FIG. 1, and comprises the following steps:
step S100, obtaining APP data information of a user to be analyzed carried by the portrait analysis request, and reading characteristic data of each APP and use frequency data of each APP in the APP data information.
The user to be analyzed is a data analysis object which is obtained by analyzing the existing data information according to the existing data information of the user, for example, APP data information generated when the user uses equipment such as a mobile phone, and the like, and the feature tag representing the characteristics of the user. The APP data information may refer to data generated when a user installs a used application program on the mobile device, and the APP data information includes which APPs the user installs and a use condition of each APP. The feature data of the APP comprises operation data generated when a user uses the APP, data type change, data storage positions and types and the like, the feature data can be one item or combination of multiple items, and the feature data can be obtained by acquiring behavior log data of the user. For example, the mobile device of the user is installed with "american drawings and" × P drawings ", wherein the feature data of" american drawings and "× P drawings" may include "open camera", "open gallery/album", "edit picture", and "store picture", etc. The use frequency data refers to that the user triggers the APP and generates the recorded data of the trigger records, the user uses the APP once, the APP data information generates the use recorded data once, the more the user triggers the APP, the higher the use frequency of the corresponding use recorded data.
Step S200, traversing the APP characteristic data set formed by the characteristic data to obtain the co-occurrence data of the characteristic data of each APP in the APP characteristic data set.
The APP characteristic data set refers to a database formed by characteristic data of all APPs installed by a user, and when the server reads the characteristic data of each APP in the APP data information, the data set formed by the characteristic data of each APP
The co-occurrence data of feature data in different APPs refers to the number of times that an APP with the same feature data occurs in all APPs installed by a user, for example, if the user installs "american map" and "P map" APPs, and the feature data of the two APPs are the same, the co-occurrence data of the feature data is 2. The larger the value corresponding to the co-occurrence data is, the more times the user uses the same type of APP is.
Step S400, searching an associated database of the APPs and the label information, acquiring the label information corresponding to each APP, and adding co-occurrence data and use frequency data of each APP to the label information to obtain an updated label.
The database of the association between the APP and the tag information is a database in which an association relationship between a representative feature tag and the APP is obtained by performing feature analysis on a large amount of sample data in advance. Generally, there is a certain commonality between applications installed or used on mobile terminals used by users, for example, a female user generally installs or uses a related APP such as "meix" that records physiological period data; a user who is in a pregnant or childbearing stage generally installs or uses APPs such as "pex", "baby X", "X caretaker", and the like; users who like autodyne generally install or use APPs such as 'american drawings', 'P drawings'; users who like to play games generally install various game APPs, and the APP data information is analyzed through sample users with the same habit characteristics to obtain the association relation between each APP and the label information, so that an association database of the APPs and the label information is constructed. Searching an associated database of the APPs and the tag information, acquiring the tag information corresponding to each APP, wherein each tag information in the associated database is initial tag information and contains feature information of a user, and an update tag carrying co-occurrence data and use frequency data is obtained by adding the co-occurrence data and the use frequency data of each APP to the tag information.
And step S500, generating a user portrait of the user to be analyzed according to the updating label.
The user representation is a virtual representation of a real user, and is first based on reality, which is not a specific person, and is distinguished into different types based on differences in the behavioral views of the target, and is rapidly organized together, and then the newly derived types are refined to form a type of user representation. By updating the label of the co-occurrence data and the use frequency data, a user portrait for describing the characteristics of the user can be formed according to the co-occurrence data and the use frequency data of the label information and the content of the label, and the larger the proportion data corresponding to the co-occurrence data and the use frequency data is, the larger the influence on describing the characteristics of the user is.
The user portrait generation method comprises the steps of reading feature data of each APP and use frequency data of each APP in APP data information based on APP data information of a user to be analyzed carried by a portrait analysis request, traversing an APP feature data set formed by each feature data to obtain co-occurrence data of the feature data of each APP in the APP feature data set, searching a correlation database of the APPs and tag information to obtain tag information corresponding to the APPs, and better representing preference characteristics of the user to use the applications through tags, even if the number of the APPs of the user to be analyzed is small, obtaining corresponding tag information according to the APP data information of the user to be analyzed, and adding the co-occurrence data and the use frequency data of each APP to the tag information through co-occurrence data analysis to obtain updated tags, and representing the feature information of the user more accurately to obtain a user portrait of the user to be analyzed, the accuracy of the generated user representation is improved.
In one embodiment, as shown in fig. 3, step S400 of adding the co-occurrence data and the usage frequency data of each APP to the tag information to obtain the updated tag includes:
step S420, calculating a weighting parameter corresponding to the characteristic data according to the co-occurrence data of the characteristic data and the APP quantity of the APP characteristic data set;
step S440, performing weighted calculation on the use frequency data of the APP according to the weighted parameters corresponding to the characteristic data of the APP to obtain specific gravity data of the APP;
step S460, adding the proportion data to the tag information to obtain the updated tag.
Specifically, taking feature data a as an example, if the APPs including feature data a are a1, a2, and A3, the co-occurrence data of feature data a is 3, the feature data of the APP of the user further includes B, C, D, the co-occurrence data corresponding to feature data A, B, C, D are 3, 4, 1, and 2, respectively, and the percentage corresponding to feature data A, B, C, D is 0.3, 0.4, 0.1, and 0.2. From the APP data information, the use frequency of all APPs is accumulated to 1000, where the use frequency of a1 is 20, the use frequency of a2 is 30, the use frequency of A3 is 50, and the proportions of a1, a2, and A3 to the total use frequency are 0.02,0.03, and 0.05, respectively, so that the weight data corresponding to a1, a2, and A3 are 0.06, 0.09, and 0.15, respectively, which can be obtained by calculation.
In an embodiment, as shown in fig. 4, step S400 is to search an association database of APPs and tag information, obtain tag information corresponding to the APPs, add co-occurrence data and usage frequency data of each APP to the tag information, and before obtaining an updated tag, further include:
step S320, obtaining the label information carried by the sample user, classifying the sample user according to the label information, and obtaining a plurality of user classification sets.
Step S340, obtaining APP data information corresponding to each sample user in the user classification set, determining common APP of each sample user according to the APP data information, and establishing an association relationship between the common APP and the label information.
And S360, updating the tag information and the generic APP into an initial database according to the association relationship to obtain an association database of the APP and the tag information.
The sample user is a user who has already determined user identity information and corresponding APP data information, and is used for analyzing through a plurality of sample users to obtain relevant rules of corresponding user population characteristics of each application, so that the user portrait can be obtained through analysis of the APP data information. The label information of the sample refers to information with group common characteristics extracted through user identity information, such as gender, age group, academic calendar, industry and the like. The classification processing refers to a process of classifying sample users carrying the same sample label into a classification set, wherein each sample user in the same classification set has the same sample label information, and each sample user has certain similarity. If a user in pregnancy or childbearing stage generally installs or uses APP such as 'Bei X', 'baby X', 'pregnancy XX' and the like; the user who likes autodyne generally installs or uses the APPs such as the american figure and the P figure, and the installation and use data of the APPs of the same category can reflect the habit characteristics of the user. For example, the sample tag information of the user classification set is a student, the corresponding APP data information may include applications such as "homework X", "XX search topic", and the like, which are installed or used, and is marked as a generic APP, so as to establish an association relationship between the tag information of "student" and the generic APP related to the student, and update the tag information and the generic APP into an initial database, thereby obtaining an association database of APPs and the tag information.
Specifically, the obtaining process of the label information of the sample user may include: and extracting user characteristic data according to the user basic information of the sample user, and marking the extracted user characteristic data as label information of the sample user. The basic information of the user may include various items of information already owned by the user, such as basic attributes (age, gender, region, constellation, etc.) of the user, social features (family structure, marital status, etc.), interest features (hobbies, interactive content, etc.), and other behavior, consumption, and purchasing power features required in different situations. The user characteristic data can comprise information with group common characteristics extracted from the user basic information, including gender, age stage, academic history, work type, marital and child status and the like. The tag information refers to data simplified by the feature data, and for example, the tag information of one sample user may include "male", "major scholars", "having car", "having marriage", "having child", "playing game", and the like.
The process of classifying sample users may include: the method comprises the steps of obtaining label information carried by sample users, generating a plurality of label groups, wherein the label groups can comprise a single label or a plurality of similar labels, and can also comprise a set formed by a plurality of different labels, and obtaining the sample users corresponding to the label groups by traversing the label information of the sample users according to the label information contained in the label groups, thereby obtaining a plurality of classification sets. For example, the tag group may be "woman", a combination of multiple similar tags such as "mom", "pregnant woman", and the like, or a combination of multiple different tags such as "play game", "programmer", "20-30 years old", and the like.
In an embodiment, as shown in fig. 5, step S100, acquiring APP data information of a user to be analyzed carried by the image analysis request, and reading feature data of each APP and usage frequency data of each APP in the APP data information includes:
step S110, APP data information of a user to be analyzed carried by the portrait analysis request is obtained.
And step S120, reading the use frequency data in the APP data information, sequencing the APPs according to the numerical value of the use frequency data, and generating an APP list.
Step S130, screening each APP of which the use frequency data in the APP data information do not meet the preset use frequency requirement, and updating the APP list according to the screening result.
Step S140, reading the feature data of each APP and the frequency data of each APP in the updated APP list.
The APP list is a result obtained by counting and sequencing the APPs, each use record data of each APP is included in the APP data information, the use frequency data of each APP is obtained according to the counting, each APP is sequenced according to the numerical value of the use frequency data, and the APP list is generated and specifically comprises the following steps: and sequencing the APPs directly according to the sequence of the numerical values from large to small, and sequencing the APPs containing the same type of characteristic data according to the characteristic data. Each APP with usage frequency data not meeting the preset usage frequency requirement refers to a data cleaning process for removing interference data in data, for example, data with failure in opening the APP due to reasons such as flash back of the APP and the like, a user clicks data entering the APP but not executing any operation, namely closing background and other conditions, and the like.
In one embodiment, after reading the feature data of each APP and the usage frequency data of each APP in the updated APP list, step S140 further includes:
and constructing a similar APP set containing the same characteristic data according to the characteristic data of each APP in the updated APP list.
Adding co-occurrence data and use frequency data of each APP to the tag information to obtain an updated tag, wherein the update tag comprises:
and according to the co-occurrence data, performing primary sequencing on the APP sets of the same type.
And carrying out secondary sequencing on the APPs in the similar APP sets according to the use frequency data of the APPs in the similar APP sets, and screening out the APPs of which the use frequency data is larger than a preset threshold range.
And constructing a target APP set according to the screening result of each similar APP set.
Adding co-occurrence data and use frequency data of the feature data corresponding to each target APP in the target APP set to the tag information, and obtaining the updated tag corresponding to each target APP.
The APP similarity that contains the same feature data is higher, can characterize same user feature, according to the APP list that updates after carrying out data cleaning, according to the feature data of APP, construct the APP set of the same kind, the APP that similarity is high is classified into the same kind, carry out the coincidence data statistics to all kinds of APP sets, carry out first sequencing to the APP set, confirm the preference degree of user to all kinds of APPs, and then carry out the secondary sequencing to each APP in the APP of the same kind, according to the result of twice sequencing, it is clear and definite that the user adds to the hobby of APP categorised hobby and the hobby degree of different APPs in each category to the target APP set to the construction obtains the target APP set, add the coincidence data and the use frequency data of the feature data that each target APP corresponds in the target APP set to label information, obtain the update label that corresponds with each target.
In one embodiment, as shown in fig. 6, the step S500 of generating a user representation of a user to be analyzed according to tag information carrying weight data includes:
and step S520, performing weight proportion distribution on each update label according to the number of the update labels, and performing label de-duplication processing on the update labels with the same label content.
And step S540, according to the weight proportion distribution result, performing proportion data updating on the updated label subjected to label de-weighting processing to obtain a secondary updated label.
And step S560, generating a user portrait of the user to be analyzed according to the secondary updating label.
The weight proportion distribution refers to a process of performing proportion distribution according to the number of the effective APPs subjected to the screening processing in the mobile device of the user, and if the number of the effective APPs in the mobile device of the user is 20, the weight proportion corresponding to each APP is 1/20. The tag deduplication refers to a process of removing the same tag, and the proportion data update is a process of performing superposition processing on weight data according to the number of repeated tags and the weight proportion corresponding to the number of repeated tags, for example, the number of APPs whose tag information is "game" includes 5, and the proportion data corresponding to the tag information is data superposition in which the proportion data corresponding to five APPs and the weight proportion distribution result are fused, so that the proportion data update is performed on the tag information subjected to the tag deduplication processing, and a secondary update tag is obtained.
It should be understood that although the various steps in the flow charts of fig. 2-6 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2-6 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in FIG. 7, a user representation generation apparatus is provided, comprising:
and the data reading module 100 is configured to obtain APP data information of a user to be analyzed, which is carried by the portrait analysis request, and read feature data of each APP and usage frequency data of each APP in the APP data information.
A co-occurrence data obtaining module 200, configured to traverse the APP feature data sets formed by the feature data, and obtain co-occurrence data of the feature data of each APP in the APP feature data sets.
And the tag updating module 300 is configured to search an association database of the APPs and the tag information, obtain tag information corresponding to the APPs, add co-occurrence data and usage frequency data of each APP to the tag information, and obtain an updated tag.
And a user representation generation module 400, configured to generate a user representation of the user to be analyzed according to the updated tag.
In one embodiment, the tag update module includes:
and the weighting parameter calculation unit is used for calculating the weighting parameters corresponding to the characteristic data according to the co-occurrence data of the characteristic data and the APP quantity of the APP characteristic data set.
And the proportion data calculation unit is used for performing weighting calculation on the use frequency data of the APP according to the weighting parameters corresponding to the characteristic data of the APP to obtain the proportion data of the APP.
And the label updating unit is used for adding the proportion data to the label information to obtain an updated label.
In one embodiment, the user portrait generation device further comprises an association database construction module, configured to acquire tag information carried by a sample user, classify the sample user according to the tag information, acquire a plurality of user classification sets, acquire APP data information corresponding to each sample user in the user classification sets, determine common APPs of each sample user according to the APP data information, establish an association relationship between the common APPs and the tag information, and update the tag information and the common APPs into an initial database according to the association relationship, so as to obtain an association database of the APPs and the tag information.
In one embodiment, the data reading module 100 is further configured to obtain APP data information of a user to be analyzed, which is carried by the portrait analysis request, read usage frequency data in the APP data information, sort the APPs according to the numerical value of the usage frequency data, generate an APP list, screen the APPs in the APP data information, whose usage frequency data does not meet a preset usage frequency requirement, update the APP list according to a screening result, and read feature data of the APPs and usage frequency data of the APPs in the updated APP list.
In one embodiment, the data reading module 100 is further configured to construct a similar APP set including the same feature data according to the feature data of each APP in the updated APP list;
the tag updating module 300 is further configured to perform primary sequencing on the similar APP sets according to the co-occurrence data, perform secondary sequencing on the APPs in the similar APP sets according to the use frequency data of the APPs in the similar APP sets, screen out the APPs whose use frequency data are greater than a preset threshold range, construct target APP sets according to the screening results of the similar APP sets, add the co-occurrence data and the use frequency data of the feature data corresponding to each target APP in the target APP sets to the tag information, and obtain the updated tags corresponding to each target APP.
In one embodiment, the user profile generation module 400 is further configured to perform weight proportion distribution on each update tag according to the number of update tags, perform tag deduplication processing on update tags with the same tag content, perform weight data update on the update tags subjected to the tag deduplication processing according to a weight proportion distribution result to obtain secondary update tags, and generate a user of a user to be analyzed according to the secondary update tags.
The user portrait generation device reads the characteristic data of each APP and the use frequency data of each APP in the APP data information based on the APP data information of the user to be analyzed carried by the portrait analysis request, traverses the APP characteristic data set formed by each characteristic data to obtain the co-occurrence data of the characteristic data of each APP in the APP characteristic data set, searches the associated database of the APPs and the tag information to obtain the tag information corresponding to the APP, can better represent the preference characteristic of the user to use the application through the tag, can obtain the corresponding tag information according to the APP data information of the user to be analyzed even if the number of the APPs of the user to be analyzed is small, and can obtain the updated tag through the co-occurrence data analysis and adding the co-occurrence data and the use frequency data of each APP to the tag information to more accurately represent the characteristic information of the user to be analyzed to obtain the user portrait of the user to be analyzed, the accuracy of the generated user representation is improved.
For specific limitations of the user representation generating device, reference may be made to the above limitations of the user representation generating method, which are not described herein again. The modules of the user representation generation apparatus may be implemented in whole or in part by software, hardware, or a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 8. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is for storing user representation generation data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a user representation generation method.
Those skilled in the art will appreciate that the architecture shown in fig. 8 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, there is provided a computer device comprising a memory storing a computer program and a processor implementing the following steps when the processor executes the computer program:
acquiring APP data information of a user to be analyzed carried by the image analysis request, and reading characteristic data of each APP and use frequency data of each APP in the APP data information;
traversing APP characteristic data sets formed by the characteristic data to obtain co-occurrence data of the characteristic data of each APP in the APP characteristic data sets;
searching an association database of the APPs and the tag information, acquiring the tag information corresponding to the APPs, and adding co-occurrence data and use frequency data of each APP to the tag information to obtain an updated tag;
and generating a user portrait of the user to be analyzed according to the updating label.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
calculating a weighting parameter corresponding to the characteristic data according to the co-occurrence data of the characteristic data and the number of APPs in the APP characteristic data set;
according to weighting parameters corresponding to the characteristic data of the APP, carrying out weighting calculation on the use frequency data of the APP to obtain specific gravity data of the APP;
and adding the proportion data to the label information to obtain an updated label.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
obtaining label information carried by sample users, classifying the sample users according to the label information, and obtaining a plurality of user classification sets;
acquiring APP data information corresponding to each sample user in the user classification set, determining common APP of each sample user according to the APP data information, and establishing an association relation between the common APP and the label information;
and updating the tag information and the generic APP into an initial database according to the association relationship to obtain an association database of the APP and the tag information.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring APP data information of a user to be analyzed carried by an image analysis request;
reading use frequency data in the APP data information, sequencing the APPs according to the numerical value of the use frequency data, and generating an APP list;
screening each APP of which the use frequency data in the APP data information does not meet the preset use frequency requirement, and updating an APP list according to a screening result;
and reading the characteristic data of each APP and the use frequency data of each APP in the updated APP list.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
constructing a similar APP set containing the same characteristic data according to the characteristic data of each APP in the updated APP list;
according to the co-occurrence data, performing primary sequencing on the APP sets of the same type;
according to the use frequency data of each APP in the similar APP sets, performing secondary sequencing on each APP in the similar APP sets, and screening out the APPs of which the use frequency data is larger than a preset threshold range;
constructing a target APP set according to the screening result of each similar APP set;
adding co-occurrence data and use frequency data of the feature data corresponding to each target APP in the target APP set to the tag information, and obtaining the updated tag corresponding to each target APP.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
according to the number of the updated tags, performing weight proportion distribution on each updated tag, and performing tag deduplication processing on the updated tags with the same tag content;
according to the weight proportion distribution result, performing proportion data updating on the updated label subjected to label de-weighting processing to obtain a secondary updated label;
and generating a user portrait of the user to be analyzed according to the secondary updating label.
The computer equipment for realizing the user portrait generation method reads the characteristic data of each APP and the use frequency data of each APP in the APP data information based on the APP data information of the user to be analyzed carried by the portrait analysis request, traverses the APP characteristic data set formed by each characteristic data to obtain the co-occurrence data of the characteristic data of each APP in the APP characteristic data set, searches the associated database of the APP and the label information to obtain the label information corresponding to the APP, can better represent the preference characteristic of the user to use the application through the label, can obtain the corresponding label information according to the APP data information of the user to be analyzed even if the number of the APPs of the user to be analyzed is less, and can add the co-occurrence data and the use frequency data of each APP to the label information through the analysis of the co-occurrence data to obtain the updated label, and more accurately represent the characteristic information of the user, the user portrait of the user to be analyzed is obtained, and the accuracy of the generated user portrait is improved.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring APP data information of a user to be analyzed carried by the image analysis request, and reading characteristic data of each APP and use frequency data of each APP in the APP data information;
traversing APP characteristic data sets formed by the characteristic data to obtain co-occurrence data of the characteristic data of each APP in the APP characteristic data sets;
searching an association database of the APPs and the tag information, acquiring the tag information corresponding to the APPs, and adding co-occurrence data and use frequency data of each APP to the tag information to obtain an updated tag;
and generating a user portrait of the user to be analyzed according to the updating label.
In one embodiment, the computer program when executed by the processor further performs the steps of:
calculating a weighting parameter corresponding to the characteristic data according to the co-occurrence data of the characteristic data and the number of APPs in the APP characteristic data set;
according to weighting parameters corresponding to the characteristic data of the APP, carrying out weighting calculation on the use frequency data of the APP to obtain specific gravity data of the APP;
and adding the proportion data to the label information to obtain an updated label.
In one embodiment, the computer program when executed by the processor further performs the steps of:
obtaining label information carried by sample users, classifying the sample users according to the label information, and obtaining a plurality of user classification sets;
acquiring APP data information corresponding to each sample user in the user classification set, determining common APP of each sample user according to the APP data information, and establishing an association relation between the common APP and the label information;
and updating the tag information and the generic APP into an initial database according to the association relationship to obtain an association database of the APP and the tag information.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring APP data information of a user to be analyzed carried by an image analysis request;
reading use frequency data in the APP data information, sequencing the APPs according to the numerical value of the use frequency data, and generating an APP list;
screening each APP of which the use frequency data in the APP data information does not meet the preset use frequency requirement, and updating an APP list according to a screening result;
and reading the characteristic data of each APP and the use frequency data of each APP in the updated APP list.
In one embodiment, the computer program when executed by the processor further performs the steps of:
constructing a similar APP set containing the same characteristic data according to the characteristic data of each APP in the updated APP list;
according to the co-occurrence data, performing primary sequencing on the APP sets of the same type;
according to the use frequency data of each APP in the similar APP sets, performing secondary sequencing on each APP in the similar APP sets, and screening out the APPs of which the use frequency data is larger than a preset threshold range;
constructing a target APP set according to the screening result of each similar APP set;
adding co-occurrence data and use frequency data of the feature data corresponding to each target APP in the target APP set to the tag information, and obtaining the updated tag corresponding to each target APP.
In one embodiment, the computer program when executed by the processor further performs the steps of:
according to the number of the updated tags, performing weight proportion distribution on each updated tag, and performing tag deduplication processing on the updated tags with the same tag content;
according to the weight proportion distribution result, performing proportion data updating on the updated label subjected to label de-weighting processing to obtain a secondary updated label;
and generating a user portrait of the user to be analyzed according to the secondary updating label.
The computer-readable storage medium for implementing the user portrait generation method reads the feature data of each APP and the frequency of use data of each APP in the APP data information based on the APP data information of the user to be analyzed carried by the portrait analysis request, traverses the APP feature data set formed by each feature data to obtain the co-occurrence data of the feature data of each APP in the APP feature data set, searches the associated database of the APP and the tag information to obtain the tag information corresponding to the APP, can better represent the preference characteristic of the user to use the application through the tag, can obtain the corresponding tag information according to the APP data information of the user to be analyzed even if the number of the APPs of the user to be analyzed is small, and can obtain the updated tag by analyzing the co-occurrence data and adding the frequency of use data of each APP to the tag information to more accurately represent the feature information of the user, the user portrait of the user to be analyzed is obtained, and the accuracy of the generated user portrait is improved.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A user representation generation method, the method comprising:
acquiring APP data information of a user to be analyzed carried by an image analysis request, and reading characteristic data of each APP and use frequency data of each APP in the APP data information;
traversing APP characteristic data sets formed by the characteristic data to obtain co-occurrence data of the characteristic data of each APP in the APP characteristic data sets;
searching an association database of the APP and the tag information, acquiring the tag information corresponding to the APP, and adding co-occurrence data and use frequency data of the APP to the tag information to obtain an updated tag;
and generating the user portrait of the user to be analyzed according to the updating label.
2. The method of claim 1, wherein adding the co-occurrence data and the usage frequency data of the APP to the tag information to obtain an updated tag comprises:
calculating a weighting parameter corresponding to the feature data according to the co-occurrence data of the feature data and the number of APPs in the APP feature data set;
according to the weighting parameters corresponding to the characteristic data of the APP, carrying out weighting calculation on the use frequency data of the APP to obtain specific gravity data of the APP;
and adding the proportion data to the label information to obtain an updated label.
3. The method of claim 1, wherein before the searching for the associated database of the APP and the tag information, obtaining the tag information corresponding to the APP, and adding the co-occurrence data and the usage frequency data of the APP to the tag information to obtain the updated tag, the method further comprises:
obtaining label information carried by sample users, classifying the sample users according to the label information, and obtaining a plurality of user classification sets;
obtaining APP data information corresponding to each sample user in the user classification set, determining common APPs of the sample users according to the APP data information, and establishing an association relation between the common APPs and the label information;
and updating the tag information and the generic APP into an initial database according to the association relationship to obtain an association database of the APP and the tag information.
4. The method of claim 1, wherein the obtaining APP data information of a user to be analyzed carried by the image analysis request, and reading feature data of each APP and usage frequency data of each APP in the APP data information comprises:
acquiring APP data information of a user to be analyzed carried by an image analysis request;
reading the use frequency data in the APP data information, and sequencing the APPs according to the numerical value of the use frequency data to generate an APP list;
screening each APP of which the use frequency data do not meet the preset use frequency requirement in the APP data information, and updating the APP list according to a screening result;
and reading the characteristic data of each APP and the use frequency data of each APP in the updated APP list.
5. The method of claim 4, wherein after reading the feature data of each APP and the frequency of use data of each APP in the updated APP list, further comprising:
according to the feature data of each APP in the updated APP list, constructing a similar APP set containing the same feature data;
adding the co-occurrence data and the use frequency data of the APP to the tag information to obtain an updated tag comprises:
according to the co-occurrence data, performing primary sequencing on the APP sets of the same type;
according to the use frequency data of each APP in the similar APP set, performing secondary sequencing on each APP in the similar APP set, and screening out the APPs of which the use frequency data is larger than a preset threshold range;
constructing a target APP set according to the screening result of each similar APP set;
adding co-occurrence data and use frequency data of the feature data corresponding to each target APP in the target APP set to the tag information, and obtaining an update tag corresponding to each target APP.
6. The method of claim 5, wherein generating the user representation of the user to be analyzed according to the update tag comprises:
according to the number of the updated tags, performing weight proportion distribution on each updated tag, and performing tag de-duplication processing on the updated tags with the same tag content;
according to the weight proportion distribution result, performing proportion data updating on the updated label subjected to the label de-weighting processing to obtain a secondary updated label;
and generating the user portrait of the user to be analyzed according to the secondary updating label.
7. A user representation generation apparatus, the apparatus comprising:
the data reading module is used for acquiring APP data information of a user to be analyzed carried by the image analysis request, and reading characteristic data of each APP and use frequency data of each APP in the APP data information;
the co-occurrence data acquisition module is used for traversing the APP feature data sets formed by the feature data to acquire co-occurrence data of the feature data of each APP in the APP feature data sets;
the tag updating module is used for searching an associated database of the APP and the tag information, acquiring tag information corresponding to the APP, and adding co-occurrence data and use frequency data of the APP to the tag information to obtain an updated tag;
and the user portrait generation module is used for generating the user portrait of the user to be analyzed according to the updating label.
8. The user representation generation apparatus of claim 7, wherein said tag update module comprises:
the weighting parameter calculation unit is used for calculating a weighting parameter corresponding to the feature data according to the co-occurrence data of the feature data and the number of APPs in the APP feature data set;
the proportion data calculation unit is used for performing weighting calculation on the use frequency data of the APP according to the weighting parameters corresponding to the characteristic data of the APP to obtain the proportion data of the APP;
and the label updating unit is used for adding the proportion data to the label information to obtain an updated label.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 6 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
CN201910747914.7A 2019-08-14 2019-08-14 User portrait generation method and device, computer equipment and storage medium Pending CN110674144A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910747914.7A CN110674144A (en) 2019-08-14 2019-08-14 User portrait generation method and device, computer equipment and storage medium
PCT/CN2020/106222 WO2021027595A1 (en) 2019-08-14 2020-07-31 User portrait generation method and apparatus, computer device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910747914.7A CN110674144A (en) 2019-08-14 2019-08-14 User portrait generation method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110674144A true CN110674144A (en) 2020-01-10

Family

ID=69068573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910747914.7A Pending CN110674144A (en) 2019-08-14 2019-08-14 User portrait generation method and device, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN110674144A (en)
WO (1) WO2021027595A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111405030A (en) * 2020-03-12 2020-07-10 腾讯科技(深圳)有限公司 Message pushing method and device, electronic equipment and storage medium
CN111753026A (en) * 2020-06-28 2020-10-09 中国银行股份有限公司 User portrait generation system, method, device, equipment and medium
CN111833676A (en) * 2020-08-05 2020-10-27 北京育宝科技有限公司 Interactive learning auxiliary method, device and system
CN111861545A (en) * 2020-06-22 2020-10-30 国家计算机网络与信息安全管理中心 User behavior portrait construction method, device, equipment and storage medium
WO2021027595A1 (en) * 2019-08-14 2021-02-18 深圳壹账通智能科技有限公司 User portrait generation method and apparatus, computer device, and computer-readable storage medium
CN112948526A (en) * 2021-02-01 2021-06-11 大箴(杭州)科技有限公司 User portrait generation method and device, electronic equipment and storage medium
CN112988774A (en) * 2021-03-23 2021-06-18 汪威 User information updating method based on big data acquisition and information server
CN113298145A (en) * 2021-05-24 2021-08-24 中国邮政储蓄银行股份有限公司 Label filling method and device
CN113821703A (en) * 2020-06-18 2021-12-21 广州汽车集团股份有限公司 Internet of vehicles user portrait generation method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150332373A1 (en) * 2012-12-14 2015-11-19 Baidu Online Network Technology (Beijing) Co., Ltd Method and system for pushing mobile application
CN106940705A (en) * 2016-12-20 2017-07-11 上海掌门科技有限公司 A kind of method and apparatus for being used to build user's portrait
CN110069702A (en) * 2019-03-15 2019-07-30 深圳壹账通智能科技有限公司 User behavior data analysis method, device, computer equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109558530A (en) * 2018-10-23 2019-04-02 深圳壹账通智能科技有限公司 User's portrait automatic generation method and system based on data processing
CN110674144A (en) * 2019-08-14 2020-01-10 深圳壹账通智能科技有限公司 User portrait generation method and device, computer equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150332373A1 (en) * 2012-12-14 2015-11-19 Baidu Online Network Technology (Beijing) Co., Ltd Method and system for pushing mobile application
CN106940705A (en) * 2016-12-20 2017-07-11 上海掌门科技有限公司 A kind of method and apparatus for being used to build user's portrait
CN110069702A (en) * 2019-03-15 2019-07-30 深圳壹账通智能科技有限公司 User behavior data analysis method, device, computer equipment and storage medium

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021027595A1 (en) * 2019-08-14 2021-02-18 深圳壹账通智能科技有限公司 User portrait generation method and apparatus, computer device, and computer-readable storage medium
CN111405030A (en) * 2020-03-12 2020-07-10 腾讯科技(深圳)有限公司 Message pushing method and device, electronic equipment and storage medium
CN111405030B (en) * 2020-03-12 2021-08-10 腾讯科技(深圳)有限公司 Message pushing method and device, electronic equipment and storage medium
CN113821703B (en) * 2020-06-18 2023-12-08 广州汽车集团股份有限公司 Internet of vehicles user portrait generation method and system thereof
CN113821703A (en) * 2020-06-18 2021-12-21 广州汽车集团股份有限公司 Internet of vehicles user portrait generation method and system
CN111861545B (en) * 2020-06-22 2022-10-18 国家计算机网络与信息安全管理中心 User behavior portrait construction method, device, equipment and storage medium
CN111861545A (en) * 2020-06-22 2020-10-30 国家计算机网络与信息安全管理中心 User behavior portrait construction method, device, equipment and storage medium
CN111753026A (en) * 2020-06-28 2020-10-09 中国银行股份有限公司 User portrait generation system, method, device, equipment and medium
CN111753026B (en) * 2020-06-28 2023-09-12 中国银行股份有限公司 User portrait generation system, method, device, equipment and medium
CN111833676A (en) * 2020-08-05 2020-10-27 北京育宝科技有限公司 Interactive learning auxiliary method, device and system
CN112948526A (en) * 2021-02-01 2021-06-11 大箴(杭州)科技有限公司 User portrait generation method and device, electronic equipment and storage medium
CN112988774A (en) * 2021-03-23 2021-06-18 汪威 User information updating method based on big data acquisition and information server
CN112988774B (en) * 2021-03-23 2021-10-15 宝嘉德(上海)文化发展有限公司 User information updating method based on big data acquisition and information server
CN113298145A (en) * 2021-05-24 2021-08-24 中国邮政储蓄银行股份有限公司 Label filling method and device

Also Published As

Publication number Publication date
WO2021027595A1 (en) 2021-02-18

Similar Documents

Publication Publication Date Title
CN110674144A (en) User portrait generation method and device, computer equipment and storage medium
CN108874992B (en) Public opinion analysis method, system, computer equipment and storage medium
CN109543925B (en) Risk prediction method and device based on machine learning, computer equipment and storage medium
WO2022105129A1 (en) Content data recommendation method and apparatus, and computer device, and storage medium
CN104077723B (en) A kind of social networks commending system and method
CN111143178B (en) User behavior analysis method, device and equipment
CN111163072B (en) Method and device for determining characteristic value in machine learning model and electronic equipment
CN110544109A (en) user portrait generation method and device, computer equipment and storage medium
CN112035549B (en) Data mining method, device, computer equipment and storage medium
WO2019061664A1 (en) Electronic device, user's internet surfing data-based product recommendation method, and storage medium
CN111090807A (en) Knowledge graph-based user identification method and device
CN110880006A (en) User classification method and device, computer equipment and storage medium
CN112784168B (en) Information push model training method and device, information push method and device
CN114511085A (en) Entity attribute value identification method, apparatus, device, medium, and program product
CN105389714B (en) Method for identifying user characteristics from behavior data
CN113918738A (en) Multimedia resource recommendation method and device, electronic equipment and storage medium
CN115687810A (en) Webpage searching method and device and related equipment
CN111651666A (en) User theme recommendation method and device, computer equipment and storage medium
CN112685618A (en) User feature identification method and device, computing equipment and computer storage medium
CN113961811B (en) Event map-based conversation recommendation method, device, equipment and medium
CN107003930B (en) User information recording method and device and electronic equipment
CN109242690A (en) Finance product recommended method, device, computer equipment and readable storage medium storing program for executing
CN114741540A (en) Multimedia sequence recommendation method, operation prediction model training method, device, equipment and storage medium
CN104376021A (en) File recommending system and method
CN114021739B (en) Business processing method, business processing model training device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200110