CN114491205A - User portrait generation method and device, electronic equipment and readable medium - Google Patents

User portrait generation method and device, electronic equipment and readable medium Download PDF

Info

Publication number
CN114491205A
CN114491205A CN202111676606.3A CN202111676606A CN114491205A CN 114491205 A CN114491205 A CN 114491205A CN 202111676606 A CN202111676606 A CN 202111676606A CN 114491205 A CN114491205 A CN 114491205A
Authority
CN
China
Prior art keywords
recruitment
behavior
attribute value
user
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111676606.3A
Other languages
Chinese (zh)
Inventor
吴建斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wuba Ganji Information Technology Co ltd
Original Assignee
Beijing 58 Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 58 Information Technology Co Ltd filed Critical Beijing 58 Information Technology Co Ltd
Priority to CN202111676606.3A priority Critical patent/CN114491205A/en
Publication of CN114491205A publication Critical patent/CN114491205A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Marketing (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Economics (AREA)
  • Evolutionary Computation (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a user portrait generation method, a user portrait generation device, electronic equipment and a readable medium, wherein the method comprises the following steps: acquiring user behavior information capable of reflecting job hunting intention of a target user; extracting recruitment attribute value sequences corresponding to various types of behaviors from the user behavior information; the recruitment attribute value sequence comprises a plurality of recruitment attribute values; generating a recruitment attribute value vector for the recruitment attribute value for each of the types of behaviors; the recruitment attribute value vector is used for reflecting the association relationship between the recruitment attribute values; aiming at a target type behavior, determining a portrait label of the target type behavior according to the recruitment attribute value vector corresponding to the target type behavior; and determining the portrait label of the target user according to the portrait label of the target type behavior. The embodiment of the invention can obtain more accurate portrait labels of more accurate target users.

Description

User portrait generation method and device, electronic equipment and readable medium
Technical Field
The embodiment of the invention relates to the technical field of internet, in particular to a user portrait generation method, a user portrait device, electronic equipment and a computer readable medium.
Background
The user portrait is a series of portrait labels which are generated by statistics and algorithm calculation and used for describing the characteristics of the user and are related to business according to information provided by the user and operation records of the user on the service platform. Therefore, the core task of constructing a user portrait is to set a portrait label for the user, and the portrait label is a highly refined feature identifier obtained by analyzing information.
In a recruitment recommendation scene, in order to provide a more accurate personalized recommendation result for a user, portrait tags are required to be used, portrait tags of single types of behaviors are usually calculated, and then, the portrait tags of the single types of behaviors can be used for screening out job positions which are most matched with the user from a large number of candidate job positions and recommending the job positions to the user, so that invalid browsing time of the user is shortened, and user experience is improved.
However, although the conventional user representation calculation scheme can calculate representation tags of single type behaviors, the association relationship among various recruitment attribute values in the type behaviors is not considered, so that recommendation based on the representation tags cannot be associated with other recruitment requirements of the user, and accurate recommendation for the user based on the representation tags cannot be performed.
Disclosure of Invention
The embodiment of the invention provides a user portrait generation method and device, electronic equipment and a computer readable storage medium, and aims to solve the problem that a portrait label cannot reflect other recruitment requirements of a user.
The embodiment of the invention discloses a user portrait generation method, which comprises the following steps:
acquiring user behavior information capable of reflecting job hunting intention of a target user;
extracting a recruitment attribute value sequence corresponding to each type of behavior from the user behavior information; the recruitment attribute value sequence comprises a plurality of recruitment attribute values;
acquiring a recruitment attribute value vector of the recruitment attribute value of each type of behavior; the recruitment attribute value vector is used for reflecting the association relationship between the recruitment attribute values;
aiming at a target type behavior, determining a portrait label of the target type behavior according to the recruitment attribute value vector corresponding to the target type behavior;
and determining the portrait label of the target user according to the portrait label of the target type behavior so as to form the user portrait of the target user.
Optionally, the determining the portrait label of the target type behavior according to the recruitment attribute value vector corresponding to the target type behavior includes:
forming a recruitment attribute value vector sequence by the recruitment attribute value vectors of the target type behaviors;
summing the recruitment attribute value vector sequences of the target type behaviors to obtain recruitment attribute direction vectors of the target type behaviors;
acquiring a weight value of the target type behavior;
weighting the recruitment attribute direction vector of the target type behavior by adopting the weight value;
calculating a first cosine similarity between the recruitment attribute value vector of the target type behavior and the weighted recruitment attribute direction vector;
and taking the recruitment attribute value corresponding to the recruitment attribute value vector with the first cosine similarity before the preset digit as the portrait label of the target type behavior.
Optionally, the obtaining the weight value of the target type behavior includes:
forming a recruitment attribute value vector sequence by the recruitment attribute value vectors of other types of behaviors;
calculating a second cosine value similarity between the recruitment attribute direction vector of the target type behavior and the recruitment attribute value vector sequences of the other types of behaviors;
and calculating the average value between the second cosine value similarities as the weight value of the target type behavior.
Optionally, the determining a portrait label of the target user according to the portrait label of the target type behavior includes:
acquiring a first cosine similarity of the portrait label of each target type behavior; the first cosine similarity is the cosine similarity of the recruitment attribute value vector of the corresponding recruitment attribute value and the weighted recruitment attribute direction vector of the portrait label of each target type behavior;
and taking the portrait label of the target type behavior with the first cosine similarity before a preset digit as the portrait label of the target user.
Optionally, a point burying program is set on the service platform, and the acquiring of the user behavior information capable of reflecting the job hunting intention of the target user includes:
and acquiring user behavior information reported by the embedded point program in the service platform.
Optionally, the method further comprises:
acquiring basic portrait information of the target user; the basic portrait information comprises biological basic information reflecting the characteristics of the target user and job hunting intention information;
and constructing the user portrait of the target user by using the basic portrait information and/or the portrait label.
Optionally, the biological basic information includes at least gender, age, school calendar, and work experience; the job hunting intention information at least comprises a desired industry, a job type and job salaries; the type behavior at least comprises delivery, communication and clicking.
Optionally, the obtaining a recruitment attribute value vector of the recruitment attribute value for each of the types of behaviors includes:
inputting the recruitment attribute values of the types of behaviors into a pre-trained recruitment attribute value prediction model to obtain recruitment attribute vectors corresponding to the recruitment attribute values; the recruitment attribute value prediction model is obtained by training by adopting a historical recruitment attribute value sequence of each type of behavior, and the historical recruitment attribute value sequence comprises a plurality of historical recruitment attribute values.
The embodiment of the invention also discloses a user portrait generating device, which comprises:
the user behavior information acquisition module is used for acquiring user behavior information capable of reflecting the job hunting intention of the target user;
the recruitment attribute value extraction module is used for extracting a recruitment attribute value sequence corresponding to each type of behavior from the user behavior information; the recruitment attribute value sequence comprises a plurality of recruitment attribute values;
a recruitment attribute value vector generation module for acquiring a recruitment attribute value vector of the recruitment attribute value of each of the types of behaviors; the recruitment attribute value vector is used for reflecting the association relationship between the recruitment attribute values;
the type behavior tag determining module is used for determining a portrait tag of the target type behavior according to the recruitment attribute value vector corresponding to the target type behavior aiming at the target type behavior;
and the user tag determining module is used for determining the portrait tag of the target user according to the portrait tag of the target type behavior.
Optionally, the type behavior tag determination module is configured to combine the recruitment attribute value vectors of the target type behaviors into a recruitment attribute value vector sequence; summing the recruitment attribute value vector sequences of the target type behaviors to obtain recruitment attribute direction vectors of the target type behaviors; acquiring a weight value of the target type behavior; the weight value of the target type behavior has an incidence relation with other types of behaviors; weighting the recruitment attribute direction vector of the target type behavior by adopting the weight value; calculating a first cosine similarity between the recruitment attribute value vector of the target type behavior and the weighted recruitment attribute direction vector; and taking the recruitment attribute value corresponding to the recruitment attribute value vector with the first cosine similarity before the preset digit as the portrait label of the target type behavior.
Optionally, the type behavior tag determination module is configured to combine recruitment attribute value vectors of other types of behaviors into a recruitment attribute value vector sequence; calculating a second cosine value similarity between the recruitment attribute direction vector of the target type behavior and the recruitment attribute value vector sequences of the other types of behaviors; and calculating the average value between the second cosine value similarities as the weight value of the target type behavior.
Optionally, the user tag determination module is configured to obtain a first cosine similarity of the portrait tag of each target type behavior; the first cosine similarity is the cosine similarity of the recruitment attribute value vector of the corresponding recruitment attribute value and the weighted recruitment attribute direction vector of the portrait label of each target type behavior; and taking the portrait label of the target type behavior with the first cosine similarity before the preset digit as the portrait label of the target user.
Optionally, a point burying program is arranged on the service platform, and the user behavior information obtaining module is configured to obtain the user behavior information reported by the point burying program in the service platform.
Optionally, the apparatus further comprises: the user portrait generating module is used for acquiring basic portrait information of the target user; the basic portrait information comprises biological basic information reflecting the characteristics of the target user and job hunting intention information; and constructing the user portrait of the target user by the base portrait information and/or the portrait label of the target user.
Optionally, the biological basic information may include at least gender, age, scholarly and working experience; the job hunting intention information at least comprises a desired industry, a job type and a job salary; the type behavior at least comprises delivery, communication and clicking.
Optionally, the recruitment attribute value vector generation module is configured to input the recruitment attribute values of the types of behaviors into a pre-trained recruitment attribute value prediction model to obtain a recruitment attribute vector corresponding to the recruitment attribute value; the recruitment attribute value prediction model is obtained by training by adopting a historical recruitment attribute value sequence of each type of behavior, and the historical recruitment attribute value sequence comprises a plurality of historical recruitment attribute values.
The embodiment of the invention also discloses electronic equipment which comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory finish mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the method according to the embodiment of the present invention when executing the program stored in the memory.
Also disclosed are one or more computer-readable media having instructions stored thereon, which, when executed by one or more processors, cause the processors to perform a method according to an embodiment of the invention.
Embodiments of the present invention also disclose a computer program product, which is stored in a storage medium and executed by at least one processor to implement the method according to the embodiments of the present invention.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, user behavior information capable of reflecting the job hunting intention of a target user is acquired, a recruitment attribute value sequence corresponding to each type of behavior is extracted from the user behavior information, a recruitment attribute value vector reflecting the association relationship between the recruitment attribute values is generated according to the recruitment attribute value sequence, then, according to the recruitment attribute value vector corresponding to the target type of behavior, an image label of the target type of behavior is determined, and then, according to the image label of the target type of behavior, the image label of the target user is finally determined, so that a user image of the target user is formed. According to the embodiment of the invention, when the portrait label corresponding to the single type of behavior is calculated, the recruitment attribute value vector reflecting the association relationship between the recruitment attribute values is generated, and the portrait label of the type of behavior is determined according to the recruitment attribute value vector of the single type of behavior.
Drawings
FIG. 1 is a flow chart illustrating steps of a method for generating a user representation according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps for portrait tag determination for a target user provided in an embodiment of the present invention;
FIG. 3 is a block diagram of a user portrait apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The current user portrait generation scheme is that user behavior information of a user on a service platform is recorded, corresponding recruitment attribute values in the user behavior information are extracted, and the recruitment attribute values of various types of behaviors are attenuated according to corresponding event occurrence time so as to update portrait labels corresponding to the event occurrence time; and setting a preset fixed weight value for the portrait labels of each type of behaviors, and then obtaining the final portrait label based on the portrait labels of each type of behaviors according to the preset fixed weight value.
The disadvantages of the user representation generation scheme include: 1) when the portrait label of a single type of behavior is calculated, the event occurrence time is used for attenuation, and although timeliness is considered, the incidence relation between recruitment attribute values in the type of behavior is not considered; 2) when the final portrait label is determined by integrating the portrait labels of all types of behaviors, a preset fixed weight value is adopted, adjustment is not performed according to the correlation among all types of behaviors of the user, and the method is not flexible enough, so that when recommendation is performed based on the portrait label, other related information cannot be further provided for the user, for example, when job hunting and recruitment, more related position information cannot be provided for the user.
In view of the above problems, embodiments of the present invention provide a user portrait generation method, which determines portrait tags of various types of behaviors by using a recruitment attribute value vector that reflects an association relationship between recruitment attribute values when calculating portrait tags of single types of behaviors, and considers the association between recruitment attribute values within a single type of behavior; in addition, aiming at the recruitment attribute value vector of a certain type of behavior, a weight value capable of reflecting the association relationship between the type of behavior and other types of behaviors is calculated, so that the portrait labels of various types of behaviors are more flexibly fused and can be adjusted along with the association relationship between the types of behaviors.
Referring to fig. 1, a flowchart illustrating steps of a user portrait generation method provided in an embodiment of the present invention is shown, which may specifically include the following steps:
and 102, acquiring user behavior information capable of reflecting the job hunting intention of the target user.
The service platform may refer to an Application program (APP). The target user refers to a user in the service platform with job hunting intentions, for example, a user who posts a resume on the service platform, a user who browses and clicks recruitment information on the service platform, or a user who passes through a place for posting a recruitment position on the service platform. Specifically, the application programs may include life application programs, and the life application programs may provide functions of job hunting, room renting, used car and the like for the target user.
In the embodiment of the invention, the user behavior information which is generated by the user in the process of using the recruitment function of the service platform and can reflect the job hunting intention of the user is obtained. Illustratively, when the user uses the recruitment function of the service platform, the generated user behavior information may include [ click behavior information of the target user on the recruitment position ], [ resume delivery behavior information on the recruitment position ], [ communication behavior information with the publisher of the recruitment position ], and the like.
It should be noted that other privacy information such as the user behavior information related to the embodiment of the present invention is information authorized by the user or authorized by each party.
Step 104, extracting recruitment attribute value sequences corresponding to various types of behaviors from the user behavior information; the sequence of recruitment attribute values includes a plurality of recruitment attribute values.
Wherein, the type behaviors at least comprise delivery, communication, click and the like; each type of behavior may include one or more recruitment attributes, for example, the recruitment attributes may include at least [ recruitment industry ], [ job type ], [ job salary ], and so on; each recruitment attribute may include a plurality of recruitment attribute values, and the recruitment attribute values for the plurality of recruitment attributes may constitute a set of sequences of recruitment attribute values. Illustratively, the set of sequences of recruitment attribute values may be [ Internet, product manager, 10K-20K ].
In the embodiment of the invention, the recruitment attribute value sequence corresponding to the recruitment attribute of each type of behavior can be extracted from the user behavior information. For example, when the user behavior information is [ clicking behavior of the target user on the product manager position of the xx internet company ], a genre behavior [ clicking ], a recruitment attribute value sequence [ internet, product manager ], which includes a recruitment attribute value [ internet ] and a [ product manager ] may be obtained based on the user behavior information.
106, acquiring a recruitment attribute value vector of the recruitment attribute value of each type of behavior; and the recruitment attribute value vector is used for reflecting the association relationship between the recruitment attribute values.
As an example of the present invention, the recruitment attribute values of each type of behavior are respectively input into the trained recruitment attribute value prediction model, so as to obtain a recruitment attribute value vector corresponding to each type of behavior, for example, a recruitment attribute value [ product manager ] of a type of behavior [ deliver ] is input into the trained recruitment attribute value prediction model, so as to obtain a recruitment attribute value [ product manager ] corresponding to a recruitment attribute value [ deliver ] of a type of behavior.
And 108, aiming at the target type behavior, determining the portrait label of the target type behavior according to the recruitment attribute value vector corresponding to the target type behavior.
The target type behavior refers to the type behavior of the portrait label corresponding to the target type behavior. In the embodiment of the present invention, the type behavior of the portrait label corresponding to the portrait label to be calculated may be all type behaviors or a part of type behaviors, which is not limited to this.
In the embodiment of the invention, aiming at the target type behavior, the portrait label as the target type behavior can be determined according to the recruitment attribute value vector corresponding to the target type behavior. For example, according to a recruitment attribute value vector corresponding to a recruitment attribute value of a target type behavior [ click ] of a target user, a portrait label corresponding to the target type behavior [ click ] of the target user is determined as a recruitment attribute value [ internet ], or according to a recruitment attribute value vector corresponding to a recruitment attribute value of the target type behavior [ post ] of the target user, a portrait label corresponding to the target type behavior [ post ] of the target user is determined as a recruitment attribute value [ product manager ], or according to a recruitment attribute value vector corresponding to a recruitment attribute value of the target type behavior [ communication ] of the target user, a portrait label corresponding to the target type behavior [ communication ] of the target user is determined as a recruitment attribute value [ 15K-20K ].
Step 110, determining the portrait label of the target user according to the portrait label of the target type behavior so as to form the user portrait of the target user.
In the embodiment of the invention, after the portrait tags of the target type behaviors are obtained, the portrait tag which can most express the characteristics of the target user can be further screened out from the portrait tags of the target type behaviors to be used as the portrait tag of the target user, so that the user portrait of the target user is formed.
Illustratively, suppose that the target type behavior of the target user has portrait tags [ internet ], [ product manager ], [ 15K-20K ], etc., and the portrait tags of the target user may be internet and/or product manager.
The method can be understood that the portrait label of the target type behavior is determined based on the recruitment attribute value vector of the target user, which can reflect the association relationship between the recruitment attribute values, and then the portrait label of the target user is determined based on the portrait label of the target type behavior, so that the related information of the target user can be sufficiently mined, and the recommendation effect when recommendation is performed based on the portrait label of the target user can be improved.
According to the user portrait generating method, user behavior information capable of reflecting job hunting intentions of a target user is acquired, recruitment attribute value sequences corresponding to various types of behaviors are extracted from the user behavior information, recruitment attribute value vectors reflecting the association relation between the recruitment attribute values are generated according to the recruitment attribute value sequences, portrait tags of the target type behaviors are determined according to the recruitment attribute value vectors corresponding to the target type behaviors, and portrait tags of the target user are finally determined according to the portrait tags of the target type behaviors to form user portrait of the target user. According to the embodiment of the invention, when the portrait label corresponding to the single type of behavior is calculated, the recruitment attribute value vector reflecting the association relationship between the recruitment attribute values is generated, and the portrait label of the type of behavior is determined according to the recruitment attribute value vector of the single type of behavior.
On the basis of the above-described embodiment, a modified embodiment of the above-described embodiment is proposed, and it is to be noted herein that, in order to make the description brief, only the differences from the above-described embodiment are described in the modified embodiment.
In an exemplary embodiment, the step 108 of determining the portrait label of the target type behavior according to the recruitment attribute value vector corresponding to the target type behavior may include the following steps:
a recruitment attribute value vector sequence consisting of the recruitment attribute value vectors of the target type behaviors;
summing the recruitment attribute value vector sequences of the target type behaviors to obtain recruitment attribute direction vectors of the target type behaviors;
forming a recruitment attribute value vector sequence by the recruitment attribute value vectors of the target type behaviors;
summing the recruitment attribute value vector sequence of the target type behavior to obtain a recruitment attribute direction vector of the target type behavior;
acquiring a weight value of the target type behavior; the weight value of the target type behavior has an incidence relation with the weight values of other types of behaviors;
weighting the recruitment attribute direction vector of the target type behavior by adopting the weight value;
calculating a first cosine similarity between the recruitment attribute value vector of the target type behavior and the weighted recruitment attribute direction vector;
and taking the recruitment attribute value corresponding to the recruitment attribute value vector with the first cosine similarity before the preset digit as the portrait label of the target type behavior.
The cosine similarity is also called cosine similarity, and the similarity between vectors is evaluated by calculating the cosine value of an included angle between two vectors.
In the specific implementation, user behavior information generated by a target user when the target user uses the service platform is acquired, recruitment attribute values corresponding to various types of behaviors are extracted from the user behavior information, and then the recruitment attribute values of various types of behaviors can be input into a pre-trained recruitment attribute value prediction model, so that a recruitment attribute value vector capable of reflecting the association relationship between the recruitment attribute values is obtained.
Aiming at the target type behavior, a recruitment attribute value vector of a recruitment attribute value of the target type behavior can be obtained, the recruitment attribute value vector forms a recruitment attribute value vector sequence, and the recruitment attribute value vector sequence is summed to obtain a recruitment attribute direction vector of the target type behavior. Exemplarily, assuming that the target type behavior is delivery, and there are 3 recruitment attribute value vectors corresponding to the target type behavior delivery, the 3 recruitment attribute value vectors may be combined to obtain a recruitment attribute value vector sequence, and then the recruitment attribute value vector sequence is summed to obtain a recruitment attribute direction vector corresponding to the target type behavior delivery of the target user.
In the embodiment of the invention, the weight value of the target type behavior is obtained, wherein the weight value is used for representing the importance degree of constructing the portrait of the user, then the recruitment attribute direction vector of the target type behavior is weighted by adopting the weight value, the cosine similarity between the recruitment attribute value vector of the target type behavior and the weighted recruitment attribute direction vector is calculated, and then the recruitment attribute value corresponding to the recruitment attribute value vector with the first cosine similarity before the preset digit is taken as the portrait label of the target type behavior.
Illustratively, assuming that a weight value corresponding to the target type behavior [ delivery ] is 0.3, and the preset digit N may be 2, weighting the recruitment attribute direction vector of the target type behavior [ delivery ] by using the weight value, and then calculating cosine similarity between the recruitment attribute value vector and the target type behavior, if the recruitment attribute values of the cosine similarity in the first 2 positions are [ product manager ], [ 10K-20K ], then [ product manager ], [ 10K-20K ] may be used as the portrait label of the target type behavior [ delivery ].
In the above exemplary embodiment, when calculating the portrait tags of a single type of behavior, the cosine similarity between the recruitment attribute value vector of each recruitment attribute value and the recruitment attribute direction vector of the type of behavior weighted by the weight value is calculated, so as to determine the recruitment attribute value of the portrait tag serving as the type of behavior based on the cosine similarity, where the weight value of the single type of behavior is in an association relationship with other types of behaviors, so that the portrait tag is more flexible to be fused with other types of behaviors when calculating the portrait tag of the single type of behavior, and can be continuously and flexibly adjusted along with the user behavior information.
In an exemplary embodiment, the obtaining the weight value of the target-type behavior may include:
forming a recruitment attribute value vector sequence by the recruitment attribute value vectors of other types of behaviors;
calculating a second cosine value similarity between the recruitment attribute direction vector of the target type behavior and the recruitment attribute value vector sequences of the other types of behaviors;
and calculating the average value between the second cosine value similarities as the weight value of the target type behavior.
In the embodiment of the invention, for each type of behavior of the target user, corresponding recruitment attribute value vectors can be respectively obtained, the recruitment attribute value vectors form a recruitment attribute value vector sequence, and the recruitment attribute value vector sequences of each type of behavior are summed to obtain the recruitment attribute direction vector of each type of behavior.
For the target-type behavior, cosine similarity between the recruitment attribute direction vector of the target-type behavior and the recruitment attribute value vector sequence of other types of behaviors is calculated, for example, assuming that the behavior type of the target user includes S1, S2, S3 and S4, if the target-type behavior is S1, cosine similarity between the recruitment attribute direction vector of S1 and the recruitment attribute value vector sequences of S2, S3 and S4 needs to be calculated, and then, the mean value of the cosine similarity is taken as the weight value of the target-type behavior.
As an alternative example, the mathematical expression of the weight values corresponding to the target type behavior is as follows:
Figure BDA0003451532830000121
Figure BDA0003451532830000122
wherein, proj (A', D)A) Recruitment attribute value vector sequence for A' type behavior
Figure BDA0003451532830000123
Recruitment attribute direction vector D in type A behaviorACosine value projection of, cos (D)A,Vi) Is the projection of the cosine value,
Figure BDA0003451532830000124
is composed of
Figure BDA0003451532830000125
Length of sequence of (A)sFor all types of behavior of the target user,
Figure BDA0003451532830000126
is AsW (a) is the weight value corresponding to the type a behavior of the target user.
In the above exemplary embodiment, the mean value of the cosine similarity of the recruitment attribute direction vector of the target type behavior and the recruitment attribute value vector sequence of the other type behavior is used as the weight value of the target type behavior, so that the weight value reflects the association relationship between the target type behavior and the other type behavior, and further the portrait label of the target type behavior determined based on the weight value is fused with the portrait labels of the other type behaviors, and the weight value can be flexibly adjusted along with the user behavior information.
In an exemplary embodiment, the step 110 of determining the portrait label of the target user based on the portrait label of the target type behavior may include the steps of:
acquiring a first cosine similarity of the portrait label of each target type behavior; the first cosine similarity is the cosine similarity of the recruitment attribute value vector of the corresponding recruitment attribute value and the weighted recruitment attribute direction vector of the portrait label of each target type behavior;
and taking the portrait label of the target type behavior with the first cosine similarity before the preset digit as the portrait label of the target user.
In the embodiment of the invention, after the portrait tags of a plurality of target type behaviors are obtained, cosine similarity corresponding to the portrait tags of the target type behaviors can be obtained, namely, the cosine similarity is determined as the recruitment attribute value of the portrait tags of the target type behaviors, and the cosine similarity between the corresponding recruitment attribute value vector and the weighted recruitment attribute direction vector, and then the portrait tags of the target type behaviors with the cosine similarity before the preset digit number are taken as the portrait tags of the target user.
Illustratively, assuming that there is a cosine similarity C1 of portrait label 1, a cosine similarity C2 of portrait label 2, and a cosine similarity C3 of portrait label 3, which have calculated target type behavior [ post ] of the target user, a target type behavior [ click ], and a target type behavior [ communication ], the portrait label may be taken as the portrait label of the target user, assuming that the preset number of digits is 1, and the cosine similarity C2 of portrait label 2 is the maximum.
In the above exemplary embodiment, after the portrait label of the target type behavior of the target user is obtained, the final portrait label of the target user may be further determined from the portrait labels of the target type behavior based on the cosine similarity corresponding to the portrait label, so that the user portrait of the target user is more accurate.
In an exemplary embodiment, a point burying program is provided on the service platform, and the step 102 of obtaining user behavior information capable of reflecting the job hunting intention of the target user may include the following steps:
and acquiring user behavior information reported by the embedded point program in the service platform.
Specifically, a point burying program can be set for the service platform, the point burying program is that some probes are inserted into the service platform on the basis of ensuring the original logic integrity of the service platform, wherein the probes are essentially code segments for information acquisition, and can be assignment statements or function calls for acquiring coverage information, and the like, then the operation behaviors of a target user on the service platform can be collected through the point burying program in the service platform, so as to obtain user behavior information capable of reflecting the job hunting intentions of the user, such as [ click behavior information of the target user on a recruitment position ], [ resume delivery behavior information of the recruitment position ], and [ communication behavior information with a recruitment position publisher ], and the like, and then recruitment attribute values, such as recruitment industry, position type, position salary and the like, can be extracted from the user behavior information, facilitating subsequent generation of portrait tags for the target user.
In an exemplary embodiment, the method may further include the steps of:
acquiring basic portrait information of the target user; the basic portrait information comprises biological basic information reflecting the characteristics of the target user and job hunting intention information;
and constructing the user portrait of the target user by the base portrait information and/or the portrait label of the target user.
Wherein the biological basic information at least comprises the gender, age, academic history, work experience and the like of the target user; the job intention information may include at least a desired industry, job type, and job payroll, among others. Alternatively, the biological basic information and the job intention information may be acquired from the resume of the target user or from registration information in the service platform, which is not necessarily limited by the embodiment of the present invention.
In the embodiment of the invention, the basic portrait information of the target user, namely the biological basic information and the job hunting intention information, is acquired, then the biological basic information and the job hunting intention information and the portrait label of the target user form the user portrait of the target user, and then the recommendation of information such as recruitment information and the like can be carried out for the target user based on the user portrait.
Illustratively, assuming that the portrait label in the user portrait of the target user includes [ product manager ], the base portrait information includes a study [ subject ], a work experience [ five years ], if a recruiting position publisher publishes a piece of recruiting information: and product managers adopting the subject calendar of the recruitment department and the work experience for more than three years can accurately recommend the recruitment information to the target user.
In the above exemplary embodiment, the base portrait information of the target user is combined with the portrait label of the target user determined based on the recruitment attribute value vector capable of reflecting the association relationship between the recruitment attribute values and the weight value capable of reflecting the association relationship between the type behaviors as the user portrait of the target user, so that the recommended content is more personalized when information recommendation is performed based on the user portrait.
In an exemplary embodiment, the obtaining 106 a recruitment attribute value vector for the recruitment attribute value for each of the types of behaviors may include the steps of:
inputting the recruitment attribute values of the types of behaviors into a pre-trained recruitment attribute value prediction model to obtain recruitment attribute vectors corresponding to the recruitment attribute values; the recruitment attribute value prediction model is obtained by training by adopting a historical recruitment attribute value sequence of each type of behavior, and the historical recruitment attribute value sequence comprises a plurality of historical recruitment attribute values.
In the specific implementation, historical user behavior information capable of reflecting the job application intention of the user on a service platform is recorded, for the recruitment attributes of each type of behavior, corresponding recruitment attribute values are respectively extracted from the historical user behavior information, the extracted recruitment attribute values are sorted according to time by user groups to obtain multiple groups of recruitment attribute value sequences, then the multiple groups of recruitment attribute value sequences are used as a data training set, a recruitment attribute value prediction model is trained to obtain a trained recruitment attribute value prediction model, and a recruitment attribute value vector reflecting the association relationship between the recruitment attribute values can be output based on the trained recruitment attribute value prediction model.
In the embodiment of the invention, the recruitment attribute values of each type of behavior are respectively input into the trained recruitment attribute value prediction model, so that a recruitment attribute value vector corresponding to each type of behavior is obtained, for example, a recruitment attribute value sequence of the type of behavior [ deliver ] is input into the trained recruitment attribute value prediction model, and a recruitment attribute value vector corresponding to the recruitment attribute value of the type of behavior [ deliver ] can be obtained.
In order to make those skilled in the art better understand the technical solution of the embodiment of the present invention, a specific exemplary process for generating the portrait label of the target user is described below. Referring to fig. 2, the specific steps include:
acquiring user behavior information of job hunting intention of a target user, and extracting behavior sequences (recruitment attribute value sequences) corresponding to three types of behaviors, namely an A1 behavior sequence number, an A2 behavior sequence and an A3 behavior sequence, from the user behavior information; obtaining a direction sequence (recruitment attribute direction vector) corresponding to A1, A2 and A3 based on the recruitment attribute value sequence, namely a D1 direction vectorA D2 directional vector, a D3 directional vector; weighting the D1 direction vector, the D2 direction vector and the D3 direction vector by adopting corresponding weight values respectively to obtain correspondingly weighted direction vectors DuserThen, all the recruitment attribute values of the recruitment attributes of each type of behavior and the corresponding direction vector D are combineduserCalculating cosine similarity, taking the recruitment attribute values of the first N types with the largest cosine similarity as portrait labels of the type of behaviors, and finally, integrating the portrait labels of the types of behaviors, and taking the portrait labels of the first N types with the largest cosine similarity as portrait labels of the target user, wherein N is a positive integer.
In summary, in the embodiment of the present invention, when calculating the portrait label of the target type behavior of the target user, the recruitment attribute value vector is used to consider the association relationship between the recruitment attribute values in the target type behavior, and the weight value is used to consider the association relationship between the target type behavior and other types of behaviors, so that the portrait labels of the target types, that is, the portrait labels of the target types are associated with the recruitment attribute values in the type behaviors and associated with other types of behaviors, and further the portrait labels of the target type behaviors are synthesized to obtain the portrait label of the target user, which is more accurate and can better perform information recommendation.
It should be noted that for simplicity of description, the method embodiments are shown as a series of combinations of acts, but those skilled in the art will recognize that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 3, a block diagram of a user portrait apparatus provided in an embodiment of the present invention is shown, where the apparatus may specifically include the following modules:
a user behavior information obtaining module 302, configured to obtain user behavior information that can reflect job hunting intention of a target user;
a recruitment attribute value extraction module 304, configured to extract a recruitment attribute value sequence corresponding to each type of behavior from the user behavior information; the recruitment attribute value sequence comprises a plurality of recruitment attribute values;
a recruitment attribute value vector generation module 306 for obtaining a recruitment attribute value vector for the recruitment attribute value of each of the types of behaviors; the recruitment attribute value vector is used for reflecting the association relationship between the recruitment attribute values;
a type behavior tag determination module 308, configured to determine, for a target type behavior, a portrait tag of the target type behavior according to the recruitment attribute value vector corresponding to the target type behavior;
and a user tag determination module 310, configured to determine the portrait tag of the target user according to the portrait tag of the target type behavior.
In an exemplary embodiment, the type behavior tag determination module 308 is configured to compose a recruitment attribute value vector of the target type behavior into a sequence of recruitment attribute value vectors; summing the recruitment attribute value vector sequences of the target type behaviors to obtain recruitment attribute direction vectors of the target type behaviors; acquiring a weight value of the target type behavior; the weight value of the target type behavior has an incidence relation with other types of behaviors; weighting the recruitment attribute direction vector of the target type behavior by adopting the weight value; calculating a first cosine similarity between the recruitment attribute value vector of the target type behavior and the weighted recruitment attribute direction vector; and taking the recruitment attribute value corresponding to the recruitment attribute value vector with the first cosine similarity before the preset digit as the portrait label of the target type behavior.
In an exemplary embodiment, the type behavior tag determination module 308 is configured to compose recruitment attribute value vectors for other types of behaviors into a sequence of recruitment attribute value vectors; calculating a second cosine value similarity between the recruitment attribute direction vector of the target type behavior and the recruitment attribute value vector sequences of the other types of behaviors; and calculating the average value between the second cosine value similarities as the weight value of the target type behavior.
In an exemplary embodiment, the user tag determining module 310 is configured to obtain a first cosine similarity of the portrait tag of each target type behavior; the first cosine similarity is the cosine similarity of the recruitment attribute value vector of the corresponding recruitment attribute value and the weighted recruitment attribute direction vector of the portrait label of each target type behavior; and taking the portrait label of the target type behavior with the first cosine similarity before a preset digit as the portrait label of the target user.
In an exemplary embodiment, a buried point program is disposed on a service platform, and the user behavior information obtaining module 302 is configured to obtain user behavior information reported by the buried point program in the service platform.
In an exemplary embodiment, the apparatus further comprises: the user portrait construction module is used for acquiring basic portrait information of the target user; the basic portrait information comprises biological basic information reflecting the characteristics of the target user and job hunting intention information; and constructing the user portrait of the target user by using the basic portrait information, the portrait label and/or the portrait label of the target user.
In an exemplary embodiment, the biological basic information includes at least gender, age, scholarly history, and work experience; the job hunting intention information at least comprises a desired industry, a job type and job salaries; the type behavior at least comprises delivery, communication and clicking.
In an exemplary embodiment, the recruitment attribute value vector generating module 306 is configured to input the recruitment attribute values of the types of behaviors into a pre-trained recruitment attribute value prediction model to obtain a recruitment attribute vector corresponding to the recruitment attribute value; the recruitment attribute value prediction model is obtained by training by adopting a historical recruitment attribute value sequence of each type of behavior, and the historical recruitment attribute value sequence comprises a plurality of historical recruitment attribute values.
In summary, in the embodiment of the present invention, user behavior information capable of reflecting job hunting intention of a target user is acquired, a recruitment attribute value sequence corresponding to each type of behavior is extracted from the user behavior information, a recruitment attribute value vector reflecting an association relationship between recruitment attribute values is generated according to the recruitment attribute value sequence, then, for the target type of behavior, a portrait label of the target type of behavior is determined according to the recruitment attribute value vector corresponding to the target type of behavior, and then, a portrait label of the target user is finally determined according to the portrait label of the target type of behavior, so as to form a user portrait of the target user. According to the embodiment of the invention, when the portrait label corresponding to the single type of behavior is calculated, the recruitment attribute value vector reflecting the association relationship between the recruitment attribute values is generated, and the portrait label of the type of behavior is determined according to the recruitment attribute value vector of the single type of behavior.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
Preferably, an embodiment of the present invention further provides an electronic device, including: the processor, the memory, and the computer program stored in the memory and capable of running on the processor, when executed by the processor, implement the processes of the user representation generation method embodiment, and can achieve the same technical effects, and are not described herein again to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the user portrait generation method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the computer program is not described herein again. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
Embodiments of the present invention provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing embodiment of the user image generation method, and achieve the same technical effects, and in order to avoid repetition, the details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the particular illustrative embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but is intended to cover various modifications, equivalent arrangements, and equivalents thereof, which may be made by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (11)

1. A user representation generation method, comprising:
acquiring user behavior information capable of reflecting job hunting intention of a target user;
extracting a recruitment attribute value sequence corresponding to each type of behavior from the user behavior information; the recruitment attribute value sequence comprises a plurality of recruitment attribute values;
acquiring a recruitment attribute value vector of the recruitment attribute value of each type of behavior; the recruitment attribute value vector is used for reflecting the association relationship between the recruitment attribute values;
aiming at a target type behavior, determining a portrait label of the target type behavior according to the recruitment attribute value vector corresponding to the target type behavior;
and determining the portrait label of the target user according to the portrait label of the target type behavior so as to form the user portrait of the target user.
2. The method of claim 1, wherein said determining a portrait label for the target type of behavior based on the recruitment attribute value vector corresponding to the target type of behavior comprises:
forming a recruitment attribute value vector sequence by the recruitment attribute value vectors of the target type behaviors;
summing the recruitment attribute value vector sequences of the target type behaviors to obtain recruitment attribute direction vectors of the target type behaviors;
acquiring a weight value of the target type behavior; the weight value of the target type behavior has an incidence relation with other types of behaviors;
weighting the recruitment attribute direction vector of the target type behavior by adopting the weight value;
calculating a first cosine similarity between the recruitment attribute value vector of the target type behavior and the weighted recruitment attribute direction vector;
and taking the recruitment attribute value corresponding to the recruitment attribute value vector with the first cosine similarity before the preset digit as the portrait label of the target type behavior.
3. The method of claim 2, wherein obtaining the weight value of the target type behavior comprises:
forming a recruitment attribute value vector sequence by the recruitment attribute value vectors of other types of behaviors;
calculating a second cosine value similarity between the recruitment attribute direction vector of the target type behavior and the recruitment attribute value vector sequences of the other types of behaviors;
and calculating the average value between the second cosine value similarities as the weight value of the target type behavior.
4. The method of claim 2, wherein determining the target user's portrait label based on the target type behavior portrait label comprises:
acquiring a first cosine similarity of the portrait label of each target type behavior; the first cosine similarity is the cosine similarity of the recruitment attribute value vector of the corresponding recruitment attribute value and the weighted recruitment attribute direction vector of the portrait label of each target type behavior;
and taking the portrait label of the target type behavior with the first cosine similarity before the preset digit as the portrait label of the target user.
5. The method according to claim 1, wherein a point burying program is provided on the service platform, and the obtaining of the user behavior information capable of reflecting the job seeking intention of the target user comprises:
and acquiring user behavior information reported by the embedded point program in the service platform.
6. The method of claim 1, further comprising:
acquiring basic portrait information of the target user; the basic portrait information comprises biological basic information reflecting the characteristics of the target user and job hunting intention information;
and constructing the user portrait of the target user by the base portrait information and/or the portrait label of the target user.
7. The method of claim 6,
the biological basic information at least comprises gender, age, school calendar and working experience;
the job hunting intention information at least comprises a desired industry, a job type and job salaries;
the type behavior at least comprises delivery, communication and clicking.
8. The method of claim 1, wherein said obtaining a recruitment attribute value vector for the recruitment attribute value for each of the types of behavior comprises:
inputting the recruitment attribute values of the types of behaviors into a pre-trained recruitment attribute value prediction model to obtain recruitment attribute vectors corresponding to the recruitment attribute values; the recruitment attribute value prediction model is obtained by training by adopting a historical recruitment attribute value sequence of each type of behavior, and the historical recruitment attribute value sequence comprises a plurality of historical recruitment attribute values.
9. A user representation generation apparatus, comprising:
the user behavior information acquisition module is used for acquiring user behavior information capable of reflecting the job hunting intention of the target user;
the recruitment attribute value extraction module is used for extracting a recruitment attribute value sequence corresponding to each type of behavior from the user behavior information; the recruitment attribute value sequence comprises a plurality of recruitment attribute values;
a recruitment attribute value vector generation module for acquiring a recruitment attribute value vector of the recruitment attribute value of each of the types of behaviors; the recruitment attribute value vector is used for reflecting the association relationship between the recruitment attribute values;
the type behavior tag determining module is used for determining a portrait tag of the target type behavior according to the recruitment attribute value vector corresponding to the target type behavior aiming at the target type behavior;
and the user tag determining module is used for determining the portrait tag of the target user according to the portrait tag of the target type behavior.
10. An electronic device, comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory communicate with each other via the communication bus;
the memory is used for storing a computer program;
the processor, when executing a program stored on the memory, implementing the method of any of claims 1-8.
11. One or more computer-readable media having instructions stored thereon that, when executed by one or more processors, cause the processors to perform the method recited in any of claims 1-8.
CN202111676606.3A 2021-12-31 2021-12-31 User portrait generation method and device, electronic equipment and readable medium Pending CN114491205A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111676606.3A CN114491205A (en) 2021-12-31 2021-12-31 User portrait generation method and device, electronic equipment and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111676606.3A CN114491205A (en) 2021-12-31 2021-12-31 User portrait generation method and device, electronic equipment and readable medium

Publications (1)

Publication Number Publication Date
CN114491205A true CN114491205A (en) 2022-05-13

Family

ID=81510702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111676606.3A Pending CN114491205A (en) 2021-12-31 2021-12-31 User portrait generation method and device, electronic equipment and readable medium

Country Status (1)

Country Link
CN (1) CN114491205A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114819924A (en) * 2022-06-28 2022-07-29 杭银消费金融股份有限公司 Enterprise information push processing method and device based on portrait analysis
CN115760200A (en) * 2023-01-06 2023-03-07 万链指数(青岛)信息科技有限公司 User portrait construction method based on financial transaction data
CN116401464A (en) * 2023-06-02 2023-07-07 深圳市一览网络股份有限公司 Professional user portrait construction method, device, equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114819924A (en) * 2022-06-28 2022-07-29 杭银消费金融股份有限公司 Enterprise information push processing method and device based on portrait analysis
CN115760200A (en) * 2023-01-06 2023-03-07 万链指数(青岛)信息科技有限公司 User portrait construction method based on financial transaction data
CN116401464A (en) * 2023-06-02 2023-07-07 深圳市一览网络股份有限公司 Professional user portrait construction method, device, equipment and storage medium
CN116401464B (en) * 2023-06-02 2023-08-04 深圳市一览网络股份有限公司 Professional user portrait construction method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN114491205A (en) User portrait generation method and device, electronic equipment and readable medium
CN109920174B (en) Book borrowing method and device, electronic equipment and storage medium
CN109614238B (en) Target object identification method, device and system and readable storage medium
CN108108821A (en) Model training method and device
CN109816420A (en) Customer data processing method, device, computer equipment and storage medium
US11080376B2 (en) Anonymous cross-device, cross-channel, and cross-venue user identification using adaptive deep learning
CN112613917A (en) Information pushing method, device and equipment based on user portrait and storage medium
CN108921587B (en) Data processing method and device and server
CN112651841A (en) Online business handling method and device, server and computer readable storage medium
CN110727860A (en) User portrait method, device, equipment and medium based on internet beauty platform
CN117114514B (en) Talent information analysis management method, system and device based on big data
CN111090807A (en) Knowledge graph-based user identification method and device
CN113393306A (en) Product recommendation method and device, electronic equipment and computer readable medium
CN108205575B (en) Data processing method and device
CN113112282A (en) Method, device, equipment and medium for processing consult problem based on client portrait
CN117349899B (en) Sensitive data processing method, system and storage medium based on forgetting model
CN113793174A (en) Data association method and device, computer equipment and storage medium
CN113656699A (en) User feature vector determination method, related device and medium
CN112287111A (en) Text processing method and related device
CN111753203A (en) Card number recommendation method, device, equipment and medium
JP2006018340A (en) Customer information integration system and method for preparing integrated customer information database
CN110705733A (en) Number obtaining method and device, electronic equipment and computer readable storage medium
CN113468886B (en) Work order processing method and device and computer equipment
US20220284319A1 (en) Intelligent guidance using machine learning for user navigation of multiple web pages
CN113505369B (en) Method and device for training user risk recognition model based on space-time perception

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220608

Address after: Room 103, No. 19, Zhihui Road, Huishan Economic Development Zone, Wuxi, Jiangsu 214174

Applicant after: Wuxi May 8th Ganji Technology Service Co.,Ltd.

Address before: Room 301, 3 / F, College Park, Dongsheng science and Technology Park, Zhongguancun, No.A, Xueqing Road, Haidian District, Beijing 100083

Applicant before: BEIJING 58 INFORMATION TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221020

Address after: Room 117, 1st Floor, Room 101, Building 1-7, Yard A 10, Jiuxianqiao North Road, Chaoyang District, Beijing 100015

Applicant after: Beijing Wuba Ganji Information Technology Co.,Ltd.

Address before: Room 103, No. 19, Zhihui Road, Huishan Economic Development Zone, Wuxi, Jiangsu 214174

Applicant before: Wuxi May 8th Ganji Technology Service Co.,Ltd.