CN115098793A - User portrait analysis method and system based on big data - Google Patents

User portrait analysis method and system based on big data Download PDF

Info

Publication number
CN115098793A
CN115098793A CN202210321101.3A CN202210321101A CN115098793A CN 115098793 A CN115098793 A CN 115098793A CN 202210321101 A CN202210321101 A CN 202210321101A CN 115098793 A CN115098793 A CN 115098793A
Authority
CN
China
Prior art keywords
user
historical
information
portrait
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210321101.3A
Other languages
Chinese (zh)
Other versions
CN115098793B (en
Inventor
陈应书
郭从仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weimai Technology Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202210321101.3A priority Critical patent/CN115098793B/en
Publication of CN115098793A publication Critical patent/CN115098793A/en
Application granted granted Critical
Publication of CN115098793B publication Critical patent/CN115098793B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9536Search customisation based on social or collaborative filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/906Clustering; Classification
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention provides a user portrait analysis method and system based on big data, and relates to the technical field of big data. In the invention, aiming at each user terminal device, user portrait information of a device user corresponding to the user terminal device is obtained; for each of a plurality of device users corresponding to a plurality of user terminal devices, obtaining a user portrait matching degree corresponding to the device user based on a matching degree between user portrait information corresponding to the device user and predetermined target user portrait information; and determining target device users from the plurality of device users based on the user portrait matching degree corresponding to each device user in the plurality of device users, wherein each determined target device user is used for constructing and forming a target user group. Based on the method, the problem that the reliability of the user group constructed and formed based on the prior art is poor can be solved.

Description

Big data-based user portrait analysis method and system
Technical Field
The invention relates to the technical field of big data, in particular to a user portrait analysis method and system based on big data.
Background
With the continuous development of internet technology and computer technology, network behaviors of users are getting larger and larger, so that classifying users based on network behavior data of users has become an important application, for example, different users can be classified based on similarity between network behavior data of users to form different user groups. However, this is to divide the group from the perspective of similarity between users, and thus, there may be a problem that the reliability of constructing the formed user group is not good for the application level, such as mismatch with application requirements.
Disclosure of Invention
In view of the above, the present invention provides a method and a system for analyzing a user portrait based on big data, so as to solve the problem of poor reliability of a user group constructed and formed based on the prior art.
In order to achieve the above purpose, the embodiment of the invention adopts the following technical scheme:
a big data-based user portrait analysis method is applied to a user data analysis server, and comprises the following steps:
the method comprises the steps that user portrait information of an equipment user corresponding to a plurality of user terminal equipment in communication connection is obtained for each user terminal equipment, wherein the user portrait information is constructed and formed on the basis of user characteristic information obtained by data acquisition of the corresponding equipment user;
for each device user in a plurality of device users corresponding to the plurality of user terminal devices, obtaining a user portrait matching degree corresponding to the device user based on a matching degree between user portrait information corresponding to the device user and predetermined target user portrait information, wherein the target user portrait information is constructed and formed based on user characteristic information of a target user group to be constructed and formed;
and determining target device users in the plurality of device users based on the user portrait matching degree corresponding to each device user in the plurality of device users, wherein each determined target device user is used for constructing and forming the target user group.
In some preferred embodiments, in the method for analyzing a user portrait based on big data, the step of obtaining, for each user terminal device in the plurality of user terminal devices, user portrait information of a device user corresponding to the user terminal device includes:
judging whether a user portrait analysis instruction is acquired or not, and generating user portrait analysis notification information after the user portrait analysis instruction is acquired;
sending the user portrait analysis notification information to each of a plurality of user terminal devices in communication connection, wherein each of the user terminal devices is configured to display the user portrait analysis notification information to a device user corresponding to the user terminal device after receiving the user portrait analysis notification information, generate corresponding user portrait analysis confirmation information in response to an operation that the device user agrees to perform user portrait analysis based on the user portrait analysis notification information, and send the user portrait analysis confirmation information to the user data analysis server;
after the user portrait analysis confirmation information sent by each of the plurality of user terminal devices is obtained, user portrait information of a device user corresponding to each of the plurality of user terminal devices is obtained respectively.
In some preferred embodiments, in the method for analyzing a user profile based on big data, the step of obtaining, for each of a plurality of device users corresponding to the plurality of user terminal devices, a matching degree of the user profile corresponding to the device user based on a matching degree between the user profile information corresponding to the device user and predetermined target user profile information includes:
determining a fusion coefficient corresponding to each piece of user characteristic information included in the predetermined target user portrait information;
for each equipment user in a plurality of equipment users corresponding to the plurality of user terminal equipment, respectively calculating the matching degree between each piece of user characteristic information included in the user portrait information corresponding to the equipment user and the corresponding piece of user characteristic information included in the target user portrait information to obtain the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information;
and for each equipment user in a plurality of equipment users corresponding to the plurality of user terminal equipment, based on the fusion coefficient corresponding to each piece of user characteristic information, fusing the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information corresponding to the equipment user to obtain the user portrait matching degree corresponding to the equipment user.
In some preferred embodiments, in the method for analyzing a user portrait based on big data, the step of determining, for each piece of user feature information included in the predetermined target user portrait information, a fusion coefficient corresponding to the piece of user feature information includes:
obtaining each historical target user group formed by historical construction to obtain at least one historical target user group, wherein each historical target user group in the at least one historical target user group comprises at least one historical device user;
aiming at each historical target user group in the at least one historical target user group, determining historical user portrait information corresponding to the historical target user group, and performing duplication elimination screening on historical user characteristic information included in the historical user portrait information corresponding to each historical target user group in the at least one historical target user group to obtain a corresponding historical characteristic information set;
for each piece of historical user characteristic information in the historical characteristic information set, determining a historical device user corresponding to the historical user characteristic information in the at least one historical target user group as a first historical device user corresponding to the historical user characteristic information, and determining a fusion coefficient corresponding to the historical user characteristic information based on the first historical device user;
and determining historical user feature information to which the user feature information belongs aiming at each piece of user feature information included in the predetermined target user portrait information, and determining a fusion coefficient corresponding to the historical user feature information as a fusion coefficient corresponding to the user feature information.
In some preferred embodiments, in the big data-based user representation analysis method, the step of determining, for each piece of historical user feature information in the set of historical feature information, a corresponding historical device user of the historical user feature information in the at least one historical target user group as a first historical device user corresponding to the historical user feature information, and determining a fusion coefficient corresponding to the historical user feature information based on the first historical device user includes:
for each piece of historical user characteristic information in the historical characteristic information set, determining a historical device user corresponding to the historical user characteristic information in the at least one historical target user group as a first historical device user corresponding to the historical user characteristic information, counting the number of the first historical device users corresponding to the historical user characteristic information to obtain a first user statistical number corresponding to the historical user characteristic information, and determining a first coefficient corresponding to the historical user characteristic information based on the first user statistical number corresponding to the historical user characteristic information, wherein the first coefficient and the first user statistical number have a positive correlation;
for each historical target user group in the at least one historical target user group, respectively determining the information attention of each historical device user in the historical target user group to the historical recommendation information corresponding to the historical target user group, and determining a group contribution coefficient of each historical device user in the historical target user group based on the information attention corresponding to each historical device user, wherein the group contribution coefficient and the information attention have a positive correlation;
for each group contribution coefficient, determining historical formation time of a historical target user group corresponding to the group contribution coefficient, constructing and forming a corresponding two-dimensional coordinate based on the group contribution coefficient and the historical formation time, and determining a coordinate vector corresponding to the two-dimensional coordinate;
for each piece of historical user feature information in the historical feature information set, sequentially connecting one coordinate vector corresponding to each first historical device user corresponding to the historical user feature information to obtain one connection path corresponding to the historical user feature information, wherein the step of sequentially connecting one coordinate vector corresponding to each first historical device user corresponding to the historical user feature information to obtain one connection path corresponding to the historical user feature information is executed for multiple times to obtain multiple corresponding connection paths, wherein every two connection paths in the multiple connection paths are different;
for each piece of historical user feature information in the historical user feature information set, respectively calculating a vector distance between two adjacent coordinate vectors in each connection path corresponding to the historical user feature information, respectively calculating a sum of the vector distances between two adjacent coordinate vectors in each connection path, obtaining a vector distance sum value corresponding to each connection path, determining a connection path corresponding to the vector distance sum value with the minimum value as a target connection path corresponding to the historical user feature information, and obtaining a contribution coefficient fusion value corresponding to the historical user feature information by fusing each group contribution coefficient corresponding to the target connection path;
and determining a fusion coefficient corresponding to the historical user characteristic system information according to the contribution coefficient fusion value and the first coefficient corresponding to the historical user characteristic system information aiming at each piece of historical user characteristic information in the historical characteristic information set.
In some preferred embodiments, in the method for analyzing a user portrait based on big data, the step of determining a target device user among the plurality of device users based on the user portrait matching degree corresponding to each of the plurality of device users includes:
for each device user in the plurality of device users, determining a relative size relationship between a user portrait matching degree corresponding to the device user and a preset portrait matching degree threshold;
and aiming at each of the plurality of equipment users, if the user portrait matching degree corresponding to the equipment user is greater than or equal to the portrait matching degree threshold value, determining the equipment user as a target equipment user, and if the user portrait matching degree corresponding to the equipment user is smaller than the portrait matching degree threshold value, determining the equipment user as a non-target equipment user.
In some preferred embodiments, in the method for analyzing a user portrait based on big data, the step of determining a target device user among the plurality of device users based on the user portrait matching degree corresponding to each of the plurality of device users includes:
sequencing the equipment users based on the user portrait matching degree corresponding to each equipment user in the plurality of equipment users to obtain user sequencing sequences corresponding to the plurality of equipment users, wherein when sequencing the equipment users, sequencing is carried out according to the sequence of the user portrait matching degree corresponding to the equipment users, wherein the sequence is big first and small last or small first and big last;
and acquiring group quantity range information configured aiming at the target user group in advance, and selecting the device user with the corresponding quantity range with the maximum user portrait matching degree in the user sorting sequence as the target device user based on the group quantity range information.
The embodiment of the invention also provides a user portrait analysis system based on big data, which is applied to a user data analysis server, and the user portrait analysis system based on big data comprises:
the user portrait acquisition module is used for acquiring user portrait information of an equipment user corresponding to each user terminal equipment in a plurality of user terminal equipments in communication connection, wherein the user portrait information is constructed and formed on the basis of user characteristic information acquired by data acquisition of the corresponding equipment user;
the portrait matching degree determining module is used for obtaining the user portrait matching degree corresponding to each user of a plurality of user terminal devices based on the matching degree between the user portrait information corresponding to the user terminal device and predetermined target user portrait information, wherein the target user portrait information is constructed and formed based on user characteristic information of a target user group to be constructed and formed;
and the target user determining module is used for determining a target device user from the plurality of device users based on the user portrait matching degree corresponding to each device user, wherein each determined target device user is used for constructing and forming the target user group.
In some preferred embodiments, in the big data based user representation analysis system described above, the user representation acquisition module is specifically configured to:
judging whether a user portrait analysis instruction is acquired or not, and generating user portrait analysis notification information after the user portrait analysis instruction is acquired;
sending the user portrait analysis notification information to each of a plurality of user terminal devices in communication connection, wherein each of the user terminal devices is configured to, after receiving the user portrait analysis notification information, display the user portrait analysis notification information to a device user corresponding to the user terminal device, generate corresponding user portrait analysis confirmation information in response to an operation that the device user agrees to perform user portrait analysis based on the user portrait analysis notification information, and send the user portrait analysis confirmation information to the user data analysis server;
after the user portrait analysis confirmation information sent by each of the plurality of user terminal devices is obtained, user portrait information of a device user corresponding to each of the plurality of user terminal devices is obtained respectively.
In some preferred embodiments, in the big data based user representation analysis system, the representation matching degree determination module is specifically configured to:
aiming at each piece of user characteristic information included in the predetermined target user portrait information, determining a fusion coefficient corresponding to the user characteristic information;
for each device user in a plurality of device users corresponding to the user terminal devices, respectively calculating the matching degree between each piece of user feature information included in the user portrait information corresponding to the device user and the corresponding user feature information included in the target user portrait information to obtain the feature matching degree corresponding to each piece of user feature information included in the user portrait information;
and for each equipment user in a plurality of equipment users corresponding to the plurality of user terminal equipment, based on the fusion coefficient corresponding to each piece of user characteristic information, fusing the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information corresponding to the equipment user to obtain the user portrait matching degree corresponding to the equipment user.
The method and system for analyzing user portrait based on big data according to the embodiments of the present invention may first obtain user portrait information of a device user corresponding to each user terminal device, and then obtain user portrait matching degree corresponding to the device user based on matching degree between the user portrait information corresponding to the device user and predetermined target user portrait information for each device user of a plurality of device users corresponding to the plurality of user terminal devices, so that a target device user may be determined among the plurality of device users based on the user portrait matching degree corresponding to each device user of the plurality of device users, and thus, it may be ensured that the determined target device user has a higher matching degree with target user portrait information (i.e. characteristics of a required user) representing application requirements, therefore, the reliability of the target user group constructed based on the determined target equipment users is guaranteed, and the problem that the reliability of the user group constructed based on the prior art is poor is solved.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
Fig. 1 is an application block diagram of a user data analysis server according to an embodiment of the present invention.
FIG. 2 is a flowchart illustrating steps included in a big data-based user representation analysis method according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating modules included in a big data based user representation analysis system according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a user data analysis server. Wherein the user data analysis server may include a memory and a processor.
In detail, the memory and the processor are electrically connected directly or indirectly to realize data transmission or interaction. For example, they may be electrically connected to each other via one or more communication buses or signal lines. The memory can have stored therein at least one software function (computer program) which can be present in the form of software or firmware. The processor may be configured to execute the executable computer program stored in the memory to implement a big data based user representation analysis method provided by embodiments of the present invention (described below).
For example, in one possible embodiment, the Memory may be, but is not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Programmable Read-Only Memory (PROM), Erasable Read-Only Memory (EPROM), electrically Erasable Read-Only Memory (EEPROM), and the like.
For example, in one possible implementation, the Processor may be a general-purpose Processor including a Central Processing Unit (CPU), a Network Processor (NP), a System on Chip (SoC), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components.
Moreover, the structure shown in fig. 1 is only an illustration, and the user data analysis server may further include more or fewer components than those shown in fig. 1, or have a different configuration from that shown in fig. 1, for example, may include a communication unit for performing information interaction with other devices (such as user terminal devices, etc., where the user terminal devices may include, but are not limited to, mobile phones, computers, etc.).
With reference to fig. 2, an embodiment of the present invention further provides a user portrait analysis method based on big data, which is applicable to the user data analysis server. The method steps defined by the flow related to the big data-based user portrait analysis method can be realized by the user data analysis server. The specific process shown in FIG. 2 will be described in detail below.
Step S110 is to acquire user portrait information of an apparatus user corresponding to each of a plurality of user terminal apparatuses in communication connection.
In the embodiment of the present invention, the user data analysis server may obtain, for each user terminal device in a plurality of user terminal devices that are in communication connection, user portrait information of a device user corresponding to the user terminal device. The user portrait information is constructed and formed based on user characteristic information (such as gender, age, income and the like) obtained by data acquisition of corresponding equipment users.
Step S120 is performed to obtain, for each of a plurality of device users corresponding to the plurality of user terminal devices, a user portrait matching degree corresponding to the device user based on a matching degree between user portrait information corresponding to the device user and predetermined target user portrait information.
In this embodiment of the present invention, the user data analysis server may obtain, for each of a plurality of device users corresponding to the plurality of user terminal devices, a user portrait matching degree corresponding to the device user based on a matching degree between user portrait information corresponding to the device user and predetermined target user portrait information. And constructing and forming the target user portrait information based on the user characteristic information of a target user group to be constructed and formed.
Step S130, determining a target device user among the plurality of device users based on the user portrait matching degree corresponding to each device user among the plurality of device users.
In this embodiment of the present invention, the user data analysis server may determine a target device user among the multiple device users based on a user portrait matching degree corresponding to each of the multiple device users. Each determined target device user is used for constructing and forming the target user group (in this way, information to be recommended corresponding to the target user portrait information can be pushed to each target device user in the target user group).
The user portrait analysis method based on the big data can firstly acquire the user portrait information of the device user corresponding to each user terminal device, then, for each device user of a plurality of device users corresponding to the plurality of user terminal devices, the user portrait matching degree corresponding to the device user can be obtained based on the matching degree between the user portrait information corresponding to the device user and the predetermined target user portrait information, so that the target device user can be determined in the plurality of device users based on the user portrait matching degree corresponding to each device user of the plurality of device users, thus, the determined target device user and the target user portrait information (namely, the characteristics of the required user) representing the application requirements can be ensured to have higher matching degree, and the reliability of the target user group constructed based on the determined target device user can be ensured, therefore, the problem of poor reliability of a user group constructed and formed based on the prior art is solved.
For example, in a possible implementation, the step S110 in the above implementation may further include the following steps:
firstly, judging whether a user portrait analysis instruction is acquired (for example, after the information to be recommended is received, the user portrait analysis instruction is considered to be acquired), and generating user portrait analysis notification information after the user portrait analysis instruction is acquired;
secondly, sending the user portrait analysis notification information to each user terminal device in a plurality of user terminal devices in communication connection, wherein each user terminal device in the plurality of user terminal devices is used for displaying the user portrait analysis notification information to a device user corresponding to the user terminal device after receiving the user portrait analysis notification information, responding to an operation that the device user agrees to perform user portrait analysis based on the user portrait analysis notification information to generate corresponding user portrait analysis confirmation information, and sending the user portrait analysis confirmation information to the user data analysis server;
then, after the user portrait analysis confirmation information sent by each of the plurality of user terminal devices is obtained, user portrait information of a device user corresponding to each of the plurality of user terminal devices is obtained respectively.
For example, in a possible implementation, the step S120 in the above implementation may further include the following steps:
firstly, aiming at each piece of user characteristic information included in predetermined target user portrait information, determining a fusion coefficient corresponding to the user characteristic information;
secondly, respectively calculating the matching degree between each piece of user characteristic information included in the user portrait information corresponding to the user terminal equipment and the corresponding piece of user characteristic information included in the target user portrait information aiming at each piece of equipment user in a plurality of equipment users corresponding to the user terminal equipment to obtain the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information;
then, for each of multiple device users corresponding to the multiple user terminal devices, based on a fusion coefficient corresponding to each piece of the user feature information, performing fusion processing (such as weighted summation calculation) on a feature matching degree corresponding to each piece of user feature information included in the user portrait information corresponding to the device user, so as to obtain a user portrait matching degree corresponding to the device user.
For example, in a possible implementation manner, the step of determining, for each piece of user feature information included in the predetermined target user portrait information in the above implementation manner, a fusion coefficient corresponding to the user feature information may further include the following steps:
firstly, acquiring each historical target user group formed by historical construction to obtain at least one historical target user group, wherein each historical target user group in the at least one historical target user group comprises at least one historical device user;
secondly, determining historical user portrait information corresponding to each historical target user group in the at least one historical target user group, and performing duplication elimination screening on historical user feature information included in the historical user portrait information corresponding to each historical target user group in the at least one historical target user group (namely the same historical user feature information, only one of which is reserved) to obtain a corresponding historical feature information set;
then, for each piece of historical user feature information in the historical user feature information set, determining a historical device user corresponding to the historical user feature information in the at least one historical target user group as a first historical device user corresponding to the historical user feature information, and determining a fusion coefficient corresponding to the historical user feature information based on the first historical device user;
finally, for each piece of user feature information included in the predetermined target user portrait information, determining historical user feature information to which the user feature information belongs (for example, the same), and determining a fusion coefficient corresponding to the historical user feature information as a fusion coefficient corresponding to the user feature information.
For example, in a possible implementation manner, in the foregoing implementation manner, for each piece of historical user feature information in the historical feature information set, the step of determining, as a first historical device user corresponding to the historical user feature information, a historical device user corresponding to the historical user feature information in the at least one historical target user group, and determining, based on the first historical device user, a fusion coefficient corresponding to the historical user feature information further includes the following steps:
firstly, for each piece of historical user characteristic information in the historical characteristic information set, determining a historical device user corresponding to the historical user characteristic information in the at least one historical target user group as a first historical device user corresponding to the historical user characteristic information, counting the number of the first historical device users corresponding to the historical user characteristic information to obtain a first user statistical number corresponding to the historical user characteristic information, and determining a first coefficient corresponding to the historical user characteristic information based on the first user statistical number corresponding to the historical user characteristic information, wherein the first coefficient and the first user statistical number have a positive correlation;
secondly, respectively determining information attention of each historical device user in the historical target user group to the historical recommendation information corresponding to the historical target user group aiming at each historical target user group in the at least one historical target user group, and determining a group contribution coefficient of each historical device user in the historical target user group based on the information attention corresponding to each historical device user, wherein the group contribution coefficient and the information attention have positive correlation (if the information attention can be directly used as the group contribution coefficient);
then, aiming at each group contribution coefficient, determining the historical formation time of a historical target user group corresponding to the group contribution coefficient, constructing and forming a corresponding two-dimensional coordinate based on the group contribution coefficient and the historical formation time, and determining a coordinate vector corresponding to the two-dimensional coordinate;
then, for each piece of historical user feature information in the historical feature information set, sequentially connecting one coordinate vector corresponding to each first historical device user corresponding to the historical user feature information to obtain one connection path corresponding to the historical user feature information, wherein for each piece of historical user feature information in the historical feature information set, sequentially connecting one coordinate vector corresponding to each first historical device user corresponding to the historical user feature information to obtain one connection path corresponding to the historical user feature information, the step of obtaining one connection path corresponding to the historical user feature information is executed for multiple times to obtain multiple corresponding connection paths, and every two connection paths in the multiple connection paths are different;
further, for each piece of historical user feature information in the historical feature information set, respectively calculating a vector distance between two adjacent coordinate vectors in each connection path corresponding to the historical user feature information, respectively calculating a sum of the vector distances between two adjacent coordinate vectors in each connection path, obtaining a vector distance sum corresponding to each connection path, determining a connection path corresponding to the vector distance sum having the minimum value as a target connection path corresponding to the historical user feature information, and obtaining a contribution coefficient fusion value corresponding to the historical user feature information by fusing each group contribution coefficient corresponding to the target connection path;
finally, for each piece of historical user feature information in the historical feature information set, based on the contribution coefficient fusion value and the first coefficient (for example, calculating a product or an average value, etc.) corresponding to the historical user feature system information, a fusion coefficient corresponding to the historical user feature system information is determined.
For example, in a possible implementation manner, in the foregoing implementation manner, for each piece of historical user feature information in the historical feature information set, the step of determining, as a first historical device user corresponding to the historical user feature information, a historical device user corresponding to the historical user feature information in the at least one historical target user group, and determining, based on the first historical device user, a fusion coefficient corresponding to the historical user feature information further includes the following steps:
firstly, for each piece of historical user characteristic information in the historical characteristic information set, determining a historical device user corresponding to the historical user characteristic information in the at least one historical target user group as a first historical device user corresponding to the historical user characteristic information, counting the number of the first historical device users corresponding to the historical user characteristic information to obtain a first user statistical number corresponding to the historical user characteristic information, and determining a first coefficient corresponding to the historical user characteristic information based on the first user statistical number corresponding to the historical user characteristic information, wherein the first coefficient and the first user statistical number have a positive correlation;
secondly, respectively determining the information attention of each historical device user in the historical target user group to the historical recommendation information corresponding to the historical target user group aiming at each historical target user group in the at least one historical target user group, and determining a group contribution coefficient of each historical device user in the historical target user group based on the information attention corresponding to each historical device user, wherein the group contribution coefficient and the information attention have positive correlation, and the number of the historical target user groups is multiple;
then, for each historical device user, counting the number of group contribution coefficients corresponding to the historical device user to obtain a coefficient statistical number corresponding to the historical device user, determining a relative size relationship between the coefficient statistical number and a pre-configured statistical number threshold, and determining the historical device user as a second historical device user when the coefficient statistical number is greater than the statistical number threshold, or determining the historical device user as a third historical device user when the coefficient statistical number is less than or equal to the statistical number threshold, wherein a plurality of group contribution coefficients corresponding to each historical device user are sequentially arranged based on the history formation time of a historical target user group corresponding to each group contribution coefficient;
then, determining the coefficient statistic number with the minimum value in the coefficient statistic number corresponding to each second historical device user as a first number reference value, and calculating the average value of the group contribution coefficients corresponding to the third history equipment users aiming at each third history equipment user, and determining a second historical device user with correlation based on the average value and the average value of the group contribution coefficient corresponding to each second historical device user, and based on the group contribution coefficient corresponding to the second historical device user, interpolating the population contribution coefficient corresponding to the third history equipment user to obtain a new population contribution coefficient corresponding to the third history equipment user, the number of the new group contribution coefficients corresponding to each third history device user is the same as the number of the group contribution coefficients corresponding to one second history device user with the correlation;
further, for each of the historical device users, based on the statistical number threshold, performing sliding window processing on a plurality of group contribution coefficients currently possessed by the historical device user to obtain a plurality of coefficient sliding window sequences corresponding to the historical device user, and calculating sequence similarity between each two coefficient sliding window sequences in the plurality of coefficient sliding window sequences (e.g., calculating coefficient similarity of group contribution coefficients at corresponding sequence positions, and then calculating an average value of the coefficient similarity, etc.), and for each coefficient sliding window sequence corresponding to the historical device user, calculating an average value of the sequence similarity between the coefficient sliding window sequence and each other coefficient sliding window sequence to obtain a similarity average value corresponding to the coefficient sliding window sequence, and then determining one coefficient sliding window sequence with the largest corresponding similarity average value in the plurality of coefficient sliding window sequences, the target coefficient sliding window sequence is used as a target coefficient sliding window sequence corresponding to the historical equipment user;
further, for each historical device user, determining a target group contribution coefficient corresponding to the historical device user based on a plurality of group contribution coefficients included in the target coefficient sliding window sequence corresponding to the historical device user (for example, calculating an average value or a median value of the plurality of group contribution coefficients included in the target coefficient sliding window sequence, and then taking the average value or the median value as the target group contribution coefficient), and performing fusion processing on the target group contribution coefficient corresponding to each historical device user corresponding to the historical user feature information for each piece of historical user feature information in the historical feature information set to obtain a contribution coefficient fusion value corresponding to the historical user feature system information;
and finally, determining a fusion coefficient corresponding to the historical user characteristic system information based on the contribution coefficient fusion value and the first coefficient corresponding to the historical user characteristic system information aiming at each piece of historical user characteristic information in the historical characteristic information set.
For example, in a possible implementation, the step S130 in the above implementation may further include the following steps:
first, for each of the multiple device users, determining a relative size relationship between a user portrait matching degree corresponding to the device user and a pre-configured portrait matching degree threshold (e.g., whether the user portrait matching degree is greater than or equal to the portrait matching degree threshold);
secondly, for each of the multiple device users, if the user portrait matching degree corresponding to the device user is greater than or equal to the portrait matching degree threshold, the device user is determined as a target device user, and if the user portrait matching degree corresponding to the device user is smaller than the portrait matching degree threshold, the device user is determined as a non-target device user.
For example, in a possible implementation, the step S130 in the above implementation may further include the following steps:
firstly, sorting the equipment users based on the user portrait matching degree corresponding to each equipment user in the plurality of equipment users to obtain user sorting sequences corresponding to the plurality of equipment users, wherein when the equipment users are sorted, the sorting is carried out according to the sequence of the user portrait matching degree corresponding to the equipment users in a first order and a second order or in a first order and a second order;
and secondly, acquiring group quantity range information configured aiming at the target user group in advance, and selecting the device user with the corresponding quantity range with the maximum user portrait matching degree in the user sorting sequence as the target device user based on the group quantity range information.
With reference to fig. 3, an embodiment of the present invention further provides a big data-based user representation analysis system, which is applicable to the user data analysis server. The user representation analysis system may include a user representation acquisition module, a representation matching degree determination module, and a target user determination module.
The user portrait acquisition module is used for acquiring user portrait information of an equipment user corresponding to each user terminal equipment in a plurality of user terminal equipments in communication connection, wherein the user portrait information is constructed and formed on the basis of user characteristic information obtained by data acquisition of the corresponding equipment user.
The portrait matching degree determining module is used for obtaining, for each of a plurality of device users corresponding to the plurality of user terminal devices, a user portrait matching degree corresponding to the device user based on a matching degree between user portrait information corresponding to the device user and predetermined target user portrait information, wherein the target user portrait information is constructed and formed based on user feature information of a target user group to be constructed and formed.
The target user determination module is configured to determine a target device user among the multiple device users based on a user portrait matching degree corresponding to each device user among the multiple device users, where each determined target device user is used to construct and form the target user group.
For example, in a possible implementation, the user representation obtaining module is specifically configured to (refer to the related description of step S110 above):
judging whether a user portrait analysis instruction is acquired or not, and generating user portrait analysis notification information after the user portrait analysis instruction is acquired;
sending the user portrait analysis notification information to each of a plurality of user terminal devices in communication connection, wherein each of the user terminal devices is configured to, after receiving the user portrait analysis notification information, display the user portrait analysis notification information to a device user corresponding to the user terminal device, generate corresponding user portrait analysis confirmation information in response to an operation that the device user agrees to perform user portrait analysis based on the user portrait analysis notification information, and send the user portrait analysis confirmation information to the user data analysis server;
after the user portrait analysis confirmation information sent by each of the plurality of user terminal devices is obtained, user portrait information of a device user corresponding to each of the plurality of user terminal devices is obtained respectively.
For example, in a possible implementation, the image matching degree determining module is specifically configured to (refer to the related description of step S120 above):
determining a fusion coefficient corresponding to each piece of user characteristic information included in the predetermined target user portrait information;
for each equipment user in a plurality of equipment users corresponding to the plurality of user terminal equipment, respectively calculating the matching degree between each piece of user characteristic information included in the user portrait information corresponding to the equipment user and the corresponding piece of user characteristic information included in the target user portrait information to obtain the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information;
and for each equipment user in a plurality of equipment users corresponding to the plurality of user terminal equipment, based on the fusion coefficient corresponding to each piece of user characteristic information, fusing the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information corresponding to the equipment user to obtain the user portrait matching degree corresponding to the equipment user.
In summary, the method and system for analyzing user portrait based on big data according to the present invention may first obtain user portrait information of a device user corresponding to each user terminal device, and then, for each device user of a plurality of device users corresponding to a plurality of user terminal devices, obtain a user portrait matching degree corresponding to the device user based on a matching degree between the user portrait information corresponding to the device user and predetermined target user portrait information, so that a target device user may be determined among the plurality of device users based on the user portrait matching degree corresponding to each device user, thereby ensuring a high matching degree between the determined target device user and target user portrait information (i.e. characteristics of a required user) representing application requirements, and thus ensuring reliability of a target user group constructed based on the determined target device user, therefore, the problem of poor reliability of a user group constructed and formed based on the prior art is solved.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus and method embodiments described above are illustrative only, as the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, an electronic device, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a portable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, and various media capable of storing program codes. It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A big data-based user portrait analysis method is applied to a user data analysis server, and comprises the following steps:
the method comprises the steps that user portrait information of an equipment user corresponding to a plurality of user terminal equipment in communication connection is obtained for each user terminal equipment, wherein the user portrait information is constructed and formed on the basis of user characteristic information obtained by data acquisition of the corresponding equipment user;
for each device user in a plurality of device users corresponding to the user terminal devices, obtaining a user portrait matching degree corresponding to the device user based on a matching degree between user portrait information corresponding to the device user and predetermined target user portrait information, wherein the target user portrait information is constructed and formed based on user characteristic information of a target user group to be constructed and formed;
and determining target device users in the plurality of device users based on the user portrait matching degree corresponding to each device user in the plurality of device users, wherein each determined target device user is used for constructing and forming the target user group.
2. The big data based user representation analysis method of claim 1, wherein the step of obtaining, for each of the plurality of user terminal devices, user representation information of a device user corresponding to the user terminal device comprises:
judging whether a user portrait analysis instruction is acquired or not, and generating user portrait analysis notification information after the user portrait analysis instruction is acquired;
sending the user portrait analysis notification information to each of a plurality of user terminal devices in communication connection, wherein each of the user terminal devices is configured to display the user portrait analysis notification information to a device user corresponding to the user terminal device after receiving the user portrait analysis notification information, generate corresponding user portrait analysis confirmation information in response to an operation that the device user agrees to perform user portrait analysis based on the user portrait analysis notification information, and send the user portrait analysis confirmation information to the user data analysis server;
after the user portrait analysis confirmation information sent by each of the plurality of user terminal devices is obtained, user portrait information of a device user corresponding to each of the plurality of user terminal devices is obtained respectively.
3. The big data-based user representation analysis method of claim 1, wherein the step of obtaining, for each of a plurality of device users corresponding to the plurality of user terminal devices, a user representation matching degree corresponding to the device user based on a matching degree between user representation information corresponding to the device user and predetermined target user representation information comprises:
aiming at each piece of user characteristic information included in the predetermined target user portrait information, determining a fusion coefficient corresponding to the user characteristic information;
for each equipment user in a plurality of equipment users corresponding to the plurality of user terminal equipment, respectively calculating the matching degree between each piece of user characteristic information included in the user portrait information corresponding to the equipment user and the corresponding piece of user characteristic information included in the target user portrait information to obtain the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information;
and for each equipment user in a plurality of equipment users corresponding to the plurality of user terminal equipment, based on the fusion coefficient corresponding to each piece of user characteristic information, fusing the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information corresponding to the equipment user to obtain the user portrait matching degree corresponding to the equipment user.
4. The big data-based user representation analysis method of claim 3, wherein the step of determining, for each piece of user feature information included in the predetermined target user representation information, a fusion coefficient corresponding to the user feature information comprises:
obtaining each historical target user group formed by historical construction to obtain at least one historical target user group, wherein each historical target user group in the at least one historical target user group comprises at least one historical device user;
aiming at each historical target user group in the at least one historical target user group, determining historical user portrait information corresponding to the historical target user group, and performing duplication elimination screening on historical user characteristic information included in the historical user portrait information corresponding to each historical target user group in the at least one historical target user group to obtain a corresponding historical characteristic information set;
for each piece of historical user characteristic information in the historical characteristic information set, determining a historical device user corresponding to the historical user characteristic information in the at least one historical target user group as a first historical device user corresponding to the historical user characteristic information, and determining a fusion coefficient corresponding to the historical user characteristic information based on the first historical device user;
and determining historical user feature information to which the user feature information belongs aiming at each piece of user feature information included in the predetermined target user portrait information, and determining a fusion coefficient corresponding to the historical user feature information as a fusion coefficient corresponding to the user feature information.
5. The big data-based user representation analysis method of claim 4, wherein the step of determining, for each piece of historical user feature information in the set of historical feature information, a corresponding historical device user of the historical user feature information in the at least one historical target user group as a first historical device user corresponding to the historical user feature information, and determining a fusion coefficient corresponding to the historical user feature information based on the first historical device user comprises:
for each piece of historical user characteristic information in the historical characteristic information set, determining a historical device user corresponding to the historical user characteristic information in the at least one historical target user group as a first historical device user corresponding to the historical user characteristic information, counting the number of the first historical device users corresponding to the historical user characteristic information to obtain a first user statistical number corresponding to the historical user characteristic information, and determining a first coefficient corresponding to the historical user characteristic information based on the first user statistical number corresponding to the historical user characteristic information, wherein the first coefficient and the first user statistical number have a positive correlation;
for each historical target user group in the at least one historical target user group, respectively determining the information attention of each historical device user in the historical target user group to the historical recommendation information corresponding to the historical target user group, and determining a group contribution coefficient of each historical device user in the historical target user group based on the information attention corresponding to each historical device user, wherein the group contribution coefficient and the information attention have a positive correlation;
for each group contribution coefficient, determining the historical formation time of a historical target user group corresponding to the group contribution coefficient, constructing and forming a corresponding two-dimensional coordinate based on the group contribution coefficient and the historical formation time, and determining a coordinate vector corresponding to the two-dimensional coordinate;
for each piece of historical user feature information in the historical feature information set, sequentially connecting one coordinate vector corresponding to each first historical device user corresponding to the historical user feature information to obtain one connection path corresponding to the historical user feature information, wherein the step of sequentially connecting one coordinate vector corresponding to each first historical device user corresponding to the historical user feature information to obtain one connection path corresponding to the historical user feature information is executed for multiple times to obtain multiple corresponding connection paths, wherein every two connection paths in the multiple connection paths are different;
for each piece of historical user feature information in the historical feature information set, respectively calculating a vector distance between two adjacent coordinate vectors in each connecting path corresponding to the historical user feature information, respectively calculating a sum of the vector distances between two adjacent coordinate vectors in each connecting path, obtaining a vector distance sum value corresponding to each connecting path, determining a connecting path corresponding to the vector distance sum value with the minimum value as a target connecting path corresponding to the historical user feature information, and obtaining a contribution coefficient fusion value corresponding to the historical user feature information by fusing each group contribution coefficient corresponding to the target connecting path;
and determining a fusion coefficient corresponding to the historical user feature system information based on the contribution coefficient fusion value and the first coefficient corresponding to the historical user feature system information aiming at each piece of historical user feature information in the historical feature information set.
6. The big data-based user representation analysis method of any of claims 1-5, wherein the step of determining a target device user among the plurality of device users based on the user representation matching corresponding to each of the plurality of device users comprises:
for each device user in the plurality of device users, determining a relative size relation between a user portrait matching degree corresponding to the device user and a preset portrait matching degree threshold;
and for each equipment user in the plurality of equipment users, if the user portrait matching degree corresponding to the equipment user is larger than or equal to the portrait matching degree threshold value, determining the equipment user as a target equipment user, and if the user portrait matching degree corresponding to the equipment user is smaller than the portrait matching degree threshold value, determining the equipment user as a non-target equipment user.
7. The big data-based user representation analysis method of any of claims 1-5, wherein the step of determining a target device user among the plurality of device users based on the user representation matching corresponding to each of the plurality of device users comprises:
sequencing the equipment users based on the user portrait matching degree corresponding to each equipment user in the plurality of equipment users to obtain user sequencing sequences corresponding to the plurality of equipment users, wherein when sequencing the equipment users, sequencing is carried out according to the sequence of the user portrait matching degree corresponding to the equipment users, wherein the sequence is big first and small last or small first and big last;
and acquiring group quantity range information configured aiming at the target user group in advance, and selecting the device user with the corresponding quantity range with the maximum user portrait matching degree in the user sorting sequence as the target device user based on the group quantity range information.
8. A big data based user representation analysis system for application to a user data analysis server, the big data based user representation analysis system comprising:
the user portrait acquisition module is used for acquiring user portrait information of an equipment user corresponding to a plurality of user terminal equipment in communication connection aiming at each user terminal equipment, wherein the user portrait information is constructed and formed on the basis of user characteristic information acquired by data acquisition of the corresponding equipment user;
the portrait matching degree determining module is used for obtaining the user portrait matching degree corresponding to each user of a plurality of user terminal devices based on the matching degree between the user portrait information corresponding to the user terminal device and predetermined target user portrait information, wherein the target user portrait information is constructed and formed based on user characteristic information of a target user group to be constructed and formed;
and the target user determining module is used for determining a target device user from the plurality of device users based on the user portrait matching degree corresponding to each device user, wherein each determined target device user is used for constructing and forming the target user group.
9. The big data-based user representation analysis system of claim 8, wherein the user representation acquisition module is specifically configured to:
judging whether a user portrait analysis instruction is acquired or not, and generating user portrait analysis notification information after the user portrait analysis instruction is acquired;
sending the user portrait analysis notification information to each of a plurality of user terminal devices in communication connection, wherein each of the user terminal devices is configured to display the user portrait analysis notification information to a device user corresponding to the user terminal device after receiving the user portrait analysis notification information, generate corresponding user portrait analysis confirmation information in response to an operation that the device user agrees to perform user portrait analysis based on the user portrait analysis notification information, and send the user portrait analysis confirmation information to the user data analysis server;
after the user portrait analysis confirmation information sent by each of the plurality of user terminal devices is obtained, user portrait information of a device user corresponding to each of the plurality of user terminal devices is obtained respectively.
10. The big-data-based user representation analysis system of claim 8, wherein the representation-matching determination module is specifically configured to:
determining a fusion coefficient corresponding to each piece of user characteristic information included in the predetermined target user portrait information;
for each equipment user in a plurality of equipment users corresponding to the plurality of user terminal equipment, respectively calculating the matching degree between each piece of user characteristic information included in the user portrait information corresponding to the equipment user and the corresponding piece of user characteristic information included in the target user portrait information to obtain the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information;
and for each equipment user in a plurality of equipment users corresponding to the user terminal equipment, based on the fusion coefficient corresponding to each piece of user characteristic information, performing fusion processing on the characteristic matching degree corresponding to each piece of user characteristic information included in the user portrait information corresponding to the equipment user to obtain the user portrait matching degree corresponding to the equipment user.
CN202210321101.3A 2022-03-30 2022-03-30 User portrait analysis method and system based on big data Active CN115098793B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210321101.3A CN115098793B (en) 2022-03-30 2022-03-30 User portrait analysis method and system based on big data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210321101.3A CN115098793B (en) 2022-03-30 2022-03-30 User portrait analysis method and system based on big data

Publications (2)

Publication Number Publication Date
CN115098793A true CN115098793A (en) 2022-09-23
CN115098793B CN115098793B (en) 2023-05-09

Family

ID=83287628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210321101.3A Active CN115098793B (en) 2022-03-30 2022-03-30 User portrait analysis method and system based on big data

Country Status (1)

Country Link
CN (1) CN115098793B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150356439A1 (en) * 2014-06-09 2015-12-10 Cognite, Inc. Travel-Related Cognitive Personas
CN110110321A (en) * 2019-03-19 2019-08-09 深圳壹账通智能科技有限公司 Products Show method, apparatus, equipment and storage medium based on voice data
CN109710811B (en) * 2018-11-28 2021-03-02 汉海信息技术(上海)有限公司 User portrait detection method, device and application system
CN113254833A (en) * 2021-06-07 2021-08-13 深圳市中元产教融合科技有限公司 Information pushing method and service system based on birth teaching fusion
CN113282631A (en) * 2020-02-20 2021-08-20 上海哔哩哔哩科技有限公司 Method and equipment for determining target user based on user portrait data
CN113641899A (en) * 2021-08-04 2021-11-12 王钊 User portrait analysis method and device based on distance education and server
CN114021016A (en) * 2021-11-05 2022-02-08 山东库睿科技有限公司 Data recommendation method, device, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150356439A1 (en) * 2014-06-09 2015-12-10 Cognite, Inc. Travel-Related Cognitive Personas
CN109710811B (en) * 2018-11-28 2021-03-02 汉海信息技术(上海)有限公司 User portrait detection method, device and application system
CN110110321A (en) * 2019-03-19 2019-08-09 深圳壹账通智能科技有限公司 Products Show method, apparatus, equipment and storage medium based on voice data
CN113282631A (en) * 2020-02-20 2021-08-20 上海哔哩哔哩科技有限公司 Method and equipment for determining target user based on user portrait data
CN113254833A (en) * 2021-06-07 2021-08-13 深圳市中元产教融合科技有限公司 Information pushing method and service system based on birth teaching fusion
CN113641899A (en) * 2021-08-04 2021-11-12 王钊 User portrait analysis method and device based on distance education and server
CN114021016A (en) * 2021-11-05 2022-02-08 山东库睿科技有限公司 Data recommendation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN115098793B (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN107305611B (en) Method and device for establishing model corresponding to malicious account and method and device for identifying malicious account
CN109299356B (en) Activity recommendation method and device based on big data, electronic equipment and storage medium
CN110909222B (en) User portrait establishing method and device based on clustering, medium and electronic equipment
CN110782286A (en) Advertisement pushing method and device, server and computer readable storage medium
CN111797320A (en) Data processing method, device, equipment and storage medium
CN112214677A (en) Interest point recommendation method and device, electronic equipment and storage medium
CN111626780A (en) Information pushing method and device, server and storage medium
CN113485931A (en) Test method, test device, electronic equipment and computer readable storage medium
CN115098793B (en) User portrait analysis method and system based on big data
CN113949881B (en) Business processing method and system based on smart city data
CN114125813B (en) Signal coverage range determining method based on mobile phone signaling and related device
CN111368858A (en) User satisfaction evaluation method and device
EP4349055A2 (en) Dimensioning of telecommunication infrastructure
CN114416786A (en) Stream data processing method and device, storage medium and computer equipment
KR102323424B1 (en) Rating Prediction Method for Recommendation Algorithm Based on Observed Ratings and Similarity Graphs
CN113256366A (en) Order data processing method and system based on big data and cloud computing
CN113609111A (en) Big data testing method and system
CN113407750A (en) Image distributed storage method and system
CN114416829A (en) Network training method based on machine learning and cloud authentication service system
CN113724023B (en) Media resource pushing method and device, electronic equipment and storage medium
CN111198941A (en) Problem discovery method and device, electronic equipment and storage medium
CN110688508A (en) Image-text data expansion method and device and electronic equipment
CN115375412B (en) Intelligent commodity recommendation processing method and system based on image recognition
CN113836402B (en) Order screening method based on data processing
CN116501993B (en) House source data recommendation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230414

Address after: Room 2201-1, Building 1, No. 2167, Zhenbei Road, Putuo District, Shanghai, 200333

Applicant after: Weimaikejian Group Co.,Ltd.

Address before: No. 7 Zhenxing Avenue, Simao District, Pu'er City, Yunnan Province 665000

Applicant before: Chen Yingshu

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Room 2201-1, Building 1, No. 2167, Zhenbei Road, Putuo District, Shanghai, 200333

Patentee after: Weimai Technology Co.,Ltd.

Address before: Room 2201-1, Building 1, No. 2167, Zhenbei Road, Putuo District, Shanghai, 200333

Patentee before: Weimaikejian Group Co.,Ltd.

CP01 Change in the name or title of a patent holder