CN112948226B - User portrait drawing method and device - Google Patents

User portrait drawing method and device Download PDF

Info

Publication number
CN112948226B
CN112948226B CN202110163487.5A CN202110163487A CN112948226B CN 112948226 B CN112948226 B CN 112948226B CN 202110163487 A CN202110163487 A CN 202110163487A CN 112948226 B CN112948226 B CN 112948226B
Authority
CN
China
Prior art keywords
data
user
analysis
interface
embedding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110163487.5A
Other languages
Chinese (zh)
Other versions
CN112948226A (en
Inventor
袁潇锋
肖群
王进
关宇坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction Bank Corp
Original Assignee
China Construction Bank Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Bank Corp filed Critical China Construction Bank Corp
Priority to CN202110163487.5A priority Critical patent/CN112948226B/en
Publication of CN112948226A publication Critical patent/CN112948226A/en
Application granted granted Critical
Publication of CN112948226B publication Critical patent/CN112948226B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/302Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is a software system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3065Monitoring arrangements determined by the means or processing involved in reporting the monitored data
    • G06F11/3072Monitoring arrangements determined by the means or processing involved in reporting the monitored data where the reporting involves data filtering, e.g. pattern matching, time or event triggered, adaptive or policy-based reporting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3089Monitoring arrangements determined by the means or processing involved in sensing the monitored data, e.g. interfaces, connectors, sensors, probes, agents
    • G06F11/3093Configuration details thereof, e.g. installation, enabling, spatial arrangement of the probes

Abstract

The invention relates to the technical field of user behavior data acquisition and analysis, in particular to a user portrait drawing method and device. The method comprises the following steps: burying points in a target program by utilizing the SDK; the target program comprises a webpage program and/or an APP program; monitoring behavior events of an interface when a target program runs to obtain buried data; processing and transmitting the obtained buried point data according to preset data screening and cleaning rules; analyzing the buried point data and drawing a target user portrait; wherein the target user is a representative of at least one user using the target program. Compared with the prior art, the user portrait drawing scheme improves the accuracy of user portrait drawing.

Description

User portrait drawing method and device
Technical Field
The invention relates to the technical field of user behavior data acquisition and analysis, in particular to a user portrait drawing method and device.
Background
The user portrait is also called user role, is used as an effective tool for outlining target users and contacting user appeal and design direction, is widely applied in various fields, and basically all APP is needed and user portrait drawing is carried out, so that targeted service is provided for users, and the use experience and user viscosity of the APP by the users are improved. In order to cope with increasingly strong market competition and acquire more user resources in competition, how to accurately grasp clients, accurately describe user images and improve user experience is a topic in continuous and intensive research. The collection and analysis of user behavior data is a vital precondition for user portrayal rendering. At present, the main stream of Internet user behavior data acquisition adopts a buried point technology, and a plurality of buried point schemes exist, but the existing buried point technical scheme has the problem of larger invasiveness to application codes in the buried point process, influences subsequent application management and application maintenance, increases the resource consumption in the buried point process, and reduces the buried point efficiency.
Disclosure of Invention
The object of the present application is to solve at least one of the technical drawbacks mentioned above. The technical scheme adopted by the application is as follows:
in a first aspect, an embodiment of the present application discloses a user portrait drawing method, where the method includes:
burying points in a target program by utilizing the SDK; the target program comprises a webpage program and/or an APP program;
monitoring behavior events of an interface when a target program runs to obtain buried data;
processing and transmitting the obtained buried point data according to preset data screening and cleaning rules;
analyzing the buried point data and drawing a target user portrait; wherein the target user is a representative of at least one user using the target program.
Further, performing the buried point in the target program includes: embedding points in the target program by adopting code embedding points and/or visual embedding point technology.
Further, embedding points in the target program by using the code embedding point technology comprises:
selecting a computer language to finish embedding the point codes; wherein the computer language includes, but is not limited to, any one of the following: HTML, CSS, javaScript;
embedding the completed embedded point code file into a target program through a target program interface; wherein the embedded code file or files in the object remain independent.
Further, the embedding the point in the target program by adopting the code embedding point technology further comprises:
writing a self-adjusting function in the embedded point code;
setting the operation enabling state of the embedded point code file to be an opening state; or, setting the operation enabling parameter of the target program and the enabling parameter of the embedded point code file to be the same parameter.
Further, the behavioral event acquisition burial point data of the interface includes, but is not limited to: click event, exposure event, page dwell time.
Further, the preset data screening rule includes at least one of the following:
eliminating or correcting interface up-down sliding events exceeding a preset frequency;
eliminating or correcting click events on page elements which do not relate to skip operation in the interface;
and eliminating or correcting other interface behavior events exceeding a preset frequency.
Further, the preset data cleaning rule includes: culling or correcting non-closed loop behavioral operational events.
Further, the buried data processed according to the preset data screening and cleaning rules are stored and then transmitted to a server for analysis.
Further, the analyzing the buried point data and drawing the target user portrait includes: analyzing the obtained buried point data according to preset dimensions, wherein the preset dimensions comprise: social attributes, lifestyle habits, consumption behavior.
Further, after analyzing the acquired buried data according to a preset dimension, the method further includes: analyzing the acquired buried data in at least one of the following ways: flow statistics analysis, behavior path construction analysis and user equipment management analysis.
In another aspect, an embodiment of the present application provides a data embedding device, where the device includes: the system comprises a buried point module, a monitoring module, a storage module, a processing module, an analysis module and a drawing module, wherein,
the embedded point module is used for embedding points in the target program by utilizing the SDK; the target program comprises a webpage program and/or an APP program;
the monitoring module is used for monitoring behavior events of the interface when the target program runs to obtain buried data;
the storage module is used for storing preset data screening and cleaning rules;
the processing module is used for processing and transmitting the obtained buried point data according to preset data screening and cleaning rules;
the analysis module is used for analyzing the buried point data;
the drawing module is used for drawing the target user portrait; wherein the target user is a representative of at least one user using the target program.
Further, the embedded point module is specifically configured to embed points in the target program by using a code embedded point and/or visual embedded point technology.
Further, the buried point module further includes: a selection unit and an interface unit; wherein,
the selecting unit is used for selecting a computer language to finish embedding the point codes; wherein the computer language includes, but is not limited to, any one of the following: HTML, CSS, javaScript;
the interface unit is used for embedding the completed embedded point code file into a target program through a target program interface; wherein the embedded code file or files in the object remain independent.
Further, the preset data screening rule includes: eliminating or correcting up-and-down sliding events of the interface exceeding a preset frequency, and/or eliminating or correcting clicking events of page elements which do not relate to jumping operation in the interface; and/or eliminating or correcting other interface behavior events exceeding a preset frequency;
the preset data cleaning rule comprises the following steps: culling or correcting non-closed loop behavioral operational events.
In a third aspect, embodiments of the present application provide an electronic device including a processor and a memory;
The memory is used for storing operation instructions;
the processor is configured to execute the method described in any one of the foregoing embodiments by calling the operation instruction.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements a method according to any of the embodiments described above.
According to the data point embedding scheme provided by the embodiment of the application, the SDK is utilized to embed points in the target program; the target program comprises a webpage program and/or an APP program; monitoring behavior events of an interface when a target program runs to obtain buried data; processing and transmitting the obtained buried point data according to preset data screening and cleaning rules; analyzing the buried point data and drawing a target user portrait; wherein the target user is a representative of at least one user using the target program. The technical scheme provided by the embodiment of the application has the beneficial effects that the method at least comprises one of the following steps:
(1) In the embodiment of the application, the embedded point code file and the target program service file are relatively independent, do not influence each other and do not interfere each other, so that the problem that various embedded point codes invade original pages and service codes greatly at present is solved.
(2) According to the embodiment of the application, the buried point data after data processing is temporarily stored, so that the loss of the buried point data can be avoided, and the problem that the buried point data cannot be transmitted to a server side in real time due to the fact that a user network or equipment fails in some schemes in the prior art is solved.
(3) According to the embodiment of the application, the buried point data are screened and cleaned according to the preset rule, so that the problem that the data are not filtered in the data uploading process of the client side in the buried point code in the prior art is solved, and then a plurality of invalid data are uploaded to the server side, and finally the server side receives a large amount of redundant data to influence the accuracy of data analysis.
(4) Compared with the prior art, the scheme of the embodiment of the application improves the accuracy of drawing the user portrait.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings that are required to be used in the description of the embodiments of the present application will be briefly described below.
FIG. 1 is a flow chart of a user portrait drawing method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a data point burying device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of illustrating the present application and are not to be construed as limiting the invention.
It will be appreciated by those of skill in the art that, unless expressly stated otherwise, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, wherein "first," "second," etc. are used for purposes of describing the clarity of understanding only and are not intended to limit the subject itself, and of course the subject defined by "first" and "second" may be the same terminal, device, user, etc. or the same terminal, device, and user. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
Furthermore, it should be understood that in embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one (item) below" or the like, refers to any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
In order to more clearly describe the technical solutions of the present application, the following describes some concepts, terms or devices that may be related to the following embodiments, so as to help understand the data embedding scheme disclosed in the present application:
the user portrayal is also called user role, which is an effective tool for outlining target users and contacting user appeal and design direction, and is widely used in various fields. The user representation is a virtual representation of a real user, first it is based on reality, it is not a specific person, and the other is to distinguish between different types based on differences in the behavioral perspective of the target, cluster the types together rapidly, and then refine the newly derived types to form a type of user representation. For one specific requirement, approximately 4-8 types of user portraits are required for analysis. We often combine the user's attributes, behaviors, and expected data transformations in the most superficial and life-oriented utterances during the course of actual operation. As a virtual representation of an actual user, the user image is formed in a user character that is not built off the product and market, and the formed user character is required to have a primary audience and target group representing the performance representative product.
SDKs (Software Development Kit, software development kits) are typically a collection of development tools that some software engineers build application software for a particular software package, software framework, hardware platform, operating system, etc.
By "buried point" is meant the related art and its implementation of capturing, processing and transmitting for a particular user action or event. The buried point technology can be classified into: code embedded points and visual embedded points. In the test stage of the application or the on-line operation stage of the application, the embedded point is an indispensable link, and especially in the operation stage, various APP needs to analyze each operation behavior of the user acquired by the embedded point specifically, abstract the information into labels, and then utilize the labels to more embody the user image, thereby providing targeted service for the user. The embodiment of the application provides a user portrait drawing scheme based on a buried point technology.
IMEI (International Mobile Equipment Identity): international Mobile equipment identity code
IMSI (International Mobile Subscriber Identification Number): international mobile subscriber identity
UUID (Universally Unique Identifier): universal unique identification code
MAC (Media Access Control Address): media access control address
The placeholder attribute provides a hint message that may describe the expected value of an input field, which hint may be displayed when the input field is empty and may disappear when the field gets focus.
Fig. 1 shows a schematic flow chart of a data embedding point provided in an embodiment of the present application, and as shown in fig. 1, the method may mainly include:
s101, burying points in a target program by utilizing an SDK; the target program comprises a webpage program and/or an APP program;
in a further alternative embodiment, performing the burial point in the target program includes: embedding points in the target program by adopting code embedding points and/or visual embedding point technology.
In a preferred embodiment, the implementation procedure of embedding points in the target program by adopting the code embedding point technology includes:
step 1, selecting a computer language to finish embedding point codes; wherein the computer language includes, but is not limited to, any one of the following: HTML, CSS, javaScript;
step 2, embedding the completed embedded point code file into a target program through a target program interface; wherein the embedded code file or files in the object remain independent. In the embodiment of the application, the embedded point code file and the target program service file are relatively independent, do not influence each other and do not interfere each other, so that the problem that various embedded point codes invade original pages and service codes greatly at present is solved.
Specifically, the embedding the point in the target program by adopting the code embedding point technology further comprises:
step 1, writing a self-adjusting function in a buried point code; namely, the embedded point code is automatically operated when the code file is loaded in the target program.
Step 2-1, setting the operation enabling state of the embedded point code file to be an opening state, namely enabling the embedded point code to be in the operation state all the time to monitor the interface of the user terminal, and acquiring an interface behavior event; or alternatively adopting the mode of the step 2-2;
and 2-2, setting the operation enabling parameters of the target program and the enabling parameters of the embedded point code file to be the same parameters, namely when the target program operates, the related embedded point codes synchronously operate to start monitoring the behavior event of the operation interface of the target program, and similarly, when the operation of the target program is finished, the related embedded point codes also stop loading and operating.
Further, the method of binding the global click event is adopted in the embedded point code file to acquire the interface operation behavior of the user, and when the user enters the program operation interface to perform operations such as interface sliding browsing, clicking content and the like, the user can respectively process according to different element DOM attributes in the user click interface. For example:
(1) When a user clicks labels of < b > </b >, < p > </p >, < span > </span >, < text > </text >, < view > </view > in the interface, the content of the clicked element needs to be acquired through the incertext attribute of the element.
(2) When the user clicks the < img/> tag in the interface, the network address of the clicked element needs to be obtained through the currentSrc attribute of the element, and the title value of the picture is obtained through the title attribute of the element. In addition, the information such as the size of the picture, the clicking position and the like can be obtained according to other attributes in the element.
(3) When the user clicks the < button > tag in the interface, the content of the clicked element needs to be acquired through the incertext attribute of the element, and the type of the clicked element needs to be acquired through the type attribute.
(4) Corresponding content can also be obtained by clicking on a page element tag customized by a developer in the interface.
S102, monitoring behavior events of an interface when the target program runs to obtain buried data; in a further alternative embodiment, the behavioral event acquisition burial point data of the interface includes, but is not limited to: clicking events, exposure events, page residence time and other interface operation behavior events to acquire operation behavior information of a user in real time. And meanwhile, analyzing the attributes of the visited pages according to the operation behaviors of the user, wherein the attributes comprise the current interface path, the entering time, the leaving time, the stay time, the source channel, whether the home page is the home page or not and whether the exit page is the exit page or not.
S103, processing and transmitting the obtained buried point data according to preset data screening and cleaning rules;
in a further optional embodiment, the preset data filtering rule includes at least one of:
(1) Eliminating or correcting interface up-down sliding events exceeding a preset frequency;
(2) Eliminating or correcting click events on page elements which do not relate to skip operation in the interface; for example: buttons, pictures, text, page blank areas, etc.
(3) And eliminating or correcting other interface behavior events exceeding a preset frequency. For example, the user frequently slides the canner advertisement carousel chart in the page, or frequently clicks the next page, or frequently clicks the page element to perform the operations of jumping or frequently performing the operations of jumping, returning, and the like.
In a further optional embodiment, the preset data cleansing rule includes: rejecting or correcting non-closed-loop behavioral operational events, i.e., checking whether the analytical process data belongs to a behavioral closed loop, such as: interface open, interface browse, interface close (or interface jump), and the like. For example, during use by a user, the embedded point code listens for the following behavior information:
(1) The user opens interface a at 2020-6-28 14:52:28 and stays in interface a for 13 seconds, while also triggering an interface sliding event, which we assume by behavioral analysis that the user is browsing interface a.
(2) The user clicks on a certain advertisement picture B in interface a and jumps into interface B at 2020-6-28 14:52:41, we will record the click and interface jump behavior.
(3) If the user network or the hardware device has a problem at this time, the embedded point code does not acquire the specific behavior of clicking the picture B in the interface a and the specific time of leaving the interface a, but only acquires the specific time of entering the interface B. Before the missing data is transmitted to the server through the interface, the missing data is partially complemented, rewritten and even covered according to logic, and the abnormal marking work of the data is finished.
According to the embodiment of the application, the buried point data are screened and cleaned according to the preset rule, so that the problem that the data are not filtered in the data uploading process of the client side in the buried point code in the prior art is solved, and then a plurality of invalid data are uploaded to the server side, and finally the server side receives a large amount of redundant data to influence the accuracy of data analysis.
In a further alternative embodiment, the embedded point data processed according to the preset data screening and cleaning rules is stored and then transmitted to the server for analysis, so that the loss of the embedded point data can be avoided, and the problem that the embedded point data cannot be transmitted to the server in real time due to the fact that a user network or equipment breaks down in some schemes in the prior art is solved.
After the data processing is performed in the above manner, the processed buried point data is transmitted to the server through the relevant interface for summarizing and analyzing, that is, the operation of step S104 is performed.
S104, analyzing the buried point data and drawing a target user portrait; wherein the target user is a representative of at least one user using the target program.
In a further alternative embodiment, the analyzing step of analyzing the buried point data and rendering the target user representation includes:
step 1, conventional data statistics processing, analysis and modeling are carried out on summarized buried data;
step 2, analyzing the obtained buried point data according to preset dimensions, wherein the preset dimensions comprise: social attributes, lifestyle and consumption behavior, wherein the specific information of the three dimensions is as follows:
social attributes may include age, gender, territory, academic, profession, marital status, residential vehicles, and the like.
Lifestyle may include sports, leisure, travel, eating living, shopping, gaming, sports, culture, and the like.
The consumption behavior (based on the product) may include a consumption amount, a number of consumption times, a consumption time, a consumption frequency, and the like.
In a further alternative embodiment, after analyzing the acquired buried data according to a preset dimension, the method further comprises: analyzing the acquired buried data in at least one of the following ways: flow statistics analysis, behavior path construction analysis and user equipment management analysis. The specific analysis contents of the three analysis modes are as follows:
1. Flow statistical analysis
(1) According to on-line condition analysis
The online condition analysis records the activity information of online users respectively, and comprises the following steps: visiting time, visitor region, incoming page, current stay page, etc., which are helpful for enterprises to grasp their own website traffic in real time.
(2) Analysis by time period
The time period analysis can provide the flow change condition of the website (or APP) in any time period, or the flow change condition from a certain period to a certain period, such as small-period distribution and daily access volume distribution, so that the time period that the enterprise knows the user to browse the webpage is well analyzed.
(3) Analysis by source
The source analysis can provide the data such as visiting times, IP, independent visitors, new visitors, browsing times of the new visitors, total browsing times in the station and the like brought by the domain name of the incoming path for the enterprise. The data can directly enable enterprises to know the way of the promotion effect, so that the ways of the promotion effect can be analyzed, or the advertising effect of the website is more obvious.
2. Behavioral path construction analysis
And acquiring the whole behavior link of the user by correlating the processed information. For example: and the user A clicks a third picture of the canner on the home page to enter the active page H, clicks a fourth promotion position of the third row of the navigation on the active page H to enter a certain commodity D page, and finally the user browses for thirty seconds and then adds the commodity D page into a shopping cart to pay for the order.
3. User device management analysis includes, but is not limited to, managing user devices using several data:
international mobile phone unique identification IMEI, international mobile subscriber identity IMSI, android ID, device ID, UUID (an ID is generated when running for the first time after program Installation, and is used to identify the unique ID of each application, i.e. the Installation ID may be used to track the number of applications installed, etc.), MAC address, download Channel, APP version, user behavior parameters (user behavior may be classified and given a classification parameter definition in APP or page design), etc.
Through the data analysis, a complete user information database can be obtained, user screening can be performed through various labels, and further content such as accurate short message pushing, app message pushing, personalized advertisement and the like can be realized through user screening, product optimization is guided, and even private customization of product functions and the like are realized. Looking at the data alone makes it difficult to find subtle links between the data, and can also be analyzed and studied using charts, such as line graphs, bar graphs, pie charts, scatter plots, and the like.
Based on the user portrait drawing method shown in fig. 1, on the other hand, the embodiment of the present application provides a data embedding device, as shown in fig. 2, the device may include: a 201 buried point module, a 202 monitoring module, a 203 storage module, a 204 processing module, a 205 analysis module and a 206 drawing module, wherein,
The 201 embedded point module is used for embedding points in a target program by utilizing the SDK; the target program comprises a webpage program and/or an APP program;
the 202 monitoring module is used for monitoring behavior events of an interface when the target program runs to obtain buried data;
the 203 storage module is used for storing preset data screening and cleaning rules;
the 204 processing module is used for processing and transmitting the obtained buried point data according to preset data screening and cleaning rules;
the 205 analysis module is configured to analyze the buried point data;
the 206 drawing module is used for drawing the target user portrait; wherein the target user is a representative of at least one user using the target program.
Further, the 201 embedding point module is specifically configured to embed points in the target program by using a code embedding point and/or a visual embedding point technology.
Further, the 201 embedded point module further includes: 2011 selecting a unit and 2012 an interface unit; wherein,
the 2011 selecting unit is used for selecting a computer language to finish the embedded point code; wherein the computer language includes, but is not limited to, any one of the following: HTML, CSS, javaScript;
the 2012 interface unit is used for embedding the completed embedded point code file into a target program through a target program interface; wherein the embedded code file or files in the object remain independent.
Further, the preset data screening rule includes: eliminating or correcting up-and-down sliding events of the interface exceeding a preset frequency, and/or eliminating or correcting clicking events of page elements which do not relate to jumping operation in the interface; and/or eliminating or correcting other interface behavior events exceeding a preset frequency;
the preset data cleaning rule comprises the following steps: culling or correcting non-closed loop behavioral operational events.
It will be appreciated that the above-described constituent devices of the data point burying apparatus in the present embodiment have functions of implementing the respective steps of the method in the embodiment shown in fig. 1. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules or devices corresponding to the functions described above. The modules and the devices can be software and/or hardware, and each module and the devices can be implemented separately or multiple modules and devices can be integrated. The functional description of the above modules and apparatuses may be specifically referred to the corresponding description of the method in the embodiment shown in fig. 1, and therefore, the advantages achieved by the functional description may be referred to the advantages of the corresponding method provided above, which are not described herein.
It should be understood that the structure illustrated in the embodiments of the present invention does not constitute a specific limitation on the specific structure of the data burying device. In other embodiments of the present application, the data point device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The embodiment of the application provides electronic equipment, which comprises a processor and a memory;
a memory for storing operation instructions;
and the processor is used for executing the user portrait drawing method provided in any embodiment of the application by calling the operation instruction.
As an example, fig. 3 shows a schematic structural diagram of an electronic device to which the embodiment of the present application is applied, and as shown in fig. 3, the electronic device 300 includes: a processor 301 and a memory 303. Wherein the processor 301 is coupled to the memory 303, such as via a bus 302. Optionally, the electronic device 300 may also include a transceiver 304. It should be noted that, in practical application, the transceiver 304 is not limited to one. It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the specific structure of the electronic device 300. In other embodiments of the present application, electronic device 300 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. Optionally, the electronic device may further comprise a display screen 305 for displaying images or receiving user operation instructions if necessary.
The processor 301 is applied in the embodiments of the present application, and is configured to implement the method shown in the above-described method embodiments. Transceiver 304 may include a receiver and a transmitter, with transceiver 304 being used in embodiments of the present application to perform functions that enable an electronic device of embodiments of the present application to communicate with other devices.
The processor 301 may be a CPU (Central Processing Unit ), general purpose processor, DSP (Digital Signal Processor, data signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field Programmable Gate Array, field programmable gate array) or other programmable logic device, transistor logic device, hardware components, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. Processor 301 may also be a combination that implements computing functionality, e.g., comprising one or more microprocessor combinations, a combination of a DSP and a microprocessor, etc.
Processor 301 may also include one or more processing units such as, for example: the processor 301 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processingunit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a Neural network processor (Neural-network Processing Unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller may be a neural hub and a command center of the electronic device 300, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. A memory may also be provided in the processor 301 for storing instructions and data. In some embodiments, the memory in the processor 301 is a cache memory. The memory may hold instructions or data that the processor 301 has just used or recycled. If the processor 301 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 301 is reduced, thus improving the efficiency of the system.
The processor 301 may run the user portrait drawing method provided in the embodiment of the present application, so as to reduce the operation complexity of the user, improve the intelligent degree of the terminal device, and improve the user experience. The processor 301 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to perform the user portrait drawing method provided in the embodiments of the present application, for example, a part of algorithms in the user portrait drawing method are executed by the CPU, and another part of algorithms are executed by the GPU, so as to obtain faster processing efficiency.
Bus 302 may include a path to transfer information between the components. Bus 302 may be a PCI (Peripheral Component Interconnect, peripheral component interconnect Standard) bus or an EISA (Extended Industry Standard Architecture ) bus, or the like. Bus 302 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 3, but not only one bus or one type of bus.
The Memory 303 may be, but is not limited to, ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, RAM (Random Access Memory ) or other type of dynamic storage device that can store information and instructions, EEPROM (Electrically Erasable Programmable Read Only Memory ), CD-ROM (Compact Disc Read Only Memory, compact disc Read Only Memory), high speed random access Memory, nonvolatile Memory such as at least one magnetic disk storage device, flash Memory device, universal flash Memory (universal flash storage, UFS), or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store the desired program code in the form of instructions or data structures and that can be accessed by a computer.
Optionally, the memory 303 is used for storing application program codes for executing the embodiments of the present application, and the processor 301 controls the execution. The processor 301 is configured to execute application code stored in the memory 303 to implement the user portrayal rendering method provided in any of the embodiments of the present application.
Memory 303 may be used to store computer executable program code that includes instructions. The processor 301 executes instructions stored in the memory 303 to thereby perform various functional applications and data processing of the electronic device 300. The memory 303 may include a stored program area and a stored data area. The storage program area may store, among other things, an operating system, code for an application program, and the like. The storage data area may store data created during use of the electronic device 300 (e.g., images, video, etc. captured by a camera application), and so on.
The memory 303 may also store one or more computer programs corresponding to the user portrait drawing method provided in the embodiments of the present application. The one or more computer programs are stored in the memory 303 and configured to be executed by the one or more processors 301, the one or more computer programs comprising instructions that can be used to perform the various steps in the respective embodiments described above.
Of course, the code of the user portrait drawing method provided in the embodiment of the present application may also be stored in an external memory. In this case, the processor 301 may run the code of the user portrait drawing method stored in the external memory through the external memory interface, and the processor 301 may control the running of the data buried point flow.
The display screen 305 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrixorganic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot lightemitting diodes, QLED), or the like. In some embodiments, the electronic device 300 may include 1 or N display screens 305, N being a positive integer greater than 1. The display screen 305 may be used to display information entered by a user or provided to a user as well as various graphical user interfaces (graphical user interface, GUI). For example, the display screen 305 may display photographs, videos, web pages, or files, etc.
The electronic device provided in the embodiment of the present application is applicable to any embodiment of the foregoing method, so the beneficial effects that can be achieved by the electronic device can refer to the beneficial effects in the corresponding method provided above, and will not be described herein.
The present embodiment provides a computer-readable storage medium having a computer program stored thereon, which when executed by a processor, implements the user portrait drawing method shown in the above method embodiment.
The computer readable storage medium provided in the embodiments of the present application is applicable to any one of the embodiments of the above method, and therefore, the beneficial effects that can be achieved by the computer readable storage medium can refer to the beneficial effects in the corresponding method provided above, and are not repeated here.
The present application also provides a computer program product which, when run on a computer, causes the computer to perform the above-mentioned related steps to implement the method in the above-mentioned embodiments. The computer program product provided in the embodiments of the present application is applicable to any of the embodiments of the above-mentioned method, and therefore, the advantages achieved by the computer program product can refer to the advantages provided in the corresponding method, and are not described herein.
The data embedding scheme provided by the embodiment of the application comprises the steps of embedding points in a target program by utilizing an SDK; the target program comprises a webpage program and/or an APP program; monitoring behavior events of an interface when a target program runs to obtain buried data; processing and transmitting the obtained buried point data according to preset data screening and cleaning rules; analyzing the buried point data and drawing a target user portrait; wherein the target user is a representative of at least one user using the target program. In the embodiment of the application, the embedded point code file and the target program service file are relatively independent, do not influence each other and do not interfere each other, so that the problem that various embedded point codes invade original pages and service codes greatly at present is solved. In addition, the embodiment of the application solves the problem that buried data cannot be transmitted to a server side in real time due to faults of a user network or equipment in the prior art, and solves the problem that the buried data cannot be filtered in the data uploading process of a client side in the prior art, so that a plurality of invalid data are uploaded to a server side, and finally the server side receives a plurality of redundant data, so that the accuracy of data analysis is affected.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely one logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be discarded or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
The foregoing is merely a specific embodiment of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of changes or substitutions, and can make several improvements and modifications within the technical scope of the present application, and these changes, substitutions, improvements and modifications are also considered to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A user portrayal rendering method, the method comprising:
burying points in a target program by utilizing the SDK; the target program comprises a webpage program and/or an APP program;
monitoring behavior events of an interface when a target program runs to obtain buried data;
processing and transmitting the obtained buried point data according to preset data screening and cleaning rules;
analyzing the buried point data and drawing a target user portrait; wherein the target user is a representation of at least one user using the target program;
the preset data screening rule comprises at least one of the following: eliminating or correcting interface up-down sliding events exceeding a preset frequency; eliminating or correcting click events on page elements which do not relate to skip operation in the interface; eliminating or correcting other interface behavior events exceeding a preset frequency;
The preset data cleaning rule comprises the following steps: rejecting or correcting non-closed-loop behavior operation events;
the preset data cleaning rule further comprises: when the problem occurs in a user network or hardware equipment and the acquired buried data is lost, before the lost data is transmitted to a server through an interface, the lost data is subjected to partial complementation, overwriting and even covering according to logic, and the abnormal marking of the lost data is performed;
the analyzing the buried point data and drawing the target user portrait includes: analyzing the obtained buried point data according to preset dimensions, wherein the preset dimensions comprise: social attributes, lifestyle and consumption behavior;
after analyzing the acquired buried data according to a preset dimension, the user portrait drawing method further comprises the following steps: analyzing the acquired buried data in at least one of the following ways: flow statistics analysis, behavior path construction analysis and user equipment management analysis;
the flow statistical analysis comprises on-line condition analysis, time period analysis and source analysis; according to the online condition analysis, the activity information of the online user is recorded respectively, wherein the activity information comprises visiting time, visitor region, incoming page and current stay page; according to the time interval analysis, the flow change condition of a website or a mobile terminal application program in any time interval is provided, or the flow change condition from a certain time interval to a certain time interval comprises small-time interval distribution and daily access volume distribution; the analysis according to the source refers to providing data brought by the domain name of the incoming route, including visiting times, IP, independent visitors, new visitors, browsing times of the new visitors and total browsing times in the station;
The behavior path construction analysis comprises the steps of obtaining the whole behavior link of a user by correlating the processed information;
the data types of the management analysis of the user equipment for management and use comprise the unique international mobile phone identifier IMEI, the international mobile subscriber identifier IMSI, androidID, the address of the equipment ID, UUID, MAC, the Channel of downloading, the APP version and the user behavior parameters.
2. The user drawing method according to claim 1, wherein the embedding of the point in the target program includes:
embedding points in the target program by adopting code embedding points and/or visual embedding point technology.
3. The user drawing method according to claim 2, wherein embedding points in the object program using a code embedding technique includes:
selecting a computer language to finish embedding the point codes; wherein the computer language includes, but is not limited to, any one of the following: HTML, CSS, javaScript;
embedding the completed embedded point code file into a target program through a target program interface; wherein the embedded code file or files in the object remain independent.
4. The user portrait drawing method according to claim 3 wherein said embedding points in the object program using a code embedding technique further includes:
Writing a self-adjusting function in the embedded point code;
setting the operation enabling state of the embedded point code file to be an opening state; or, setting the operation enabling parameter of the target program and the enabling parameter of the embedded point code file to be the same parameter.
5. The user portrait rendering method of claim 4 where behavioral event acquisition burial data of the interface includes, but is not limited to:
click event, exposure event, page dwell time.
6. The user portrait drawing method according to claim 1 or 5, wherein buried data processed according to a preset data screening and cleaning rule is stored and then transferred to a server for analysis.
7. A data burial point device, said device comprising: the system comprises a buried point module, a monitoring module, a storage module, a processing module, an analysis module and a drawing module, wherein,
the embedded point module is used for embedding points in the target program by utilizing the SDK; the target program comprises a webpage program and/or an APP program;
the monitoring module is used for monitoring behavior events of the interface when the target program runs to obtain buried data;
the storage module is used for storing preset data screening and cleaning rules;
The processing module is used for processing and transmitting the obtained buried point data according to preset data screening and cleaning rules;
the analysis module is used for analyzing the buried point data;
the drawing module is used for drawing the target user portrait; wherein the target user is a representation of at least one user using the target program;
the preset data screening rule comprises the following steps: eliminating or correcting up-and-down sliding events of the interface exceeding a preset frequency, and/or eliminating or correcting clicking events of page elements which do not relate to jumping operation in the interface; and/or eliminating or correcting other interface behavior events exceeding a preset frequency;
the preset data cleaning rule comprises the following steps: rejecting or correcting non-closed-loop behavior operation events;
the preset data cleaning rule further comprises: when the problem occurs in a user network or hardware equipment and the acquired buried data is lost, before the lost data is transmitted to a server through an interface, the lost data is subjected to partial complementation, overwriting and even covering according to logic, and the abnormal marking of the lost data is performed;
the analyzing the buried data includes: analyzing the obtained buried point data according to preset dimensions, wherein the preset dimensions comprise: social attributes, lifestyle and consumption behavior;
The drawing the target user portrait includes: analyzing the acquired buried data in at least one of the following ways: flow statistics analysis, behavior path construction analysis and user equipment management analysis;
the flow statistical analysis comprises on-line condition analysis, time period analysis and source analysis; according to the online condition analysis, the activity information of the online user is recorded respectively, wherein the activity information comprises visiting time, visitor region, incoming page and current stay page; according to the time interval analysis, the flow change condition of a website or a mobile terminal application program in any time interval is provided, or the flow change condition from a certain time interval to a certain time interval comprises small-time interval distribution and daily access volume distribution; the analysis according to the source refers to providing data brought by the domain name of the incoming route, including visiting times, IP, independent visitors, new visitors, browsing times of the new visitors and total browsing times in the station;
the behavior path construction analysis comprises the steps of obtaining the whole behavior link of a user by correlating the processed information;
the data types of the management analysis of the user equipment for management and use comprise the unique international mobile phone identifier IMEI, the international mobile subscriber identifier IMSI, androidID, the address of the equipment ID, UUID, MAC, the Channel of downloading, the APP version and the user behavior parameters.
8. The data embedding apparatus of claim 7, wherein the embedding module is specifically configured to embed points in the target program using code embedding and/or visual embedding techniques.
9. The data burial point device of claim 8, wherein said burial point module further comprises: a selection unit and an interface unit; wherein,
the selecting unit is used for selecting a computer language to finish embedding the point codes; wherein the computer language includes, but is not limited to, any one of the following: HTML, CSS, javaScript;
the interface unit is used for embedding the completed embedded point code file into a target program through a target program interface; wherein the embedded code file or files in the object remain independent.
10. An electronic device comprising a processor and a memory;
the memory is used for storing operation instructions;
the processor is configured to execute the method of any one of claims 1-6 by invoking the operation instruction.
11. A computer readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when executed by a processor, implements the method of any of claims 1-6.
CN202110163487.5A 2021-02-05 2021-02-05 User portrait drawing method and device Active CN112948226B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110163487.5A CN112948226B (en) 2021-02-05 2021-02-05 User portrait drawing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110163487.5A CN112948226B (en) 2021-02-05 2021-02-05 User portrait drawing method and device

Publications (2)

Publication Number Publication Date
CN112948226A CN112948226A (en) 2021-06-11
CN112948226B true CN112948226B (en) 2024-04-02

Family

ID=76242739

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110163487.5A Active CN112948226B (en) 2021-02-05 2021-02-05 User portrait drawing method and device

Country Status (1)

Country Link
CN (1) CN112948226B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114051167B (en) * 2021-10-28 2023-09-26 北京金堤科技有限公司 Video processing method, device and processor
CN113986954B (en) * 2021-12-30 2022-04-08 深圳市明源云科技有限公司 User event acquisition method and device, intelligent terminal and readable storage medium
CN115757980A (en) * 2022-12-21 2023-03-07 北京政务科技有限公司 User portrait method, device, equipment and medium for government affair service
CN116502054A (en) * 2023-05-12 2023-07-28 上海邮电设计咨询研究院有限公司 Flow data analysis method, system, medium and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107145489A (en) * 2016-03-01 2017-09-08 阿里巴巴集团控股有限公司 A kind of information statistical method and device of the client application based on cloud platform
CN108492224A (en) * 2018-03-09 2018-09-04 上海开放大学 Based on deep learning online education Students ' Comprehensive portrait tag control system
CN109255640A (en) * 2017-07-13 2019-01-22 阿里健康信息技术有限公司 A kind of method, apparatus and system of determining user grouping
CN111553729A (en) * 2020-04-27 2020-08-18 广州探途网络技术有限公司 Method and device for generating portrait data of e-commerce user and computing equipment
CN111597422A (en) * 2020-05-14 2020-08-28 腾讯科技(深圳)有限公司 Buried point mapping method and device, computer equipment and storage medium
WO2020252639A1 (en) * 2019-06-17 2020-12-24 深圳市欢太科技有限公司 Content pushing method and related product
WO2020257990A1 (en) * 2019-06-24 2020-12-30 深圳市欢太科技有限公司 Device recommendation method and related product

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107145489A (en) * 2016-03-01 2017-09-08 阿里巴巴集团控股有限公司 A kind of information statistical method and device of the client application based on cloud platform
CN109255640A (en) * 2017-07-13 2019-01-22 阿里健康信息技术有限公司 A kind of method, apparatus and system of determining user grouping
CN108492224A (en) * 2018-03-09 2018-09-04 上海开放大学 Based on deep learning online education Students ' Comprehensive portrait tag control system
WO2020252639A1 (en) * 2019-06-17 2020-12-24 深圳市欢太科技有限公司 Content pushing method and related product
WO2020257990A1 (en) * 2019-06-24 2020-12-30 深圳市欢太科技有限公司 Device recommendation method and related product
CN111553729A (en) * 2020-04-27 2020-08-18 广州探途网络技术有限公司 Method and device for generating portrait data of e-commerce user and computing equipment
CN111597422A (en) * 2020-05-14 2020-08-28 腾讯科技(深圳)有限公司 Buried point mapping method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN112948226A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
CN112948226B (en) User portrait drawing method and device
US9390438B2 (en) Systems and methods for capturing and reporting metrics regarding user engagement including a canvas model
US8997081B1 (en) Analytics for mobile applications
US7305622B2 (en) Graphical user interface and web site evaluation tool for customizing web sites
US8725794B2 (en) Enhanced website tracking system and method
US8533141B2 (en) Systems and methods for rule based inclusion of pixel retargeting in campaign management
US20170013085A1 (en) Method of website optimisation
US10600075B2 (en) Proactive web content attribute recommendations
US20140229271A1 (en) System and method to analyze and rate online advertisement placement quality and potential value
CN108132814A (en) Page loading method, device, computer equipment and the storage medium of application program
KR20150130282A (en) Intelligent platform for real-time bidding
CN102314455A (en) Method and system for calculating click flow of web page
CN107357903B (en) User behavior data integration method and device and electronic equipment
CN103606094A (en) Mobile Internet advertisement monitoring method and system thereof
US11727082B2 (en) Machine-learning based personalization
CN110059223A (en) Circulation, image to video computer vision guide in machine
US11893076B2 (en) Systems and methods for managing an online user experience
US20210073893A1 (en) Back End Server Modification And Model Visualization
CN108984070B (en) Method, apparatus, electronic device and readable medium for thermodynamic diagram imaging
US20140052851A1 (en) Systems and methods for discovering sources of online content
CN113626624B (en) Resource identification method and related device
Chaqfeh et al. Jsanalyzer: A web developer tool for simplifying mobile web pages through non-critical javascript elimination
CN111200639A (en) Information pushing method and device based on user operation behavior and electronic equipment
Percival HTML5 advertising
EP3542342A1 (en) Automatic generation of interactive web page content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant