CN113099275A - User behavior statistical method, device and equipment for interactive video - Google Patents

User behavior statistical method, device and equipment for interactive video Download PDF

Info

Publication number
CN113099275A
CN113099275A CN202110279492.2A CN202110279492A CN113099275A CN 113099275 A CN113099275 A CN 113099275A CN 202110279492 A CN202110279492 A CN 202110279492A CN 113099275 A CN113099275 A CN 113099275A
Authority
CN
China
Prior art keywords
interactive
user
data
interactive video
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110279492.2A
Other languages
Chinese (zh)
Inventor
刘杰
杜欢
张弢帅
曾筑娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Altstory Technology Beijing Co ltd
Original Assignee
Altstory Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Altstory Technology Beijing Co ltd filed Critical Altstory Technology Beijing Co ltd
Priority to CN202110279492.2A priority Critical patent/CN113099275A/en
Publication of CN113099275A publication Critical patent/CN113099275A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • H04N21/25891Management of end-user data being end-user preferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a method, a device and equipment for counting user behaviors of an interactive video, wherein the method comprises the following steps: in the interactive video playing process, automatically acquiring user core index data in a non-buried point mode, wherein the user core index data comprises user interactive behavior data related to interactive nodes of an interactive video; preprocessing the user core index data to obtain basic data, and storing the basic data; calculating and analyzing the basic data to obtain user behavior statistical data; and outputting the user behavior statistical data. The invention adopts the non-embedded point technology, does not need to embed points in data indexes to be counted at the client, and can fully meet the requirements of counting and analyzing user behavior data aiming at the interactive video, thereby knowing the performance and the result of an interactive part in the interactive video, providing good guidance and feedback for producers and operators of the interactive video, and further being capable of perfecting the interactive video in a more targeted way.

Description

User behavior statistical method, device and equipment for interactive video
Technical Field
The invention relates to the technical field of information processing, in particular to a user behavior statistical method, a device and equipment for interactive video.
Background
With the commercial and wide deployment of the internet, various application terminals are also widely developed, and statistics on user behavior information is involved in many application scenarios in order to provide comprehensive and high-quality services for users.
The statistics of the user behavior information mainly depends on a basic framework of data reporting and data analysis, that is, real and accurate user behavior data can be obtained from the actual watching and playing process of a user, and the user behavior data is processed and analyzed to obtain a conclusion with reference value. Specifically, firstly, a service logic for recording user actions is set in a client/webpage source code, and when a user performs a specific operation (such as clicking, selecting and focusing), the code for recording functions is triggered to execute and current information is uploaded to a server. The server performs operations such as data cleaning desensitization, data storage, analysis and calculation, report forms and visualization, and summarizes user behaviors for relevant personnel to perform analysis and decision.
Currently, a method for performing user behavior statistics in a video follows the indexes of webpage statistics, and mainly focuses on the browsing volume (including UV/VV) of a current page, the user dwell time, the number of page links or page jumps, and the like. The video viewing related data includes video loading time, pause time, viewing duration, and the like. The above statistical indexes mainly focus on the traditional page or on-demand mode, or equate the video to the web page, or only focus on simple indexes (such as playing time length) of the on-demand video, but for the interactive video, the behavior characteristics and statistics of the user interaction in the interactive video cannot be reflected, the traditional user behavior statistical mode and statistical indexes are not enough to completely reflect the user behavior of the interactive video, and the significance of guidance and feedback is lacked for producers and operators of the interactive video.
In addition, interactive videos usually have a large number of interactive components, and reporting one by using a conventional data embedding manner greatly increases workload. Therefore, the interactive behavior report of the interactive video is designed and developed, the non-buried point report can be realized, and the statistical function and the result are obtained.
Disclosure of Invention
In view of this, the invention provides a method, a device and equipment for user behavior statistics of an interactive video, which can count the interactive behavior characteristics of a user aiming at an interactive node according to the characteristics of the interactive video, so that the requirements of user behavior data statistics and analysis aiming at the interactive video can be fully met.
Specifically, the method comprises the following technical scheme:
in a first aspect, an embodiment of the present invention provides a user behavior statistical method for an interactive video, including:
in the interactive video playing process, automatically acquiring user core index data in a non-buried point mode, wherein the user core index data comprises user interactive behavior data related to interactive nodes of an interactive video;
preprocessing the user core index data to obtain basic data, and storing the basic data;
calculating and analyzing the basic data to obtain user behavior statistical data;
and outputting the user behavior statistical data.
Optionally, the method further comprises:
and performing non-buried point processing on the part of the interactive video related to the user core index data.
Optionally, in the interactive video playing process, automatically acquiring user core index data in a non-buried point manner, includes:
and when the interactive video is played to the interactive node, acquiring user interactive behavior data related to the interactive operation of the user.
Optionally, the preprocessing the user core index data includes:
and cleaning, screening, merging, supplementing and compressing the user core index data.
Optionally, the calculating and analyzing the basic data includes:
and calculating the number of effective users, wherein the effective users are users of at least one interactive node participating in the interactive video.
Optionally, the calculating and analyzing the basic data includes:
and calculating the number of users in a clearance, wherein the clearance represents the designated target video content after any interactive node of the interactive video watched by the users is finished.
Optionally, the calculating and analyzing the basic data includes:
calculating the user retention rate of a certain interactive node, specifically including calculating the ratio of the number of users watching the designated video content before the interactive node to the number of users watching the designated video content after the interactive node.
Optionally, the calculating and analyzing the basic data includes:
and calculating the average of the sum of the interaction times of the users on a certain interaction node of the interactive video.
In a second aspect, an embodiment of the present invention provides a user behavior statistics apparatus for an interactive video, including:
the acquisition module is used for automatically acquiring user core index data in a non-buried point mode in the interactive video playing process, wherein the user core index data comprises user interactive behavior data related to interactive nodes of the interactive video;
the preprocessing module is used for preprocessing the user core index data to obtain basic data and storing the basic data;
the calculation analysis module is used for calculating and analyzing the basic data to obtain user behavior statistical data;
and the output module is used for outputting the user behavior statistical data.
In a third aspect, an embodiment of the present invention provides an apparatus, including a processor and a memory, where the memory is used for storing executable instructions of the processor, and the processor is configured to execute the above method by executing the executable instructions.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
according to the invention, in the interactive video playing process, the user core index data containing the user interactive behavior data related to the interactive nodes are automatically acquired in a non-buried point mode, and are calculated and analyzed to obtain the user behavior statistical data, the interactive behavior characteristics of the user are counted aiming at the interactive nodes, and the requirements of the user behavior data statistics and analysis aiming at the interactive video can be fully met, so that the performance and the result of the interactive part in the interactive video can be known, good guidance and feedback are provided for producers and operators of the interactive video, and the interactive video can be more pertinently improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment for a statistical method for interactive video user behavior according to an exemplary embodiment of the present invention.
Fig. 2 is a flowchart illustrating a user behavior statistical method for interactive video according to an exemplary embodiment of the present invention.
Fig. 3 illustrates data presentation based on interactive video branching options.
Fig. 4 is a block diagram illustrating a user behavior statistics apparatus for interactive video according to an exemplary embodiment of the invention.
Fig. 5 is a block diagram of an apparatus according to an exemplary embodiment of the present invention.
With the above figures, certain embodiments of the invention have been illustrated and described in more detail below. The drawings and the description are not intended to limit the scope of the inventive concept in any way, but rather to illustrate it by those skilled in the art with reference to specific embodiments.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Here, the interactive video may be a movie and television interactive video, an animation interactive video, a game interactive video, an advertisement promotion interactive video, a virtual reality interactive video, an augmented reality image interactive video, or the like. The interactive video can be in the form of images or media such as pictures and texts.
The interactive video is a video with an interactive function, or a video with an interactive control displayed in an overlapping manner to realize the interactive function. Optionally, the interactive video is a video played according to an interactive story line. Optionally, the interactive video comprises at least one interactive node. And when the interactive video is played to the interactive node, the interactive control is superposed and displayed on the current video picture. Optionally, when the user triggers the interactive control, the terminal may display interactive content, which is content capable of interacting with the user in multiple forms. The interactive video can realize the interaction between the user and the video by setting the interactive nodes.
Fig. 1 is a schematic diagram of an implementation environment of a statistical method for interactive video user behavior according to an exemplary embodiment of the present invention, and as shown in fig. 1, the system includes a terminal device 1, a server 2, and a communication network 3.
The terminal device 1 is used for acquiring interactive video from the server 2 through the communication network 3. The interactive video acquired by the terminal device 1 from the server 2 includes video information related to the interactive video. The related video information of the interactive video comprises video content and display configuration information corresponding to the interactive control needing to be displayed in the video. When the video is played to the interactive node, the interactive control appears on the video playing interface of the user, and the plurality of users trigger the interactive control on the respective video playing interface.
The system can automatically monitor the interaction event, automatically trigger a preset reporting event when the user interacts, and execute corresponding data reporting actions in the event so as to report the user core index data (particularly the user behavior data related to the interaction node) to the server 2, thereby realizing the acquisition of the user core index data.
Optionally, the terminal device may be a mobile terminal such as a mobile phone, a tablet computer, and a smart watch, or a terminal device such as a desktop computer and a notebook computer.
The server 2 is used for storing the interactive video and transmitting the interactive video to the terminal equipment of the user, and is also used for receiving the user core index data (particularly the user behavior data related to the interactive node) reported by the terminal equipment 1, the server calculates and analyzes the reported user core index data to obtain the user behavior statistical data, and the server can output the user behavior statistical data in a report form and can also be connected with a data visualization tool to provide good guidance and feedback for producers and operators of the interactive video.
Alternatively, the server 2 may be a stand-alone server or a group of servers. The server 2 may be a physical server or a cloud server, which is not limited in this embodiment of the present invention.
The communication network 3 may be a wired communication network or a wireless communication network.
The user behavior statistical method for the interactive video provided by the present invention is described with reference to the above implementation environment, fig. 2 is a flowchart of the user behavior statistical method for the interactive video according to an exemplary embodiment of the present invention, and as shown in fig. 2, the user behavior statistical method for the interactive video according to the embodiment of the present invention includes:
step S101: in the interactive video playing process, user core index data are automatically collected in a non-buried point mode, and the user core index data comprise user interactive behavior data related to interactive nodes of the interactive video.
And performing non-embedded point processing on the part of the interactive video related to the user core index data, writing a script program (such as a JavaScript script) adapting to various terminal devices into the interactive video, and integrating the script program into an interactive video player. When the interactive video is played and a predetermined event is triggered or a predetermined condition is met (for example, a user loads the interactive video; for example, the user participates in an interactive node; for example, the user watches predetermined video content), the required data can be sent to the inlet of the data acquisition node in a fixed format, and if the sending fails due to network reasons and the like in the process, the data is cached and sent again in the next starting process.
Data reporting is a very important link in user behavior analysis, directly determines the data breadth, depth and quality, influences all subsequent links, and needs to be adapted to different types of terminal equipment. The data reporting can be realized by code embedding points, full embedding points and non-embedding points. The code embedding is the most classical embedding mode, embedding point codes are combined into service codes by implementing research and development of embedding points, and acquisition of user behavior data is realized. The code embedding points can be divided into front end embedding points and rear end embedding points according to different positions. The front-end buried point is used for recording the operation behavior of a user at a client (terminal device), and the back-end buried point is used for recording the log of server requests made by the client.
For interactive video, the interactive video usually has a large number of interactive components, and reporting one by using a conventional data embedding manner can greatly increase workload. Therefore, interactive behavior report of the interactive video is designed and developed, the interactive event is monitored automatically, a preset report event can be triggered automatically when a user interacts, corresponding data report actions are executed in the event, and non-buried point data report of a part (such as an interactive node) related to the interactive operation of the user in the interactive video is realized so as to collect data related to the interactive characteristics of the user. And the user interaction behavior data related to the interaction nodes of the interaction video in the user core index data can fully reflect the interaction characteristics of the user.
Optionally, when the interactive video is played to the interactive node, user interaction behavior data related to the user interaction operation is collected.
For example, in an interactive video of a movie and television play, when the video is played to an interactive node where two characters in the play perform robbery, a user performs robbery operation by clicking an interactive control displayed on a terminal device,
at this time, for the interactive node (robbery), user interaction behavior data related to the interactive operation of the user may be collected, for example, the number of times the user clicks the interactive control may be collected, for example, the time from the time when the user triggers the interactive control of the interactive node to the time when the interactive link is ended may be collected.
Aiming at the characteristics of interactive video, a set of reported event sets is designed and embedded into a production tool of the interactive video. The interactive video finished by the production tool can automatically report corresponding interactive events, and the non-buried point collection and statistics of user behaviors are finished through the process without additional development.
Events are mainly classified into the following categories:
1. interaction and operation, including but not limited to recording interaction results, playing progress check points, etc.;
2. logic jumps and variable updates, including but not limited to recording trigger event behaviors and updates of variables/factors, etc.;
3. player behavior including, but not limited to, playing heartbeats, player operations performed by the user, and the like;
4. video loading behavior including, but not limited to, application initialization, application loading, user initialization, and the like.
Step S102: and preprocessing the core index data of the user to obtain basic data, and storing the basic data.
The core index data of the user collected in step S101 is original big data, which is an original resource, and has problems of poor quality, incompleteness, damage, etc., so that "desensitization" and "packaging" are required to obtain complete high-quality data (i.e., basic data) for subsequent calculation and analysis.
Alternatively, the user core index data may be preprocessed in units of certain time intervals (e.g., hours).
Optionally, the user core index data may be cleaned, screened, merged, supplemented and compressed, the data acquisition node transmits the acquired user core index data to the first database, cleans, screens, merges, supplements and compresses the user core index data in the first database at a predetermined time interval to obtain basic data, and stores the basic data in the second database.
For example, the above-mentioned data collection node portal may adopt a web beacon technology, and then the collection part may drop the reported data in the interactive video into an HDFS database (first database) in a streaming manner, so as to form original data. Meanwhile, the original data can be cleaned, screened, merged and supplemented completely according to a certain time interval (for example, 5 hours), and then compressed to form basic data for all subsequent calculations and analysis, and the basic data is stored in a Hive table (a second database).
The data cleaning is a procedure for finding and correcting recognizable errors in the data file, and the step selects a proper method to carry out 'cleaning' aiming at obvious error values, missing values, abnormal values and suspicious data found in the data examination process, thereby being beneficial to obtaining a reliable conclusion by the subsequent statistical analysis.
In general, data cleansing is the process of compacting a database to remove duplicate records and converting the remainder into a standard acceptable format. The standard model of data cleansing is to input data into a data cleansing processor, "cleanse" the data through a series of steps, and then output the cleansed data in a desired format. Data cleaning processes the problems of data loss value, out-of-bounds value, inconsistent code, repeated data and the like from the aspects of data accuracy, integrity, consistency, uniqueness, timeliness and effectiveness.
After the data is cleaned, whether the cleaned data conforms to the expected data format or not can be verified by means of sampling inspection.
Step S103: and calculating and analyzing the basic data to obtain user behavior statistical data.
The basic data comprehensively describes the related information and intelligence of the user and describes the playing and playing states and conditions in the interactive video.
According to the requirement of user behavior statistics aiming at interactive videos, basic data are calculated and analyzed, and the obtained user behavior statistical data can comprise:
(1) a user summary (user _ summary) comprising:
and (4) counting the total playing amount by taking the first step loading time point of the user as a standard. The data is the earliest achievable hit point for user behavior and is theoretically closest to the "play volume" on the video platform. In addition, at this time, since the loading is not completed, the viewing data of the independent user/login user cannot be counted.
The number of users is based on the time point when the basic function loading of the interactive video is completed, and the number of independent users at the moment is counted. What can be generally understood as a state in which loading is complete, but no video has yet been played; and users who fail to play due to loading failure caused by compatibility or system reasons can be excluded.
And the number of the effective users is defined as the number of the effective users, and the number of the effective users is the number of the independent users.
(2) Interactive content and playback summary (content _ summary), which includes:
average number of interactions (number of interactions), average value of sum of active number of interactions of user on interactive node (including interactive behavior automatically completed by overtime).
The average active interaction number is similar to the former, but the interaction behavior of automatically completing the interaction link due to overtime is not counted.
Average playing time length, average playing physical time length of a user, no counting of time consuming for pause and fast forward skipping, and no conversion of content played at double speed. Due to the data collection mode, the statistics of the playing time length has an error within +/-1 minute.
The clearance rate indicates a target video content specified after the user watches any interactive node of the interactive video, or in other words, the user watches a final video specified by the target, or reaches any ending. The clearance rate is the ratio of the number of calculated clearance users in the number of effective users.
The average number of the customs clearance, the ratio of the number of times the customs clearance is achieved to the number of effective users, and the difference from the clearance rate is that the user's multiple customs clearance is repeatedly calculated, so the average number of the customs clearance can be larger than 1 theoretically.
The rate of review, the percentage of active users for users who have performed repeated views. Recording the number of users who have performed such operations for interactive content that actively lets users reopen (e.g., "from scratch" or "plot tree"); for interactive content without such a mechanism, the number of users who have reached the first interactive node (potential content bifurcation point) multiple times is recorded.
The average review number, similar to the average pass number, is calculated repeatedly when the user performs the review operation a plurality of times.
Sharing rate, the percentage of the number of users performing the over-sharing operation among the active users. Possibly including but not limited to sharing provided by the player or sharing among system components.
The average share number is similar to the average pass number.
(3) Chapter or key video behavior list (chapter _ list), which includes:
chapter retention rate, which may define key checkpoints in some videos, most commonly the beginning and end of a chapter. The percentage of the number of users reaching each checkpoint relative to the number of users completing the load is counted as the retention rate. Optionally, a ratio of the number of users at each checkpoint relative to the last checkpoint is provided. The index data is for the video of the play guide, not for the video of the interactive guide, so the calculation is carried out based on the number of the finished loading users.
(4) An achievement or critical behavior list (activity _ list) comprising:
achievement rate, for a user's special behavior can be listed and manually buried, one common factor is achievement. The occupation ratio of the users with achievement in the number of effective users can be counted.
(5) An interactive node behavior list (intpt _ list) comprising:
the interactive node retention rate, for a defined interactive node, may specifically calculate a ratio between the number of users watching the specified video content before the interactive node and the number of users watching the specified video content after the interactive node, that is, a ratio of the number of users of the resulting video before the interactive node and after reaching the interactive node.
And (4) distributing results of the interactive nodes, and if the interactive nodes have branches, achieving the proportion of users with different branches (including overtime) among the users reaching the interactive nodes.
(6) The system component uses the behavior list (component _ list), and records the number of users and the click number ratio used by each system component/custom non-interactive component. While supporting the expanded presentation (which may be done in an out-link another report) of more of the system's components' contexts (e.g., whether the same "close" button is done in chapter one or chapter two).
(7) A player operation list (player _ list) comprising:
the operation list records the triggering times of player behavior, and is divided according to specific behavior properties (such as fast forward or fast backward).
And dividing a video operation list, and increasing the dimension according to which specific video the operation occurs in on the basis of the result of the previous operation.
(8) A video load capability list (performance _ list) comprising:
and loading a time list, recording p90 and p99 values of a plurality of time indexes for completing loading, and distinguishing according to terminal equipment families.
And loading a step list, recording the arrival rate of each loaded process, and distinguishing according to the terminal equipment family. This entry does not necessarily apply to all items.
(9) A headboard (spotlight), comprising:
and the PCU (the highest number of people who are online at the same time) directly gives the time point (in terms of hours) with the maximum playing amount after online.
The longest playback time and the shortest pass time, which we actually counted, will be 0.01 or 0.99 minutes to remove some obvious unreasonable results (e.g., four or five days in a minute). This entry does not necessarily apply to all items.
In order to visually reflect the selection condition of each branch in the interactive video, fig. 3 shows data presentation based on the interactive video branch options, as shown in fig. 3, information such as the proportion of users who select the branch, how many proportion of users quit when the current branch times out is shown at each branch video node.
The calculation results adopt a T +1 delay off-line calculation mode to form long-time storage data with days as dimensionality. Besides the scheduled tasks, the method also supports simple task development and customized calculation supported by the warehouse to supplement the custom indexes beyond the standard indexes for a single project.
Step S104: and outputting the user behavior statistical data.
For the index results of the warehouse part, data can be exported in a report form, and a plurality of universal panels can be provided for direct viewing by connecting with a self-selected data visualization tool, so that video content creators and operators can more clearly and intuitively see the results of the combination of the data results and the design and configuration of the content.
In summary, according to the method for counting the user behavior of the interactive video provided by the embodiment of the invention, in the process of playing the interactive video, the user core index data including the user interaction behavior data related to the interactive node is automatically acquired in a non-buried point manner, and is calculated and analyzed to obtain the user behavior statistical data, the interactive behavior characteristics of the user are counted aiming at the interactive node, and the requirements for counting and analyzing the user behavior data of the interactive video can be fully met, so that the performance and the result of the interactive part in the interactive video can be known, good guidance and feedback are provided for producers and operators of the interactive video, and the interactive video can be more specifically improved.
Fig. 4 is a block diagram illustrating an apparatus for counting user behavior of interactive video according to an exemplary embodiment of the present invention. The device can implement all or part of the steps of the user behavior statistical method of the interactive video provided by the embodiment shown in fig. 2.
As shown in fig. 4, an embodiment of the present invention provides a user behavior statistics apparatus for an interactive video, which includes an acquisition module 10, a preprocessing module 20, a calculation and analysis module 30, and an output module 40.
The acquisition module 10 is configured to automatically acquire user core index data in a non-embedded point manner during an interactive video playing process, where the user core index data includes user interaction behavior data related to an interaction node of an interactive video.
The preprocessing module 20 is configured to preprocess the user core index data to obtain basic data, and store the basic data.
The calculation and analysis module 30 is used for calculating and analyzing the basic data to obtain the user behavior statistical data.
The output module 40 is used for outputting the user behavior statistical data.
In summary, in the scheme shown in this embodiment, in the interactive video playing process, the user core index data including the user interaction behavior data related to the interaction node is automatically acquired in a non-buried point manner, and is calculated and analyzed to obtain the user behavior statistical data, and the interaction behavior characteristics of the user are counted for the interaction node, so that the requirements of user behavior data statistics and analysis for the interactive video can be fully met, and thus the performance and the result of the interaction part in the interactive video can be known, good guidance and feedback are provided for producers and operators of the interactive video, and the interactive video can be more specifically improved.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the above functional modules is illustrated, and in practical applications, the above functions may be distributed by different functional modules according to actual needs, that is, the content structure of the device is divided into different functional modules, so as to complete all or part of the functions described above.
The specific manner in which the various modules perform operations has been described in detail in relation to the embodiments of the method, and will not be described in detail herein with regard to the apparatus of the embodiments.
Fig. 5 is a block diagram of an apparatus according to an exemplary embodiment of the present invention.
Based on the method shown in fig. 2, correspondingly, an embodiment of the present invention further provides an apparatus, including a processor 51 and a memory 52, where the memory 52 is used for storing executable instructions of the processor 51, and the processor 51 is configured to execute the method shown in fig. 2 by executing the executable instructions; the device further comprises a bus 53 configured to couple the processor 51 and the storage device 52.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments. It will be appreciated that the relevant features of the method and apparatus described above are referred to one another.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Unless otherwise defined, technical or scientific terms used herein shall have the meaning understood by those of ordinary skill in the art to which the invention belongs. The use of "first," "second," and similar terms in the present application do not denote any order, quantity, or importance, but rather the terms are used to distinguish one element from another. Also, the use of the terms "a" or "an" and the like do not denote a limitation of quantity, but rather denote the presence of at least one. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships are changed accordingly.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A user behavior statistical method of an interactive video is characterized by comprising the following steps:
in the interactive video playing process, automatically acquiring user core index data in a non-buried point mode, wherein the user core index data comprises user interactive behavior data related to interactive nodes of an interactive video;
preprocessing the user core index data to obtain basic data, and storing the basic data;
calculating and analyzing the basic data to obtain user behavior statistical data;
and outputting the user behavior statistical data.
2. The method of claim 1, further comprising:
and performing non-buried point processing on the part of the interactive video related to the user core index data.
3. The method of claim 1, wherein automatically collecting user core index data in a non-embedded manner during the interactive video playing process comprises:
and when the interactive video is played to the interactive node, acquiring user interactive behavior data related to the interactive operation of the user.
4. The method of claim 1, wherein preprocessing the user core metrics data comprises:
and cleaning, screening, merging, supplementing and compressing the user core index data.
5. The method of claim 1, wherein said computing and analyzing said base data comprises:
and calculating the number of effective users, wherein the effective users are users of at least one interactive node participating in the interactive video.
6. The method of claim 1, wherein said computing and analyzing said base data comprises:
and calculating the number of users in a clearance, wherein the clearance represents the designated target video content after any interactive node of the interactive video watched by the users is finished.
7. The method of claim 1, wherein said computing and analyzing said base data comprises:
calculating a user retention rate of an interactive node, including calculating a ratio between a number of users viewing the specified video content before the interactive node and a number of users viewing the specified video content after the interactive node.
8. The method of claim 1, wherein said computing and analyzing said base data comprises:
and calculating the average of the sum of the interaction times of the users on a certain interaction node of the interactive video.
9. A user behavior statistics apparatus for interactive video, comprising:
the acquisition module is used for automatically acquiring user core index data in a non-buried point mode in the interactive video playing process, wherein the user core index data comprises user interactive behavior data related to interactive nodes of the interactive video;
the preprocessing module is used for preprocessing the user core index data to obtain basic data and storing the basic data;
the calculation analysis module is used for calculating and analyzing the basic data to obtain user behavior statistical data;
and the output module is used for outputting the user behavior statistical data.
10. A device comprising a processor and a memory for storing executable instructions of the processor, the processor configured to perform the method of any one of claims 1-8 by executing the executable instructions.
CN202110279492.2A 2021-03-16 2021-03-16 User behavior statistical method, device and equipment for interactive video Pending CN113099275A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110279492.2A CN113099275A (en) 2021-03-16 2021-03-16 User behavior statistical method, device and equipment for interactive video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110279492.2A CN113099275A (en) 2021-03-16 2021-03-16 User behavior statistical method, device and equipment for interactive video

Publications (1)

Publication Number Publication Date
CN113099275A true CN113099275A (en) 2021-07-09

Family

ID=76668140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110279492.2A Pending CN113099275A (en) 2021-03-16 2021-03-16 User behavior statistical method, device and equipment for interactive video

Country Status (1)

Country Link
CN (1) CN113099275A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320517A (en) * 2023-03-21 2023-06-23 北京网梯科技发展有限公司 Learning track merging method and device and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307315A (en) * 2011-04-22 2012-01-04 赛特斯网络科技(南京)有限责任公司 User behavior analysis device in Internet protocol television (IPTV) system, and system for realizing analysis application
US20150120470A1 (en) * 2013-10-24 2015-04-30 Yahoo! Inc. Multi-protocol interactive mobile video advertising
CN106933472A (en) * 2017-05-20 2017-07-07 南京西桥科技有限公司 A kind of user behavior data acquisition system and its control method based on mobile phone A PP
CN108769814A (en) * 2018-06-01 2018-11-06 腾讯科技(深圳)有限公司 Video interaction method, device and readable medium
CN110515679A (en) * 2019-08-28 2019-11-29 北京思维造物信息科技股份有限公司 Collecting method, device, equipment and storage medium
CN110716848A (en) * 2019-10-18 2020-01-21 广州华多网络科技有限公司 Data collection method and device, electronic equipment and storage medium
CN112416995A (en) * 2019-08-23 2021-02-26 腾讯科技(深圳)有限公司 Data statistical method and device, computer equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307315A (en) * 2011-04-22 2012-01-04 赛特斯网络科技(南京)有限责任公司 User behavior analysis device in Internet protocol television (IPTV) system, and system for realizing analysis application
US20150120470A1 (en) * 2013-10-24 2015-04-30 Yahoo! Inc. Multi-protocol interactive mobile video advertising
CN106933472A (en) * 2017-05-20 2017-07-07 南京西桥科技有限公司 A kind of user behavior data acquisition system and its control method based on mobile phone A PP
CN108769814A (en) * 2018-06-01 2018-11-06 腾讯科技(深圳)有限公司 Video interaction method, device and readable medium
CN112416995A (en) * 2019-08-23 2021-02-26 腾讯科技(深圳)有限公司 Data statistical method and device, computer equipment and storage medium
CN110515679A (en) * 2019-08-28 2019-11-29 北京思维造物信息科技股份有限公司 Collecting method, device, equipment and storage medium
CN110716848A (en) * 2019-10-18 2020-01-21 广州华多网络科技有限公司 Data collection method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116320517A (en) * 2023-03-21 2023-06-23 北京网梯科技发展有限公司 Learning track merging method and device and electronic equipment
CN116320517B (en) * 2023-03-21 2024-05-24 北京网梯科技发展有限公司 Learning track merging method and device and electronic equipment

Similar Documents

Publication Publication Date Title
US10926184B2 (en) Synchronized video with in game telemetry
US20170230731A1 (en) System and method to create a media content summary based on viewer annotations
US10986064B2 (en) Ascertaining events in media
CN110830735B (en) Video generation method and device, computer equipment and storage medium
CN111294609A (en) Live content display method and device, electronic equipment and readable storage medium
US20120144311A1 (en) Computerized system and method for commenting on sub-events within a main event
CN109151488B (en) Method and system for recommending live broadcast room in real time according to user behaviors
CN107239389A (en) A kind of method and device that user operation records are determined in mixing APP
CN111263170B (en) Video playing method, device and equipment and readable storage medium
WO2021106034A1 (en) Server device and electronic commerce method
CN104205862A (en) Dynamic search service
CN111107434A (en) Information recommendation method and device
CN109522191A (en) A kind of method and device of the attribute information of acquisition interbehavior instruction
US20050177613A1 (en) Statistical and vouyeristic link behavioral tracking and presentation tools
CN113099275A (en) User behavior statistical method, device and equipment for interactive video
CN109388737A (en) A kind of sending method, device and the storage medium of the exposure data of content item
CN108334429A (en) Method, apparatus and system for investigating front end page problem
CN107343221B (en) Online multimedia interaction system and method
CN111353455B (en) Video content determining method and device, storage medium and electronic equipment
CN111182316B (en) Stream switching method and device of media resource, storage medium and electronic device
CN111581518A (en) Information pushing method and device
US20190313156A1 (en) Asynchronous Video Conversation Systems and Methods
US11000771B1 (en) Gameplay telemetry and video acquisition system
CN105847898A (en) Video automatic releasing method and device
CN104410874A (en) A method, a device, and a system for detecting video viscosity information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210709