CN110971976B - Audio and video file analysis method and device - Google Patents

Audio and video file analysis method and device Download PDF

Info

Publication number
CN110971976B
CN110971976B CN201911159203.4A CN201911159203A CN110971976B CN 110971976 B CN110971976 B CN 110971976B CN 201911159203 A CN201911159203 A CN 201911159203A CN 110971976 B CN110971976 B CN 110971976B
Authority
CN
China
Prior art keywords
behavior
data
playing
play
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911159203.4A
Other languages
Chinese (zh)
Other versions
CN110971976A (en
Inventor
胡慧
贾宝军
徐雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN201911159203.4A priority Critical patent/CN110971976B/en
Publication of CN110971976A publication Critical patent/CN110971976A/en
Application granted granted Critical
Publication of CN110971976B publication Critical patent/CN110971976B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44204Monitoring of content usage, e.g. the number of times a movie has been viewed, copied or the amount which has been watched

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The invention discloses an audio and video file analysis method and device, wherein the method comprises the following steps: the method comprises the steps of obtaining access data of an audio and video file to be analyzed according to a log of the audio and video file, determining behavior data according to the access data, wherein the behavior data are operation data of a user on the audio and video file, and determining the difficulty of the content of the audio and video file and/or the interest of the user on the audio and video file according to the behavior data. The method can analyze the difficulty of the content of the audio and video file and the interest of the user in the audio and video file according to the operation data of the user to the audio and video file, and compared with analyzing the audio and video file according to the data of average watching time, audience coverage rate and the like, the method considers the situation similar to the situation that the user plays the audio and video file but does not concentrate on attention, obtains more accurate and reasonable analysis results, and provides powerful decision support for the operator on adjusting the content of the audio and video file.

Description

Audio and video file analysis method and device
Technical Field
The invention relates to the technical field of audio and video file analysis, in particular to an audio and video file analysis method and device.
Background
With the development of informatization, various media institutions establish online audio/video playing platforms and connect a large number of audio/video files. The difficulty of the content of the audio and video files and the interest degree of the users in the audio and video files are generally analyzed by referring to several indexes of user scores, user comments, average watching time and audience coverage rate. User scores and user comments come from subjective evaluation criteria of users, and therefore have certain limitations. The average watching time refers to the average time of watching the audio and video files by a user, the audience coverage rate refers to the proportion of the users who have finished watching the audio and video files, if the users play the audio and video files, but the attention is not focused on the content of the audio and video files, or the users leave the player to do other things, the two indexes of the average watching time and the audience coverage rate cannot reflect the difficulty of the content of the audio and video files and the interest of the users on the audio and video files, and therefore the method has certain limitation.
Therefore, there is a need for an audio/video file analysis scheme to solve the above technical problems.
Disclosure of Invention
Therefore, the invention provides an audio and video file analysis method and device, and aims to solve the problems of difficulty in analyzing the content of an audio and video file and inaccuracy in the interest degree of a user in the audio and video file.
In order to achieve the above object, a first aspect of the present invention provides an audio/video file analysis method, including:
acquiring access data of the audio and video file according to a log of the audio and video file to be analyzed;
determining behavior data according to the access data, wherein the behavior data is operation data of the user on the audio and video file;
and determining the difficulty of the content of the audio and video file and/or the interest of the user in the audio and video file according to the behavior data.
Preferably, after the determining the behavior data according to the access data, before determining the difficulty level of the content of the audio-video file and/or the interest level of the user in the audio-video file according to the behavior data, the method further includes:
determining behavior attributes of the behavior data;
and processing the behavior data according to the behavior attribute.
Preferably, the behavior data includes variable-speed playing behavior data, behavior attributes of the variable-speed playing behavior data include a playing speed before the variable-speed playing behavior occurs and a playing speed after the variable-speed playing behavior occurs, and the processing the behavior data according to the behavior attributes includes:
and deleting the variable-speed playing behavior data with the same playing speed before the variable-speed playing behavior occurs and the same playing speed after the variable-speed playing behavior occurs.
Preferably, the behavior data includes drag play behavior data, the behavior attribute of the drag play behavior data includes a play time before the occurrence of the drag play behavior and a play time after the occurrence of the drag play behavior, and the processing the behavior data according to the behavior attribute includes:
deleting the drag play behavior data with the same play time before the drag play behavior occurs and the same play time after the drag play behavior occurs.
Preferably, the behavior data includes drag play behavior data, behavior attributes of the drag play behavior data include a play time before occurrence of a drag play behavior, a play time after occurrence of the drag play behavior, an executor of the drag play behavior, and an occurrence time of the drag play behavior, and the processing the behavior data according to the behavior attributes includes:
determining drag play behavior data to be processed from each drag play behavior data of the audio and video file to be analyzed, wherein the play time before the drag play behavior of any data in the drag play behavior data to be processed occurs is the same as the play time after the drag play behavior of other data in the drag play behavior data to be processed occurs, or the play time after the drag play behavior of any data in the drag play behavior data to be processed occurs is the same as the play time before the drag play behavior of other data in the drag play behavior data to be processed occurs;
determining an executor of a dragging and playing behavior of the dragging and playing behavior data to be processed and the occurrence time of the dragging and playing behavior;
and merging the data of the dragging and playing behaviors to be processed, wherein the data of the dragging and playing behaviors to be processed are respectively the same as the occurrence time of an executor of the dragging and playing behavior and the occurrence time of the dragging and playing behavior according to the playing time before the dragging and playing behavior occurs and the playing time after the dragging and playing behavior occurs.
Preferably, the behavior data includes play behavior data, and after the determining the behavior data according to the access behavior, the method further includes:
determining the playing time period of the audio/video file to be analyzed according to the playing behavior data;
and adjusting the load of the audio and video server according to the playing time period of each audio and video file to be analyzed.
The invention provides an audio and video file analysis device, which comprises an acquisition module, a first determination module and a second determination module, wherein the acquisition module is used for acquiring access data of an audio and video file to be analyzed according to a log of the audio and video file;
the first determining module is used for determining behavior data according to the access data, wherein the behavior data is operation data of a user on the audio and video file;
and the second determining module is used for determining the difficulty of the content of the audio and video file and/or the interest of the user in the audio and video file according to the behavior data.
Preferably, the apparatus further includes a processing module, and the processing module is configured to determine a behavior attribute of the behavior data, and process the behavior data according to the behavior attribute.
Preferably, the behavior data includes variable-speed playing behavior data, and the behavior attribute of the variable-speed playing behavior data includes a playing speed before the variable-speed playing behavior occurs and a playing speed after the variable-speed playing behavior occurs;
the processing module is used for deleting the variable-speed playing behavior data with the same playing speed before the variable-speed playing behavior occurs and the same playing speed after the variable-speed playing behavior occurs.
Preferably, the behavior data includes drag play behavior data, and the behavior attribute of the drag play behavior data includes a play time before the occurrence of the drag play behavior and a play time after the occurrence of the drag play behavior;
the processing module is used for deleting the dragging and playing behavior data with the playing time before the dragging and playing behavior occurs and the playing time after the dragging and playing behavior occurs being equal.
Preferably, the behavior data includes drag play behavior data, and the behavior attribute of the drag play behavior data includes a play time before the occurrence of the drag play behavior, a play time after the occurrence of the drag play behavior, an executor of the drag play behavior, and an occurrence time of the drag play behavior;
the processing module is configured to determine drag play behavior data to be processed from each drag play behavior data of the audio/video file to be analyzed, where a play time before a drag play behavior of any data in the drag play behavior data to be processed occurs is the same as a play time after a drag play behavior of other data in the drag play behavior data to be processed occurs, or a play time after a drag play behavior of any data in the drag play behavior data to be processed occurs is the same as a play time before a drag play behavior of other data in the drag play behavior data to be processed occurs; determining an executor of a dragging and playing behavior of the dragging and playing behavior data to be processed and the occurrence time of the dragging and playing behavior; and merging the data of the dragging and playing behaviors to be processed, wherein the data of the dragging and playing behaviors to be processed are respectively the same as the occurrence time of an executor of the dragging and playing behavior and the occurrence time of the dragging and playing behavior according to the playing time before the dragging and playing behavior occurs and the playing time after the dragging and playing behavior occurs.
Preferably, the behavior data includes play behavior data, and the apparatus further includes an adjustment module; the adjusting module is used for determining the playing time period of the audio and video files to be analyzed according to the playing behavior data and adjusting the load of the audio and video server according to the playing time period of each audio and video file to be analyzed.
The invention has the following advantages:
according to the method, the access data of the audio and video file is acquired according to the log of the audio and video file to be analyzed, the behavior data is determined according to the access data, the behavior data is the operation data of a user on the audio and video file, and the difficulty of the content of the audio and video file and/or the interest of the user on the audio and video file are determined according to the behavior data. The method can analyze the difficulty of the content of the audio and video file and the interest of the user in the audio and video file according to the operation data of the user to the audio and video file, and compared with analyzing the audio and video file according to the data of average watching time, audience coverage rate and the like, the method considers the situation similar to the situation that the user plays the audio and video file but does not concentrate on attention, obtains more accurate and reasonable analysis results, and provides powerful decision support for the operator on adjusting the content of the audio and video file.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention.
Fig. 1 is a schematic flow diagram of an audio/video file analysis method according to an embodiment of the present invention;
fig. 2 is a data statistics diagram of playing behavior of a file in a video server according to an embodiment of the present invention;
FIG. 3a is a statistical chart of behavior data of a video file according to an embodiment of the present invention;
fig. 3b is a statistical chart of pause play behavior data, drag play behavior data, and variable speed play behavior data of a certain video file according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an audio/video file analysis apparatus according to an embodiment of the present invention;
fig. 5 is a second schematic structural diagram of an audio/video file analysis apparatus according to an embodiment of the present invention;
fig. 6 is a third schematic structural diagram of an audio/video file analysis apparatus according to an embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present invention, are given by way of illustration and explanation only, not limitation.
As shown in fig. 1, the audio/video file analysis method provided by the present invention may include the following steps:
and step S101, acquiring access data of the audio and video file according to the log of the audio and video file to be analyzed.
The server of the audio/video playing platform can generate a server log according to all activity behaviors of the user aiming at the audio/video file, and the types of information included in the server log include a user name for accessing the audio/video file, a Page address of the audio/video file, access time for the user to access the audio/video file, IP (Internet Protocol, Protocol interconnected between networks) information, Page operation information (Page _ action), an audio/video file number and the like. Therefore, the server log has great value for the analysis and platform optimization of the audio and video files in the platform. According to the analysis requirement, several information of user name, access time, page address and page operation information can be selected as access data to further analyze the audio and video file.
And S102, determining behavior data according to the access data, wherein the behavior data is operation data of the user on the audio and video file.
In the embodiment of the present invention, event _ type carried in the page operation information indicates a type of an operation performed on an audio/video file by a user, and may include video loading (load _ video), video playing (play _ video), pause playing (page _ video), variable-speed playing (speed _ change _ video), skip playing (seq _ video), drag playing (seek _ video), page closing (page _ close), and the like.
In the embodiment of the invention, as the event type is converted from json (JavaScript Object Notation) format to text format for storage, the access data is not classified in low dimension, the content of the characteristics (the type of information contained in the access data) of the low-dimensional data is less, and the specific operation of a user on the audio/video file cannot be reflected if the low-dimensional data is directly used for analysis. Loads () may be used to parse page operation information, matching out the event type and its value using regular expression functions, component () and search (). According to the analysis requirement, any behavior data of several operation types, such as video playing, variable speed playing, skip playing and drag playing, can be selected to further analyze the audio and video files. It should be noted that the behavior data of the present invention is not limited to the above four operation behavior data, and may also include pause play behavior data.
And step S103, determining the difficulty of the content of the audio and video file and/or the interest of the user in the audio and video file according to the behavior data.
In the embodiment of the invention, the number of various behaviors can be counted aiming at each audio/video file, and the difficulty of the content of the audio/video file and/or the interest degree of a user in the audio/video file can be intuitively reflected by utilizing the number of the behaviors. For example, the difficulty level of the content of the audio/video file can be analyzed according to the occurrence number of the dragging playing behavior and the variable speed playing behavior. The dragging playing can be playback or forward, and the variable speed playing can be slow speed playing or fast speed playing. If the number of the playback behaviors or the slow play behaviors in a certain play duration of the audio/video file is large, the content difficulty is high, and if the number of the forward or fast play behaviors is large, the content difficulty is low. The interest degree of the user in the audio and video file can be analyzed according to the number of the playing behaviors and the skipping playing behaviors, if the number of the playing behaviors in a certain playing time of the audio and video file is large, the user can be interested in the content, and if the number of the skipping playing behaviors is large, the user is not interested in the content.
It can be seen from the above steps S101-S103 that, in the present invention, access data of an audio/video file is obtained according to a log of the audio/video file to be analyzed, behavior data is determined according to the access data, the behavior data is operation data of a user on the audio/video file, and a difficulty level of a content of the audio/video file and/or an interest level of the user on the audio/video file is determined according to the behavior data. The method provided by the invention can analyze the difficulty of the content of the audio/video file and the interest of the user in the audio/video file according to the operation data of the user on the audio/video file, and compared with analyzing the audio/video file according to the data such as average watching time, audience coverage rate and the like, the method takes the situation similar to the situation that the user plays the audio/video file but does not concentrate on attention into consideration, obtains more accurate and reasonable analysis results, and provides a powerful decision support for the operator on adjusting the content of the audio/video file.
Further, the audio/video file analysis method provided by the present invention may further include the following steps after determining the behavior data according to the access data, and before determining the difficulty level of the content of the audio/video file and/or the interest level of the user in the audio/video file according to the behavior data: determining behavior attributes of the behavior data; the behavior data is processed according to the behavior attributes.
The behavior data is derived from the server log file, and if the interference data in the server log file is not filtered, the accuracy of the analysis result may be directly or indirectly affected, so that the behavior data needs to be processed to ensure the accuracy of the result.
It should be noted that the behavior attribute is not particularly limited in the present invention, and different behavior attributes may be selected for different behavior data according to the analysis requirement, for example, for playing behavior data or pausing playing behavior data, attributes such as an executor of a behavior, an occurrence time of the behavior, and a current playing time of an audio video may be selected for analysis. For the jump playing behavior data, attributes such as an executor of the behavior, the occurrence time of the behavior, the current video number, the jump video number and the like can be selected for analysis.
Further, in the embodiment of the present invention, the behavior data may include variable-speed playing behavior data, and the behavior attribute of the variable-speed playing behavior data may include a playing speed before the variable-speed playing behavior occurs and a playing speed after the variable-speed playing behavior occurs. The method provided by the invention for processing the behavior data according to the behavior attribute can comprise the following steps: and deleting the variable-speed playing behavior data with the same playing speed before the variable-speed playing behavior occurs and the same playing speed after the variable-speed playing behavior occurs.
If the speed before the change carried in the variable-speed playing behavior data is the same as the speed after the change, the variable-speed playing behavior is a meaningless operation, and the user does not actually change the current playing speed of the audio/video file, but selects an option which is the same as the current playing speed after clicking the variable-speed option. The variable speed playing behavior data is deleted, so that the influence on the accuracy of the analysis result can be avoided. Specifically, for the variable-speed playing behavior data, the behavior attributes of the variable-speed playing behavior data may be determined to be the playing speed before the variable-speed playing behavior occurs (i.e., the speed old _ speed before the change) and the playing speed after the variable-speed playing behavior occurs (i.e., the speed new _ speed after the change). The variable speed playing behavior data can be found by judging whether the event type is variable speed playing or not, and the variable speed playing behavior data with the same speed before and after change is deleted.
Further, in the embodiment of the present invention, the behavior data may include drag play behavior data, and the behavior attribute of the drag play behavior data may include a play time before the occurrence of the drag play behavior and a play time after the occurrence of the drag play behavior, and the processing of the behavior data according to the behavior attribute in the method provided by the present invention may include the following steps:
deleting the drag play behavior data with the same play time before the drag play behavior occurs and the same play time after the drag play behavior occurs.
If the audio and video playing time at the beginning of dragging carried in the dragging and playing behavior data is the same as the audio and video playing time at the end of dragging, the dragging and playing behavior is a meaningless operation, and the user does not actually change the current playing time of the audio and video file, but returns to the audio and video playing time at the beginning of dragging when the dragging action is ended. The dragging and playing behavior data are deleted, so that the influence on the accuracy of the analysis result can be avoided. Specifically, for the data of the drag play behavior, the behavior attributes thereof may be determined as a play time before the occurrence of the drag play behavior (i.e., start drag time old _ time) and a play time after the occurrence of the drag play behavior (i.e., end drag time new _ time). The drag play behavior data can be found by judging whether the event type is drag play or not, and the drag play behavior data with the same drag start time and drag end time is deleted.
Further, in the embodiment of the present invention, the behavior data may include drag play behavior data, and the behavior attribute of the drag play behavior data may include a play time before the occurrence of the drag play behavior, a play time after the occurrence of the drag play behavior, an executor of the drag play behavior, and an occurrence time of the drag play behavior, where the processing of the behavior data according to the behavior attribute in the method provided by the present invention may include the following steps: determining drag play behavior data to be processed from each drag play behavior data of an audio/video file to be analyzed, wherein the play time before the occurrence of the drag play behavior of any data in the drag play behavior data to be processed is the same as the play time after the occurrence of the drag play behavior of other data in the drag play behavior data to be processed, or the play time after the occurrence of the drag play behavior of any data in the drag play behavior data to be processed is the same as the play time before the occurrence of the drag play behavior of other data in the drag play behavior data to be processed; determining an executor of a dragging and playing behavior of dragging and playing behavior data to be processed and the occurrence time of the dragging and playing behavior; and merging the drag play behavior data to be processed, which are respectively the same as the executor of the drag play behavior and the occurrence time of the drag play behavior, according to the play time before the drag play behavior occurs and the play time after the drag play behavior occurs.
Since the dragging and playing operation of the audio and video file by the user is a continuous behavior, multiple dragging and playing behavior records may be generated by one dragging and playing operation, all data records generated by one dragging and playing operation are merged when the dragging and playing behavior data are analyzed, and the influence on the accuracy of the analysis result can be avoided. Specifically, for the data of the drag play behavior, the behavior attributes of the data of the drag play behavior may be determined as a play time before the occurrence of the drag play behavior (i.e., start drag time old _ time), a play time after the occurrence of the drag play behavior (i.e., end drag time new _ time), an executor of the drag play behavior (i.e., Username), and an occurrence time of the drag play behavior (i.e., access time Date). If a plurality of pieces of drag play behavior data are generated under one drag play operation of the user, the user names of the continuous drag play behavior data are the same, and since the time required for completing one drag operation is generally short, it is considered that the access time of the continuous drag play behavior data (and the time for recording the drag play behavior) is also the same. In the continuous dragging and playing behavior data, the dragging ending time in the previous data is the dragging starting time in the next data. Only the first record and the last record generated under one drag play operation need to be merged, redundant records are deleted, and finally, only one drag play operation corresponds to one drag play behavior data.
The drag play behavior data can be found by judging whether the event type is drag play. The method comprises the steps of determining dragging and playing behavior data to be processed according to a dragging starting time and a dragging ending time, namely finding a series of dragging and playing behavior data to be processed, wherein the dragging ending time in the previous data and the dragging starting time in the next data can be connected from the dragging and playing behavior data of an audio and video file to be analyzed. Further, it is possible to determine whether the user names and access times of these data are respectively the same. And assigning the dragging ending moment of the previous data to the dragging starting moment of the next data according to the dragging playing behavior data generated under one dragging playing operation, and deleting the previous data until one dragging playing operation corresponds to only one dragging playing behavior data.
Further, in the embodiment of the present invention, the behavior data may further include play behavior data, and after determining the behavior data according to the access behavior, the method provided by the present invention may further include the following steps:
determining the playing time period of the audio/video file according to the playing behavior data; and adjusting the load of the audio and video server according to the playing time period of each audio and video file.
In the embodiment of the present invention, the number of play actions may be counted according to the occurrence time (i.e., access time Date) of the play actions carried in the play action data. For example, the number of playing behavior data of an audio-video file in each time period of 24 hours per day may be counted, and the playing time period of the audio-video file may be determined. Further, the total amount of playing behavior data in each time period can be counted according to the playing time periods of all the audio and video files in the audio and video server, the total amount is compared with a threshold preset according to the historical record, and the playing peak period and the playing idle period of all the audio and video files in the audio and video server are determined. The load of the audio and video server can be properly increased in the playing peak period, the load of the audio and video server can be properly reduced in the playing idle period, the user experience is enhanced, and meanwhile, the waste of network resources is avoided.
As shown in fig. 2, taking the analysis situation of a certain video server as an example, after counting the playing behavior data of all video files in the video server, it is found that the number of video playing behavior data from 6 points to 8 points is increasing continuously, and the peak is reached at 8 points, which exceeds the threshold value set in advance according to the access amount of the past week, so that the load needs to be increased, while the number of video playing behavior data after 8 points is reduced, and is less than the preset threshold value at 9 points, so that the load can be reduced appropriately.
In addition, the audio and video file analysis method provided by the invention can also determine the optimal playing time of the audio and video file according to the behavior data.
Specifically, the total amount of the behavior data in different playing time periods may be counted according to the behavior data of the audio/video file, for example, the total amount of the behavior data in 1 minute of the playing time period of the audio/video file, and the total amount of the behavior data in … … (n-1) minutes-n minutes in 1 minute-2 minutes may be counted. If the total amount of (m-1) minute starting behavior data is in a trend of being obviously reduced, the optimal playing time of the audio and video file can be considered as m minutes, wherein m is smaller than n.
As a specific implementation mode, the method provided by the invention can be applied to scenes of online teaching. With the development of education informatization, on-line teaching cloud platforms are established and a large number of video courses are online in various colleges and universities, compared with the traditional open courses, the on-line teaching has no time, place and number limitation, and learners can learn through the network. However, since online teaching lacks supervising measures and teaching environments in the conventional education mode, it is difficult to adjust the teaching plan and the course structure according to the learning effect of the learner, so that it is possible to analyze the difficulty of the contents of the video file, the interest of the learner in the video file, and the optimal playing time of the video file, and adjust the teaching plan and the course structure based on this, thereby achieving the optimal teaching effect.
As shown in fig. 3a and 3b, taking the analysis situation of a video file in a certain video learning server as an example, after statistics of behavior data of the video file, it is found that after 360 seconds (i.e. 5 minutes), the number of video learning micro behavior data (i.e. all behavior data) has a significantly decreasing trend, and the number of pause playing behavior data, drag playing behavior data and variable speed playing behavior data also has a significantly decreasing trend, and it is seen that the optimal playing time length of the video file is 5 minutes.
The audio and video file analysis method provided by the invention can determine the difficulty of the content of the audio and video file and/or the interest of the user in the audio and video file according to the operation data of the user on the audio and video file, can determine the optimal playing time of the audio and video file, provides decision support for adjusting the audio and video content and the playing time for an operator of an audio and video playing platform, can adjust the load of an audio and video server according to the operation data of the user on the audio and video file, and provides decision support for adjusting the load of the server for an operation and maintenance person of the audio and video playing platform.
Based on the same technical concept, an embodiment of the present invention further provides an audio/video file analysis device, as shown in fig. 4, the device includes an obtaining module 401, a first determining module 402, and a second determining module 403, where the obtaining module 401 is configured to obtain access data of an audio/video file according to a log of the audio/video file to be analyzed;
the first determining module 402 is configured to determine behavior data according to the access data, where the behavior data is operation data of a user on an audio/video file;
the second determining module 403 is configured to determine, according to the behavior data, a difficulty level of content of the audio/video file and/or an interest level of the user in the audio/video file.
Preferably, as shown in fig. 5, the audio/video file analysis apparatus provided by the present invention further includes a processing module 404, where the processing module 404 is configured to determine a behavior attribute of the behavior data; the behavior data is processed according to the behavior attributes.
Preferably, the behavior data includes variable-speed playing behavior data, the behavior attribute of the variable-speed playing behavior data includes a playing speed before the variable-speed playing behavior occurs and a playing speed after the variable-speed playing behavior occurs, as shown in fig. 5, the processing module 404 is configured to delete the variable-speed playing behavior data in which the playing speed before the variable-speed playing behavior occurs is equal to the playing speed after the variable-speed playing behavior occurs.
Preferably, the behavior data includes drag play behavior data, the behavior attribute of the drag play behavior data includes a play time before the occurrence of the drag play behavior and a play time after the occurrence of the drag play behavior, as shown in fig. 5, the processing module 404 is configured to delete the drag play behavior data that is equal to the play time before the occurrence of the drag play behavior and the play time after the occurrence of the drag play behavior.
Preferably, the behavior data includes drag play behavior data, the behavior attribute of the drag play behavior data includes a play time before the occurrence of the drag play behavior, a play time after the occurrence of the drag play behavior, an executor of the drag play behavior, and an occurrence time of the drag play behavior, as shown in fig. 5, the processing module 404 is configured to, from each dragging and playing behavior data of the audio-video file to be analyzed, determining drag play behavior data to be processed, wherein the play time before the occurrence of the drag play behavior of any data in the drag play behavior data to be processed is the same as the play time after the occurrence of the drag play behavior of other data in the drag play behavior data to be processed, or the playing time after the dragging playing behavior of any data in the dragging playing behavior data to be processed occurs is the same as the playing time before the dragging playing behavior of other data in the dragging playing behavior data to be processed occurs; determining an executor of a dragging and playing behavior of dragging and playing behavior data to be processed and the occurrence time of the dragging and playing behavior; and merging the drag play behavior data to be processed, which are respectively the same as the executor of the drag play behavior and the occurrence time of the drag play behavior, according to the play time before the drag play behavior occurs and the play time after the drag play behavior occurs.
Preferably, the behavior data may further include play behavior data, as shown in fig. 6, the audio/video file analysis device provided by the present invention may further include an adjustment module 405, where the adjustment module 405 is configured to determine a play time period of the audio/video file according to the play behavior data; and adjusting the load of the audio and video server according to the playing time period of each audio and video file.
It will be understood that the above embodiments are merely exemplary embodiments taken to illustrate the principles of the present invention, which is not limited thereto. It will be apparent to those skilled in the art that various modifications and improvements can be made without departing from the spirit and substance of the invention, and these modifications and improvements are also considered to be within the scope of the invention.

Claims (8)

1. An audio/video file analysis method is characterized by comprising the following steps:
acquiring access data of the audio and video file according to a log of the audio and video file to be analyzed;
determining behavior data according to the access data, wherein the behavior data is operation data of the user on the audio and video file;
determining behavior attributes of the behavior data;
processing the behavior data according to the behavior attribute;
determining the difficulty of the content of the audio and video file and/or the interest of a user in the audio and video file according to the behavior data;
the behavior data includes drag play behavior data, behavior attributes of the drag play behavior data include play time before occurrence of the drag play behavior, play time after occurrence of the drag play behavior, an executor of the drag play behavior, and occurrence time of the drag play behavior, and the processing of the behavior data according to the behavior attributes includes:
determining drag play behavior data to be processed from each drag play behavior data of the audio and video file to be analyzed, wherein the play time before the drag play behavior of any data in the drag play behavior data to be processed occurs is the same as the play time after the drag play behavior of other data in the drag play behavior data to be processed occurs, or the play time after the drag play behavior of any data in the drag play behavior data to be processed occurs is the same as the play time before the drag play behavior of other data in the drag play behavior data to be processed occurs;
determining an executor of a dragging and playing behavior of the dragging and playing behavior data to be processed and the occurrence time of the dragging and playing behavior;
and merging the data of the dragging and playing behaviors to be processed, wherein the data of the dragging and playing behaviors to be processed are respectively the same as the occurrence time of an executor of the dragging and playing behavior and the occurrence time of the dragging and playing behavior according to the playing time before the dragging and playing behavior occurs and the playing time after the dragging and playing behavior occurs.
2. The method of claim 1, wherein the behavior data further comprises variable-speed playback behavior data, behavior attributes of the variable-speed playback behavior data comprising a playback speed before the variable-speed playback behavior occurs and a playback speed after the variable-speed playback behavior occurs, and wherein processing the behavior data according to the behavior attributes comprises:
and deleting the variable-speed playing behavior data with the same playing speed before the variable-speed playing behavior occurs and the same playing speed after the variable-speed playing behavior occurs.
3. The method according to claim 1, wherein the behavior data includes drag play behavior data, behavior attributes of the drag play behavior data include a play time before a drag play behavior occurs and a play time after the drag play behavior occurs, and the processing the behavior data according to the behavior attributes includes:
deleting the drag play behavior data with the same play time before the drag play behavior occurs and the same play time after the drag play behavior occurs.
4. A method according to any of claims 1-3, wherein the behavioural data comprises play behavioural data, and after said determining behavioural data from the access data, the method further comprises:
determining the playing time period of the audio/video file to be analyzed according to the playing behavior data;
and adjusting the load of the audio and video server according to the playing time period of each audio and video file to be analyzed.
5. The device for analyzing the audio and video files is characterized by comprising an acquisition module, a first determination module and a second determination module, wherein the acquisition module is used for acquiring access data of the audio and video files according to logs of the audio and video files to be analyzed;
the first determining module is used for determining behavior data according to the access data, wherein the behavior data is operation data of a user on the audio and video file;
the second determining module is used for determining the difficulty of the content of the audio and video file and/or the interest of the user in the audio and video file according to the behavior data;
the device further comprises a processing module, wherein the processing module is used for determining the behavior attribute of the behavior data and processing the behavior data according to the behavior attribute; the behavior data comprises drag play behavior data, and the behavior attributes of the drag play behavior data comprise play time before the occurrence of the drag play behavior, play time after the occurrence of the drag play behavior, an executor of the drag play behavior and occurrence time of the drag play behavior;
the processing module is configured to determine drag play behavior data to be processed from each drag play behavior data of the audio/video file to be analyzed, where a play time before a drag play behavior of any data in the drag play behavior data to be processed occurs is the same as a play time after a drag play behavior of other data in the drag play behavior data to be processed occurs, or a play time after a drag play behavior of any data in the drag play behavior data to be processed occurs is the same as a play time before a drag play behavior of other data in the drag play behavior data to be processed occurs; determining an executor of a dragging and playing behavior of the dragging and playing behavior data to be processed and the occurrence time of the dragging and playing behavior; and merging the data of the dragging and playing behaviors to be processed, wherein the data of the dragging and playing behaviors to be processed are respectively the same as the occurrence time of an executor of the dragging and playing behavior and the occurrence time of the dragging and playing behavior according to the playing time before the dragging and playing behavior occurs and the playing time after the dragging and playing behavior occurs.
6. The apparatus of claim 5, wherein the behavior data comprises variable speed playback behavior data, behavior attributes of which include a playback speed before the variable speed playback behavior occurs and a playback speed after the variable speed playback behavior occurs;
the processing module is used for deleting the variable-speed playing behavior data with the same playing speed before the variable-speed playing behavior occurs and the same playing speed after the variable-speed playing behavior occurs.
7. The apparatus according to claim 5, wherein the behavior data includes drag play behavior data, and the behavior attribute of the drag play behavior data includes a play time before the occurrence of the drag play behavior and a play time after the occurrence of the drag play behavior;
the processing module is used for deleting the dragging and playing behavior data with the playing time before the dragging and playing behavior occurs and the playing time after the dragging and playing behavior occurs being equal.
8. The apparatus of any of claims 5-7, wherein the behavior data comprises play behavior data, the apparatus further comprising an adjustment module; the adjusting module is used for determining the playing time period of the audio and video files to be analyzed according to the playing behavior data and adjusting the load of the audio and video server according to the playing time period of each audio and video file to be analyzed.
CN201911159203.4A 2019-11-22 2019-11-22 Audio and video file analysis method and device Active CN110971976B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911159203.4A CN110971976B (en) 2019-11-22 2019-11-22 Audio and video file analysis method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911159203.4A CN110971976B (en) 2019-11-22 2019-11-22 Audio and video file analysis method and device

Publications (2)

Publication Number Publication Date
CN110971976A CN110971976A (en) 2020-04-07
CN110971976B true CN110971976B (en) 2021-08-27

Family

ID=70031297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911159203.4A Active CN110971976B (en) 2019-11-22 2019-11-22 Audio and video file analysis method and device

Country Status (1)

Country Link
CN (1) CN110971976B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102868936A (en) * 2012-09-06 2013-01-09 北京邮电大学 Method and system for storing video logs
CN104581400A (en) * 2015-02-10 2015-04-29 飞狐信息技术(天津)有限公司 Video content processing method and video content processing device
CN106503044A (en) * 2016-09-21 2017-03-15 北京小米移动软件有限公司 Interest characteristics distribution acquiring method and device
WO2017124116A1 (en) * 2016-01-15 2017-07-20 Bao Sheng Searching, supplementing and navigating media

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8687023B2 (en) * 2011-08-02 2014-04-01 Microsoft Corporation Cross-slide gesture to select and rearrange
CN105828100A (en) * 2016-03-21 2016-08-03 乐视网信息技术(北京)股份有限公司 Audio and video files simultaneous playing method, device and system
CN107743248A (en) * 2017-09-28 2018-02-27 北京奇艺世纪科技有限公司 A kind of video fast forward method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102868936A (en) * 2012-09-06 2013-01-09 北京邮电大学 Method and system for storing video logs
CN104581400A (en) * 2015-02-10 2015-04-29 飞狐信息技术(天津)有限公司 Video content processing method and video content processing device
WO2017124116A1 (en) * 2016-01-15 2017-07-20 Bao Sheng Searching, supplementing and navigating media
CN106503044A (en) * 2016-09-21 2017-03-15 北京小米移动软件有限公司 Interest characteristics distribution acquiring method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
xMOOCs中的教学视频设计要点:基于案例的视频分析研究;李秋菊 等;《远程教育杂志》;20141121(第06期);第95-102页 *

Also Published As

Publication number Publication date
CN110971976A (en) 2020-04-07

Similar Documents

Publication Publication Date Title
Li et al. How do in-video interactions reflect perceived video difficulty?
US20200380989A1 (en) Electronic transcription job market
US8099459B2 (en) Content feedback for authors of web syndications
US20230119753A1 (en) Parameters for generating an automated survey
US20090012965A1 (en) Network Content Objection Handling System and Method
JP6364424B2 (en) Method and system for displaying contextually relevant information about media assets
CN108595492B (en) Content pushing method and device, storage medium and electronic device
CN110941738B (en) Recommendation method and device, electronic equipment and computer-readable storage medium
US20140137144A1 (en) System and method for measuring and analyzing audience reactions to video
CN109996122B (en) Video recommendation method and device, server and storage medium
RU2583764C1 (en) Method of processing request for user to access web resource and server
KR20120081554A (en) Data highlighting and extraction
Chen et al. A study of user behavior in online VoD services
CN104486649A (en) Video content rating method and device
Heerwegh Internet survey paradata
CN104081386A (en) Content evaluation/playback device
CN107810638A (en) By the transmission for skipping redundancy fragment optimization order content
CN203206260U (en) System and method for flow analysis
CN110971976B (en) Audio and video file analysis method and device
CN110569425A (en) generating customized learning paths
CN110139160B (en) Prediction system and method
Trippas et al. Crowdsourcing user preferences and ery judgments for speech-only search
CN114219346A (en) Method and system for improving service quality of network learning environment
WO2020117806A1 (en) Methods and systems for generating curated playlists
US20220138257A1 (en) System and method for journey recording

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant