CN111711854A - Movie analysis and integration method based on user attention - Google Patents
Movie analysis and integration method based on user attention Download PDFInfo
- Publication number
- CN111711854A CN111711854A CN202010459420.1A CN202010459420A CN111711854A CN 111711854 A CN111711854 A CN 111711854A CN 202010459420 A CN202010459420 A CN 202010459420A CN 111711854 A CN111711854 A CN 111711854A
- Authority
- CN
- China
- Prior art keywords
- average
- time
- acquiring
- starting
- nodes
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 230000010354 integration Effects 0.000 title claims abstract description 25
- 238000004458 analytical method Methods 0.000 title claims abstract description 14
- 230000003993 interaction Effects 0.000 claims description 28
- 238000012935 Averaging Methods 0.000 description 9
- 238000000605 extraction Methods 0.000 description 4
- 230000002452 interceptive effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44222—Analytics of user selections, e.g. selection of programs or purchase activity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4667—Processing of monitored end-user data, e.g. trend analysis based on the log file of viewer selections
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
A movie analysis and integration method based on user attention comprises the following steps: acquiring file information of a movie file, wherein the file information comprises a name, description information and content of a movie; extracting playing information of the movie file, wherein the playing information comprises watching data, fast forward data and backward data; acquiring starting nodes and ending nodes of all fast forward operations of the movie file; matching a starting node and an ending node of fast forward; calculating a fast forward time period to obtain an average fast forward time period; acquiring starting nodes and ending nodes of all backtracking operations of the movie file; matching a backtracking starting node and an ending node; calculating a backtracking time period to obtain an average backtracking time period; acquiring starting nodes and ending nodes of all slow playing operations of the movie file; matching a slow playing starting node with a slow playing ending node; calculating a slow release time period to obtain an average slow release time period; and integrating the average backtracking time period and the average slow playing time period in the film file, and eliminating the average fast forward time period to generate the clipped film.
Description
Technical Field
The invention relates to the field of video media, in particular to a film analysis and integration method based on user attention.
Background
Movie clips, i.e. the decomposition and combination of film images and sound material. That is, a large amount of materials shot in the film production are selected, broken down and assembled to finally complete a coherent and smooth work with clear meaning, bright theme and artistic appeal.
Because the duration of the film and television works is long, and the viewer with time tension wants to watch the wonderful part of the film, the existing film selection is to select and cut the film according to the preference of the editor, but not according to the real preference of the viewer, so the real impression of the viewer on the film needs to be judged according to the operation of the viewer when watching the film, and the film needs to be cut according to the operation of the viewer.
Disclosure of Invention
The purpose of the invention is as follows:
aiming at the technical problems mentioned in the background technology, the invention provides a movie analysis and integration method based on user attention.
The technical scheme is as follows:
a movie analysis and integration method based on user attention comprises the following steps:
acquiring file information of a movie file, wherein the file information comprises a name, description information and content of a movie;
extracting playing information of a movie file, wherein the playing information comprises watching data, fast forward data and backward data;
acquiring starting nodes and ending nodes of all fast forward operations of the movie file;
matching a starting node and an ending node of fast forward;
calculating a fast forward time period to obtain an average fast forward time period;
acquiring starting nodes and ending nodes of all backtracking operations of the movie file;
matching a backtracking starting node and an ending node;
calculating a backtracking time period to obtain an average backtracking time period;
acquiring starting nodes and ending nodes of all slow playing operations of the movie file;
matching a slow playing starting node with a slow playing ending node;
calculating a slow release time period to obtain an average slow release time period;
and integrating the average backtracking time period and the average slow playing time period in the film file, and eliminating the average fast forward time period to generate the clipped film.
As a preferred mode of the present invention, the calculating of the fast-forwarding time period includes the steps of:
acquiring all starting nodes of the fast forward operation, and acquiring starting average time points of the starting nodes on a time axis;
acquiring all end nodes of the fast forward operation, and acquiring the average end time point of the end nodes on a time axis;
integrating the starting average time point and the ending average time point generates an average fast forward period.
As a preferred mode of the present invention, the calculating of the backtracking time period includes the following steps:
acquiring all starting nodes of backtracking operation, and acquiring starting average time points of the starting nodes on a time axis;
acquiring all end nodes of backtracking operation, and acquiring an end average time point of the end nodes on a time axis;
integrating the starting average time point and the ending average time point generates an average backtracking period.
As a preferred mode of the present invention, the calculation of the slow-release time period includes the steps of:
acquiring all starting nodes of slow play operation, and acquiring starting average time points of the starting nodes on a time axis;
acquiring all end nodes of the slow playing operation, and acquiring an average end time point of the end nodes on a time axis;
integrating the starting average time point and the ending average time point generates an average slow playing period.
The method comprises the following steps:
acquiring keywords of a film file, and capturing network related information according to the keywords of the film file;
extracting related videos in the network related information;
comparing the time axes of the related videos and the film files;
intercepting the time axis with the most superposition;
taking the movie file content corresponding to the time axis as a fine selection time period;
the cull period is integrated with the average backtracking period, the average slow playing period.
The method comprises the following steps:
acquiring complete playing data of the movie file on a network;
acquiring playing information of the film files with playing quantity larger than a preset quantity threshold, wherein the playing information also comprises a mutual quantity;
the interaction amount comprises a time axis of the user and the interaction;
acquiring a time period on a time axis with the most interaction amount as an interaction time period;
and integrating the interaction period with the average backtracking period and the average slow playing period.
As a preferred mode of the present invention, acquiring a time period on a time axis in which a maximum amount of interaction occurs includes:
acquiring a relation graph of the mutual quantity and time;
extracting a time point when the mutual quantity vertex is formed;
and intercepting movie files of preset time before and after the time point when the interaction amount is at the top as an interaction time period.
As a preferred mode of the present invention, the time slot integration of the movie file includes the steps of:
acquiring a time period to be integrated;
extracting a time axis of each time interval;
the time periods are integrated in order of the time axis.
The method comprises the following steps:
the average slow release period is subjected to slow release display after integration.
The invention realizes the following beneficial effects:
judging the preference of the historical viewer to the content of the film according to the operation of the historical viewer to the film, and extracting the popular
The segments of the movie are integrated to generate a selected portion of the movie.
According to the playing data of the film on the network, the interactive data of the film is obtained, the popular part in the content of the film is judged according to the interactive quantity, and the fragments with large interactive quantity are intercepted to generate the carefully selected part.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a method for analyzing and integrating movies based on user attention according to the present invention;
fig. 2 is a flowchart illustrating fast forward time period integration of a movie analysis integration method based on user attention according to the present invention;
fig. 3 is a flowchart illustrating a backtracking period integration method for a movie analysis integration method based on user attention according to the present invention;
fig. 4 is a flowchart of slow playing period integration of a movie analysis and integration method based on user attention according to the present invention.
Fig. 5 is a flowchart illustrating integration of interaction time periods of a movie analysis and integration method based on user attention according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
Example one
Reference is made to fig. 1 as an example.
A movie analysis and integration method based on user attention comprises the following steps:
and S100, acquiring file information of a movie file, wherein the file information comprises the name, the description information and the content of the movie.
The name, brief introduction, file and other contents of the movie are acquired, and information extraction is performed on the movie file of the data source of the sub-clip.
And S110, extracting the playing information of the movie file, wherein the playing information comprises watching data, fast forward data and backward data.
And acquiring the movie file of the data source to extract the operation of the historical viewer, wherein the operation comprises the operation information of viewing amount, fast forward, backward, acceleration, slow play and the like.
And S121, acquiring the starting node and the ending node of all fast forward operations of the movie file.
And extracting the fast forward starting node and the fast forward ending node of all the users watching the movie, wherein a single fast forward operation is used as an extraction standard.
In which successive fast-forwarding times are extracted as one fast-forwarding, a time interval in the middle of the successive fast-forwarding times may be used as a criterion, and successive fast-forwarding intervals below a certain time value, which may be 1s, will be extracted as one fast-forwarding.
And S122, matching the starting node and the ending node of the fast forward.
And matching the fast forward starting node and the fast forward ending node according to the user account, wherein the fast forward starting node and the fast forward ending node of the same user account are in a period of time.
And S123, calculating the fast forward time period to obtain an average fast forward time period.
And acquiring fast-forward time periods of all the user accounts on a time axis, and averaging to obtain an average fast-forward time period.
And S131, acquiring the starting node and the ending node of all backtracking operations of the movie file.
And extracting backtracking starting nodes and ending nodes of all users watching the film, wherein a single backtracking operation is used as an extraction standard.
The continuous backtracking for multiple times is extracted as one backtracking, the time interval in the middle of the continuous backtracking for multiple times can be used as a judgment standard, the continuous backtracking with the interval lower than a certain time value can be extracted as one backtracking, and the time value can be 1 s.
And S132, matching the backtracking starting node and the backtracking ending node.
And matching the backtracked starting node and the backtracked ending node according to the user account, wherein the backtracked starting node and the backtracked ending node of the same user account are in a time interval.
And S133, calculating a backtracking time period to obtain an average backtracking time period.
And obtaining the backtracking time periods of all the user accounts on a time axis, and averaging to obtain an average backtracking time period.
And acquiring the starting node and the ending node of all slow playing operations of the movie file.
And S141, extracting the slow release start node and the slow release end node of all the users watching the movie, and taking a single slow release operation as an extraction standard.
And S142, matching the slow play starting node with the slow play ending node.
And matching the slow playing starting node and the slow playing ending node according to the user account, wherein the slow playing starting node and the slow playing ending node of the same user account form a time interval.
And S143, calculating the slow playing time period to obtain the average slow playing time period.
And acquiring the slow playing time periods of all the user accounts on a time axis, and averaging to obtain an average slow playing time period.
And S150, integrating the average backtracking time period and the average slow playing time period in the film file, and eliminating the average fast forwarding time period to generate a clipped film.
And extracting the average backtracking period and the average slow playing period for integration, and eliminating the fast forwarding period.
Example two
Reference is made to fig. 2-4 for example.
The present embodiment is substantially the same as the first embodiment, except that, as a preferred mode of the present embodiment, the calculating the fast forward time period includes the following steps:
s201: all the start nodes of the fast forward operation are acquired, and the start average time points of the start nodes are acquired on the time axis.
And extracting the fast forward starting nodes of the used users, averaging all the starting nodes on a time axis and acquiring the starting average time points of the starting nodes.
S202: all end nodes of the fast forward operation are acquired, and an end average time point of the end nodes is acquired on a time axis.
And extracting the fast forwarding end nodes of the used users, averaging all the start nodes on a time axis and acquiring the average ending time points of the start nodes.
S203: integrating the starting average time point and the ending average time point generates an average fast forward period.
As a preferred mode of this embodiment, the calculating the backtracking time period includes the following steps:
s301: all the starting nodes of the backtracking operation are acquired, and the average starting time points of the starting nodes are acquired on a time axis.
And extracting backtracking starting nodes of the used users, averaging all the starting nodes on a time axis and acquiring starting average time points of the starting nodes.
S302: and acquiring all end nodes of the backtracking operation, and acquiring an end average time point of the end nodes on a time axis.
And extracting backtracking end nodes of the used users, averaging all the start nodes on a time axis and acquiring an end average time point of the start nodes.
S303: integrating the starting average time point and the ending average time point generates an average backtracking period.
As a preferable mode of this embodiment, the calculating the slow-release time period includes the following steps:
s401: all starting nodes of the slow playing operation are acquired, and the average starting time point of the starting nodes is acquired on a time axis.
And extracting slow-release starting nodes of all users, averaging all the starting nodes on a time axis and acquiring starting average time points of the starting nodes.
S402: and acquiring all end nodes of the slow playing operation, and acquiring an average end time point of the end nodes on a time axis.
And extracting slow playing end nodes of all users, averaging all the start nodes on a time axis and acquiring an average ending time point of the start nodes.
S403: integrating the starting average time point and the ending average time point generates an average slow playing period.
EXAMPLE III
The present embodiment is substantially the same as the first embodiment, except that, as a preferred mode of the present embodiment, the following steps are included:
and acquiring a film file keyword, and capturing network related information according to the film file keyword.
And capturing related information of the film on the network.
And extracting related videos in the network related information.
And capturing the video in the related information, wherein the video is a clip video, namely an incomplete video.
The time axis of the associated video is compared to the movie file. And comparing the relevant video with the time axis of the film file model.
And intercepting the time axis with the most superposition. And extracting partial films of the overlapped time axes.
And taking the movie file content corresponding to the time axis as a fine selection period.
The cull period is integrated with the average backtracking period, the average slow playing period. The select time period is added to the filmed clip.
Example four
Reference is made to fig. 5 as an example.
The present embodiment is substantially the same as the first embodiment, except that, as a preferred mode of the present embodiment, the following steps are included:
s501: and acquiring complete playing data of the movie file on the network.
S502: and acquiring the playing information of the film files with the playing amount larger than the preset number threshold, wherein the playing information also comprises the mutual amount.
The preset number threshold may be determined according to a value set by the filter.
S503: the amount of interaction includes the user at which the interaction occurred and the timeline at which the interaction occurred. The interaction comprises barrage, message leaving and comment, and a time axis of the interaction when the user account watches the film is obtained.
S504: and acquiring a time period on a time axis with the most interaction amount as an interaction time period.
S505: and integrating the interaction period with the average backtracking period and the average slow playing period.
As a preferable mode of the present embodiment, acquiring a time period on a time axis in which the amount of interaction occurs most includes the steps of:
and acquiring a relation graph of the mutual quantity and the time.
And extracting the time point when the mutual quantity vertex is extracted.
And intercepting movie files of preset time before and after the time point when the interaction amount is at the top as an interaction time period.
EXAMPLE five
The present embodiment is substantially the same as the first embodiment, except that, as a preferred mode of the present embodiment, the time period integration of the movie file includes the following steps:
the time period to be integrated is acquired.
The time axis of each period is extracted.
The time periods are integrated in order of the time axis.
As a preferable mode of the present embodiment, the method includes the steps of:
the average slow release period is subjected to slow release display after integration.
The above embodiments are merely illustrative of the technical ideas and features of the present invention, and are intended to enable those skilled in the art to understand the contents of the present invention and implement the present invention, and not to limit the scope of the present invention. All equivalent changes or modifications made according to the spirit of the present invention should be covered within the protection scope of the present invention.
Claims (9)
1. A movie analysis and integration method based on user attention is characterized by comprising the following steps:
acquiring file information of a movie file, wherein the file information comprises a name, description information and content of a movie;
extracting playing information of a movie file, wherein the playing information comprises watching data, fast forward data and backward data;
acquiring starting nodes and ending nodes of all fast forward operations of the movie file;
matching a starting node and an ending node of fast forward;
calculating a fast forward time period to obtain an average fast forward time period;
acquiring starting nodes and ending nodes of all backtracking operations of the movie file;
matching a backtracking starting node and an ending node;
calculating a backtracking time period to obtain an average backtracking time period;
acquiring starting nodes and ending nodes of all slow playing operations of the movie file;
matching a slow playing starting node with a slow playing ending node;
calculating a slow release time period to obtain an average slow release time period;
and integrating the average backtracking time period and the average slow playing time period in the film file, and eliminating the average fast forward time period to generate the clipped film.
2. A method for movie analysis and integration based on user attention as claimed in claim 1, wherein the step of calculating the fast forward time period comprises the steps of:
acquiring all starting nodes of the fast forward operation, and acquiring starting average time points of the starting nodes on a time axis;
acquiring all end nodes of the fast forward operation, and acquiring the average end time point of the end nodes on a time axis;
integrating the starting average time point and the ending average time point generates an average fast forward period.
3. The method as claimed in claim 1, wherein the step of calculating the time period of backtracking comprises the steps of:
acquiring all starting nodes of backtracking operation, and acquiring starting average time points of the starting nodes on a time axis;
acquiring all end nodes of backtracking operation, and acquiring an end average time point of the end nodes on a time axis;
integrating the starting average time point and the ending average time point generates an average backtracking period.
4. A method for movie analysis and integration based on user attention as claimed in claim 1, wherein the step of calculating the slow-release time period comprises the steps of:
acquiring all starting nodes of slow play operation, and acquiring starting average time points of the starting nodes on a time axis;
acquiring all end nodes of the slow playing operation, and acquiring an average end time point of the end nodes on a time axis;
integrating the starting average time point and the ending average time point generates an average slow playing period.
5. A method for analyzing and integrating movies based on user attention as claimed in claim 1, characterized by comprising the steps of:
acquiring keywords of a film file, and capturing network related information according to the keywords of the film file;
extracting related videos in the network related information;
comparing the time axes of the related videos and the film files;
intercepting the time axis with the most superposition;
taking the movie file content corresponding to the time axis as a fine selection time period;
the cull period is integrated with the average backtracking period, the average slow playing period.
6. A method for analyzing and integrating movies based on user attention as claimed in claim 1, characterized by comprising the steps of:
acquiring complete playing data of the movie file on a network;
acquiring playing information of the film files with playing quantity larger than a preset quantity threshold, wherein the playing information also comprises a mutual quantity;
the interaction amount comprises a time axis of the user and the interaction;
acquiring a time period on a time axis with the most interaction amount as an interaction time period;
and integrating the interaction period with the average backtracking period and the average slow playing period.
7. The method as claimed in claim 6, wherein the step of obtaining the time segment on the time axis where the amount of interaction occurs the most includes the steps of:
acquiring a relation graph of the mutual quantity and time;
extracting a time point when the mutual quantity vertex is formed;
and intercepting movie files of preset time before and after the time point when the interaction amount is at the top as an interaction time period.
8. A method for analyzing and integrating movies based on user attention as claimed in claim 1, wherein the time-interval integration of movie files comprises the following steps:
acquiring a time period to be integrated;
extracting a time axis of each time interval;
the time periods are integrated in order of the time axis.
9. The method as claimed in claim 8, wherein the method comprises the following steps:
the average slow release period is subjected to slow release display after integration.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010459420.1A CN111711854B (en) | 2020-05-27 | 2020-05-27 | Film analysis and integration method based on user attention |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010459420.1A CN111711854B (en) | 2020-05-27 | 2020-05-27 | Film analysis and integration method based on user attention |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111711854A true CN111711854A (en) | 2020-09-25 |
CN111711854B CN111711854B (en) | 2023-01-06 |
Family
ID=72538238
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010459420.1A Active CN111711854B (en) | 2020-05-27 | 2020-05-27 | Film analysis and integration method based on user attention |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111711854B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080317433A1 (en) * | 2005-12-28 | 2008-12-25 | Sony Corporation | Playback Apparatus, Playback Method, Storage Apparatus, Storage Method, Program, Storage Medium, Data Structure, and Method of Producing a Storage Medium |
CN104410920A (en) * | 2014-12-31 | 2015-03-11 | 合一网络技术(北京)有限公司 | Video segment playback amount-based method for labeling highlights |
CN104796781A (en) * | 2015-03-31 | 2015-07-22 | 小米科技有限责任公司 | Video clip extraction method and device |
CN108093297A (en) * | 2017-12-29 | 2018-05-29 | 厦门大学 | A kind of method and system of filmstrip automatic collection |
-
2020
- 2020-05-27 CN CN202010459420.1A patent/CN111711854B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080317433A1 (en) * | 2005-12-28 | 2008-12-25 | Sony Corporation | Playback Apparatus, Playback Method, Storage Apparatus, Storage Method, Program, Storage Medium, Data Structure, and Method of Producing a Storage Medium |
CN104410920A (en) * | 2014-12-31 | 2015-03-11 | 合一网络技术(北京)有限公司 | Video segment playback amount-based method for labeling highlights |
CN104796781A (en) * | 2015-03-31 | 2015-07-22 | 小米科技有限责任公司 | Video clip extraction method and device |
CN108093297A (en) * | 2017-12-29 | 2018-05-29 | 厦门大学 | A kind of method and system of filmstrip automatic collection |
Also Published As
Publication number | Publication date |
---|---|
CN111711854B (en) | 2023-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2021203490B2 (en) | Methods and systems for generating and providing program guides and content | |
US20210099762A1 (en) | Live video stream sharing | |
CN107093100B (en) | Multifunctional multimedia device | |
US20240205505A1 (en) | Interaction method, system, and electronic device | |
CN111711854B (en) | Film analysis and integration method based on user attention | |
JP5596622B2 (en) | Digest video information providing apparatus, digest video information providing method, and digest video information providing program | |
JP2009194767A (en) | Device and method for video evaluation, and video providing device | |
CN108377426A (en) | Barrage time display method, system and storage medium | |
US11769531B1 (en) | Content system with user-input based video content generation feature | |
WO2007012556A1 (en) | Method for creating a summary of a document based on user-defined criteria, and related audio-visual device | |
FR2988255A1 (en) | Method for tele-transmission of data set from broadcast server to e.g. set-top box associated with TV screen, involves constructing stream to be tele-transmitted comprising data stream, and tele-transmitting stream to reception device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20221207 Address after: No. 803-1, Building 3, Baoneng Times Bay, No. 1, Shangang Road, East Coast New City, Shantou City, Guangdong Province, 515000 Applicant after: Shantou Daogu Culture Media Co.,Ltd. Address before: Room 437, building 1, No.503, Xingguo Road, economic and Technological Development Zone, Yuhang District, Hangzhou, Zhejiang 311100 Applicant before: Hangzhou cloud cultural creativity Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |