CN115048010A - Method, device, equipment and medium for displaying audiovisual works - Google Patents
Method, device, equipment and medium for displaying audiovisual works Download PDFInfo
- Publication number
- CN115048010A CN115048010A CN202110250896.9A CN202110250896A CN115048010A CN 115048010 A CN115048010 A CN 115048010A CN 202110250896 A CN202110250896 A CN 202110250896A CN 115048010 A CN115048010 A CN 115048010A
- Authority
- CN
- China
- Prior art keywords
- plot
- map
- route
- scenario
- work
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Databases & Information Systems (AREA)
- Remote Sensing (AREA)
- Data Mining & Analysis (AREA)
- Controls And Circuits For Display Device (AREA)
- Navigation (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a method, a device, equipment and a storage medium for displaying audiovisual works, and belongs to the field of multimedia playing. The method comprises the following steps: displaying a map control of the audiovisual work, the map control displaying a map associated with a scenario of the audiovisual work; in response to a plot route selection operation on the map, determining map marking points on the plot route; and displaying the plot sections corresponding to the plot routes in the audiovisual works. According to the method and the device, the user can select different plot trends based on plot routes on the map in interaction, and the display is switched among the video clips corresponding to the different plot routes.
Description
Technical Field
The embodiment of the application relates to the field of multimedia playing, in particular to a method, a device, equipment and a medium for displaying audiovisual works.
Background
Audiovisual works, such as television shows, novels, comics, and the like, are the most widely used multimedia content on the internet.
In the related technology, in order to improve the experience of users, the users can participate in the plot development of the audiovisual works more, an interaction link is often inserted after the story background is handed over, and different plot routes are skipped according to the selection of the users, so that different plot endings are guided. Taking watching television drama as an example, a user can make a selection of a role in a certain key node according to own judgment, and watch a drama extending according to the selection.
For some audiovisual works with complex geographic positions, such as adventure decryption series or synthesis needing reasoning, when the method is used for interactive selection, a user is often difficult to imagine the geographic positions and predict the development direction of the scenario, so that the selection made in the interactive link is not suitable for the self language expectation, and the video needs to be watched again from the beginning when the selection is made again, so that the efficiency is low.
Disclosure of Invention
The application provides a method, a device, equipment and a storage medium for displaying audio-visual works, so that a user can select different plot trends based on plot routes on a map and switch display among different plot routes. The technical scheme is as follows:
according to an aspect of the present application, there is provided a method of presenting an audiovisual work, the method comprising:
a map control that displays the audiovisual work, the map control displaying a map associated with a scenario of the audiovisual work and no less than two scenario routes;
in response to the selection operation of the at least two plot routes on the map, determining map marking points on the plot routes;
and displaying the plot sections corresponding to the map mark points in the audiovisual works.
According to another aspect of the present application, there is provided an apparatus for presenting an audiovisual work, the apparatus comprising:
a display module for displaying a map control of the audiovisual work, the map control displaying a map associated with a scenario of the audiovisual work and at least two scenario routes;
the interactive module is used for responding to plot route selection operation on the map and determining a selected plot route on the map;
and the display module is used for displaying the plot sections corresponding to the plot routes in the audiovisual works.
In an alternative design of the present application, a place marking control is displayed on the map control;
the display module is used for responding to a first movement operation of moving the location marking control to a first plot route of the at least two plot routes, and displaying plot sections corresponding to the first plot route on the work display page.
In an optional design of the present application, the interaction module is configured to determine, in response to a first moving operation of moving the location marking control to a first scenario route of the at least two scenario routes, a first map marking point indicated by the location marking control on the first scenario route;
the display module is used for displaying a first plot segment which corresponds to the first plot route and occurs on the first map mark point on the work display page.
In an alternative design of the present application, the apparatus further includes:
the interaction module is used for responding to a second movement operation of moving the place marking control from the first map marking point on the first plot route, and determining a second place marking point indicated by the place marking control on the first plot route;
the display module is used for displaying a second plot segment which corresponds to the first plot route and occurs on the second geographical icon record point on the work display page.
In an optional design of the present application, the presentation module is configured to display a fast forward animation if the first scenario segment is earlier than the second scenario segment; and displaying the fast-backward animation under the condition that the first plot segment is later than the second plot segment.
In an optional design of the present application, the presentation module is configured to present, in response to a third selection operation on a second scenario route of the at least two scenario routes, a scenario segment corresponding to the second scenario route on the work presentation page.
In an optional design of the present application, the display module is configured to display a plot element selection control of the audiovisual work;
the interaction module is used for responding to element selection operation on the plot element selection control and determining the selected first plot element;
the display module is used for responding to selection operation on a first plot route in the at least two plot routes and displaying plot fragments corresponding to the first plot route and the first plot elements on the work display page.
In an optional design of the present application, the interaction module is configured to determine a selected second scenario element in response to an element selection operation on the scenario element selection control;
the display module is used for displaying the plot sections corresponding to the first plot route and the second plot element in the audiovisual work in a switching mode on the work display page.
In an optional design of the present application, the display module is configured to highlight a map point corresponding to a scenario segment being played on the map control; or, highlighting the map point corresponding to the target storyline character which is playing on the map control;
wherein the highlighting comprises: at least one of bold display, enlarge display, reverse color display, foreground color change, background color change and animation special effect increase.
In an optional design of the present application, the display module is configured to display a map control of the audiovisual work in response to a first trigger condition being met; the first trigger condition includes any one of the following conditions:
playing to the designated progress on the playing progress bar;
receiving a start presentation operation of the audiovisual work;
receiving a dragging operation or a skipping operation on a playing progress bar of the audiovisual work;
receiving the dragging operation on a playing progress bar of the audiovisual work, wherein the dragging distance of the dragging operation is greater than a first threshold value;
receiving the jump operation on a playing progress bar of the audiovisual work, wherein the operation times of the jump operation is greater than a second threshold value;
receiving a fast forward operation or a fast rewind operation of the audiovisual work;
an open operation or a jump operation or a forward operation or a backward operation of a directory of the audiovisual work is received.
In an optional design of the present application, the display module is configured to cancel displaying the map control of the audiovisual work in response to a second trigger condition being met; the second trigger condition includes any one of the following conditions:
receiving a closing operation for closing the map control;
the duration of time for which no operation is received on the map control is greater than a third threshold;
starting to play the plot segments;
and receiving human-computer interaction operation on other interaction areas except the map control.
According to an aspect of the present application, there is provided a computer device including: a processor and a memory, said memory storing a computer program that is loaded and executed by said processor to implement the method of presenting an audiovisual work as described above.
According to an aspect of the present application, there is provided a computer readable storage medium storing a computer program which is loaded and executed by a processor to implement the method of presenting an audiovisual work as described above.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The computer instructions are read by a processor of a computer device from a computer-readable storage medium, and the computer instructions are executed by the processor to cause the computer device to perform the methods provided in the various alternative implementations of the audiovisual work presentation aspect described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
the interactive display scheme based on the map can be provided by displaying the map associated with the plot of the audiovisual work and displaying plot sections corresponding to the plot route in the audiovisual work in response to the plot route selected by the user on the map, so that the user can switch the display among different plot routes based on the plot route on the map, and a more efficient human-computer interaction scheme is provided for audiovisual works with exploration decryption, complex geographical positions and strong interactivity.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 illustrates a block diagram of a computer system provided by an exemplary embodiment;
FIG. 2 illustrates a flow chart of a method of presenting an audiovisual work provided by an exemplary embodiment;
FIG. 3 illustrates an interface diagram of a presentation method of an audiovisual work provided by an exemplary embodiment;
FIG. 4 illustrates a flow chart of a method of presenting an audiovisual work provided by an exemplary embodiment;
FIG. 5 illustrates an interface diagram of a method for presenting an audiovisual work provided by an exemplary embodiment;
FIG. 6 illustrates a flow chart of a method of presenting an audiovisual work provided by an exemplary embodiment;
FIG. 7 illustrates an interface diagram of a presentation method of an audiovisual work provided by an exemplary embodiment;
FIG. 8 illustrates an interface diagram of a presentation method of an audiovisual work provided by an exemplary embodiment;
FIG. 9 illustrates an interface diagram of a presentation method of an audiovisual work provided by an exemplary embodiment;
FIG. 10 illustrates a flow chart of a method of presenting an audiovisual work provided by an exemplary embodiment;
FIG. 11 illustrates an interface diagram of a presentation method of an audiovisual work provided by an exemplary embodiment;
FIG. 12 shows a block diagram of a presentation apparatus for an audiovisual work provided by an exemplary embodiment;
FIG. 13 illustrates a block diagram of a computer device, which is provided in an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
It is to be understood that reference herein to "a number" means one or more and "a plurality" means two or more. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
FIG. 1 is a block diagram illustrating a computer system provided in an exemplary embodiment of the present application. The computer system 100 includes: a terminal 120 and a server 140.
The terminal 120 is equipped with and runs a client for presenting audiovisual works, which may be an application or a web client. Taking the client as an application program as an example, the application program may be any one of a video playing program, a novel reading program, a cartoon reading program, and an audio playing program. In this embodiment, the application program is a video playing program for example.
The terminal 120 is connected to the server 140 through a wireless network or a wired network.
The server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 140 is used to provide background services for clients. Optionally, the server 140 undertakes primary computational work and the terminals undertake secondary computational work; or, the server 140 undertakes the secondary computing work, and the terminal undertakes the primary computing work; alternatively, the server 140 and the terminal perform cooperative computing by using a distributed computing architecture.
With reference to the above description of the implementation environment, the method for presenting audiovisual works provided in the embodiment of the present application is described, and an execution subject of the method is illustrated as a client running on the terminal shown in fig. 1.
FIG. 2 illustrates a flow chart of a method of presentation of an audiovisual work shown in an exemplary embodiment of the present application. The embodiment is exemplified by the method being performed by the terminal 120 (or a client within the terminal 120) shown in fig. 1. The method comprises the following steps:
step 220: displaying a work presentation page of an audiovisual work;
audiovisual works are works containing scenarios. The manner of presentation of the audiovisual work includes at least one of visual display and auditory playback.
An audiovisual work includes: at least one of video, audio, literature, and caricatures. Such as movies, television shows, temperaments, novels, biographies, audio books, animations, and the like.
Step 240: the map control displays a map associated with the plot of the audiovisual work, and at least two plot routes are displayed on the map;
the map of the audiovisual work is related to the scenario, for example, the map is a map of activities of various characters in the audiovisual work, and for example, the map is a map related to a world view constructed in the audiovisual work.
Illustratively, the map may be a two-dimensional map or a three-dimensional map. The map is a natural map, a social map, a humanistic map, an economic map, a historical map, a cultural map, a traffic map, an aeronautical chart, an astronautic chart, a navigation chart and a tourist map.
In one embodiment, a map control of an audiovisual work is initially displayed upon presentation of the audiovisual work to a plot key point. A scenario keypoint is a point in time at which at least two scenario trends begin to appear.
The map control is a full screen control, a window control, or an element control on the interface. Illustratively, a map control is a control displayed on a localized area of a presentation interface of an audiovisual work. A plurality of candidate storyline routes may be displayed on the map control. Each candidate storyline route is a storyline route that appears in the storyline.
Step 260: in response to a selection operation on a first scenario route of the at least two scenario routes, scenario segments corresponding to the first scenario route are displayed on the work presentation page.
When the audiovisual works are videos or audios, the terminal plays the scenario segments corresponding to the scenario routes selected by the user in the audiovisual works; when the audio-visual work is a literary work, the terminal displays text segments, such as chapters, loops, pages and paragraphs, corresponding to the plot route selected by the user in the audio-visual work; and when the audiovisual work is a cartoon work, displaying the image-text segments corresponding to the plot route selected by the user in the audiovisual work by the terminal.
As shown in fig. 3 by way of example, taking an audiovisual work as a video, a video a is being played on a terminal, when the video a is played to a plot key point, a map control 20 is displayed on a playing interface of the video a, and three plot routes are displayed on the map control 20: 1 coastal, 2 mountain roads and 3 jungles. When the user clicks on the map control 20, one of the storyline routes 22 may be selected. The terminal skips to play the video clip 24 corresponding to the scenario route 22.
In summary, the method provided by this embodiment displays the map associated with the scenario of the audiovisual work, and responds to the scenario route selected by the user on the map to display the scenario segments corresponding to the scenario route in the audiovisual work, so as to provide an interactive display scheme based on the map, so that the user can switch between different scenario routes for display based on the scenario route on the map, and provide a more efficient human-computer interaction scheme for audiovisual works which have complicated geographical positions, need deep experience of the user, and have strong interactivity, such as the scenarios.
FIG. 4 illustrates a flow chart of a method of presentation of an audiovisual work illustrated in an exemplary embodiment of the present application. The embodiment is illustrated by the method being performed by the terminal 120 (or a client in the terminal 120) shown in fig. 1. The method comprises the following steps:
step 242: in response to a first trigger condition being met, displaying a map control of the audiovisual work, the map control displaying a map associated with a scenario of the audiovisual work, the map displaying at least two scenario routes and location marking controls;
the first trigger condition includes, but is not limited to, at least one of the following conditions:
a specified progress of playing onto the play progress bar.
The specified progress is the scenario development to the key point. Under the condition that the audiovisual works are videos or audios, if the audiovisual works are played to a plot key point and a user needs to make plot trend selection, a map control of the audiovisual works is displayed to help the user to make a prejudgment on the plot and select a plot route.
Receiving a start presentation operation of the audiovisual work;
the starting of the display operation includes: the operation of playing the audiovisual work is started, the operation of listening to the audiovisual work is started, and the operation of displaying the audiovisual work is started. In response to the start of the presentation operation, the terminal starts presenting the audiovisual work and displaying a map control of the audiovisual work.
Receiving a drag operation or a skip operation on a play progress bar of the audiovisual work;
under the condition that the audiovisual work is video or audio, if dragging operation or jumping operation on a playing progress bar of the audiovisual work is received, the jumping intention of a user for jumping playing based on a map control is predicted to exist on a certain probability.
Receiving a drag operation on a play progress bar of an audiovisual work and a drag distance of the drag operation is greater than a first threshold;
under the condition that the audiovisual work is video or audio, if a large-amplitude dragging operation on a playing progress bar of the audiovisual work is received, the skipping intention of skipping playing based on the map control is predicted to exist on a certain probability. The first threshold may be 10%, 20%, 30%, etc. of the total length of the play progress bar.
Receiving a jump operation on a play progress bar of the audiovisual work, the number of times of the jump operation being greater than a second threshold;
under the condition that the audiovisual work is video or audio, if a plurality of jumping operations on a playing progress bar of the audiovisual work are received, the jumping intention of jumping playing based on a map control is predicted to exist on a certain probability. The second threshold may be 3 times, 5 times, etc.
Receiving a fast forward operation or a fast rewind operation of an audiovisual work;
under the condition that the audiovisual work is video or audio, if a fast forward operation or a fast backward operation on a playing progress bar of the audiovisual work is received, the skip intention of skipping playing based on a map control is predicted to exist on a certain probability.
An open operation or a jump operation or a forward operation or a backward operation of a directory of the audiovisual work is received.
When the audiovisual work is a written work or a cartoon work, if the opening operation or the jumping operation or the forward operation or the backward operation of the directory of the audiovisual work is received, the jumping intention of the user for jumping play based on the map control is predicted to exist in a certain probability.
In response to at least one of the above conditions being met, the terminal displays a map control of the audiovisual work.
Illustratively, a map control displays a map associated with a scenario of an audiovisual work, at least two scenario routes corresponding to different scenario routes, and a location marking control. The place mark control is a control for selecting a map mark point. For example, the place marker control is a map flag, a floating water droplet, a triangular indicator, a floating cube cone, and the like.
Illustratively, the map control displays one or more maps. When a plurality of maps need to be displayed, a user can slide left and right or up and down on the map control, and the terminal switches different map displays in response to the sliding operation on the map control. Or the terminal simultaneously displays a plurality of candidate maps and responds to the confirmation operation of the user to display the selected target map.
Step 262: in response to a first movement operation of moving the location marking control to a first plot route in the plot routes, determining a first map marking point indicated by the location marking control on the first plot route;
the first storyline route is one of at least two alternative storyline routes.
The first moving operation is an operation for selecting a scenario route on at least two scenario route selection controls. The first moving operation may be at least one of a click operation, a drag operation, a double click operation, a voice operation, a pressure touch operation, an eye control, and a body control, which is not limited in the present application.
The first map marker point is a map marker point on the first storyline route.
As shown in fig. 3, route 1 (coastal) in the scenario route is the currently selected scenario route, which is the first scenario route; selecting an operation of route 1 (coastal) as a first moving operation; after the first plot route is selected through the first moving operation, the position of the map marking control is a first map marking point.
Step 264: displaying a first plot segment corresponding to the first plot route and occurring at a first map marking point on a work display page;
when the audiovisual work is video or audio, the terminal plays a scenario segment corresponding to a first map mark point in a first scenario route in the audiovisual work; when the audiovisual work is a literary work, the terminal displays text segments, such as chapters, loops, pages and paragraphs, corresponding to the first map marking point in the first plot route in the audiovisual work; when the audiovisual work is a cartoon work, the terminal displays the image-text segment corresponding to the first map mark point in the first plot route in the audiovisual work.
Step 282: in response to a second movement operation of moving the location marking control on the first storyline route, determining a second location marking point indicated by the location marking control on the first storyline route;
the second moving operation is an operation for moving the location marking control on the same plot route. The second moving operation may be at least one of a click operation, a drag operation, a double click operation, a voice operation, a pressure touch operation, an eye control, and a body control, which is not limited in the present application.
And the second place mark point is a map mark point determined by the mobile place mark control on the same storyline route.
Schematically, acquiring a first corresponding relation, wherein the first corresponding relation comprises a corresponding relation between a map mark point and a plot segment;
the first correspondence includes correspondence between different locations and different storyline segments. The scenario segment is represented by a start time point, or a start time point and an end time point.
As shown in fig. 5, an exemplary scenario segment corresponding to location 1 in the coastal route starts from 15:00 on the play progress bar; the plot corresponding to the position 2 in the coastal route starts from 23:00 on the playing progress bar; the plot corresponding to the position 3 in the coastal route starts from 33:10 on the playing progress bar; the scenario segment corresponding to location 4 in the coastal route starts at 42:32 on the play progress bar.
Step 284: displaying a second plot segment corresponding to the first plot route and occurring at a second geographical point on the work display page;
for example, assuming that the map marking point selected by the user is the location 3 of the coastal route, the plot section corresponding to the location 3 of the coastal route starts to be played from 33:10 of the playing progress bar.
When the audiovisual work is video or audio, the terminal plays a scenario segment corresponding to the second place icon registration point in the first scenario route in the audiovisual work; when the audiovisual work is a literary work, the terminal displays text segments, such as chapters, loops, pages and paragraphs, corresponding to the second place icon record points in the first plot route in the audiovisual work; when the audiovisual work is a cartoon work, the terminal displays the image-text segment corresponding to the second place icon record point in the first plot route in the audiovisual work.
In an alternative design, step 282 may be preceded by the following steps: displaying a fast forward animation in the event that the first scenario segment is earlier than the second scenario segment; and displaying the fast-moving animation under the condition that the first plot segment is later than the second plot segment.
Step 292: in response to a third selection operation of moving the location marking control to a second plot route of the at least two plot routes, showing plot segments corresponding to the second plot route on the work showing page;
the second scenario route is a scenario route different from the first scenario route among the at least two scenario routes.
The third selection operation is an operation for selecting a second scenario route control different from the currently selected scenario route. The third moving operation may be at least one of a click operation, a drag operation, a double click operation, a voice operation, a pressure touch operation, an eye control, and a motion sensing control, which is not limited in this application.
Step 294: in response to a second trigger condition being satisfied, canceling the display of the map control of the audiovisual work.
The second trigger condition includes any one of the following conditions:
receiving a close operation for closing the map control;
the length of time that no operation is received on the map control is greater than a third threshold;
start playing the episode of the plot;
receiving human-computer interaction operations on other interaction regions than map controls.
In summary, the method provided in this embodiment provides an interactive display scheme based on a map by displaying a map associated with a scenario of an audiovisual work and responding to a scenario route and a map mark point selected by a user on the map, so that the user can select a scenario trend through the scenario route, can freely switch between different scenario routes and different places for display, know scenarios occurring on different scenario routes in the same time period, and can select to repeatedly watch a more complex scenario, so as to more comprehensively and deeply feel the scenario of the audiovisual work, thereby improving experience, and providing an efficient human-computer interaction scheme.
Since the trend of a story in an audiovisual work is not only related to the story line, but may also be related to other story elements in the story, such as the story character, the items in the story, the age of the story, and the like. The embodiments of the present application also provide the following embodiments.
FIG. 6 illustrates a flow chart of a method of presenting an audiovisual work illustrated in an exemplary embodiment of the present application. The embodiment is exemplified by the method being performed by the terminal 120 (or a client within the terminal 120) shown in fig. 1. The method comprises the following steps:
step 602: displaying a map control and a plot element selection control of the audiovisual work;
the scenario elements include: at least one of a time in the scenario, a character in the scenario, an item in the scenario. The embodiment is exemplified by the scenario element being a character in the scenario, but the specific type of the scenario element is not limited.
The scenario element selection control is a control for selecting one or more scenario elements among the plurality of scenario elements. The plot element selection control may be a button, menu, list, or the like. The embodiment is exemplified by the scenario element selection control being a plurality of buttons, each button corresponding to a role.
In one embodiment, the plot element selection control is displayed simultaneously with the map control, or simultaneously with the cancellation of the display. In one embodiment, the plot element selection control may not be displayed at the same time as the map control, or may not be displayed at the same time.
As shown in fig. 7, for example, in response to the trigger condition being met, a map control 20 and a storyline element selection control 26 of the audiovisual work are simultaneously displayed. The plot element selection control 26 includes a selection button corresponding to actor a, a selection button corresponding to actor B, and a selection button corresponding to actor C. Optionally, a plot route control 22 associated with a plot of the audiovisual work is displayed on the map control 20.
Step 604: responding to a first element selection operation on the plot element selection control, and determining a selected first plot element;
the first element selection operation is an operation for selecting a scenario element on the scenario element selection control. The first element selection operation may be at least one of a click operation, a drag operation, a double click operation, a voice operation, a pressure touch operation, an eye control, and a body control, which is not limited in this application.
The first scenario element is one of a plurality of scenario elements.
Step 606: in response to a selection operation on a first plot route of the at least two plot routes, determining a selected first plot route;
the execution sequence of step 604 and step 606 is not limited in this embodiment, step 604 may be executed before step 606, and step 606 may be executed before step 604.
Step 608: displaying a plot segment corresponding to a first plot route and a first plot element on a work display page;
when the audiovisual work is video or audio, the terminal plays the scenario segments which correspond to the first scenario route and the first scenario element in the audiovisual work at the same time; when the audio-visual work is a literary work, the terminal displays text segments, such as chapters, loops, pages and paragraphs, corresponding to the first plot route and the first plot element in the audio-visual work; and when the audiovisual work is cartoon work, the terminal displays the image-text segments which correspond to the first plot route and the first plot elements in the audiovisual work at the same time.
In an example where the first scenario element is selected first and then the scenario route is selected, as shown in fig. 8:
the terminal displays a selection button 26a of actor a, a selection button of actor B, a selection button of actor C, and a map control 20, and the map control 20 displays a plot route control 22 and a location mark control 28. In response to a click operation triggered by the user on the selection button 26a of actor a, the terminal displays candidate scenario routes 1, 2, 3 corresponding to actor a on the map control 20, respectively corresponding to scenario routes 1, 2, 3. In response to a user's dragging operation of the location mark control 28, the plot route 1 closest to the dragged location mark control 28 is determined as a plot route, and the terminal displays a video clip in which the actor a appears on the plot route 1.
That is, in response to determining the first scenario element, displaying the candidate scenario route corresponding to the first scenario element on the map control. The candidate scenario route corresponding to the first scenario element is a candidate scenario route in which the first scenario element appears. And if candidate plot routes which do not correspond to the first plot elements exist, displaying the candidate plot routes or displaying the candidate plot routes as unselected styles.
Illustratively, after the first scenario element is determined, the scenario element selection control corresponding to the first scenario element is highlighted. For example, the selection button corresponding to actor a may be enlarged and bolded.
Illustratively, in response to a dragging operation on the location marking control, the terminal determines a candidate scenario route on the map closest to the moved location marking control as the scenario route. And in response to the ending position of the dragging operation not coinciding with the nearest candidate scenario route, the terminal displays an adsorption animation for adsorbing the location mark control to the nearest candidate scenario route. Therefore, the user can perceive that the plot route selected by the user is the candidate plot route closest to the moved position marking control.
In the previous example of selecting a plot route and then selecting a first plot element, as shown in fig. 9:
the terminal displays a selection button 26a of actor a, a selection button of actor B, a selection button of actor C, and a map control 20, and the map control 20 displays a scenario route control 22 and a place mark control 28.
In the initial state, the map point indicated by the location mark control 28 is the map point corresponding to the current scenario. In response to receiving a touch operation (start touch event) on the place mark control 28, a plurality of candidate storyline routes are displayed on the map control. Optionally, the plurality of candidate storyline routes are storyline routes in which at least two storyline elements appear. The plot routes where different plot elements appear can be displayed in different styles.
In response to receiving the drag operation on the location mark control 28, the terminal determines the plot route closest to the dragged location mark control 28 as the plot route, such as plot route 1 is selected as the plot route.
After the plot route is determined, the terminal displays plot element selection controls of at least two candidate plot elements corresponding to (or appearing on) the plot route. In response to a click operation triggered on the selection button 26a of actor a by the user, the terminal displays a video clip in which actor a appears on scenario route 1.
And the candidate plot elements corresponding to the selected map marking points are the candidate plot elements appearing at the map marking points. And if other candidate map elements which do not appear at the map marking point exist, not displaying the other candidate map elements or displaying the candidate map elements in an unselected mode.
Illustratively, after the first scenario element is determined, the scenario element selection control corresponding to the first scenario element is highlighted. For example, the selection button corresponding to actor a may be enlarged and bolded.
Illustratively, in response to a dragging operation on the location marking control, the terminal determines a candidate route on the map closest to the moved location marking control as the scenario route. And in response to the ending position of the dragging operation not coinciding with the nearest candidate scenario route, the terminal displays an adsorption animation for adsorbing the location mark control to the nearest candidate scenario route. Therefore, the user can perceive that the plot route selected by the user is the candidate plot route closest to the moved position marking control.
Step 610: responding to a second element selection operation on the plot element selection control, and determining a selected second plot element;
optionally, in the displaying of the scenario segments corresponding to the first scenario route and the first scenario element, in response to the first scenario route having the scenario segment corresponding to the second scenario element, the scenario button corresponding to the second scenario element may be displayed as a selectable middle style.
The user can reselect the second scenario element on the scenario element selection control in the process of watching the scenario segment corresponding to the first scenario element. And responding to a second selection operation on the plot element selection control, and determining the selected second plot element by the terminal.
Step 612: and switching and displaying the plot sections corresponding to the first plot route and the second plot elements in the audiovisual works on the work display page.
When the audiovisual work is video or audio, the terminal plays the scenario segments which correspond to the first scenario route and the second scenario element simultaneously in the audiovisual work; when the audiovisual work is a literary work, the terminal displays text segments, such as chapters, returns, pages and paragraphs, corresponding to the first scenario route and the second scenario element in the audiovisual work at the same time; and when the audiovisual work is cartoon work, the terminal displays the image-text segments which correspond to the first plot route and the second plot elements in the audiovisual work at the same time.
The user can drag the place marking control on the first plot route in the process of watching the plot sections which correspond to the first plot route and the second plot elements simultaneously in the audiovisual works. And in response to a second movement operation of the location marking control to move on the first plot route, determining a second location marking point indicated by the location marking control on the first plot route. And acquiring a first corresponding relation, and displaying a scenario segment which corresponds to the first scenario route and the second scenario element at the same time and occurs at the second place icon record point on the work display page, wherein the specific steps are similar to steps 280 to 284.
In summary, the method provided in this embodiment displays, on the map associated with the scenario of the audiovisual work, the scenario segments corresponding to the scenario routes and the scenario elements in the audiovisual work in response to the scenario routes and the scenario elements selected by the user on the map, and can provide an interactive skip display scheme based on the scenario routes and the scenario elements, so that the user can not only switch and display between different scenario routes based on the scenario routes, but also switch and display between different scenario elements, so as to experience details of the scenario of the audiovisual work more deeply, and provide a more efficient human-computer interaction scheme for trial listening works, such as exploration decryption, high user experience requirements, and strong interactivity.
In an alternative embodiment based on fig. 6, the above step 608 has a plurality of technical implementations, including but not limited to at least one of the following technical implementations:
the first implementation mode comprises the following steps: based on the corresponding relation;
the terminal acquires a second corresponding relation, wherein the second corresponding relation comprises a corresponding relation among the plot lines, plot elements and plot fragments; and displaying the plot segments corresponding to the plot routes and the first plot elements in the audiovisual works based on the second corresponding relation.
That is, the terminal or the server is preset with: the corresponding relation among the plot lines, the plot elements and the plot segments. The correspondence may be stored as a data table, a database, or the like. After determining the plot route and the first plot elements, the terminal takes the plot route and the first plot elements as query input to query and obtain corresponding plot segments.
The second implementation mode comprises the following steps: based on face recognition;
assuming that the audiovisual work comprises a video, the first storyline element comprises a first character. The first character is actor a.
The terminal acquires a first corresponding relation, wherein the first corresponding relation comprises a corresponding relation between a map mark point and a plot segment; determining a video clip corresponding to the map marking point based on the first corresponding relation; identifying a video segment corresponding to the first character in the video segments corresponding to the map marking points, wherein the video segment comprises at least one of a human face video frame and a voice audio frame of the first character; the video clip is displayed.
That is, after the candidate video segments corresponding to the map marking points are obtained, the video segment corresponding to the first character is determined in the candidate video segments based on a face recognition technology or a voice recognition technology. The video frames in the video clip contain the face of the first character and/or the audio frames in the video clip contain the sound features of the first character.
The third implementation mode comprises the following steps: based on the speech recognition;
assuming that the audiovisual work includes audio;
the method comprises the steps that a terminal obtains a first corresponding relation, wherein the first corresponding relation comprises the corresponding relation between map mark points and plot segments; determining an audio clip corresponding to the map marking point based on the first corresponding relation; identifying an audio segment having a keyword of a first scenario element among audio segments corresponding to the map mark points; and playing the audio clip.
That is, after the candidate audio segments corresponding to the map marking points are obtained, the audio segment corresponding to the first character is determined from the candidate audio segments based on the voice recognition technology. The audio frames in the audio clip contain the sound features of the first character.
The fourth implementation mode comprises the following steps: based on the character matching;
assuming that the audiovisual work comprises a literary work;
the method comprises the steps that a terminal obtains a first corresponding relation, wherein the first corresponding relation comprises the corresponding relation between map mark points and plot segments; determining a plot segment corresponding to the map mark point based on the first corresponding relation; identifying chapter sections with keywords of a first plot element in the plot sections corresponding to the map mark points; chapter sections are displayed.
For example, the measurement unit of the chapter section may use: chapter, answer, words, page, segment, line.
In the illustrative example shown in fig. 10, the method is cooperatively performed by a user, a terminal, and a backend server:
step 1002, a user watches a film and sees the time point 02:30 of the pre-buried map;
1014, the background server identifies key frames in the whole video according to the image identification technology, and marks the key frames appearing in the role A;
and step 1018, ending.
In another illustrative example, the method described above may also be applied to a novel reading scenario, as shown in fig. 11, when the user reads the article content in chapter 15, the user triggers the display map control 132 and the character selection control 134. The map control 132 includes: a selection control of a map circuit 1, a selection control of a map circuit 2 and a selection control of a map circuit 3; the role selection control 134 includes: a selection control for role a, a selection control for role B, and a selection control for role C. After the user selects the map line 3 in the map control 132, the user starts to read the 315 th section corresponding to the map line 3; after the user selects the role A, the terminal skips to display the paragraph fragment corresponding to the role A in chapter 315; after the user selects map line 2 in the map control 132, assuming that chapter 215 is a description chapter corresponding to map line 2, the terminal skips to display a paragraph segment corresponding to role a in chapter 215; after the user selects the role B again, the terminal skips to display the paragraph fragment corresponding to the role B in chapter 215.
FIG. 12 shows a block diagram of an apparatus for presenting an audiovisual work provided by an exemplary embodiment of the present application, the apparatus comprising:
the display module 1220 is used for displaying a map control of the audiovisual work, the map control displays a map related to the plot of the audiovisual work, and the map is provided with at least two plot routes;
an interaction module 1240 for determining a selected plot route on the map in response to a plot route selection operation on the map;
and the display module 1260 is used for displaying the plot segments corresponding to the plot routes in the audiovisual works.
In an optional design of this embodiment, a place marking control is displayed on the map control;
the display module 1260 is configured to display, on the work display page, a scenario segment corresponding to a first scenario route in response to a first moving operation of moving the location marking control to the first scenario route of the at least two scenario routes.
In an optional design of this embodiment, the interaction module 1240 is configured to determine, in response to a first moving operation of moving the location marking control to a first scenario route of the at least two scenario routes, a first map marking point indicated by the location marking control on the first scenario route;
the display module 1260 is configured to display a first plot segment corresponding to the first plot route and occurring on the first map marking point on the work display page.
In an optional design of this embodiment, the apparatus further includes:
the interaction module 1240 is configured to determine, in response to a second moving operation of moving the location marking control from the first map marking point on the first storyline route, a second location marking point indicated by the location marking control on the first storyline route;
the display module 1260 is configured to display a second scenario segment corresponding to the first scenario route and occurring on the second location icon note point on the work display page.
In an optional design of this embodiment, the presentation module 1260 is configured to display a fast-forward animation if the first scenario segment is earlier than the second scenario segment; and displaying the fast-moving animation under the condition that the first plot segment is later than the second plot segment.
In an optional design of this embodiment, the presentation module 1260 is configured to present, in response to a third selection operation on a second scenario route of the at least two scenario routes, a scenario segment corresponding to the second scenario route on the work presentation page.
In an optional design of this embodiment, the display module 1220 is configured to display a plot element selection control of the audiovisual work;
the interaction module 1240 is used for responding to the element selection operation on the plot element selection control, and determining the selected first plot element;
the display module 1260 is configured to display, on the work display page, a scenario segment corresponding to a first scenario route and the first scenario element in response to a selection operation on the first scenario route of the at least two scenario routes.
In an optional design of this embodiment, the interaction module 1240 is configured to determine, in response to an element selection operation on the scenario element selection control, a selected second scenario element;
the display module 1260 is configured to display, on the work display page, the scenario segments corresponding to the first scenario route and the second scenario element in the audiovisual work in a switching manner.
In an optional design of this embodiment, the displaying module 1260 is configured to highlight, on the map control, a map point corresponding to a scenario segment being played; or, highlighting the map point corresponding to the target storyline character which is playing on the map control;
wherein the highlighting comprises: and at least one of thickening display, amplifying display, reverse color display, foreground color change, background color change and animation special effect increase.
In an optional design of this embodiment, the display module 1220 is configured to display a map control of the audiovisual work in response to a first trigger condition being met; the first trigger condition includes any one of the following conditions:
playing to the designated progress on the playing progress bar;
receiving a start presentation operation of the audiovisual work;
receiving a dragging operation or a skipping operation on a playing progress bar of the audiovisual work;
receiving the dragging operation on a playing progress bar of the audiovisual work, wherein the dragging distance of the dragging operation is greater than a first threshold value;
receiving the jump operation on a playing progress bar of the audiovisual work, wherein the operation times of the jump operation are greater than a second threshold value;
receiving a fast forward operation or a fast rewind operation of the audiovisual work;
an open operation or a jump operation or a forward operation or a backward operation of a directory of the audiovisual work is received.
In an optional design of this embodiment, the display module 1220 is configured to cancel displaying the map control of the audiovisual work in response to a second trigger condition being met; the second trigger condition includes any one of the following conditions:
receiving a closing operation for closing the map control;
the duration of time for which no operation is received on the map control is greater than a third threshold;
starting to play the plot segments;
and receiving human-computer interaction operation on other interaction areas except the map control.
Fig. 13 is a block diagram of a terminal according to an example embodiment. The terminal 1300 includes a Central Processing Unit (CPU) 1301, a system Memory 1304 including a Random Access Memory (RAM) 1302 and a Read-Only Memory (ROM) 1303, and a system bus 1305 connecting the system Memory 1304 and the CPU 1301. The computer device 1300 also includes a basic Input/Output system (I/O system) 1306, which facilitates transfer of information between devices within the computer device, and a mass storage device 1307 for storing an operating system 1313, application programs 1314 and other program modules 1315.
The basic input/output system 1306 includes a display 1308 for displaying information and an input device 1309, such as a mouse, keyboard, etc., for a user to input information. Wherein the display 1308 and input device 1309 are connected to the central processing unit 1301 through an input-output controller 1310 connected to the system bus 1305. The basic input/output system 1306 may also include an input/output controller 1310 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, input-output controller 1310 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1307 is connected to the central processing unit 1301 through a mass storage controller (not shown) connected to the system bus 1305. The mass storage device 1307 and its associated computer device-readable media provide non-volatile storage for the computer device 1300. That is, the mass storage device 1307 may include a computer device readable medium (not shown) such as a hard disk or Compact Disc-Only Memory (CD-ROM) drive.
Without loss of generality, the computer device readable media may comprise computer device storage media and communication media. Computer device storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer device readable instructions, data structures, program modules or other data. Computer device storage media includes RAM, ROM, Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), CD-ROM, Digital Video Disk (DVD), or other optical, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that the computer device storage media is not limited to the foregoing. The system memory 1304 and mass storage device 1307 described above may collectively be referred to as memory.
The computer device 1300 may also operate as a remote computer device connected to a network via a network, such as the internet, according to various embodiments of the present disclosure. That is, the computer device 1300 may be connected to the network 1311 through a network interface unit 1312 coupled to the system bus 1305, or may be connected to other types of networks and remote computer device systems (not shown) using the network interface unit 1315.
The memory further includes one or more programs, the one or more programs are stored in the memory, and the central processing unit 1301 executes the one or more programs to implement all or part of the steps of the method for displaying audiovisual works.
In an exemplary embodiment, a computer readable storage medium is also provided, having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by a processor to implement the method of presenting an audiovisual work provided by the various method embodiments described above.
The present application further provides a computer-readable storage medium having at least one instruction, at least one program, a set of codes, or a set of instructions stored therein, which is loaded and executed by a processor to implement the method for presenting an audiovisual work provided by the above-mentioned method embodiments.
The present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions are read by a processor of a computer device from a computer-readable storage medium, and the computer instructions are executed by the processor to cause the computer device to perform the methods provided in the various alternative implementations of the audiovisual work presentation aspect described above.
Optionally, the present application also provides a computer program product containing instructions which, when run on a computer device, cause the computer device to perform the method of presenting an audiovisual work as described in the above aspects.
The above-mentioned serial numbers of the embodiments of the present application are merely for description, and do not represent the advantages and disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (14)
1. A method of presenting an audiovisual work, the method comprising:
displaying a work presentation page of the audiovisual work;
displaying a map control of the audiovisual work, wherein the map control displays a map associated with a scenario of the audiovisual work, and at least two scenario routes are displayed on the map;
and responding to a selection operation on a first plot route in the at least two plot routes, and displaying plot segments corresponding to the first plot route on the work display page.
2. The method of claim 1, wherein a location marking control is displayed on the map control;
the displaying, in response to a selection operation on a first scenario route of the at least two scenario routes, a scenario segment corresponding to the first scenario route on the work presentation page includes:
in response to a first moving operation of moving the location marking control to a first plot route of the at least two plot routes, a plot section corresponding to the first plot route is displayed on the work display page.
3. The method of claim 2, wherein the displaying, on the work presentation page, a storyline segment corresponding to a first storyline route of the at least two storyline routes in response to a first move operation of moving the location-marking control onto the first storyline route comprises:
in response to a first movement operation of moving the location marking control to a first plot route of the at least two plot routes, determining a first map marking point indicated by the location marking control on the first plot route;
displaying a first plot segment corresponding to the first plot route and occurring on the first map marking point on the work display page.
4. The method of claim 3, further comprising:
in response to a second movement operation of moving the location marking control from the first map marking point on the first plot route, determining a second location marking point indicated by the location marking control on the first plot route;
displaying a second plot segment corresponding to the first plot route and occurring on the second place icon notation point on the work display page.
5. The method of claim 4, further comprising:
displaying a fast forward animation if the first scenario segment is earlier than the second scenario segment;
and displaying the fast-moving animation under the condition that the first plot segment is later than the second plot segment.
6. The method of any of claims 1 to 4, further comprising:
and in response to a third selection operation on a second plot route of the at least two plot routes, showing plot segments corresponding to the second plot route on the work showing page.
7. The method of any of claims 1 to 5, further comprising:
displaying a plot element selection control of the audiovisual work;
responding to element selection operation on the plot element selection control, and determining a selected first plot element;
the displaying, in response to a selection operation on a first scenario route of the at least two scenario routes, a scenario segment corresponding to the first scenario route on the work presentation page includes:
in response to a selection operation on a first plot route of the at least two plot routes, a plot segment corresponding to the first plot route and the first plot element is displayed on the work presentation page.
8. The method of claim 6, further comprising:
responding to the element selection operation on the plot element selection control, and determining the selected second plot element;
and switching and displaying the plot sections corresponding to the first plot route and the second plot element in the audiovisual work on the work display page.
9. The method of any of claims 1 to 4, further comprising:
highlighting map points corresponding to the scenario segments which are playing on the map control; or, highlighting the map point corresponding to the target storyline character which is playing on the map control;
wherein the highlighting comprises: and at least one of thickening display, amplifying display, reverse color display, foreground color change, background color change and animation special effect increase.
10. The method of any of claims 1 to 4, wherein said displaying a map control of said audiovisual work comprises:
in response to a first trigger condition being met, displaying a map control of the audiovisual work; the first trigger condition includes any one of the following conditions:
playing to the designated progress on the playing progress bar;
receiving a start presentation operation of the audiovisual work;
receiving a dragging operation or a skipping operation on a playing progress bar of the audiovisual work;
receiving the dragging operation on a playing progress bar of the audiovisual work, wherein the dragging distance of the dragging operation is greater than a first threshold value;
receiving the jump operation on a playing progress bar of the audiovisual work, wherein the operation times of the jump operation are greater than a second threshold value;
receiving a fast forward operation or a fast rewind operation of the audiovisual work;
an open operation or a jump operation or a forward operation or a backward operation of a directory of the audiovisual work is received.
11. The method of any of claims 1 to 4, further comprising:
in response to a second trigger condition being met, canceling display of a map control of the audiovisual work; the second trigger condition includes any one of the following conditions:
receiving a closing operation for closing the map control;
the duration of time for which no operation is received on the map control is greater than a third threshold;
starting to play the plot segments;
and receiving human-computer interaction operation on other interaction areas except the map control.
12. An apparatus for presenting an audiovisual work, the apparatus comprising:
the display module is used for displaying a map control of the audiovisual work, and the map control displays a map and a plot route which are associated with the plot of the audiovisual work;
the interactive module is used for responding to plot route selection operation on the map and determining a selected plot route on the map;
and the display module is used for displaying the plot sections corresponding to the plot routes in the audiovisual works.
13. A computer device, characterized in that the computer device comprises: a processor and a memory, the memory storing a computer program that is loaded and executed by the processor to implement the method of presenting an audiovisual work as claimed in any of claims 1 to 11.
14. A computer-readable storage medium, characterized in that it stores a computer program which is loaded and executed by a processor to implement the method of presentation of an audiovisual work according to any of claims 1 to 11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110250896.9A CN115048010A (en) | 2021-03-08 | 2021-03-08 | Method, device, equipment and medium for displaying audiovisual works |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110250896.9A CN115048010A (en) | 2021-03-08 | 2021-03-08 | Method, device, equipment and medium for displaying audiovisual works |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115048010A true CN115048010A (en) | 2022-09-13 |
Family
ID=83156385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110250896.9A Pending CN115048010A (en) | 2021-03-08 | 2021-03-08 | Method, device, equipment and medium for displaying audiovisual works |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115048010A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116089728A (en) * | 2023-03-23 | 2023-05-09 | 深圳市人马互动科技有限公司 | Method and related device for generating voice interaction novel for children |
-
2021
- 2021-03-08 CN CN202110250896.9A patent/CN115048010A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116089728A (en) * | 2023-03-23 | 2023-05-09 | 深圳市人马互动科技有限公司 | Method and related device for generating voice interaction novel for children |
CN116089728B (en) * | 2023-03-23 | 2023-06-20 | 深圳市人马互动科技有限公司 | Method and related device for generating voice interaction novel for children |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111970577B (en) | Subtitle editing method and device and electronic equipment | |
US9092437B2 (en) | Experience streams for rich interactive narratives | |
US20140019865A1 (en) | Visual story engine | |
US20170180806A1 (en) | System and method for integrating interactive call-to-action, contextual applications with videos | |
US8174523B2 (en) | Display controlling apparatus and display controlling method | |
US9558784B1 (en) | Intelligent video navigation techniques | |
CN112969097B (en) | Content playing method and device, and content commenting method and device | |
CN113849258B (en) | Content display method, device, equipment and storage medium | |
US20230054388A1 (en) | Method and apparatus for presenting audiovisual work, device, and medium | |
CN109947979B (en) | Song identification method, device, terminal and storage medium | |
GB2503888A (en) | Expandable video playback timeline that includes the location of tag content. | |
US11544322B2 (en) | Facilitating contextual video searching using user interactions with interactive computing environments | |
US9076489B1 (en) | Circular timeline for video trimming | |
CN113014985A (en) | Interactive multimedia content processing method and device, electronic equipment and storage medium | |
CN112752132A (en) | Cartoon picture bullet screen display method and device, medium and electronic equipment | |
CN113553466A (en) | Page display method, device, medium and computing equipment | |
KR102353797B1 (en) | Method and system for suppoting content editing based on real time generation of synthesized sound for video content | |
CN114579030A (en) | Information stream display method, device, apparatus, storage medium, and program | |
CN114925285A (en) | Book information processing method, device, equipment and storage medium | |
WO2019146466A1 (en) | Information processing device, moving-image retrieval method, generation method, and program | |
CN115048010A (en) | Method, device, equipment and medium for displaying audiovisual works | |
KR20160086031A (en) | Method and apparatus for providing contents complex | |
KR20130108684A (en) | Annotation method and augmenting video process in video stream for smart tv contents and system thereof | |
JP2021060991A (en) | Method for displaying dynamic digital content, graphical user interface, and system of the same | |
CN112653931B (en) | Control method and device for resource information playing, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40072619 Country of ref document: HK |
|
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |