WO2023124864A1 - 互动方法、系统及电子设备 - Google Patents
互动方法、系统及电子设备 Download PDFInfo
- Publication number
- WO2023124864A1 WO2023124864A1 PCT/CN2022/137344 CN2022137344W WO2023124864A1 WO 2023124864 A1 WO2023124864 A1 WO 2023124864A1 CN 2022137344 W CN2022137344 W CN 2022137344W WO 2023124864 A1 WO2023124864 A1 WO 2023124864A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- audio
- interaction
- interactive
- segment
- Prior art date
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 235
- 238000000034 method Methods 0.000 title claims abstract description 79
- 230000001960 triggered effect Effects 0.000 claims abstract description 60
- 239000012634 fragment Substances 0.000 claims abstract description 22
- 230000004044 response Effects 0.000 claims abstract description 16
- 230000002452 interceptive effect Effects 0.000 claims description 209
- 230000011218 segmentation Effects 0.000 claims description 18
- 230000000694 effects Effects 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 7
- 230000000875 corresponding effect Effects 0.000 description 119
- 230000003139 buffering effect Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 8
- 239000003086 colorant Substances 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000007619 statistical method Methods 0.000 description 2
- 206010013911 Dysgeusia Diseases 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4722—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47217—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
Definitions
- the present application relates to the field of computer technology, in particular to an interactive method, system and electronic equipment.
- the popularity of the audio and video can be known by the total number of likes, and users can decide whether to watch or listen to the audio and video based on the popularity of the audio and video. It is possible that the user decides to watch it, but is not very interested in certain segments, drags the progress bar to jump to any point in the video, and ends up missing the highlights.
- the present application provides an interactive method, system and electronic device that solve the above problems or at least partially solve the above problems.
- an interaction method includes:
- an interaction method is also provided.
- the method includes:
- the interaction data set determine the user interaction heat corresponding to the different segments of the audio and video
- an interactive system is also provided.
- the system includes:
- the client is used to play audio and video; in the playback interface of the audio and video, display an interactive control reflecting the playback progress; in response to the interactive operation triggered by the user through the interactive control, determine the interactive information and when the interactive operation is triggered the playback progress; based on the interaction and the playback progress, send the user's interaction data for the audio and video clips to the server;
- the server is used to obtain the interaction data triggered by different users for different segments of the audio and video, and obtain an interaction data set related to the audio and video; according to the interaction data set, determine the user interaction corresponding to the different segments of the audio and video heat; based on the user interaction heat corresponding to different segments of the audio and video and the audio and video data of the audio and video, generate the streaming media data of the audio and video, so that the client device based on the downloaded streaming media data, in the Perceivable information reflecting user interaction heat corresponding to different segments of the audio and video is displayed on the playback interface of the audio and video.
- an interaction method is also provided.
- the method includes:
- a progress bar is displayed
- the interaction data of the user for the audio and video clip is determined.
- an electronic device in an embodiment of the present application, includes a processor and a memory, wherein the memory is used to store one or more computer instructions; the processor is coupled to the memory and used for the one or more computer instructions for The steps in the foregoing embodiments of the interaction methods are realized.
- audio and video mentioned in the various embodiments of the present application may be audio, video, or multimedia data including both audio and video.
- the playback interface of the audio and video played by the client displays an interactive control that reflects the playback progress, and the user can trigger an interactive operation on the audio and video clip through the interactive control, which facilitates the establishment of user interaction and audio and video
- the client responds to the interactive operation triggered by the user through the interactive control, can determine the interactive information and the playback progress when the interactive operation is triggered, and send the user's audio to the server based on the interactive information and playback progress.
- the interaction data of the video segment so that the server can obtain the interaction data triggered by different users for different segments of the audio and video.
- the server can obtain the interaction data set related to the audio and video based on the acquired interaction data triggered by different users for different segments of the audio and video, and determine the user interaction popularity corresponding to the different segments of the audio and video according to the interaction data set.
- the user interaction heat corresponding to different segments of the audio and video can be provided to the client user for reference, that is, the server can generate audio and video streaming data based on the user interaction heat corresponding to different audio and video segments and the audio and video data of the audio and video. It is convenient for the client to display perceivable information reflecting the user interaction heat corresponding to different audio and video segments in the audio and video playback interface according to the downloaded streaming media data.
- the displayed perceptible information that reflects the user interaction heat corresponding to different audio and video segments can help users clarify the popularity of different audio and video segments, and help users browse audio and video. Highlights.
- the probability of users missing highlights is reduced, which to a certain extent also helps to increase users' interest in audio and video, and also helps to increase the number of audio and video plays and the number of content plays. and other indicators.
- interactive controls may not be displayed on the playback interface, and users can directly trigger interactive operations on the playback interface (such as double-clicking, sliding operations, etc.), and the client responds to the interactive operations triggered by the user on the playback interface.
- the playback progress on the progress bar displays the user-perceivable information corresponding to the interactive operation, and according to the interactive information corresponding to the interactive operation and the playback progress when the interactive operation is triggered, determine the user's response to the audio and video clip.
- Interaction data so as to upload the user's interaction data on audio and video clips to the server.
- FIG. 1 shows a schematic flowchart of an interaction method provided by an embodiment of the present application
- Fig. 2 shows a schematic diagram of the principle of audio-video interaction provided by an embodiment of the present application
- Fig. 3 shows a schematic structural diagram of an interactive system provided by an embodiment of the present application
- Fig. 4 shows a schematic flowchart of an interaction method provided by another embodiment of the present application.
- Fig. 5 shows a schematic flowchart of an interaction method provided by another embodiment of the present application.
- Fig. 6 shows a schematic diagram of the principle of audio-video interaction provided by another embodiment of the present application.
- Fig. 7 shows a schematic structural diagram of an interactive device provided by an embodiment of the present application.
- Fig. 8 shows a schematic structural diagram of an interactive management device provided by another embodiment of the present application.
- Fig. 9 shows a schematic structural diagram of an interactive management device provided by another embodiment of the present application.
- Fig. 10 shows a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- various embodiments of the present application provide an audio and video interaction solution, and users can interact with one or more clips, such as like operation, negative comment operation and so on.
- the progress bar corresponding to the played audio and video is associated with an interactive control (such as a like control), and the interactive control can reflect the progress of the audio and video playback, which is conducive to combining user interaction with audio and video. Fragments are associated with each other.
- the buffering progress bar displayed on the progress bar in the playback interface can also display perceivable information reflecting the user interaction heat corresponding to different segments of the video, so that the user can watch the video.
- VV Video View, playback number
- CV Content Views, content playback number
- VV refers to the number of videos that are opened within a statistical period
- CV refers to the sum of the number of times the video is opened and the main content of the video (except advertisements) is successfully played within a statistical period.
- the value of CV is less than the value of VV due to the user churn when the ad is played.
- the audio and video mentioned in the various embodiments of the present application may be audio, video, or multimedia data including both audio and video.
- the interaction methods provided in the following embodiments of the present application can be applied to a system architecture composed of at least one client device and a server (eg, server).
- a server eg, server
- the client 100 and the server 200 are connected through a network for communication.
- data transmission is performed between the client 100 and the server 200 according to a preset protocol.
- the preset protocol may include but not limited to any one of the following: HTTP protocol (Hyper Text Transfer Protocol, hypertext transfer protocol), HTTPS protocol (Hyper Text Transfer Protocol over Secure Socket Layer, HTTP protocol aimed at security )wait.
- the server 200 may be a single server, or a server group composed of several functional servers, or a virtual server or cloud.
- the client 100 can be any electronic device with a network connection function, for example, the client 100 can be a personal computer, a tablet computer, a smart phone, a personal digital assistant (Personal Digital Assistant, PAD), a mobile device such as a smart wearable device, Or fixed equipment such as desktop computers and digital TVs.
- An audio and video playback application is installed on the client 100, and the audio and video playback application can be used for users to watch audio and video (such as TV dramas, movies, variety shows, music, etc.)
- the interactive controls displayed on the playback interface that follow the progress mark on the progress bar (such as like controls and comment controls) perform operations such as likes, negative comments, and posting comments (such as barrage) on the audio and video.
- the interactive information generated by the user's operations such as likes, bad comments, and comments through the interactive controls and the corresponding playback progress when the user triggers the interactive controls (such as a certain playback moment of the video, 1 minute and 50 seconds, 10 minutes, 00 seconds, etc., or frame identifiers)
- the server 200 will also be sent to the server 200 as user interaction information for the corresponding segments of the audio and video, so that the server 200 can collect user interactions triggered by different users for different segments of the audio and video data.
- audio and video playback applications generally use buffering while playing.
- a buffering progress bar will be displayed on the audio and video progress bar. The technical solution provided by this application is to display the buffering progress bar.
- the progress bar is displayed in segments, and different buffer segments also display different or the same display attributes, which are related to user interaction heat.
- the popularity of user interaction is determined by the server 200 based on the collected user interaction data triggered by different users for different segments of the audio and video. In this way, when the user is watching the video, by buffering the display attributes corresponding to at least one buffer section of the progress bar, the user interaction heat corresponding to the subsequent segment to be played can be determined, so as to determine whether to play the audio or video according to the user interaction heat You can perform operations such as double-speed playback and dragging the progress bar to quickly reach the playback point you are interested in.
- user A uses his own client (such as a notebook computer, mobile phone, etc.) to watch an audio and video (such as a TV series, movie).
- his own client such as a notebook computer, mobile phone, etc.
- an audio and video such as a TV series, movie.
- user A can follow the progress bar 02 on the progress bar 02 to move the interactive control 01 (like the control ) triggers the like action.
- the client 100 can share the like data generated by the user's like and the corresponding progress of the current audio and video playback (such as a certain playback time of the audio and video, 2 minutes 24 seconds, 10 minutes 00 seconds Time, etc., or frame identifier), as user A’s interaction data for the segment corresponding to the playback progress, and send the interaction data to the server, so that the server can obtain the interaction triggered by different users for different audio and video segments data.
- the progress bar 02 also shows a buffer progress bar 03 corresponding to the buffered audio and video data, and the buffer progress bar 03 includes three buffer segments, namely the first buffer segment 031, the second buffer segment 032 and the second buffer segment Three buffer segments 033.
- These three buffer segments use different grayscale colors to represent the display attributes of the buffer segments.
- the darker the grayscale the higher the user interaction of the audio and video clips to be played corresponding to the buffer, and the higher the user interaction, the higher the audio and video content.
- the second buffer segment 032 has low user interaction heat , indicating that the audio and video segments to be played corresponding to the second buffer segment 032 are less exciting than the audio and video segments to be played corresponding to the first buffer segment 031 and the second buffer segment 033 .
- the user can select multiple playback or mopping progress marks for the audio and video clips corresponding to the second buffer section 032 to directly skip the audio and video clips corresponding to the second buffer section 032 , directly watch the audio and video segment corresponding to the buffer segment 033.
- the above-mentioned example is that the user clicks on the played audio and video content through an interactive control (such as a like control).
- the interactive control can be a comment control, and the comment control moves with the progress bar on the progress bar.
- a pop-up A text box for inputting comment content for the user to input comment content for publication, and this example is not shown in the attached figure.
- the interactive control is mainly used as an example of the like control as an example.
- the user interaction popularity corresponding to the audio and video clips is determined by the server based on the interaction data obtained by different users for different audio and video clips. For the specific determination process, please refer to the relevant content below.
- the present application also provides multiple method embodiments.
- the specific execution content of each end (such as the client and server) under the above system architecture will be described in the following embodiments.
- the execution subject of the interaction method provided in the following embodiments is the client or the server.
- embodiments of the interaction device will be provided below, and the interaction device is generally set in the client device and/or the server.
- Fig. 1 shows a schematic flowchart of an interaction method provided by an embodiment of the present application.
- the execution subject of the method provided in this embodiment is the client shown in FIG. 2 or FIG. 3 .
- the client can be but not limited to smartphones, PCs (Personal Computers, personal computers), mobile computers, tablet computers, personal digital assistants (Personal Digital Assistant, PAD), smart TVs, smart wearable devices (such as smart bracelets, embedded Smart wearable devices for clothing accessories such as clothes, shrinks, etc.), there is no limitation here.
- the interactive method includes the following steps:
- the audio and video to be played may be audio and video played online or local audio and video files played offline.
- various types of preprocessing such as resolution adjustment can be performed on the frame images in the audio and video, so as to be suitable for playing on the client. If the user does not perform any preprocessing on the frame images in the audio and video, the audio and video will be played with the default configuration information.
- FIG. 3 shows a case where the interactive control is a thumbs-up control.
- a progress bar will be displayed on the audio and video playback interface, and a movable progress mark will also be displayed on the progress bar.
- the position of the progress mark on the progress bar can reflect the audio and video playback progress.
- the interactive control can be made to follow the movement of the progress bar on the progress bar, so as to realize the function that the interactive control can reflect the playback progress. That is, further, the method provided in this embodiment may further include the following steps:
- the progress bar 02 relative to the currently played audio and video can be displayed at the bottom of the playback interface of the audio and video, and the playback progress with the current audio and video can be displayed on the progress bar 02
- the movable progress mark 04 is used to remind the user of the progress of playing the current audio and video.
- the specific form of the progress mark 04 may be an indicating circle (as shown in FIG. 3 ), indicating a rectangle, indicating a triangle, etc., which are not limited here. Because the progress mark 04 moves on the progress bar 02, it usually moves according to a certain moving speed and direction. Therefore, in order to realize the movement of the interactive control 01 following the progress mark, the interactive control 01 and the progress mark 04 can be Association, to control the movement of the interactive control 01 according to the acquired moving speed and moving direction of the progress marker 04.
- interactive controls may be displayed on the upper, lower, left or right parts of the progress bar.
- the interactive control can be displayed above or below the progress bar.
- FIG. 2 and FIG. 3 show the situation that the interactive control 01 is displayed on the top of the progress bar 04 .
- the moving direction of the progress indicator is preset, and its moving direction can move from left to right or from right to left.
- each audio and video playback application sets the progress mark 04 to move from left to right (as shown in Figure 2 and Figure 3).
- the moving speed of the progress mark is often related to the network conditions of the client device and the clarity of the audio and video playback selected by the user.
- the audio and video playback application itself has a corresponding calculation strategy for calculating the moving speed of the progress mark.
- the moving speed of the progress mark will be calculated according to the built-in calculation strategy, so as to control the movement of the progress mark according to the moving speed and moving direction.
- the audio and video playback application determines the moving speed and direction of the progress mark to control the movement of the progress mark, and at the same time controls the movement of the interactive control based on the moving speed and direction of the progress mark, so that the interactive control follows The progress indicator moves, so that the interactive control can not only provide the user with the function of interacting with the audio and video, but also reflect the progress of the audio and video playback.
- the interactive control is linked with the progress indicator, one is more intuitive, and the other is convenient for the user to operate.
- the user concentrates on watching a movie, because the plot of the movie is very exciting, and does not perform any operations on the playback interface.
- the progress bar controls, time, movie title and other information on the playback interface will be hidden.
- the user feels that some clips are very exciting during the aftertaste process, and wants to give a thumbs up to a certain clip.
- the user can click the playback interface, and the progress bar, the progress icon, and the interactive controls linked with the progress icon are all displayed on the playback interface.
- the user directly drags the interactive control, and the progress indicator moves due to the linkage relationship between the progress indicator and the interactive control. Go to the playback progress corresponding to the clip that the user wants to replay and like, and then let go to complete the like operation.
- the user does not need to move the progress bar and then go to another location on the interface to trigger the interactive operation.
- streaming media refers to continuous time-based media, such as audio, video, animation, or other multimedia files, transmitted over the Internet using streaming technology.
- the main feature of streaming media data is to transmit multimedia data in the form of streaming, and viewers or listeners can enjoy it while downloading.
- the downloaded streaming media data will be cached in the buffer of the user's client, and the cached data in the buffer can be referred to as buffered data. That is, when a user watches an online audio and video, the audio and video playback application will continue to download streaming media data from the server (such as a CDN node) and cache it in the local buffer when providing the video playback function for the user. Watch and download.
- a buffering progress bar corresponding to the buffered video data to be played is also displayed on the progress bar.
- the buffering progress bar displayed on the progress bar can only roughly indicate the playing time corresponding to the audio and video data that has been buffered for the currently played audio and video. Bars cannot clarify information such as the excitement and fun of the audio and video data to be played, which will cause the user to choose such as manually dragging the progress mark to skip the uninteresting point when the user is not interested in the currently playing audio and video content. When dragging the progress bar is blind, you may miss your point of interest.
- the buffering progress bar will be displayed in the form of segments, and the buffering progress bar may include at least one buffering segment, and one segment of buffering segment corresponds to a buffered audio and video segment to be played.
- the display attributes corresponding to each displayed buffer segment are related to user interaction heat. Among them, the popularity of user interaction can be determined by the server based on the interaction data (such as like data, comment data, etc.) By displaying the display attributes of each buffer segment, it is possible to understand the user interaction information corresponding to the subsequent corresponding segment to be played, thereby indirectly clarifying the excitement of the segment to be played, etc., and providing an auxiliary reference for the user to drag the progress bar. That is, further, the technical solution provided in this embodiment may also include the following steps:
- the buffered data corresponding to the audio and video in the buffer area can be the streaming media data corresponding to the audio and video transmitted from the server received from the network during the audio and video playback application providing the user with the audio and video playback function.
- Streaming media data will be stored in a local buffer queue (that is, the buffer in this embodiment).
- various decoders can be called for each frame of streaming media data to reconstruct the original data format, and then played and output on the audio and video application after synchronization.
- the buffered data corresponding to the audio and video obtained from the buffer area through step S21 may also include information representing the user interaction popularity of the segment to be played.
- the buffer segment and the segment to be played must have consistency in quantity. The length of the buffering progress bar is known.
- the length of the buffer segment corresponding to the segment to be played can be determined according to the data volume of at least one segment to be played included in the buffered data, so as to realize the segmentation of the buffering progress bar Divide to obtain at least one buffer segment corresponding to at least one segment to be played, and determine the display attributes of the corresponding at least one buffer segment based on the information of the user interaction heat corresponding to the at least one segment to be played.
- the above-mentioned buffer data may include at least one segment to be played and information representing the degree of user interaction corresponding to the segment to be played; correspondingly, the above-mentioned S22 "based on the buffer data, determine at least one buffer segment of the buffer progress bar and the display of the buffer segment Attributes", the specific steps are as follows:
- the data amount of a segment to be played represents the size of the segment to be played, and the corresponding data amount can be calculated through the audio bit rate, video bit rate, and duration of the segment to be played.
- the calculation formula of the data amount M of the segment to be played is as follows: (audio bit rate+video bit rate)*duration/8.
- the display attributes of the buffer segment may be, but not limited to: color, pattern and so on. For example, different colors are used to reflect the user interaction heat of the corresponding segment to be played.
- the user interaction heat is high for display in red
- the user interaction heat for low user interaction is displayed in gray
- the user interaction heat is medium for orange display, and so on.
- the popularity of user interaction is determined by the interaction data (such as like data, comment data, etc.)
- Statistical analysis of interactive data such as data, comment data, etc., can determine the user interaction heat corresponding to different segments of audio and video.
- the information about the user interaction popularity of the audio and video segment is related to the number of likes and/or comments triggered by the user on the segment.
- the buffered buffer data includes 3 segments to be played (not shown in the figure), denoted as the first segment to be played, the second segment to be played, and the second segment to be played.
- each buffer segment The display attribute may be: the display attribute of the first buffer segment 031 is dark gray, the display attribute of the second buffer segment 032 is light gray, and the display attribute of the third buffer segment 033 is medium gray.
- the determination of the segment to be played is related to the division of different segments of audio and video. For the division of different segments of audio and video, please refer to the relevant content of the following embodiments, and no specific introduction will be made here.
- the cache with at least one buffer segment can be displayed on the progress bar according to the preset display rules progress bar.
- the display attribute of the buffer segment is color
- the corresponding display attributes of the first buffer segment 031, the second buffer segment 032 and the third buffer segment 033 of the determined buffer progress bar 03 are respectively For deep gray, light gray, and medium gray, you can display the colors in the attributes corresponding to each buffer segment, and fill or render the corresponding depth in the first buffer segment 031, the second buffer segment 032, and the third buffer segment 33 respectively gray, light gray, and medium gray, so that the progress bar 02 shown in Figure 3 is displayed in the audio and video playback interface.
- the audio and video playback interface there will be displayed an interactive control that follows the progress mark on the progress bar, and the user can play the corresponding current audio and video content through the interactive control Perform actions such as likes and comments to achieve interaction with audio and video.
- a buffer progress bar with at least one buffer segment will also be displayed, and buffer segments with different display attributes will be displayed separately. Because the display attributes are related to user interaction heat, this solution is adopted
- the displayed buffering progress bar can provide auxiliary reference for users to watch the segment to be played.
- the user interaction heat corresponding to different segments of audio and video may be determined by the server based on the acquired interaction data generated by different users triggering interaction controls for different segments of the audio and video.
- the client device executes the above step 103, and responds to the user through the interactive control (such as like control) to trigger the interactive operation (such as the like operation), determine the interactive information (such as the like data) and the playback progress when the interactive operation is triggered (such as a certain playback moment of the video, 1 minute 50 seconds, 10 minutes 00 seconds time, etc., or frame identification), further, based on the interaction information and playback progress, the user’s interaction data for the audio and video clips can be sent to the server, so that the server can obtain different users’ triggers for different segments of the audio and video generated interaction data.
- the client determines the corresponding clips locally according to the current playback progress, and uploads the user's interaction information and the determined clip identifier directly when uploading data to the server, so that the server can make statistics based on Fragment identification for statistics. That is, in a specific achievable technical solution, the above-mentioned 104 "based on the interaction information and the playback progress, send the user's interaction data for audio and video clips to the server", which may specifically include:
- the audio and video segments in this embodiment can be divided by the server, that is, the data corresponding to the audio and video includes segment division information.
- the interactive information and the playback progress can be directly sent to the server as interactive data.
- the playback time such as 2 minutes and 24 seconds
- frame identifier corresponding to the audio and video when the user triggers the interactive control
- the playback time or frame is obtained from the fragment information in the audio and video data
- the segment where the identifier is located, the segment where the playback moment or frame identifier is located is the target segment of the interactive information for audio and video.
- the server may transmit streaming media data corresponding to audio and video to the client, and the streaming media data may include segment division information.
- the client determines the interaction information generated by the user triggering the interactive control and the corresponding playback progress when the user triggers the interactive control, it can also determine the segment where the playback progress (such as the playback time or frame identifier) is located based on the playback progress, and then the interactive
- the information and the fragment identifier of the fragment ie, the target fragment
- the content included in the interaction data sent from the client to the server is not specifically limited, as long as the server can determine the target segment of the audio and video for the interaction information based on the interaction data.
- the above client device can not only determine the interactive information and the playback progress when the interactive operation is triggered, but also display the corresponding action on the audio and video playback interface.
- the dynamic effect corresponding to the interactive operation may include but not limited to one or a combination of audio, text, and image.
- a display as shown in Figure 2 can be displayed around the thumb-shaped thumbs-up control.
- the dynamic effect may also include the text content of the number of consecutive likes by the user this time, and this embodiment does not specifically limit the dynamic effect.
- interactive controls reflecting the playback progress are displayed in the playback interface of the played audio and video, and the user can trigger interactive operations on video clips through the interactive controls, which facilitates the establishment of user interaction and audio and video The relationship between fragments; further, after the client responds to the interactive operation triggered by the user through the interactive control, it can determine the interactive information and the playback progress when the interactive operation is triggered, and send the user's target to the server based on the interactive information and playback progress. Interaction data of audio and video segments, so that the server can obtain the interaction data triggered by different users for different segments of the audio and video, which helps to provide data support for the server to determine the user interaction popularity corresponding to different audio and video segments.
- the above content introduces the audio-video interaction solution provided by the embodiment of the present application from the perspective of the client, and the following describes the audio-video interaction solution provided by the embodiment of the present application from the perspective of the server. specifically,
- FIG. 3 shows a schematic flowchart of an interaction method provided by an embodiment of the present application. As shown in Figure 4.
- the interactive method may include the following steps:
- the interaction data set determine the user interaction popularity corresponding to different segments of the audio and video
- the interaction data is determined by the client device based on the user's interaction information and the playback progress corresponding to the interaction operation after the user triggers the interaction operation on the interaction control on the playback video interface.
- the client determines the interaction information and the playback progress, it can directly send the interaction information and the playback progress to the server.
- the execution subject server of this embodiment determines the fragment identifier of the audio and video target fragment targeted by the interactive information according to the playback progress.
- the client sends the interaction information and the segment identifier of the target segment to the server.
- the server After receiving the interaction data triggered by different users for different segments of the audio and video sent by different client devices, the server can obtain the interaction data set related to the audio and video.
- statistical analysis can be performed on the interaction information (such as like data, comment data, etc.) or the number of comments, so as to determine the user interaction popularity corresponding to different audio and video clips based on the number of likes and/or comments.
- the client For the specific realization of the client determining the target segment of the audio and video for the interactive information based on the playback progress, please refer to the relevant content of the above embodiments.
- the interaction data obtained by the server at this time also includes the interaction information and playback progress.
- the popularity of user interaction the server needs to determine the target segment of the interactive information for the audio and video based on the received playback progress, so as to use the interactive information and the identification of the target segment to construct an interactive data set related to the audio and video, and then go through the above steps Step 202 is to determine the user interaction popularity corresponding to different segments of the audio and video based on the interaction data set.
- the method provided in this embodiment further includes:
- the prerequisite for the server to determine the user interaction popularity corresponding to different segments of the audio and video based on the interaction data set is that the audio and video data of the audio and video stored by the server includes segment division information of the audio and video.
- the fragment division information may be obtained by dividing audio and video into fragments based on a built-in correlation algorithm of the server.
- the interaction method provided in this embodiment may also include the following steps:
- A21 Carry out segment division to described audio and video, obtain the segment division information of described audio and video;
- A22 Add the segment division information of the audio and video to the audio and video data of the audio and video.
- the audio and video are divided into segments.
- a relatively simple implementation method can be to directly segment the audio and video into multiple segments with equal playback duration at equal intervals.
- This segment division method is to divide the duration as a segment information.
- a shot refers to a video segment captured continuously by a camera at one time
- a scene refers to a video composed of several semantically related continuous shots that can express common semantic content Fragments
- the audio and video segments are divided directly by equal intervals, there may be an incomplete expression of the content of an audio and video segment. When watching, it brings a sense of discontinuity in the audio content to the user, and the audio and video experience is poor.
- what this embodiment provides is to perform scene segmentation on the audio and video according to the audio and video content of the audio and video to obtain segments corresponding to multiple groups of scene segmentation sequences.
- This segment division method is The start frame and end frame of each segment are used as segment division information.
- scene segmentation based on audio and video frame data is as follows: using the visual feature information of audio and video, by analyzing the similarity between adjacent frames of the video, first perform shot segmentation, and then according to the correlation between shots, Merge related shots to form a certain semantic scene, thereby completing the semantic segmentation of the scene.
- the technical solution provided by this embodiment considering that sound is also an important part of audio and video, can also provide a large amount of effective information for audio and video separation. For example, a scene with background music, a set of dialogue scenes, a set of monologue scenes, and a set of commentary in a TV series are all developed around a plot or theme. For this reason, sound information can also be divided into audio and video scenes. Important reference.
- the popularity of audio and video is also another important part of audio and video, for example, the playback status of audio and video segments on the server (such as skip, fast forward, normal play, multiple playback) and users' instant evaluation information (such as barrage) can reflect the popularity information of this video segment, and it can also be used as an important basis for audio and video scene segmentation.
- this embodiment implements semantic segmentation of audio and video scenes by integrating audio and video frame data, sound information, and popularity.
- the achievable manner of the above-mentioned A21 "dividing the audio and video into segments to obtain the segment division information of the audio and video” can include any of the following:
- A212 According to the audio and video content of the audio and video, perform scene segmentation on the audio and video to obtain segments corresponding to multiple sets of scene segmentation sequences; use the start frame and end frame of each segment as the segment division information.
- the streaming media data of the audio and video can be generated based on the user interaction heat corresponding to the different segments of the audio and video and the audio and video data of the audio and video, so that the client device can
- the streaming media data is downloaded from the server, and in the process of playing the audio and video, perceivable information reflecting the user interaction heat corresponding to different segments of the audio and video can be displayed on the audio and video playback interface.
- perceivable information can be but not believe in color, and perceivable information can be displayed through a progress bar.
- the colors of different gray levels displayed by different buffer segments 03 of the buffering progress bar 02 shown in FIG. The color of the gray level can assist the user to selectively watch the video to be played.
- the technical solution provided by this embodiment is based on the acquired interactive data triggered by different users for different segments of the audio and video, and on the basis of obtaining the interactive data set related to the audio and video, it can determine the difference between the audio and video according to the interactive data set.
- the user interaction heat corresponding to the segment can further generate audio and video streaming media data based on the user interaction heat corresponding to different audio and video segments and the audio and video data of the audio and video.
- Perceivable information reflecting user interaction heat corresponding to different segments of the audio and video is displayed on the playback interface of the audio and video.
- the perceivable information displayed on the client that reflects the user interaction heat corresponding to different audio and video segments can help users clarify the popularity of different audio and video segments, and help users browse audio and video. Miss the parts you are interested in; moreover, to a certain extent, this is also conducive to improving indicators such as the number of audio and video playback, and the number of content playback.
- An embodiment of the present application also provides an interactive system.
- the structure of the interactive system is as shown in FIG. 3 .
- the interactive system includes: a client 100 and a server 200; wherein,
- the client 100 is used to play audio and video; in the playback interface of the audio and video, display an interactive control reflecting the playback progress; in response to the interactive operation triggered by the user through the interactive control, determine the interactive information and the triggering of the interactive operation The playback progress at the time; based on the interactive information and the playback progress, send the user's interactive data for the audio and video clips to the server;
- the server 200 is configured to obtain interactive data triggered by different users for different segments of the audio and video, and obtain an interactive data set related to the audio and video; determine the user corresponding to the different segments of the audio and video according to the interactive data set Interaction heat: based on the user interaction heat corresponding to different segments of the audio and video and the audio and video data of the audio and video, generate the streaming media data of the audio and video, so that the client device based on the downloaded streaming media data, in the Perceivable information reflecting user interaction heat corresponding to different segments of the audio and video is displayed on the playback interface of the audio and video.
- the audio and video playback interface may not have interactive controls, and the user triggers an interactive operation (such as double-clicking the playback interface) through the playback interface, and the dynamic effect corresponding to the interactive operation is displayed on the playback progress of the progress bar. displayed, making it clear to the user that he or she has triggered an interaction with the audio and video segment to which the current playback progress belongs.
- the interactive control (such as the like control) does not move, and it is displayed at a fixed position. After the user clicks the interactive control, user-perceivable information can be displayed at the playback progress on the progress bar, allowing the user to see or hear other The effect that triggers the interactive action for this fragment.
- an embodiment of the present application also provides an interaction method.
- the execution flow of the interaction method is a schematic flow diagram as shown in FIG. 5 .
- the subject of execution of the method is a client, and the client may be, but not limited to, a personal computer, a tablet computer, a smart phone, and the like.
- the interactive method may include the following steps:
- the interactive operation triggered by the user on the playback interface may be, but not limited to, an operation of double-clicking any position on the playback interface; or an operation that triggers a fixed interactive control set on the playback interface, which is not limited here.
- the client responds to the interactive operation triggered by the user on the playback interface.
- the user-perceivable information corresponding to the interactive operation displayed at the playback progress on the progress bar can be the dynamic effect corresponding to the interactive operation.
- the dynamic effect can be but not limited to the following Like audio, like picture animation, etc., here is not limited. Through this dynamic effect, the user can make it clear that he has triggered an interaction for the audio and video segment to which the current playback progress belongs.
- Fig. 7 shows a schematic structural diagram of an interactive device provided by an embodiment of the present application.
- the device includes: a playback module 11 , a display module 12 , a determination module 13 and a sending module 14 .
- a playback module 11 the device includes: a playback module 11 , a display module 12 , a determination module 13 and a sending module 14 .
- Play module 11 for playing audio and video
- the display module 12 is used to display interactive controls reflecting the progress of the playback in the playback interface of the audio and video;
- a determining module 13 configured to determine the interactive information and the playback progress when the interactive operation is triggered in response to the interactive operation triggered by the user through the interactive control;
- the sending module 14 is configured to send user interaction data for audio and video segments to the server based on the interaction information and the playback progress, so that the server can obtain interaction data triggered by different users for different audio and video segments.
- the device provided in this embodiment further includes: a display module, configured to display a progress bar and a progress mark that moves on the progress bar to reflect the playback progress in the audio and video playback interface. and a display module 12, further configured to display the interactive control linked with the progress bar.
- display module 12 when used to display the interactive control linked with the progress bar, it is specifically used to: display the interactive control around the progress bar; acquire the moving speed and Moving direction: controlling the interactive control to move according to the moving speed and the moving direction.
- the device provided in this embodiment further includes: an acquisition module, configured to acquire the buffered data corresponding to the audio and video in the buffer; and the above-mentioned determination module 13, further configured to determine the buffered data based on the buffered data.
- an acquisition module configured to acquire the buffered data corresponding to the audio and video in the buffer
- the above-mentioned determination module 13 further configured to determine the buffered data based on the buffered data.
- the buffer progress bar of the at least one buffer segment is also used to display on the progress bar according to the rules of different display attributes corresponding to different display methods.
- the above-mentioned buffer data includes at least one segment to be played and information representing the degree of user interaction corresponding to the segment to be played; correspondingly, the above-mentioned determination module 13 is used to determine at least one buffer segment of the buffer progress bar based on the buffer data And when displaying attributes of the buffer segment, it is specifically used to: determine the length of the buffer segment corresponding to the segment to be played according to the data amount of at least one segment to be played included in the buffer data; Information about user interaction heat, to determine the display attributes of the corresponding segment buffer segment.
- the above-mentioned sending module when used to send the user's interaction data for audio and video clips to the server based on the interaction information and the playback progress, it is specifically used to: use the interaction information and the playback progress as The interactive data is sent to the server, and the server determines the target segment of the interactive information based on the playback progress; or determines the target segment of the audio and video according to the playback progress; The interaction information and the fragment identifier of the target fragment are sent to the server as the interaction data.
- the device provided in this embodiment further includes: a response module, configured to respond to the interactive operation triggered by the user through the interactive control, and display the animation corresponding to the interactive operation on the audio and video playback interface. effect.
- the interactive device provided by the above-mentioned embodiment can realize the technical solution described in the embodiment of the interactive method shown in Figure 1 above, and the specific implementation principles of the above-mentioned modules or units can be referred to the interactive method shown in Figure 1 above Corresponding content in the embodiments will not be repeated here.
- Fig. 8 shows a schematic structural diagram of an interaction device provided by another embodiment of the present application.
- the graph interaction device includes: an acquisition module 21 , a determination module 22 and a generation module 23 . in,
- the acquiring module 21 is configured to acquire interactive data triggered by different users for different segments of the audio and video, and obtain an interactive data set related to the audio and video;
- a determination module 22 configured to determine the user interaction heat corresponding to different segments of the audio and video according to the interaction data set
- the generation module 23 is used to generate the streaming media data of the audio and video based on the user interaction heat corresponding to different segments of the audio and video and the audio and video data of the audio and video, so that the client device can download the streaming media data based on the downloaded audio and video data.
- Data displaying perceivable information reflecting user interaction heat corresponding to different segments of the audio and video in the playback interface of the audio and video.
- the interaction device provided in this embodiment further includes: a receiving module, configured to receive the interaction information and playback progress sent by the client for the audio and video; and, the above-mentioned determination module 22 is also configured to, according to the playback progress, A target segment targeted by the interaction information is determined.
- the interactive device provided in this embodiment further includes: a division module, configured to divide the audio and video into segments to obtain segment division information of the audio and video; and an adding module, configured to segment the audio and video The segment division information is added to the audio and video data of the audio and video.
- the above division module when used to divide the audio and video into segments to obtain the segment division information of the audio and video, it is specifically used for any of the following: divide the audio and video according to equal intervals Segment into multiple segments with the same playback duration, and use the duration as the segment division information; or, according to the audio and video content of the audio and video, perform scene segmentation on the audio and video to obtain multiple sets of scene segmentation sequences corresponding to segments; the start frame and end frame of each segment are used as the segment division information.
- the image processing device provided by the above-mentioned embodiment can realize the technical solution described in the embodiment of the interactive method shown in FIG. Corresponding content in the method embodiments will not be repeated here.
- FIG. 9 Another embodiment of the present application provides an interactive device.
- the structure of the interactive device is as shown in Figure 9, including: a playback module 31, a display module 32 and a determination module 33; wherein,
- Play module 31 for playing audio and video
- the display module 32 is used to display a progress bar in the playback interface of the audio and video; and is also used to respond to the interactive operation triggered by the user on the playback interface, at the playback progress on the progress bar, displaying user-perceivable information corresponding to the interactive operation;
- the determination module 33 is configured to determine the user's interaction data for the audio and video segment according to the interaction information corresponding to the interaction operation and the playback progress when the interaction operation is triggered.
- the interactive device provided by the above-mentioned embodiment can realize the technical solution described in the above-mentioned interactive method embodiment shown in FIG. Corresponding content in the embodiments will not be repeated here.
- Fig. 10 shows a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- the electronic device includes a processor 42 and a memory 41 .
- the memory 41 is used to store one or more computer instructions;
- the processor 42, coupled with the memory 41, is used for one or more computer instructions (such as computer instructions for implementing data storage logic) for use in To realize the steps in the above-mentioned interaction method embodiments.
- Memory 41 can be realized by any type of volatile or nonvolatile storage device or their combination, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), Magnetic Memory, Flash Memory, Magnetic or Optical Disk.
- SRAM static random access memory
- EEPROM electrically erasable programmable read-only memory
- EPROM erasable Programmable Read Only Memory
- PROM Programmable Read Only Memory
- ROM Read Only Memory
- the electronic device further includes: a communication component 43 , a power supply component 45 , a display 44 and other components.
- FIG. 10 only schematically shows some components, which does not mean that the electronic device only includes the components shown in FIG. 10 .
- the computer program product includes computer programs or instructions, which, when executed by a processor, cause the processor to implement the steps in the above method embodiments.
- the embodiments of the present application also provide a computer-readable storage medium storing a computer program, and when the computer program is executed by a computer, the method steps or functions provided by the foregoing embodiments can be realized.
- the device embodiments described above are only illustrative, and the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network elements. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. It can be understood and implemented by those skilled in the art without any creative effort.
- each embodiment can be realized by means of software plus a necessary general-purpose hardware platform, and of course also by hardware.
- the essence of the above technical solution or the part that contributes to the prior art can be embodied in the form of software products, and the computer software products can be stored in computer-readable storage media, such as ROM/RAM, magnetic Discs, optical discs, etc., include several instructions to make a computer device (which may be a personal computer, server, or network device, etc.) execute the methods described in various embodiments or some parts of the embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (16)
- 一种互动方法,其特征在于,包括:播放音视频;在所述音视频的播放界面中,显示反映播放进度的互动控件;响应于用户通过所述互动控件触发的互动操作,确定互动信息及所述互动操作触发时的播放进度;基于所述互动信息及所述播放进度,向服务端发送用户针对音视频片段的互动数据,以便服务端获得不同用户针对所述音视频的不同片段触发产生的互动数据。
- 根据权利要求1所述的方法,其特征在于,还包括:在所述音视频的播放界面中,展示进度条及在所述进度条上移动以反映所述播放进度的进度标;显示与所述进度标联动的所述互动控件。
- 根据权利要求2所述的方法,其特征在于,显示与所述进度标联动的所述互动控件,包括:在所述进度标的周围,显示所述互动控件;获取所述进度标的移动速度及移动方向;按照所述移动速度及所述移动方向,控制所述互动控件移动。
- 根据权利要求1至3中任一项所述的方法,其特征在于,还包括:获取缓冲区内所述音视频对应的缓冲数据;基于所述缓冲数据,确定缓冲进度条的至少一段缓冲段及缓冲段的展示属性;其中,展示属性与用户互动热度相关;按照不同展示属性对应不同展示方式的规则,在进度条上展示具有所述至少一段缓冲段的所述缓冲进度条。
- 根据权利要求4所述的方法,其特征在于,所述缓冲数据包括至少一个待播放片段及表征待播放片段对应用户互动热度的信息;以及基于所述缓冲数据,确定缓冲进度条的至少一段缓冲段及缓冲段的展示属性,包括:根据所述缓冲数据包括的至少一个待播放片段的数据量,确定待播放片段对应缓冲段的长度;根据所述缓冲数据包括的表征待播放片段对应用户互动热度的信息,确定相应段缓冲段的展示属性。
- 根据权利要求1至5中任一项所述方法,其特征在于,基于所述互动信息及所述播放进度,向服务端发送用户针对音视频片段的互动数据,包括:将所述互动信息及所述播放进度作为所述互动数据发送至所述服务端,由所述服务端基于所述播放进度确定所述互动信息针对所述音视频的目标片段;或者根据所述播放进度,确定所述音视频的目标片段;将所述互动信息及所述目标片段的片段标识作为所述互动数据发送至所述服务端。
- 根据权利要求1至6中任一项所述的方法,其特征在于,还包括:响应于用户通过所述互动控件触发的互动操作,在所述音视频播放界面上,显示所述互动操作对应的动效。
- 一种互动方法,其特征在于,包括:获取不同用户针对音视频的不同片段触发产生的互动数据,得到与所述音视频相关的互动数据集;根据所述互动数据集,确定所述音视频不同片段对应的用户互动热度;基于所述音视频不同片段对应的用户互动热度及所述音视频的音视频数据,生成所述音视频的流媒体数据,以便于客户端设备基于下载的所述流媒体数据,在所述音视频的播放界面中显示反映所述音视频不同片段对应的用户互动热度的可感知信息。
- 根据权利要求8所述的方法,其特征在于,还包括:接收客户端针对所述音视频发送的互动信息及播放进度;根据所述播放进度,确定所述互动信息针对的目标片段。
- 根据权利要求8所述的方法,其特征在于,还包括:对所述音视频进行片段划分,得到所述音视频的片段划分信息;将所述音视频的片段划分信息添加到所述音视频的音视频数据中。
- 根据权利要求10所述的方法,其特征在于,对所述音视频进行片段划分,得到所述音视频的片段划分信息,包括如下中的任一项:按照等间隔时长,将所述音视频分段成多段播放时长相等的片段,将所述时长作为所述片段划分信息;或者根据所述音视频的音视频内容,对所述音视频进行场景分割,得到多组场景分割序列分别对应的片段;将各片段的起始帧和终止帧作为所述片段划分信息。
- 一种互动系统,其特征在于,包括:客户端,用于播放音视频;在所述音视频的播放界面中,显示反映播放进度的互动控件;响应于用户通过所述互动控件触发的互动操作,确定互动信息及所述互动操作触发时的播放进度;基于所述互动信息及所述播放进度,向服务端发送用户针对音视频片段的互动数据;服务端,用于获取不同用户针对音视频的不同片段触发产生的互动数据,得到与所述音视频相关的互动数据集;根据所述互动数据集,确定所述音视频不同片段对应的用户互动热度;基于所述音视频不同片段对应的用户互动热度及所述音视频的音视频数据,生成所述音视频的流媒体数据,以便于客户端设备基于下载的所述流媒体数据,在所述音视频的播放界面中显示反映所述音视频不同片段对应的用户互动热度的可感知信息。
- 一种互动方法,其特征在于,包括:播放音视频;在所述音视频的播放界面中,显示进度条;响应于用户在所述播放界面上触发的互动操作,在所述进度条上的播放进度处,显示所述互动操作对应的用户可感知信息;根据所述互动操作对应的互动信息及所述互动操作触发时的所述播放进度,确定用户针对所述音视频片段的互动数据。
- 一种电子设备,其特征在于,包括处理器及存储器,其中,所述存储器,用于存储一条或多条计算机指令;所述处理器,与所述存储器耦合,用于执行所述一条或多条计算机指令,以用于实现上述权利要求1至7中任一项所述方法中的步骤,或实现上述权利要求8至11中任一项所述方法中的步骤,或实现上述权利要求12所述方法中的步骤。
- 一种计算机可读存储介质,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备能够执行如权利要求1至7中任一项所述方法中的步骤,或执行上述权利要求8至11中任一项所述方法中的步骤,或执行上述权利要求12所述方法中的步骤。
- 一种计算机程序产品,当所述计算机程序产品中的指令由电子设备的处理器执行时,使得电子设备能够执行如权利要求1至7中任一项所述方法中的步骤,或执行上述权利要求8至11中任一项所述方法中的步骤,或执行上述权利要求12所述方法中的步骤。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22914119.7A EP4380171A1 (en) | 2021-12-29 | 2022-12-07 | Interaction method, system, and electronic device |
US18/590,702 US20240205505A1 (en) | 2021-12-29 | 2024-02-28 | Interaction method, system, and electronic device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111640926.3A CN114125566B (zh) | 2021-12-29 | 2021-12-29 | 互动方法、系统及电子设备 |
CN202111640926.3 | 2021-12-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/590,702 Continuation US20240205505A1 (en) | 2021-12-29 | 2024-02-28 | Interaction method, system, and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023124864A1 true WO2023124864A1 (zh) | 2023-07-06 |
Family
ID=80363722
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/137344 WO2023124864A1 (zh) | 2021-12-29 | 2022-12-07 | 互动方法、系统及电子设备 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240205505A1 (zh) |
EP (1) | EP4380171A1 (zh) |
CN (1) | CN114125566B (zh) |
WO (1) | WO2023124864A1 (zh) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114125566B (zh) * | 2021-12-29 | 2024-03-08 | 阿里巴巴(中国)有限公司 | 互动方法、系统及电子设备 |
CN115079911A (zh) * | 2022-06-14 | 2022-09-20 | 北京字跳网络技术有限公司 | 一种数据处理方法、装置、设备及存储介质 |
CN115209233B (zh) * | 2022-06-25 | 2023-08-25 | 平安银行股份有限公司 | 视频播放方法以及相关装置、设备 |
CN116546264A (zh) * | 2023-04-10 | 2023-08-04 | 北京度友信息技术有限公司 | 视频处理方法及装置、电子设备和存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105812889A (zh) * | 2016-03-31 | 2016-07-27 | 北京奇艺世纪科技有限公司 | 一种播放进度条的显示方法及系统 |
CN105939494A (zh) * | 2016-05-25 | 2016-09-14 | 乐视控股(北京)有限公司 | 音视频片段提供方法及装置 |
US20160330524A1 (en) * | 2015-05-04 | 2016-11-10 | Facebook, Inc. | Presenting video content to online system users in response to user interactions with video content presented in a feed of content items |
CN113259780A (zh) * | 2021-07-15 | 2021-08-13 | 中国传媒大学 | 全息多维音视频播放进度条生成、显示和控制播放方法 |
CN114125566A (zh) * | 2021-12-29 | 2022-03-01 | 阿里巴巴(中国)有限公司 | 互动方法、系统及电子设备 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120087638A1 (en) * | 2009-06-19 | 2012-04-12 | Shenzhen Tcl New Technology Co., Ltd. | Playing progress indicating method for time-shifted television and television set |
TWI407790B (zh) * | 2010-01-08 | 2013-09-01 | Chunghwa Telecom Co Ltd | 影音互動系統及其方法 |
US9066145B2 (en) * | 2011-06-30 | 2015-06-23 | Hulu, LLC | Commenting correlated to temporal point of video data |
CN105792006B (zh) * | 2016-03-04 | 2019-10-08 | 广州酷狗计算机科技有限公司 | 互动信息显示方法及装置 |
CN105828145B (zh) * | 2016-03-18 | 2019-07-19 | 广州酷狗计算机科技有限公司 | 互动方法及装置 |
EP3435680B1 (en) * | 2017-07-27 | 2019-09-04 | Vestel Elektronik Sanayi ve Ticaret A.S. | Technique for selecting a path of a multipath audio and/or video stream |
CN107454465B (zh) * | 2017-07-31 | 2020-12-29 | 北京小米移动软件有限公司 | 视频播放进度展示方法及装置、电子设备 |
CN109597981B (zh) * | 2017-09-30 | 2022-05-17 | 腾讯科技(深圳)有限公司 | 一种文本互动信息的展示方法、装置及存储介质 |
CN108769814B (zh) * | 2018-06-01 | 2022-02-01 | 腾讯科技(深圳)有限公司 | 视频互动方法、装置、终端及可读存储介质 |
CN109818952A (zh) * | 2019-01-15 | 2019-05-28 | 美都科技(天津)有限公司 | 一种基于websocket实现的语音、视频分节点即时互动、精准统计的架构和方法 |
CN111698547A (zh) * | 2019-03-11 | 2020-09-22 | 腾讯科技(深圳)有限公司 | 视频互动方法、装置、存储介质和计算机设备 |
CN112399263A (zh) * | 2019-08-18 | 2021-02-23 | 聚好看科技股份有限公司 | 一种互动方法、显示设备及移动终端 |
CN112492370A (zh) * | 2019-09-12 | 2021-03-12 | 上海哔哩哔哩科技有限公司 | 进度条的展示方法、装置、计算机设备及可读存储介质 |
CN110933509A (zh) * | 2019-12-09 | 2020-03-27 | 北京字节跳动网络技术有限公司 | 一种信息发布的方法、装置、电子设备及存储介质 |
CN111679776B (zh) * | 2020-06-10 | 2022-06-14 | Oppo广东移动通信有限公司 | 广告播放控制方法、装置、电子装置及存储介质 |
CN111580724B (zh) * | 2020-06-28 | 2021-12-10 | 腾讯科技(深圳)有限公司 | 一种信息互动方法、设备及存储介质 |
CN111836114A (zh) * | 2020-07-08 | 2020-10-27 | 北京达佳互联信息技术有限公司 | 视频互动方法、装置、电子设备及存储介质 |
CN112423087A (zh) * | 2020-11-17 | 2021-02-26 | 北京字跳网络技术有限公司 | 一种视频互动信息展示方法及终端设备 |
CN113110783B (zh) * | 2021-04-16 | 2022-05-20 | 北京字跳网络技术有限公司 | 控件的显示方法、装置、电子设备和存储介质 |
CN113411680B (zh) * | 2021-06-18 | 2023-03-21 | 腾讯科技(深圳)有限公司 | 多媒体资源播放方法、装置、终端及存储介质 |
CN113784195A (zh) * | 2021-08-20 | 2021-12-10 | 北京字跳网络技术有限公司 | 视频的页面显示方法、装置、电子设备和存储介质 |
CN113797532A (zh) * | 2021-09-22 | 2021-12-17 | 网易(杭州)网络有限公司 | 信息处理方法、装置及电子设备 |
-
2021
- 2021-12-29 CN CN202111640926.3A patent/CN114125566B/zh active Active
-
2022
- 2022-12-07 WO PCT/CN2022/137344 patent/WO2023124864A1/zh active Application Filing
- 2022-12-07 EP EP22914119.7A patent/EP4380171A1/en active Pending
-
2024
- 2024-02-28 US US18/590,702 patent/US20240205505A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160330524A1 (en) * | 2015-05-04 | 2016-11-10 | Facebook, Inc. | Presenting video content to online system users in response to user interactions with video content presented in a feed of content items |
CN105812889A (zh) * | 2016-03-31 | 2016-07-27 | 北京奇艺世纪科技有限公司 | 一种播放进度条的显示方法及系统 |
CN105939494A (zh) * | 2016-05-25 | 2016-09-14 | 乐视控股(北京)有限公司 | 音视频片段提供方法及装置 |
CN113259780A (zh) * | 2021-07-15 | 2021-08-13 | 中国传媒大学 | 全息多维音视频播放进度条生成、显示和控制播放方法 |
CN114125566A (zh) * | 2021-12-29 | 2022-03-01 | 阿里巴巴(中国)有限公司 | 互动方法、系统及电子设备 |
Also Published As
Publication number | Publication date |
---|---|
CN114125566B (zh) | 2024-03-08 |
EP4380171A1 (en) | 2024-06-05 |
CN114125566A (zh) | 2022-03-01 |
US20240205505A1 (en) | 2024-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2023124864A1 (zh) | 互动方法、系统及电子设备 | |
US11438637B2 (en) | Computerized system and method for automatic highlight detection from live streaming media and rendering within a specialized media player | |
US10798035B2 (en) | System and interface that facilitate selecting videos to share in a messaging application | |
US9715901B1 (en) | Video preview generation | |
CN112383566B (zh) | 流媒体呈现系统 | |
US10334300B2 (en) | Systems and methods to present content | |
CA2943975C (en) | Method for associating media files with additional content | |
US20140237365A1 (en) | Network-based rendering and steering of visual effects | |
US20230056898A1 (en) | Systems and methods for creating a non-curated viewing perspective in a video game platform based on a curated viewing perspective | |
US20110258545A1 (en) | Service for Sharing User Created Comments that Overlay and are Synchronized with Video | |
US9524278B2 (en) | Systems and methods to present content | |
EP2834972B9 (fr) | Navigation video multi-sources | |
WO2015103636A9 (en) | Injection of instructions in complex audiovisual experiences | |
US20240214629A1 (en) | Methonds and systems for presenting information | |
US11386152B1 (en) | Automatic generation of highlight clips for events | |
JP2021510026A (ja) | 以前に視聴されたコンテンツの視聴ステータスを更新するための進捗バーを提供するためのシステムおよび方法 | |
US20230326489A1 (en) | Generation of visual effects based on text | |
WO2023128864A2 (en) | Visual effect design using multiple preview windows | |
US11928078B2 (en) | Creating effect assets while avoiding size inflation | |
US20230283864A1 (en) | Content creation using interactive effects | |
EP3228117A1 (en) | Systems and methods to present content |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22914119 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022914119 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022914119 Country of ref document: EP Effective date: 20240229 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112024007154 Country of ref document: BR |