CN111970532A - Video playing method, device and equipment - Google Patents

Video playing method, device and equipment Download PDF

Info

Publication number
CN111970532A
CN111970532A CN202010875845.0A CN202010875845A CN111970532A CN 111970532 A CN111970532 A CN 111970532A CN 202010875845 A CN202010875845 A CN 202010875845A CN 111970532 A CN111970532 A CN 111970532A
Authority
CN
China
Prior art keywords
data
live broadcast
target
broadcast picture
client
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010875845.0A
Other languages
Chinese (zh)
Other versions
CN111970532B (en
Inventor
庄宇轩
孙静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010875845.0A priority Critical patent/CN111970532B/en
Publication of CN111970532A publication Critical patent/CN111970532A/en
Application granted granted Critical
Publication of CN111970532B publication Critical patent/CN111970532B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • H04N21/4355Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream involving reformatting operations of additional data, e.g. HTML pages on a television screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Abstract

The embodiment of the invention provides a video playing method, a device and equipment, wherein the method comprises the steps of acquiring live broadcast picture data acquired by a main broadcast client in real time, sending the live broadcast picture data to a viewer client, enabling the viewer client to display a panoramic video live broadcast picture according to the live broadcast picture data, receiving bullet screen data sent by the viewer client, determining target bullet screen data meeting preset rules in the bullet screen data, acquiring rendering data corresponding to the target bullet screen data, synthesizing a target live broadcast picture according to the live broadcast picture data and the rendering data, and synchronizing the target live broadcast picture to the main broadcast client and the viewer client so that the main broadcast client and the viewer client can display the panoramic video live broadcast picture according to the target live broadcast picture. The embodiment improves the watching experience of the user, and further improves the playing amount of the video.

Description

Video playing method, device and equipment
Technical Field
The embodiment of the invention relates to the technical field of data processing, in particular to a video playing method, device and equipment.
Background
The panoramic technology is a technology for splicing one group or a plurality of groups of photos shot by a camera in 360 degrees into one image, and realizing the restoring and displaying of the real scene by omnibearing interactive watching through a computer technology based on the real scene of the panoramic image.
In recent years, with the development of internet technology, a video playing mode for playing videos by applying a panoramic technology has been implemented on a current video website. When the video is played by the panoramic technology, under the support of the playing plug-in, the user can control the around-looking direction according to the actual requirement, the scene graph can be viewed in a manner that the left side can be close to the right side and the scene graph can be viewed far away from the right side, and the real interactive experience and the scene sense are brought to the user.
However, in the prior art, when a panoramic technology is applied to play a video, the played video can only be prerecorded, and can only be played sequentially when played, and cannot interact with a user in real time, so that the viewing experience of the user is reduced, and the playing amount of the video is affected.
Disclosure of Invention
The embodiment of the invention provides a video playing method, a video playing device and video playing equipment, which are used for improving the watching experience of a user and further improving the playing amount of videos.
In a first aspect, an embodiment of the present invention provides a video playing method, including:
acquiring live broadcast picture data acquired by a main broadcast client in real time, and sending the live broadcast picture data to a spectator client so that the spectator client displays a panoramic video live broadcast picture according to the live broadcast picture data;
receiving barrage data sent by the audience client;
determining target bullet screen data meeting a preset rule in the bullet screen data;
acquiring rendering data corresponding to the target barrage data;
synthesizing a target live broadcast picture according to the live broadcast picture data and the rendering data;
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a panoramic video live broadcast picture according to the target live broadcast picture.
Optionally, the determining target bullet screen data meeting a preset rule in the bullet screen data includes:
determining target bullet screen data containing preset keywords in the bullet screen data,
and/or the presence of a gas in the gas,
and determining target bullet screen data which are moved to a preset position in the bullet screen data.
Optionally, the rendering data includes: and animation data corresponding to the target bullet screen data and/or playing control data corresponding to the target bullet screen data.
Optionally, when the rendering data is animation data corresponding to the target barrage data, synthesizing a target live view according to the live view data and the rendering data includes:
determining splicing boundary information of the live broadcast picture data according to a preset splicing algorithm;
spreading and tiling the live broadcast picture data according to splicing boundary information of the live broadcast picture data to obtain tiled live broadcast picture data;
determining a target position of the target barrage data in the live broadcast picture data, and determining position coordinates of the target barrage data according to the target position and the tiled live broadcast picture data;
and synthesizing a target live broadcast picture according to the position coordinates of the target bullet screen data, the tiled live broadcast picture data and animation data corresponding to the target bullet screen data.
Optionally, when the rendering data is play control data corresponding to the target barrage data, synthesizing a target live view according to the live view data and the rendering data includes:
determining splicing boundary information of the live broadcast picture data according to a preset splicing algorithm;
spreading and tiling the live broadcast picture data according to splicing boundary information of the live broadcast picture data to obtain tiled live broadcast picture data;
determining a target position of the target barrage data in the live broadcast picture data, and determining position coordinates of the target barrage data according to the target position and the tiled live broadcast picture data;
and synthesizing a target live broadcast picture according to the position coordinates of the target bullet screen data, the tiled live broadcast picture data and the play control data corresponding to the target bullet screen data.
Optionally, the synchronizing the target live broadcast picture to the anchor client and the audience client to enable the anchor client and the audience client to display a panoramic video live broadcast picture according to the target live broadcast picture includes:
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a first live broadcast picture according to the target live broadcast picture in a panoramic way and display animation data corresponding to the target bullet screen data on the first live broadcast picture.
Optionally, the synchronizing the target live broadcast picture to the anchor client and the audience client to enable the anchor client and the audience client to display a panoramic video live broadcast picture according to the target live broadcast picture includes:
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a second live broadcast picture according to the target live broadcast picture in a panoramic manner and display play control data corresponding to the target barrage data on the second live broadcast picture.
Optionally, the animation data corresponding to the target bullet screen data includes animation data corresponding to a virtual article issuing behavior, animation data corresponding to a voting behavior, animation data corresponding to a virtual article receiving behavior, or animation data corresponding to a panoramic bullet screen special effect,
after the synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a first live broadcast picture according to the target live broadcast picture in a panoramic manner and display animation data corresponding to the target bullet screen data on the first live broadcast picture, the method further includes:
receiving first touch operation data, wherein the first touch operation data is obtained by the audience client responding to a touch operation acting on animation data corresponding to the behavior of issuing the virtual article, animation data corresponding to the behavior of voting, or animation data corresponding to the behavior of receiving the virtual article;
and determining a touch operation result corresponding to the touched target animation data according to the first touch operation data, and returning the touch operation result to the audience client.
Optionally, after the synchronizing the target live broadcast picture to the anchor client and the audience client to enable the anchor client and the audience client to display a second live broadcast picture according to the target live broadcast picture in a panoramic manner, and display, on the second live broadcast picture, play control data corresponding to the target barrage data, the method further includes:
receiving second touch operation data, wherein the second touch operation data is obtained by the audience client responding to the touch operation acting on the play control data corresponding to the target bullet screen data;
determining an interface address corresponding to the play control data according to the second touch operation data, and returning the interface address corresponding to the play control data to the audience client, so that the audience client jumps to an interface corresponding to the interface address according to the interface address corresponding to the play control data.
Optionally, the method further includes:
determining the distance between the position coordinates of the target bullet screen data and the splicing boundary information in the tiled live broadcast picture data;
and if the distance is smaller than a preset distance threshold value, performing layer covering and conversion processing on the target bullet screen data, and performing splicing processing on the layer covered, converted target bullet screen data and the live broadcast picture data according to the position coordinates of the target bullet screen data to obtain a target live broadcast picture.
Optionally, the method further includes:
if the fact that the bullet screen data do not include bullet screen data meeting preset rules is determined, the bullet screen data and live broadcast picture data corresponding to the bullet screen data are synchronized to the anchor client and the audience client, so that the anchor client and the audience client can display a panoramic video live broadcast picture according to the bullet screen data and the live broadcast picture data corresponding to the bullet screen data.
Optionally, before synchronizing the barrage data and the live view data corresponding to the barrage data to the anchor client and the viewer client, the method further includes:
and determining corresponding live broadcast picture data according to the timestamp data of the bullet screen data.
Optionally, the method further includes:
animation data corresponding to the target bullet screen data is pre-stored,
and/or the presence of a gas in the gas,
and playing control data corresponding to the target bullet screen data is pre-stored.
In a second aspect, an embodiment of the present invention provides a video playing apparatus, including:
the receiving module is used for acquiring live broadcast picture data acquired by a main broadcast client in real time and sending the live broadcast picture data to a spectator client so that the spectator client can display a panoramic video live broadcast picture according to the live broadcast picture data;
the receiving module is further used for receiving barrage data sent by the audience client;
the processing module is used for determining target bullet screen data meeting a preset rule in the bullet screen data;
the processing module is further configured to obtain rendering data corresponding to the target barrage data;
the processing module is further used for synthesizing a target live broadcast picture according to the live broadcast picture data and the rendering data;
the processing module is further configured to synchronize the target live broadcast picture to the anchor client and the audience client, so that the anchor client and the audience client display a panoramic video live broadcast picture according to the target live broadcast picture.
In a third aspect, an embodiment of the present invention provides a video playback device, including: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executes computer-executable instructions stored by the memory to cause the at least one processor to perform the video playback method of any of the first aspects.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer executing instruction is stored in the computer-readable storage medium, and when a processor executes the computer executing instruction, the video playing method according to any one of the first aspect is implemented.
The embodiment of the invention provides a video playing method, a device and equipment, after the scheme is adopted, live broadcast picture data collected by a main broadcast client can be obtained in real time, the live broadcast picture data is synchronized to other audience clients watching the live broadcast, so that the audience clients can display a panoramic video live broadcast picture according to the live broadcast picture data, in the process of video live broadcast, bullet screen data sent by the audience clients can be received, rendering data corresponding to target bullet screen data meeting preset rules are obtained, then a target live broadcast picture is synthesized according to the rendering data and the live broadcast picture data, so that the main broadcast client and the audience clients can display the panoramic video live broadcast picture according to the target live broadcast picture, the panoramic video live broadcast is realized by combining the panoramic technology with the video live broadcast technology, and the preset processing rules can be triggered when the bullet screen data meet the preset conditions, the real-time interaction with the user in the panoramic video live broadcast process is realized, the watching experience of the user is improved, and the playing amount of the video live broadcast is further improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an application system of a video playing method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a video playing method according to an embodiment of the present invention;
fig. 3 is a schematic application diagram of a live view data expansion tiling process according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an application corresponding to animation data according to an embodiment of the present invention;
fig. 5 is an application schematic diagram of a play control corresponding to play control data provided in an embodiment of the present invention;
fig. 6 is a schematic diagram illustrating an application of a touch operation according to an embodiment of the present invention;
fig. 7 is a schematic diagram illustrating an application of a touch operation according to another embodiment of the present invention;
fig. 8 is a schematic structural diagram of a video playing apparatus according to an embodiment of the present invention;
fig. 9 is a schematic diagram of a hardware structure of a video playback device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of including other sequential examples in addition to those illustrated or described. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In recent years, with the development of internet technology, a video playing mode for playing videos by applying a panoramic technology has been implemented on a current video website. When the video is played by the panoramic technology, under the support of the playing plug-in, the user can control the around-looking direction according to the actual requirement, the scene graph can be viewed leftwards and rightwards, upwards and downwards, and can be viewed close and far, and the real interactive experience and the scene sense are brought to the user.
However, in the prior art, when the panoramic technology is applied to play a video, the played video can only be pre-recorded, that is, only the video played by the video playing application program or the video playing web page can be played. And when playing, the panoramic video can be played only according to the time sequence, and the user can only watch the video in sequence or simply send the lower barrage, so that the live broadcast process cannot be controlled, namely, the panoramic video cannot interact with the user in real time in the playing process, the watching experience of the user is reduced, and the playing amount of the video is influenced.
Based on the above problem, this application proposes to combine together panoramic technology and live technique of video, realizes the live of panoramic video, then confirms the target barrage data that satisfies preset rule again to handle according to target barrage data, obtain the live frame data after the synthesis, realized live in-process of panoramic video and user's real-time interaction, no longer confine the user to can only watch the video in order, or simply send out the barrage, reached and improved the user and watched experience, and then improved the technical effect of video playback volume.
Fig. 1 is a schematic structural diagram of an application system of a video playing method according to an embodiment of the present invention, and as shown in fig. 1, the system may include: server 101, anchor client 102, and viewer client 103. The anchor client may be a client corresponding to the video anchor, and the audience client may be a client corresponding to a user watching a live video. There may be one or more viewer clients viewing the live video, in this example, one viewer client 103. The server 101 acquires live broadcast picture data acquired by the anchor client 102, and transmits the live broadcast picture data to other viewer clients 103 watching live broadcast, so that the viewer clients 103 can play the live broadcast picture in a panoramic manner. In addition, the server 101 may further receive bullet screen data sent by the audience client 103, determine rendering data corresponding to target bullet screen data meeting a preset rule, process the rendering data corresponding to the target bullet screen data and the live broadcast picture data to obtain a target live broadcast picture, and synchronously display the target live broadcast picture on the anchor client 102 and the audience client 103. The anchor client and the audience client can be smart phones, tablets, personal computers and the like. In addition, live image data may be captured by a panoramic live device, which may be, for example, a panoramic camera. In addition, the panoramic live broadcast device can be a separate device or a device integrated on a client.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 2 is a schematic flowchart of a video playing method according to an embodiment of the present invention, where the method of this embodiment may be executed by a server. As shown in fig. 2, the method of this embodiment may include:
s201: and acquiring live broadcast picture data acquired by the anchor client in real time, and sending the live broadcast picture data to the audience client so that the audience client can display a panoramic video live broadcast picture according to the live broadcast picture data.
In this embodiment, when the anchor corresponding to the anchor client is directly broadcast through the anchor client, the live broadcast picture data may be collected first, and then the live broadcast picture data is sent to the server through the anchor client, and then the server sends the live broadcast picture data to the audience client, so that the audience client displays the panoramic video live broadcast picture according to the live broadcast picture data.
In addition, the live view data may be image-type data or video-type data. If the live broadcast picture data is video-type data, the server cannot directly process the video-type data, and after the server acquires the live broadcast picture data acquired by the anchor client, the server can firstly cut the live broadcast picture data by frames to obtain the cut live broadcast picture data, and then send the cut live broadcast picture data to the audience client, so that the audience client can display a panoramic video live broadcast picture according to the live broadcast picture data.
Illustratively, the live view data may be cut in 24 frames per second, resulting in cut live view data.
S202: and receiving bullet screen data sent by the audience client.
In this embodiment, when the audience client displays a live panoramic video screen, a user watching the live panoramic video may send some bullet screen texts representing their own ideas or capable of triggering related actions according to the screen content in the live video screen, and when receiving the bullet screen text, the audience client may obtain bullet screen data according to a timestamp corresponding to the bullet screen text and then send the bullet screen data to the server. Illustratively, the bullet screen data may be "good bar, 17:38: 49", "excellent, 12:23: 21", "happy holiday, 08:01: 19", and the like.
S203: and determining target bullet screen data meeting preset rules in the bullet screen data.
In this embodiment, after receiving the bullet screen data, the server may first determine whether the bullet screen data satisfies the preset rule. The preset rule may be set according to the actual application scene, for example, the preset rule may be a rule including a preset keyword or a rule that bullet screen data is moved to a preset position.
If the preset rule is a rule containing a preset keyword, the specific implementation manner of determining the target bullet screen data meeting the preset rule in the bullet screen data may be as follows: and determining target bullet screen data containing preset keywords in the bullet screen data.
If the preset rule is a rule that the bullet screen data moves to a preset position, the specific implementation manner of determining the target bullet screen data meeting the preset rule in the bullet screen data may be as follows: and determining target bullet screen data which are moved to a preset position in the bullet screen data.
In addition, the preset rule can be a rule which meets two conditions that the preset keyword is included and the bullet screen data moves to the preset position.
Specifically, the preset keyword may be a character capable of triggering a related action. For example, the preset keyword may be a word such as "vote", "start vote", "red packet" or the like. The preset position may be coordinate information set according to actual requirements. For example, the preset position may be a center coordinate of the display screen corresponding to the client, that is, it may be determined whether the coordinate corresponding to the lower left corner of the barrage text moves to the center coordinate of the display screen corresponding to the client.
In addition, the embodiments of the present application only list a few specific rules, and other rules meeting the requirements are also within the scope of the present application.
S204: and acquiring rendering data corresponding to the target bullet screen data.
In this embodiment, the rendering data corresponding to the target data may be pre-stored, or may be synthesized in real time according to the text content in the target bullet screen data. Wherein rendering the data may include: animation data corresponding to the target bullet screen data and/or play control data corresponding to the target bullet screen data.
In a specific real-time manner, if the target bullet screen data corresponds to animation data, the animation data corresponding to the target bullet screen data may be pre-stored. If the target bullet screen data corresponds to the play control data, the play control data corresponding to the target bullet screen data may be pre-stored.
S205: and synthesizing a target live broadcast picture according to the live broadcast picture data and the rendering data.
S206: and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display the panoramic video live broadcast picture according to the target live broadcast picture.
In this embodiment, there may be two rendering data, where different rendering data correspond to different target live broadcast screen synthesis manners, and specifically, the rendering data may be:
if the rendering data is animation data corresponding to the target barrage data, a specific implementation manner of synthesizing the target live broadcast picture according to the live broadcast picture data and the rendering data may include:
and determining splicing boundary information of the live broadcast picture data according to a preset splicing algorithm.
And unfolding and tiling the live broadcast picture data according to the splicing boundary information of the live broadcast picture data to obtain the tiled live broadcast picture data.
And determining the target position of the target barrage data in the live broadcast picture data, and determining the position coordinates of the target barrage data according to the target position and the tiled live broadcast picture data.
And synthesizing a target live broadcast picture according to the position coordinates of the target bullet screen data, the tiled live broadcast picture data and animation data corresponding to the target bullet screen data.
Specifically, the animation data may be animation capable of interacting with the user or animation that generates special effects that attract the attention of the user. For example, the animation data may be animation data corresponding to a behavior of issuing a virtual article, animation data corresponding to a behavior of voting, animation data corresponding to a behavior of receiving a virtual article, or animation data corresponding to a panoramic bullet screen special effect.
Further, after the live broadcast picture data is acquired, splicing boundary information of the live broadcast picture frames can be acquired first, and then the live broadcast picture data is spread and tiled through the splicing boundary information, so that tiled live broadcast picture data is obtained. The live broadcast picture data are acquired through different cameras and are spliced according to a splicing algorithm. The stitching boundary information may be specific coordinates of each point in the stitching boundary, and may be determined according to a stitching algorithm.
In addition, a splicing algorithm can be determined according to the panoramic live broadcast equipment, and different panoramic live broadcast equipment can correspond to different splicing schemes. For example, the split scheme may be implemented by existing algorithms such as a multi-camera coordination algorithm and a time-alignment algorithm.
Furthermore, after the live broadcast picture data are unfolded and tiled according to the splicing boundary information, each point in the live broadcast picture data after being unfolded and tiled can correspond to a two-dimensional position coordinate, when the position coordinate of the target bullet screen data is determined, the target position of the target bullet screen data in the live broadcast picture data which are not unfolded and tiled can be determined, then the corresponding position coordinate of the target position after the live broadcast picture data are unfolded and tiled is determined, and the position coordinate is the position coordinate of the target bullet screen data.
The target bullet screen data can move from the right side of the terminal device to the left side at a preset speed, and the specific position in the un-expanded live broadcast picture data can be related to the angle of viewing the panoramic image at the current viewing angle, that is, the target position can be a certain point of the panoramic image at the rightmost side of the terminal device at a preset height.
In addition, the target position of the target barrage data can be determined by moving the target barrage data to the preset position according to the preset rule, that is, when the target barrage data moves to the preset position, the preset rule can be triggered to execute the relevant flow of target live broadcast picture synthesis.
In addition, the position coordinates of the target bullet screen data can be determined by directly generating the position coordinates to the target position in the live broadcast picture data according to a preset rule.
Fig. 3 is a schematic view of an application of the live view data after being spread and tiled according to the embodiment of the present invention, as shown in fig. 3, the live view data is a panoramic live view composed of A, B, C, D images, a splicing boundary between a and B is a first splicing boundary Pab, a splicing boundary between a and C is a second splicing boundary Pac, a splicing boundary between B and C is a third splicing boundary Pbc, and a splicing boundary between C and D is a fourth splicing boundary Pcd. The panoramic live broadcast picture is arranged under a two-dimensional coordinate system, each point in the images at four angles has a corresponding position coordinate, namely each point in the images at A, B, C, D four angles has a corresponding position coordinate, the target position corresponding to the target bullet screen data is E, and the position coordinate corresponding to E can be regarded as the position coordinate corresponding to the target bullet screen data.
Fig. 4 is an application schematic diagram corresponding to animation data provided in an embodiment of the present invention, and as shown in fig. 4, in this example, in a live broadcast process of a main broadcast, a viewer client corresponding to a user sends barrage data with a barrage text of "red packet", a rule corresponding to red packet animation is triggered, and besides displaying original live panoramic video content, a red packet animation effect may also be displayed in a client display interface.
If the rendering data is the play control data corresponding to the target bullet screen data, a specific implementation manner of synthesizing the target live broadcast picture according to the live broadcast picture data and the rendering data may include:
and determining splicing boundary information of the live broadcast picture data according to a preset splicing algorithm.
And unfolding and tiling the live broadcast picture data according to the splicing boundary information of the live broadcast picture data to obtain the tiled live broadcast picture data.
And determining the target position of the target barrage data in the live broadcast picture data, and determining the position coordinates of the target barrage data according to the target position and the tiled live broadcast picture data.
And synthesizing a target live broadcast picture according to the position coordinates of the target bullet screen data, the tiled live broadcast picture data and the play control data corresponding to the target bullet screen data.
Specifically, in addition to animation data, a play control corresponding to the play control data may be displayed in the live panoramic video process. And the playing control corresponding to the playing control data can be a control for controlling jumping to other interfaces. For example, controls corresponding to links to other recommended items may be skipped, and controls corresponding to other collaborative anchor live channels may also be skipped. The principle of the processing rule of the play control data is approximately similar to that of the rendering data when the rendering data is animation data, and only when a target live broadcast picture is synthesized, the play control data is synthesized according to the position coordinates of the target barrage data, the tiled live broadcast picture data and the play control data corresponding to the target barrage data.
Exemplarily, fig. 5 is an application schematic diagram of a play control corresponding to play control data provided by an embodiment of the present invention, as shown in fig. 5, in this example, in a live broadcast process of a main broadcast, a client of an audience corresponding to a user sends barrage data with a barrage text of "buy and buy", a rule for synthesizing and displaying a control corresponding to an interface capable of jumping to a recommended article is triggered, and a jump control B corresponding to an interface capable of jumping to a recommended article can be displayed in a display interface of the client in addition to original live broadcast content of a panoramic video.
By adopting the scheme, the live broadcast picture data collected by the anchor client can be acquired in real time, and the live broadcast picture data can be synchronized to other audience clients watching live broadcast, so that the audience clients can display panoramic video live broadcast pictures according to the live broadcast picture data, can receive bullet screen data sent by the audience clients in the process of video live broadcast, acquire rendering data corresponding to target bullet screen data meeting preset rules, and then synthesize the target live broadcast pictures according to the rendering data and the live broadcast picture data, so that the anchor client and the audience clients can display the panoramic video live broadcast pictures according to the target live broadcast pictures, the panoramic video live broadcast is realized by combining the panoramic technology and the video live broadcast technology, and the preset processing rules can be triggered when the bullet screen data meet preset conditions, so that the real-time interaction with users in the process of the panoramic video live broadcast is realized, the watching experience of the user is improved, and the playing amount of the live video is further improved.
Based on the method of fig. 2, the present specification also provides some specific embodiments of the method, which are described below.
In another embodiment, if the rendering data is animation data, synchronizing the target live view to the anchor client and the viewer client, so that the anchor client and the viewer client display a panoramic video live view according to the target live view may include:
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a first live broadcast picture according to the target live broadcast picture in a panoramic way and display animation data corresponding to the target bullet screen data on the first live broadcast picture.
In addition, the animation data corresponding to the target bullet screen data may include animation data corresponding to a virtual article issuing behavior, animation data corresponding to a voting behavior, and animation data corresponding to a virtual article receiving behavior or animation data corresponding to a panoramic bullet screen special effect, and after synchronizing the target live broadcast picture to the anchor client and the audience client, so that the anchor client and the audience client display a first live broadcast picture according to the target live broadcast picture in a panoramic manner, and display the animation data corresponding to the target bullet screen data on the first live broadcast picture, the method may further include:
receiving first touch operation data, wherein the first touch operation data is obtained by the audience client responding to a touch operation acting on animation data corresponding to the behavior of issuing the virtual article, animation data corresponding to the behavior of voting, or animation data corresponding to the behavior of receiving the virtual article.
And determining a touch operation result corresponding to the touched target animation data according to the first touch operation data, and returning the touch operation result to the audience client.
In this embodiment, when the animation corresponding to the behavior of issuing the virtual article, the animation corresponding to the behavior of voting, or the animation corresponding to the behavior of receiving the virtual article is played on the audience client, the user may participate in the live panoramic video broadcast process by touching the control corresponding to the animation. And then, a touch operation result can be determined according to the touch operation of the user, and then the touch operation result is returned to the audience client so as to remind the user of the participation result, and the watching experience of the user is improved.
Fig. 6 is an application schematic diagram of touch operation provided in an embodiment of the present invention, in this example, taking the example in fig. 4 as an example, in addition to displaying original live panoramic video content, a red envelope-emitting animation effect may be displayed in a display interface of a client, as shown in a in fig. 6, a user touches another red envelope control a, and after receiving first touch operation data, a server may include operation results corresponding to the first touch operation data, that is, two touch operation results including a winning touch operation result and a non-winning touch operation result, and may further include operation results such as a specific winning amount. As shown in b of fig. 6, in this example, after the user touches the red envelope control a, the result of the touch operation that is not found a prize is displayed.
In another embodiment, if the rendering data is play control data, synchronizing the target live view to the anchor client and the audience client, so that the anchor client and the audience client display a panoramic video live view according to the target live view, may include:
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a second live broadcast picture according to the target live broadcast picture in a panoramic manner and display play control data corresponding to the target barrage data on the second live broadcast picture.
After synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a second live broadcast picture according to the target live broadcast picture in a panoramic manner and display play control data corresponding to the target barrage data on the second live broadcast picture, the method may further include:
and receiving second touch operation data, wherein the second touch operation data is obtained by the audience client responding to the touch operation acting on the play control data corresponding to the target bullet screen data.
Determining an interface address corresponding to the play control data according to the second touch operation data, and returning the interface address corresponding to the play control data to the audience client, so that the audience client jumps to an interface corresponding to the interface address according to the interface address corresponding to the play control data.
Specifically, after the interface address corresponding to the play control data is displayed on the display interface of the audience client, second touch operation data corresponding to the touch operation applied to the interface address corresponding to the play control data may be received, and then the interface corresponding to the play control is skipped according to the second touch operation data.
Fig. 7 is an application schematic diagram of touch operation according to another embodiment of the present invention, in this example, by taking the example in fig. 5 as an example, in addition to the original live panoramic video content, a jump control B corresponding to an interface for jumping to a recommended item may be displayed on a display interface of the client, as shown in a in fig. 7, a user touches the jump control B, as shown in B in fig. 7, and after receiving second touch operation data, the server may directly control to jump to the interface corresponding to the second jump control B.
Additionally, in another embodiment, the method may further include:
and determining the distance between the position coordinates of the target bullet screen data and the splicing boundary information in the tiled live broadcast picture data.
And if the distance is smaller than a preset distance threshold value, performing layer covering and conversion processing on the target bullet screen data, and performing splicing processing on the layer covered, converted target bullet screen data and the live broadcast picture data according to the position coordinates of the target bullet screen data to obtain a target live broadcast picture.
Specifically, the target bullet screen data may be overlapped with the splicing boundary information when being generated or during moving, for example, the position coordinates when the bullet screen is generated belong to the coordinate range where the splicing boundary information is located. In order to avoid the above problem, the problem can be avoided by adding a colored masking layer on the black edge and performing fuzzy blending processing on the edge of the colored masking layer. The preset distance threshold may be set according to actual conditions in a self-defined manner, for example, the distance threshold may be 0 to 5 pixel points. Illustratively, there may be 2 pixels.
In another embodiment, the method may further comprise:
if the bullet screen data which meet the preset rules are not included in the bullet screen data, the bullet screen data and the live broadcast picture data corresponding to the bullet screen data are synchronized to the anchor client and the audience client, so that the anchor client and the audience client display a panoramic video live broadcast picture according to the bullet screen data and the live broadcast picture data corresponding to the bullet screen data.
In this embodiment, in the process of determining the target barrage data, it is also possible to obtain barrage data that does not satisfy the preset rule, and when the barrage data does not satisfy the preset rule, the barrage data and the live broadcast picture data corresponding to the barrage data may be synchronized to the anchor client and the audience client, so that the anchor client and the audience client display the panoramic video live broadcast picture according to the barrage data and the live broadcast picture data corresponding to the barrage data. That is, when the bullet screen data does not satisfy the preset rule, it indicates that no additional trigger related action is needed, the bullet screen data and the live broadcast picture data corresponding to the bullet screen data can be directly synchronized to the anchor client and other live broadcast watching audience clients, so that the anchor client and other live broadcast watching audience clients display the panoramic video live broadcast picture.
In addition, before synchronizing the barrage data and the live broadcast picture data corresponding to the barrage data to the anchor client and the audience client, the method may further include:
and determining corresponding live broadcast picture data according to the timestamp data of the bullet screen data.
Specifically, each frame of picture image in the live broadcast picture data may have corresponding timestamp data, and after the timestamp data of the bullet screen text in the bullet screen data is determined, the live broadcast picture data corresponding to the timestamp data may be directly acquired. Illustratively, the time stamp data of the bullet screen text is 12:23:21, and when determining the live-broadcast picture data, the live-broadcast picture data with the time stamp data of 12:23:21 can be directly acquired.
Based on the same idea, an embodiment of this specification further provides a device corresponding to the foregoing method, and fig. 8 is a schematic structural diagram of a video playing device provided in an embodiment of the present invention, as shown in fig. 8, the method may include:
the receiving module 801 is configured to obtain live broadcast picture data acquired by a main broadcast client in real time, and send the live broadcast picture data to a viewer client, so that the viewer client displays a panoramic video live broadcast picture according to the live broadcast picture data.
The receiving module 801 is further configured to receive barrage data sent by the viewer client.
The processing module 802 is configured to determine target bullet screen data that meets a preset rule in the bullet screen data.
In this embodiment, the processing module 802 is further configured to:
determining target bullet screen data containing preset keywords in the bullet screen data,
and/or the presence of a gas in the gas,
and determining target bullet screen data which are moved to a preset position in the bullet screen data.
The processing module 802 is further configured to obtain rendering data corresponding to the target barrage data.
In this embodiment, the rendering data may include: and animation data corresponding to the target bullet screen data and/or playing control data corresponding to the target bullet screen data.
Wherein animation data corresponding to the target bullet screen data is pre-stored,
and/or the presence of a gas in the gas,
and playing control data corresponding to the target bullet screen data is pre-stored.
The processing module 802 is further configured to synthesize a target live view according to the live view data and the rendering data.
In this embodiment, if the rendering data is animation data corresponding to the target barrage data, the processing module 802 is further configured to:
and determining splicing boundary information of the live broadcast picture data according to a preset splicing algorithm.
And unfolding and tiling the live broadcast picture data according to the splicing boundary information of the live broadcast picture data to obtain the tiled live broadcast picture data.
And determining the target position of the target barrage data in the live broadcast picture data, and determining the position coordinates of the target barrage data according to the target position and the tiled live broadcast picture data.
And synthesizing a target live broadcast picture according to the position coordinates of the target bullet screen data, the tiled live broadcast picture data and animation data corresponding to the target bullet screen data.
If the rendering data is the playing control data corresponding to the target barrage data, the processing module 802 is further configured to:
and determining splicing boundary information of the live broadcast picture data according to a preset splicing algorithm.
And unfolding and tiling the live broadcast picture data according to the splicing boundary information of the live broadcast picture data to obtain the tiled live broadcast picture data.
And determining the target position of the target barrage data in the live broadcast picture data, and determining the position coordinates of the target barrage data according to the target position and the tiled live broadcast picture data.
And synthesizing a target live broadcast picture according to the position coordinates of the target bullet screen data, the tiled live broadcast picture data and the play control data corresponding to the target bullet screen data.
The processing module 802 is further configured to synchronize the target live broadcast picture to the anchor client and the viewer client, so that the anchor client and the viewer client display a panoramic video live broadcast picture according to the target live broadcast picture.
In another embodiment, the processing module 802 is further configured to:
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a first live broadcast picture according to the target live broadcast picture in a panoramic way and display animation data corresponding to the target bullet screen data on the first live broadcast picture.
In another embodiment, the processing module 802 is further configured to:
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a second live broadcast picture according to the target live broadcast picture in a panoramic manner and display play control data corresponding to the target barrage data on the second live broadcast picture.
In another embodiment, the animation data corresponding to the target bullet screen data includes animation data corresponding to a virtual article issuing behavior, animation data corresponding to a voting behavior, animation data corresponding to a virtual article receiving behavior, or animation data corresponding to a panoramic bullet screen special effect, and the processing module 802 is further configured to:
receiving first touch operation data, wherein the first touch operation data is obtained by the audience client responding to a touch operation acting on animation data corresponding to the behavior of issuing the virtual article, animation data corresponding to the behavior of voting, or animation data corresponding to the behavior of receiving the virtual article.
And determining a touch operation result corresponding to the touched target animation data according to the first touch operation data, and returning the touch operation result to the audience client.
In another embodiment, the processing module 802 is further configured to:
and receiving second touch operation data, wherein the second touch operation data is obtained by the audience client responding to the touch operation acting on the play control data corresponding to the target bullet screen data.
Determining an interface address corresponding to the play control data according to the second touch operation data, and returning the interface address corresponding to the play control data to the audience client, so that the audience client jumps to an interface corresponding to the interface address according to the interface address corresponding to the play control data.
In another embodiment, the processing module 802 is further configured to:
and determining the distance between the position coordinates of the target bullet screen data and the splicing boundary information in the tiled live broadcast picture data.
And if the distance is smaller than a preset distance threshold value, performing layer covering and conversion processing on the target bullet screen data, and performing splicing processing on the layer covered, converted target bullet screen data and the live broadcast picture data according to the position coordinates of the target bullet screen data to obtain a target live broadcast picture.
In this embodiment, the processing module 802 is further configured to:
if the fact that the bullet screen data do not include bullet screen data meeting preset rules is determined, the bullet screen data and live broadcast picture data corresponding to the bullet screen data are synchronized to the anchor client and the audience client, so that the anchor client and the audience client can display a panoramic video live broadcast picture according to the bullet screen data and the live broadcast picture data corresponding to the bullet screen data.
In this embodiment, the processing module 802 is further configured to:
and determining corresponding live broadcast picture data according to the timestamp data of the bullet screen data.
The apparatus provided in the embodiment of the present invention may implement the method in the embodiment shown in fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 9 is a schematic diagram of a hardware structure of a video playback device according to an embodiment of the present invention. As shown in fig. 9, the present embodiment provides an apparatus 900 including: at least one processor 901 and memory 902. The processor 901 and the memory 902 are connected via a bus 903.
In a specific implementation process, the at least one processor 901 executes computer-executable instructions stored in the memory 902, so that the at least one processor 901 performs the method in the above-described method embodiment.
For a specific implementation process of the processor 901, reference may be made to the above method embodiments, which implement principles and technical effects are similar, and details of this embodiment are not described herein again.
In the embodiment shown in fig. 9, it should be understood that the Processor may be a Central Processing Unit (CPU), other general-purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor.
The memory may comprise high speed RAM memory and may also include non-volatile storage NVM, such as at least one disk memory.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer execution instruction is stored in the computer-readable storage medium, and when a processor executes the computer execution instruction, the video playing method according to the above method embodiment is implemented.
The computer-readable storage medium may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. Readable storage media can be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Of course, the processor and the readable storage medium may also reside as discrete components in the apparatus.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (16)

1. A video playback method, comprising:
acquiring live broadcast picture data acquired by a main broadcast client in real time, and sending the live broadcast picture data to a spectator client so that the spectator client displays a panoramic video live broadcast picture according to the live broadcast picture data;
receiving barrage data sent by the audience client;
determining target bullet screen data meeting a preset rule in the bullet screen data;
acquiring rendering data corresponding to the target barrage data;
synthesizing a target live broadcast picture according to the live broadcast picture data and the rendering data;
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a panoramic video live broadcast picture according to the target live broadcast picture.
2. The method of claim 1, wherein the determining the target bullet screen data meeting the preset rule in the bullet screen data comprises:
determining target bullet screen data containing preset keywords in the bullet screen data,
and/or the presence of a gas in the gas,
and determining target bullet screen data which are moved to a preset position in the bullet screen data.
3. The method of claim 1, wherein the rendering data comprises: and animation data corresponding to the target bullet screen data and/or playing control data corresponding to the target bullet screen data.
4. The method according to claim 3, wherein when the rendering data is animation data corresponding to the target barrage data, the synthesizing a target live view from the live view data and the rendering data includes:
determining splicing boundary information of the live broadcast picture data according to a preset splicing algorithm;
spreading and tiling the live broadcast picture data according to splicing boundary information of the live broadcast picture data to obtain tiled live broadcast picture data;
determining a target position of the target barrage data in the live broadcast picture data, and determining position coordinates of the target barrage data according to the target position and the tiled live broadcast picture data;
and synthesizing a target live broadcast picture according to the position coordinates of the target bullet screen data, the tiled live broadcast picture data and animation data corresponding to the target bullet screen data.
5. The method according to claim 3, wherein when the rendering data is play control data corresponding to the target barrage data, synthesizing a target live view according to the live view data and the rendering data includes:
determining splicing boundary information of the live broadcast picture data according to a preset splicing algorithm;
spreading and tiling the live broadcast picture data according to splicing boundary information of the live broadcast picture data to obtain tiled live broadcast picture data;
determining a target position of the target barrage data in the live broadcast picture data, and determining position coordinates of the target barrage data according to the target position and the tiled live broadcast picture data;
and synthesizing a target live broadcast picture according to the position coordinates of the target bullet screen data, the tiled live broadcast picture data and the play control data corresponding to the target bullet screen data.
6. The method of claim 3 or 4, wherein the synchronizing the target live view to the anchor client and the viewer client to cause the anchor client and the viewer client to display a panoramic video live view according to the target live view comprises:
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a first live broadcast picture according to the target live broadcast picture in a panoramic way and display animation data corresponding to the target bullet screen data on the first live broadcast picture.
7. The method of claim 3 or 5, wherein the synchronizing the target live view to the anchor client and the viewer client to cause the anchor client and the viewer client to display a panoramic video live view according to the target live view comprises:
and synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a second live broadcast picture according to the target live broadcast picture in a panoramic manner and display play control data corresponding to the target barrage data on the second live broadcast picture.
8. The method of claim 6, wherein the animation data corresponding to the target barrage data comprises animation data corresponding to a virtual article issuing behavior, animation data corresponding to a voting behavior, animation data corresponding to a virtual article receiving behavior, or animation data corresponding to a panoramic barrage special effect,
after the synchronizing the target live broadcast picture to the anchor client and the audience client so that the anchor client and the audience client display a first live broadcast picture according to the target live broadcast picture in a panoramic manner and display animation data corresponding to the target bullet screen data on the first live broadcast picture, the method further includes:
receiving first touch operation data, wherein the first touch operation data is obtained by the audience client responding to a touch operation acting on animation data corresponding to the behavior of issuing the virtual article, animation data corresponding to the behavior of voting, or animation data corresponding to the behavior of receiving the virtual article;
and determining a touch operation result corresponding to the touched target animation data according to the first touch operation data, and returning the touch operation result to the audience client.
9. The method of claim 7, further comprising, after the synchronizing the target live view to the anchor client and the viewer client, so that the anchor client and the viewer client display a second live view according to the target live view panorama, and display play control data corresponding to the target barrage data on the second live view:
receiving second touch operation data, wherein the second touch operation data is obtained by the audience client responding to the touch operation acting on the play control data corresponding to the target bullet screen data;
determining an interface address corresponding to the play control data according to the second touch operation data, and returning the interface address corresponding to the play control data to the audience client, so that the audience client jumps to an interface corresponding to the interface address according to the interface address corresponding to the play control data.
10. The method of claim 4 or 5, further comprising:
determining the distance between the position coordinates of the target bullet screen data and the splicing boundary information in the tiled live broadcast picture data;
and if the distance is smaller than a preset distance threshold value, performing layer covering and conversion processing on the target bullet screen data, and performing splicing processing on the layer covered, converted target bullet screen data and the live broadcast picture data according to the position coordinates of the target bullet screen data to obtain a target live broadcast picture.
11. The method of any one of claims 1-5, further comprising:
if the fact that the bullet screen data do not include bullet screen data meeting preset rules is determined, the bullet screen data and live broadcast picture data corresponding to the bullet screen data are synchronized to the anchor client and the audience client, so that the anchor client and the audience client can display a panoramic video live broadcast picture according to the bullet screen data and the live broadcast picture data corresponding to the bullet screen data.
12. The method of claim 11, further comprising, prior to said synchronizing said barrage data and said live view data corresponding to said barrage data to said anchor client and said viewer client:
and determining corresponding live broadcast picture data according to the timestamp data of the bullet screen data.
13. The method according to any one of claims 3-5, further comprising:
animation data corresponding to the target bullet screen data is pre-stored,
and/or the presence of a gas in the gas,
and playing control data corresponding to the target bullet screen data is pre-stored.
14. A video playback apparatus, comprising:
the receiving module is used for acquiring live broadcast picture data acquired by a main broadcast client in real time and sending the live broadcast picture data to a spectator client so that the spectator client can display a panoramic video live broadcast picture according to the live broadcast picture data;
the receiving module is further used for receiving barrage data sent by the audience client;
the processing module is used for determining target bullet screen data meeting a preset rule in the bullet screen data;
the processing module is further configured to obtain rendering data corresponding to the target barrage data;
the processing module is further used for synthesizing a target live broadcast picture according to the live broadcast picture data and the rendering data;
the processing module is further configured to synchronize the target live broadcast picture to the anchor client and the audience client, so that the anchor client and the audience client display a panoramic video live broadcast picture according to the target live broadcast picture.
15. A video playback device, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the at least one processor to perform the video playback method of any of claims 1-13.
16. A computer-readable storage medium having computer-executable instructions stored thereon, which when executed by a processor implement the video playback method of any one of claims 1 to 13.
CN202010875845.0A 2020-08-27 2020-08-27 Video playing method, device and equipment Active CN111970532B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010875845.0A CN111970532B (en) 2020-08-27 2020-08-27 Video playing method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010875845.0A CN111970532B (en) 2020-08-27 2020-08-27 Video playing method, device and equipment

Publications (2)

Publication Number Publication Date
CN111970532A true CN111970532A (en) 2020-11-20
CN111970532B CN111970532B (en) 2022-07-15

Family

ID=73391554

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010875845.0A Active CN111970532B (en) 2020-08-27 2020-08-27 Video playing method, device and equipment

Country Status (1)

Country Link
CN (1) CN111970532B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113038239A (en) * 2021-04-12 2021-06-25 上海哔哩哔哩科技有限公司 Bullet screen setting method, device and system
CN113490006A (en) * 2021-07-01 2021-10-08 北京云生万物科技有限公司 Live broadcast interaction method and equipment based on bullet screen
CN113490061A (en) * 2021-07-01 2021-10-08 北京云生万物科技有限公司 Live broadcast interaction method and equipment based on bullet screen
CN113949914A (en) * 2021-08-19 2022-01-18 广州博冠信息科技有限公司 Live broadcast interaction method and device, electronic equipment and computer readable storage medium
CN114979736A (en) * 2022-05-10 2022-08-30 海信视像科技股份有限公司 Display device and sound picture synchronization method
CN115022701A (en) * 2022-05-30 2022-09-06 北京达佳互联信息技术有限公司 Video playing method, terminal, device, electronic equipment, medium and program product
CN115460431A (en) * 2022-09-23 2022-12-09 北京爱奇艺科技有限公司 Media stream live broadcasting method, system, computer equipment and storage medium
WO2023207516A1 (en) * 2022-04-27 2023-11-02 北京字跳网络技术有限公司 Live streaming video processing method and apparatus, electronic device, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105916043A (en) * 2016-04-19 2016-08-31 乐视控股(北京)有限公司 Barrage control method and device
CN106303735A (en) * 2016-09-07 2017-01-04 腾讯科技(深圳)有限公司 A kind of barrage display system, method, device and service customer end
US20180227617A1 (en) * 2016-02-01 2018-08-09 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for pushing information
CN109660878A (en) * 2017-10-10 2019-04-19 武汉斗鱼网络科技有限公司 Living broadcast interactive method, storage medium, electronic equipment and system based on barrage
CN110312169A (en) * 2019-07-30 2019-10-08 腾讯科技(深圳)有限公司 Video data handling procedure, device, terminal and server
US20190377956A1 (en) * 2018-06-08 2019-12-12 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for processing video
CN111526412A (en) * 2020-04-30 2020-08-11 广州华多网络科技有限公司 Panoramic live broadcast method, device, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180227617A1 (en) * 2016-02-01 2018-08-09 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for pushing information
CN105916043A (en) * 2016-04-19 2016-08-31 乐视控股(北京)有限公司 Barrage control method and device
CN106303735A (en) * 2016-09-07 2017-01-04 腾讯科技(深圳)有限公司 A kind of barrage display system, method, device and service customer end
CN109660878A (en) * 2017-10-10 2019-04-19 武汉斗鱼网络科技有限公司 Living broadcast interactive method, storage medium, electronic equipment and system based on barrage
US20190377956A1 (en) * 2018-06-08 2019-12-12 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for processing video
CN110312169A (en) * 2019-07-30 2019-10-08 腾讯科技(深圳)有限公司 Video data handling procedure, device, terminal and server
CN111526412A (en) * 2020-04-30 2020-08-11 广州华多网络科技有限公司 Panoramic live broadcast method, device, equipment and storage medium

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113038239A (en) * 2021-04-12 2021-06-25 上海哔哩哔哩科技有限公司 Bullet screen setting method, device and system
CN113038239B (en) * 2021-04-12 2023-04-11 上海哔哩哔哩科技有限公司 Bullet screen setting method, device and system
CN113490006A (en) * 2021-07-01 2021-10-08 北京云生万物科技有限公司 Live broadcast interaction method and equipment based on bullet screen
CN113490061A (en) * 2021-07-01 2021-10-08 北京云生万物科技有限公司 Live broadcast interaction method and equipment based on bullet screen
CN113490061B (en) * 2021-07-01 2022-12-27 北京云生万物科技有限公司 Live broadcast interaction method and equipment based on bullet screen
CN113949914A (en) * 2021-08-19 2022-01-18 广州博冠信息科技有限公司 Live broadcast interaction method and device, electronic equipment and computer readable storage medium
WO2023207516A1 (en) * 2022-04-27 2023-11-02 北京字跳网络技术有限公司 Live streaming video processing method and apparatus, electronic device, and storage medium
CN114979736A (en) * 2022-05-10 2022-08-30 海信视像科技股份有限公司 Display device and sound picture synchronization method
CN115022701A (en) * 2022-05-30 2022-09-06 北京达佳互联信息技术有限公司 Video playing method, terminal, device, electronic equipment, medium and program product
CN115022701B (en) * 2022-05-30 2023-09-26 北京达佳互联信息技术有限公司 Video playing method, terminal, device, electronic equipment, medium and program product
CN115460431A (en) * 2022-09-23 2022-12-09 北京爱奇艺科技有限公司 Media stream live broadcasting method, system, computer equipment and storage medium
CN115460431B (en) * 2022-09-23 2023-10-10 北京爱奇艺科技有限公司 Media stream live broadcast method, system, computer device and storage medium

Also Published As

Publication number Publication date
CN111970532B (en) 2022-07-15

Similar Documents

Publication Publication Date Title
CN111970532B (en) Video playing method, device and equipment
CN106210861B (en) Method and system for displaying bullet screen
CN111491174A (en) Virtual gift acquisition and display method, device, equipment and storage medium
CN111246232A (en) Live broadcast interaction method and device, electronic equipment and storage medium
CN111225231B (en) Virtual gift display method, device, equipment and storage medium
CN107295393B (en) method and device for displaying additional media in media playing, computing equipment and computer-readable storage medium
CN112019907A (en) Live broadcast picture distribution method, computer equipment and readable storage medium
CN111246270B (en) Method, device, equipment and storage medium for displaying bullet screen
CN113068053A (en) Interaction method, device, equipment and storage medium in live broadcast room
US20170225077A1 (en) Special video generation system for game play situation
CN113196785A (en) Live video interaction method, device, equipment and storage medium
CN110710203B (en) Methods, systems, and media for generating and rendering immersive video content
CN114697703B (en) Video data generation method and device, electronic equipment and storage medium
CN113408484A (en) Picture display method, device, terminal and storage medium
CN115334246A (en) Method, device, equipment and storage medium for image shooting
CN110730340A (en) Lens transformation-based virtual auditorium display method, system and storage medium
US11961190B2 (en) Content distribution system, content distribution method, and content distribution program
CN111667313A (en) Advertisement display method and device, client device and storage medium
JP6609078B1 (en) Content distribution system, content distribution method, and content distribution program
CN113296721A (en) Display method, display device and multi-screen linkage system
KR101984616B1 (en) System for providing contents using images
CN112019906A (en) Live broadcast method, computer equipment and readable storage medium
CN113709544B (en) Video playing method, device, equipment and computer readable storage medium
KR101221540B1 (en) Interactive media mapping system and method thereof
CN113938752A (en) Processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant