CN108462883A - A kind of living broadcast interactive method, apparatus, terminal device and storage medium - Google Patents

A kind of living broadcast interactive method, apparatus, terminal device and storage medium Download PDF

Info

Publication number
CN108462883A
CN108462883A CN201810014420.3A CN201810014420A CN108462883A CN 108462883 A CN108462883 A CN 108462883A CN 201810014420 A CN201810014420 A CN 201810014420A CN 108462883 A CN108462883 A CN 108462883A
Authority
CN
China
Prior art keywords
video
interactive
client
video stream
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810014420.3A
Other languages
Chinese (zh)
Other versions
CN108462883B (en
Inventor
张国梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201810014420.3A priority Critical patent/CN108462883B/en
Priority to PCT/CN2018/077327 priority patent/WO2019134235A1/en
Publication of CN108462883A publication Critical patent/CN108462883A/en
Application granted granted Critical
Publication of CN108462883B publication Critical patent/CN108462883B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23424Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Abstract

The invention discloses a kind of living broadcast interactive method, apparatus, terminal device and storage mediums, and the method includes the following steps that server-side executes:The live video stream that the first client of real-time reception uploads;Animation compound processing is carried out to live video stream, obtains the first synthetic video stream;First synthetic video stream is sent to the second client to play out;If receiving the interactive operation instruction of the second client transmission, obtains interactive operation and instruct corresponding goal response video;Animated transition is carried out to goal response video, and transformed goal response video is synthesized with current live video stream, obtains the second synthetic video stream;Second synthetic video stream is sent to the second client to play out.Technical scheme of the present invention improves the living broadcast interactive of main broadcaster and each user during network direct broadcasting, so that user is increased the enthusiasm and initiative of interaction, improves the intelligent level of living broadcast interactive.

Description

A kind of living broadcast interactive method, apparatus, terminal device and storage medium
Technical field
The present invention relates to a kind of Internet technical field more particularly to living broadcast interactive method, apparatus, terminal device and storages Medium.
Background technology
The advantage of internet is drawn and continued to network direct broadcasting, is broadcast live on the net in the way of video signal, can incite somebody to action The content sites publications such as product introduction, related meeting, background introduction, scheme test and appraisal, investigation on the net, dialogue interview, online training Onto internet, using the intuitive, quick of internet, the form of expression is good, abundant in content, interactivity is strong and region is unrestricted The features such as processed, reinforces the promotion effect of site of activity.After the completion of live broadcast, can also be at any time reader continue to provide replay, Program request effectively extends time and the space of live streaming, plays the maximum value of live content.
Currently, with the rise of network direct broadcasting industry, the communication mode of network direct broadcasting is also increasingly welcomed by the general public, With the fast development of network direct broadcasting industry, the user of network direct broadcasting is also constantly increasing.It is carried out in the main broadcaster of direct broadcasting room straight The participation of sowing time, spectators are also higher and higher, and spectators carry out interaction by barrage or object of giving a present with main broadcaster, more and more frequently It is numerous.
But due to being broadcast live and spectators are one-to-many relationships, spectators pass through barrage or object of giving a present carried out with main broadcaster it is interactive When, main broadcaster is difficult to be made a response for the request of each spectators, to limit the initiative of spectators' interaction to a certain extent.
Invention content
A kind of living broadcast interactive method, apparatus of offer of the embodiment of the present invention, terminal device and storage medium, to solve when previous The problem for causing living broadcast interactive not high with interaction initiative more living broadcast interactive modes.
In a first aspect, the embodiment of the present invention provides a kind of living broadcast interactive method, including the following steps that server-side executes:
The live video stream that the first client of real-time reception uploads;
Animation compound processing is carried out to the live video stream, obtains the first synthetic video stream;
The first synthetic video stream is sent to the second client to play out;
In the playing process of the first synthetic video stream, if receiving the interactive operation that second client is sent Instruction, then according to preset interactive operation with react the mapping relations between video, obtain the interactive operation instruct it is corresponding Goal response video, wherein the interactive operation instruction is generated by second client according to the touch operation of user;
Animated transition is carried out to the goal response video, and transformed goal response video is straight with currently described It broadcasts video flowing to be synthesized, obtains the second synthetic video stream;
The second synthetic video stream is sent to second client to play out.
Second aspect, the embodiment of the present invention provide a kind of living broadcast interactive method, include the following step of the second client executing Suddenly:
Receive and play the first synthetic video stream of server-side transmission so that aobvious in the live streaming window of second client Show the first live video that the first synthetic video stream includes, described first is shown in the animation window of second client Corresponding first animated video of live video;
In the playing process of the first synthetic video stream, image knowledge is carried out to the image of first live video Not, regional location of the interactive objects in the live streaming window is determined;
If specific touch operation of the user in the live streaming window is detected, according to specific touch operation and institute The regional location for stating interactive objects determines that the specific touch operates corresponding interactive operation instruction;
Interactive operation instruction is sent to the server-side;
If receiving the second synthetic video stream that the server-side is sent, the second synthetic video stream is played, So that showing the second live video that the second synthetic video stream includes in the live streaming window, shown in the animation window Second animated video, wherein second animated video is the goal response video pair instructed according to the interactive operation The animated video answered.
The third aspect, the embodiment of the present invention provide a kind of living broadcast interactive device, including server-side, server-side include:
Live video receiving module, the live video stream uploaded for the first client of real-time reception;
First Video Composition module obtains the first synthesis and regards for carrying out animation compound processing to the live video stream Frequency flows;
First video sending module is played out for the first synthetic video stream to be sent to the second client;
Video matching module is reacted, in the playing process of the first synthetic video stream, if receiving described the Two clients send interactive operation instruction, then according to preset interactive operation with react the mapping relations between video, acquisition The interactive operation instructs corresponding goal response video, wherein the interactive operation instruction by second client according to The touch operation of user generates;
Second Video Composition module, for carrying out animated transition to the goal response video, and by transformed target Reaction video is synthesized with the current live video stream, obtains the second synthetic video stream;
Second video sending module is broadcast for the second synthetic video stream to be sent to second client It puts.
Fourth aspect, the embodiment of the present invention provide a kind of living broadcast interactive device, including the second client, the second client packet It includes:
First video playback module, the first synthetic video stream for receiving and playing server-side transmission so that described The live streaming window of second client shows the first live video that the first synthetic video stream includes, in second client Animation window show corresponding first animated video of first live video;
Object Identification Module is used in the playing process of the first synthetic video stream, to first live video Image carry out image recognition, determine interactive objects it is described live streaming window in regional location;
Interaction instruction generation module, if for detecting specific touch operation of the user in the live streaming window, root According to the regional location of specific touch operation and the interactive objects, determine that the specific touch operates corresponding interaction Operational order;
Interaction instruction sending module, for interactive operation instruction to be sent to the server-side;
Second video playback module, if the second synthetic video stream sent for receiving the server-side, broadcasts Put the second synthetic video stream so that show that the second live streaming that the second synthetic video stream includes regards in the live streaming window Frequently, the second animated video is shown in the animation window, wherein second animated video is to be instructed according to the interactive operation The obtained corresponding animated video of goal response video.
5th aspect, the embodiment of the present invention provide a kind of terminal device, including memory, processor and are stored in storage In device and the computer program that can run on a processor, processor realize the step of living broadcast interactive method when executing computer program Suddenly.
6th aspect, the embodiment of the present invention provide a kind of computer readable storage medium, and computer readable storage medium is deposited The step of containing computer program, living broadcast interactive method realized when computer program is executed by processor.
The embodiment of the present invention has the following advantages that compared with prior art:Living broadcast interactive side provided in an embodiment of the present invention Method, device, terminal device and storage medium, during live streaming, server-side is by carrying out at animation compound live video stream Reason, obtains the first synthetic video stream, and the first synthetic video stream is sent to the second client and is played out so that second The live streaming window of client shows live video, corresponding with animation effect in the animation window simultaneous display live video Animated video, when the user of the second client is when being broadcast live window progress specific touch operation, the second client is specific according to this The regional location of interactive objects determines that the specific touch operates corresponding interactive operation instruction in touch operation and live streaming window, And interactive operation instruction is sent to server-side, server-side is regarded according to the corresponding goal response of interactive operation instructions match Frequently, animated transition and to goal response video is carried out, transformed goal response video flowing is flowed into current live video Row synthesis, obtains the second synthetic video stream, and the second synthetic video stream is sent to the second client and is played out so that It is corresponding dynamic in animation window display target reaction video while the live streaming window of second client continues to show live video Video is drawn, it, can be in animation window to realize the spectators user of the second client when window being broadcast live initiating specific action The animation effect picture of the corresponding reaction video of the specific action is played, and the animation effect picture that the animation window is shown is only It is related to the specific action of the user so as to be changed into from one-to-many living broadcast interactive mode between user and main broadcaster one-to-one Living broadcast interactive mode also increases user and hands over to improve the living broadcast interactive of main broadcaster and each user during network direct broadcasting Mutual enthusiasm and initiative, improves the intelligent level of living broadcast interactive.
Description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below by institute in the description to the embodiment of the present invention Attached drawing to be used is needed to be briefly described, it should be apparent that, the accompanying drawings in the following description is only some implementations of the present invention Example, for those of ordinary skill in the art, without having to pay creative labor, can also be according to these attached drawings Obtain other attached drawings.
Fig. 1 is the schematic network structure of living broadcast interactive system in the living broadcast interactive method that the embodiment of the present invention 1 provides;
Fig. 2 is the implementation flow chart for the living broadcast interactive method that the embodiment of the present invention 1 provides;
Fig. 3 be the embodiment of the present invention 1 provide living broadcast interactive method in server-side establish interactive operation with react video it Between mapping relations implementation flow chart;
Fig. 4 is the realization stream that server-side generates interactive operation record in the living broadcast interactive method that the embodiment of the present invention 1 provides Cheng Tu;
Fig. 5 is the implementation flow chart of step S106 in the living broadcast interactive method that the embodiment of the present invention 1 provides;
Fig. 6 is the schematic diagram for the living broadcast interactive device that the embodiment of the present invention 2 provides;
Fig. 7 is the schematic diagram for the terminal device that the embodiment of the present invention 4 provides.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation describes, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair Embodiment in bright, the every other implementation that those of ordinary skill in the art are obtained without creative efforts Example, shall fall within the protection scope of the present invention.
Embodiment 1
The living broadcast interactive method of the embodiment of the present invention is applied in living broadcast interactive system.It is mutual that as shown in FIG. 1, FIG. 1 is live streamings The schematic network structure of dynamic system, the living broadcast interactive system include the first client, the second client and server-side.Wherein, First client is the main broadcaster end of network direct broadcasting, and the second client is the viewer end of network direct broadcasting, and the second client can be with It exists simultaneously multiple.Information exchange is realized by server-side between first client and the second client.
First client and the second client can be specifically mobile phone, tablet computer, personal computer (Personal Computer, PC) etc. intelligent terminals, server-side can be specifically streaming media server, can be a server, or The server cluster that person is made of several servers can also be a cloud computing service center.
Spread through the internet between first client and server-side and between the second client and server-side medium into Row connection, specific network communication media can be wired or wireless etc..
Living broadcast interactive method in the embodiment of the present invention specifically includes the following steps of server-side execution:
The live video stream that the first client of real-time reception uploads;
Animation compound processing is carried out to the live video stream, obtains the first synthetic video stream;
The first synthetic video stream is sent to the second client to play out;
In the playing process of the first synthetic video stream, if receiving the interactive operation that second client is sent Instruction, then according to preset interactive operation with react the mapping relations between video, obtain the interactive operation instruct it is corresponding Goal response video, wherein the interactive operation instruction is generated by second client according to the touch operation of user;
Animated transition is carried out to the goal response video, and transformed goal response video is straight with currently described It broadcasts video flowing to be synthesized, obtains the second synthetic video stream;
The second synthetic video stream is sent to second client to play out.
Living broadcast interactive method in the embodiment of the present invention further includes specifically the following steps of the second client executing:
Receive and play the first synthetic video stream of server-side transmission so that aobvious in the live streaming window of second client Show the first live video that the first synthetic video stream includes, described first is shown in the animation window of second client Corresponding first animated video of live video;
In the playing process of the first synthetic video stream, image knowledge is carried out to the image of first live video Not, regional location of the interactive objects in the live streaming window is determined;
If specific touch operation of the user in the live streaming window is detected, according to specific touch operation and institute The regional location for stating interactive objects determines that the specific touch operates corresponding interactive operation instruction;
Interactive operation instruction is sent to the server-side;
If receiving the second synthetic video stream that the server-side is sent, the second synthetic video stream is played, So that showing the second live video that the second synthetic video stream includes in the live streaming window, shown in the animation window Second animated video, wherein second animated video is the goal response video pair instructed according to the interactive operation The animated video answered.
Referring to Fig. 2, Fig. 2 shows the implementation processes of living broadcast interactive method provided in this embodiment.Details are as follows:
S101:The live video stream that the first client of server-side real-time reception uploads.
In embodiments of the present invention, live video stream be the first client main broadcaster user when into performing network living broadcast to clothes The video flowing of business end transmission, specifically, which can pass through real-time messages transport protocol (Real Time Messaging Protocol, RTMP) it is transmitted.
S102:Server-side carries out animation compound processing to live video stream, obtains the first synthetic video stream.
Specifically, server-side is decoded live video stream, obtains continuous live video frame group, the live video frame Group includes the live video frame of default frame number, and server-side carries out animation to live video frame group using preset animated transition device and turns It changes, obtains the animated video frame group with animation effect, and live video frame group and animated video frame group are subjected to Video Composition, Obtain the first synthetic video stream.
Preferably, live video stream is the elementary stream (Elementary Stream, ES) for using H.264 coded format, Server-side is decoded live video stream by FFMPEG (Fast Forward Mpeg), obtains the YUV of YUV coded formats Code stream, then YUV code streams are converted to the RGB code streams of rgb format, i.e., continuous live video frame group, server-side uses animation to turn Parallel operation carries out live video frame group the RGB code streams that the animated video frame group that animation converts also is rgb format, and server-side is logical The synthesis to RGB code streams is crossed by the live video frame of animated video frame and live video frame group in animated video frame group according to pre- If position relationship merge, and the RGB code streams after synthesis are converted into YUV code streams, then by FFMPEG to the YUV code streams It is encoded to obtain the first synthetic video stream.
S103:First synthetic video stream is sent to the second client and played out by server-side.
Specifically, the first synthetic video stream is transferred to the second client by server-side by RTMP agreements.
S104:Second client receives and plays the first synthetic video stream of server-side transmission so that in the second client Live streaming window show live video, show the first animated video in the animation window of the second client, wherein first animation Video is the corresponding animated video of the live video.
Specifically, the second client receives the first synthetic video stream that server-side is sent by RTMP agreements, and second The broadcast interface of client plays.
The broadcast interface of second client includes the live streaming window being arranged according to preset position relationship and animation window, In, live streaming window is used to show the first live video of the first client, and animation window is for simultaneous display and first live streaming Corresponding the first animated video with animation effect of video, it is readily appreciated that ground, the first live video are current live video flowing.
It should be noted that the preset position relationship between live streaming window and animation window, services with step S102 Position relationship when end merges animated video frame with live video frame is identical.
S105:Second client carries out image in the playing process of the first synthetic video stream, to the image of live video Identification, determines regional location of the interactive objects in window is broadcast live.
Specifically, the second client is in the playing process of the first synthetic video stream, in real time to the first synthetic video stream It is decoded, obtains the video frame of continuous rgb format.
Second client determines live video in video according to the preset position relationship between live streaming window and dynamic window Corresponding live streaming region in frame, and the live streaming region for being spaced the video frame to presetting frame number at every predetermined time carries out image Identification, to determine regional location of the interactive objects in window is broadcast live.Wherein, interactive objects refer to being broadcast live in live video The face, including face, eyes, ear, nose, forehead and mouth of main broadcaster etc..
Preferably, the second client to live streaming region when carrying out image recognition, to the live streaming area in each frame video frame The image in domain carries out feature extraction, and according to the feature templates of each face figure to prestore, by the feature extracted and this five The feature templates of official's figure match, if the equal successful match of video frame of default frame number, exists according to the feature of successful match Corresponding image-region determines that regional location of the face figure in region is broadcast live, i.e. interactive objects are being broadcast live in live streaming region Regional location in window.
As another preferred embodiment of the embodiment of the present invention, since face are relatively fixed in the position of face, the Two clients can also identify the face figure in live streaming region when carrying out image recognition to live streaming region by face recognition technology Then shape determines position of the face on the face figure in the position of face according to preset face, and then determines that face exist The regional location of regional location in the live streaming region, i.e. interactive objects in window is broadcast live.
It should be noted that scheduled time interval could be provided as 300 milliseconds, but it is not limited to this, specifically can root It is configured according to the needs of practical application, is not limited herein.
It is understood that scheduled time interval is shorter, the number that the second client carries out image recognition is more, figure As the accuracy of recognition result is higher, but due to frequently carrying out image recognition, execution efficiency reduces, and scheduled time interval Longer, the number that the second client carries out image recognition is fewer, and the accuracy of image recognition result can reduce, but can improve and hold Line efficiency.
S106:It is specific tactile according to this if the second client detects specific touch operation of the user in window is broadcast live The regional location of operation and interactive objects in window is broadcast live is touched, determines that the specific touch operates corresponding interactive operation instruction.
In embodiments of the present invention, the user of the second client, can be in live streaming window during watching live streaming Main broadcaster face position carry out specific touch operation, the specific touch operation include but not limited to click, sliding, scaling, rolling The operations and the combination of different operation etc. such as dynamic, double-click.
Different specific touch operations represents different interactive actions, for example, " click " representative " bullet ", " sliding " represent " stroking ", " scaling " representative " pinching " etc..
Specifically, when the second client detects user in the specific touch operation during window is broadcast live, the second client It identifies the target interaction action that specific touch operation represents, and obtains the touch location of specific touch operation, according to step Regional location of the interactive objects that S105 is determined in window is broadcast live, determines the corresponding target interactive objects of the touch location.
Second client determines that the specific touch operates corresponding interaction according to target interaction action and target interactive objects Operational order.
For example, if specific touch operation is clicking operation that user carries out in window is broadcast live on the ear of main broadcaster, mesh The interactive action of mark is " bullet ", and target interactive objects are " ear ", which operates corresponding interactive operation instruction " to play ear Piece ".
S107:Interactive operation instruction is sent to server-side by the second client.
Specifically, interactive operation can specifically be instructed by socket (socket) process communication and be sent by the second client To server-side.
S108:Server-side is in the playing process of the first synthetic video stream, if receiving the interaction of the second client transmission Operational order then instructs corresponding according to preset interactive operation with the mapping relations between video, acquisition interactive operation is reacted Goal response video.
In embodiments of the present invention, different interactive operations corresponds to different reaction videos, and reaction video can be service The video file pre-saved is held, can also be to be prerecorded by the main broadcaster user of the first client and uploaded to server-side guarantor It deposits, interactive operation pre-sets simultaneously upload service end with the mapping relations reacted between video by the main broadcaster user of the first client It preserves.
Specifically, server-side is instructed according to the interactive operation for receiving the transmission of the second client, identifies that the interactive operation refers to Interactive operation information in order, and corresponding goal response video is searched in mapping relations according to the interactive operation information.
S109:Server-side to goal response video carry out animated transition, and by transformed goal response video flowing with work as Preceding live video stream is synthesized, and the second synthetic video stream is obtained.
Specifically, server-side is decoded the video flowing of goal response video, obtains continuous goal response video frame Group, the goal response video frame group include the goal response video frame of default frame number, and server-side uses preset animated transition device Animated transition is carried out to the goal response video frame group, obtains the goal response animated video frame group with animation effect, meanwhile, Server-side is decoded the live video stream being currently received, and obtains continuous live video frame group, the live video frame group Include the live video frame of default frame number, live video frame group and goal response animated video frame group are carried out video conjunction by server-side At obtaining the second synthetic video stream.
It should be noted that the specific implementation process for carrying out Video Composition in this step is identical as step S102, herein not It repeats again.
S110:Second synthetic video stream is sent to the second client and played out by server-side.
Specifically, the second synthetic video stream is transferred to the second client by server-side by RTMP agreements.
S111:If the second client receives the second synthetic video stream of server-side transmission, plays second synthesis and regard Frequency flows so that shows live video in live streaming window, shows the second animated video in animation window, wherein second animation regards Frequency is the corresponding animated video of goal response video instructed according to interactive operation.
Specifically, the second client terminal playing receive second and at video flowing, wherein live streaming window be shown second The corresponding animated video of goal response video is shown in live video, animation window.
By taking the interactive operation instruction of " playing ear " as an example, when the user of the second client is in the ear that window click main broadcaster is broadcast live After piece, the second client continues to display live video in live streaming window, while showing the mesh of " playing ear " in animation window The corresponding animated video of mark reaction video.
It is understood that for the user of each second client, shown in the animation window of second client Animated video is only related to the operation of the specific touch of the user, and the operation of its specific touch of the user of the second different clients is not Together, then the animated video shown in its animation window also differs.
In the corresponding embodiments of Fig. 1, during live streaming, server-side is by carrying out at animation compound live video stream Reason, obtains the first synthetic video stream, and the first synthetic video stream is sent to the second client and is played out so that second The live streaming window of client shows live video, corresponding with animation effect in the animation window simultaneous display live video Animated video, when the user of the second client is when being broadcast live window progress specific touch operation, the second client is specific according to this The regional location of interactive objects determines that the specific touch operates corresponding interactive operation instruction in touch operation and live streaming window, And interactive operation instruction is sent to server-side, server-side is regarded according to the corresponding goal response of interactive operation instructions match Frequently, animated transition and to goal response video is carried out, transformed goal response video flowing is flowed into current live video Row synthesis, obtains the second synthetic video stream, and the second synthetic video stream is sent to the second client and is played out so that It is corresponding dynamic in animation window display target reaction video while the live streaming window of second client continues to show live video Video is drawn, it, can be in animation window to realize the spectators user of the second client when window being broadcast live initiating specific action The animation effect picture of the corresponding reaction video of the specific action is played, and the animation effect picture that the animation window is shown is only It is related to the specific action of the user so as to be changed into from one-to-many living broadcast interactive mode between user and main broadcaster one-to-one Living broadcast interactive mode also increases user and hands over to improve the living broadcast interactive of main broadcaster and each user during network direct broadcasting Mutual enthusiasm and initiative, improves the intelligent level of living broadcast interactive.
Next, on the basis of the corresponding embodiments of Fig. 2, in the first client of real-time reception that step S101 is referred to Before the live video stream of upload, server-side further includes establishing interactive operation and reacting the processed of mapping relations between video Journey.
Referring to Fig. 3, Fig. 3 shows the specific reality that server-side establishes interactive operation and reacts the mapping relations between video Existing flow, details are as follows:
S201:The interactive operation of preset quantity is arranged in server-side.
Specifically, server-side pre-sets the interactive operation of the preset quantity under interactive model, and interactive operation is dynamic by interaction Make and interactive objects are composed, such as:" pinching face ", " playing forehead ", " rubbing nose ", " playing ear " etc..
Interactive model is the pattern that the main broadcaster user of the first client and the user of the second client carry out one-to-one interaction Switch.
It should be noted that preset quantity according to practical application can select, it is not limited herein.
S202:Server-side obtains the interactive operation that the first client is chosen, and receives first client in the mutual of selection The lower reaction video uploaded of dynamic operation.
Specifically, the interactive operation that the main broadcaster user of the first client provides according to server-side is chosen and needs that reaction is arranged The interactive operation of video, and the corresponding at least one reaction video of interactive operation is set, an interactive operation can be arranged simultaneously Multiple reaction videos.
Reaction video can be one section of video that main broadcaster user independently records, and can also be existing the regarding of main broadcaster user's selection Frequency file.Main broadcaster user can select the interactive operation for needing to be arranged according to the needs of application, and the interactive operation is mutual with this It moves the corresponding reaction video of operation and is sent to server-side.
For example, main broadcaster client selects the interactive operation for needing to be arranged for " playing forehead ", then what main broadcaster user uploaded " plays brain The corresponding reaction video of door " can be the video of expression of getting a fright.
S203:Server-side establishes the interactive operation chosen and reacts the mapping relations between video, and by the interaction of selection Operation is preserved with video association is reacted.
Specifically, server-side receives the interactive operation of the selection of the first client transmission and after react video, foundation this Interactive operation and the mapping relations between reacting video, react video with this by the interactive operation and are associated preservation.
Reaction video is preserved in the form of reacting video file, and reaction video file supports a variety of common video formats, Such as:MPEG-1 formats, MPEG-2 formats, MPEG4 formats, AVI format and RM formats, ASF and WMV formats etc..
Optionally, the main broadcaster user of the first client can open interaction before starting live streaming or during live streaming Pattern, and one or more interactive operation that server-side provides is selected to carry out interaction with the user of the second client to meet, If corresponding reaction video is not yet arranged in the interactive operation of main broadcaster user's selection, server-side sends reaction to the first client and regards The setting reminder message of frequency, to prompt main broadcaster user to be configured.
In the corresponding embodiments of Fig. 3, the main broadcaster user of the first client according to the pre-set interactive operation of server-side, Corresponding reaction video is set so that server-side establishes interactive operation according to the customization of main broadcaster user and reacts reflecting between video Penetrate relationship, on the one hand, being on the other hand customized of interaction mode during live streaming, can be broadcast live in main broadcaster user The user of Cheng Zhong, the second client can carry out a variety of interactions by different specific actions and main broadcaster user, to which enhancing is mutual The flexibility of flowing mode and diversity.
On the basis of the corresponding embodiments of Fig. 2 or Fig. 3, the second synthetic video stream is sent what step S110 was referred to After being played out to the second client, server-side further includes generating the processing procedure of interactive operation record.
Referring to Fig. 4, Fig. 4 shows that server-side generates the specific implementation flow of interactive operation record, details are as follows:
S301:Server-side obtains the user information of the second client.
Specifically, server-side is instructed according to the interactive operation received, determines the second visitor for sending interactive operation instruction The user information at family end.
User information includes but not limited to the time etc. that user's pet name and user send interactive operation instruction.
S302:Server-side is instructed according to the user information and interactive operation of the second client, generates interactive operation record.
Specifically, server-side is obtained according to the instruction of the interactive operation of the second client received and step S301 The user information of second client generates corresponding interactive operation record.
For example, if user's pet name of the second client user is Zhang San, the interactive operation instruction which sends is " bullet brain Door ", the sending time of interactive operation instruction be 9 points 10 minutes, then the interactive operation that server-side generates is recorded as that " Zhang San is at 9 points 10 points of bullets once forehead of main broadcaster ".
S303:Interactive operation record is sent to the first client to server-side and the second client is shown.
Specifically, the interactive operation record that step S302 is determined is sent to the first client and the second client by server-side After end, the first client and the second client receive interactive operation record, show that the interaction is grasped in preset display location It notes down.
Further, server-side can also by the interactive operation record be sent to participate in live streaming other second clients into Row display.
In the corresponding embodiments of Fig. 4, server-side is instructed according to the user information and interactive operation of the second client and is generated Interactive operation records, and interactive operation record is sent to the first client and the second client progress real-time display, to Allow the main broadcaster user of the first client and the user of the second client that can understand interactive content in time, meanwhile, by the interaction Operation note is sent to other second clients for participating in being broadcast live and is shown, is also beneficial to promote interaction behaviour to other users Make method, improves other users and be actively engaged in interactive enthusiasm and initiative.
On the basis of Fig. 2 corresponding embodiments, below by a specific embodiment to mentioned in step S106 If the second client detect specific touch operation of the user in window is broadcast live, according to specific touch operation and interactive The regional location of object determines that the specific touch operates the concrete methods of realizing progress of corresponding interactive operation instruction specifically It is bright.
Referring to Fig. 5, Fig. 5 shows the specific implementation flow of step S106 provided in an embodiment of the present invention, details are as follows:
S401:If the second client detects specific touch operation of the user in window is broadcast live, touched according to preset The correspondence between operation and interactive action is touched, determines that the specific touch operates corresponding target interaction action.
Specifically, the second client pre-sets the correspondence between touch operation and interactive action, the correspondence Define the interactive action of difference that different touch operation represents, for example, " click " representative " bullet ", " sliding " representative " stroking ", " scaling " representative " pinching " etc..
When the second client detects user in the specific touch operation during window is broadcast live, identify that the specific touch is grasped Make, and according to the correspondence between preset touch operation and interactive action, determines that the specific touch operates corresponding target Interaction action.For example, if specific touch operation of the user in window is broadcast live is " sliding ", the second client is single according to preset Correspondence determines that " sliding " corresponding interactive action is " stroking ".
S402:Second client obtains specific touch and operates the target location in window is broadcast live.
Specifically, the second client obtains what specific touch operation occurred while identifying specific touch operation Time and the specific touch operate the location information etc. in the display screen of the second client.
Second client determines that this is specific according to the location information of acquisition, and live streaming window in the position of display screen Target location of the touch operation in window is broadcast live.
Preferably, the second client pass through show screen coordinate and be broadcast live window coordinate between coordinate system convert come Determine that the specific touch operates the target location in window is broadcast live.For example, if the user of the second client is in the second client Display screen coordinate points (200,100) at perform " click " operation, the left vertex of window is broadcast live in the second client Show screen coordinate position be (100,10), then it is to be appreciated that after conversion the user " clicks " operate be broadcast live The target location of window is (100,90).
S403:Second client matches target location with regional location of the interactive objects in window is broadcast live, and obtains Take the corresponding target interactive objects in the target location.
Specifically, in step S105, the second client carries out image recognition by the image to live video, really Fixed regional location of each interactive objects in window is broadcast live, therefore, the second client obtain specific tactile according to step S402 Target location of the operation in window is broadcast live is touched, which is matched with the regional location of each interactive objects, if The starting point coordinate of target location falls the regional location in the interactive objects, then confirms successful match, using the interactive objects as this The corresponding target interactive objects in target location.
S404:Second client determines that specific touch operation is corresponding according to target interaction action and target interactive objects Interactive operation instructs.
Specifically, the target that the second client obtains the step S401 target interaction actions determined and step S403 is interactive Object is combined, and obtains the corresponding interactive operation instruction of touch operation.
If for example, specific touch operation be user main broadcaster in window is broadcast live the clicking operation carried out in the head, mesh The interactive action of mark is " bullet ", and target interactive objects are " forehead ", the interactive operation obtained after " bullet " and " forehead " is combined Instruction is " playing forehead ".
In the corresponding embodiments of Fig. 5, the second client determines that target interaction acts by identifying specific touch operation, And by specific touch operation during window be broadcast live target location and regional location of the interactive objects in window is broadcast live into Row matching determines target interactive objects, and then determines specific touch operation pair according to target interactive objects and target interaction action The interactive operation instruction answered so that the second client can be promptly and accurately is converted to the specific touch operation of user corresponding Interactive operation instructs, so that server-side is according to the corresponding goal response video of the interactive operation instructions match, and in time second The animation window of client plays the corresponding animated video of goal response video, and the user for realizing the second client can lead to It crosses simple touch operation and interaction between the main broadcaster user of the first client can be realized, to improve living broadcast interactive While, increase the enthusiasm and initiative of user's interaction, improves the intelligent level of living broadcast interactive.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present invention constitutes any limit It is fixed.
Embodiment 2
Corresponding to the living broadcast interactive method in embodiment 1, Fig. 6 shows a pair of with living broadcast interactive method in embodiment 1 one The living broadcast interactive device answered.The living broadcast interactive device includes server-side, the first client and the second client.For the ease of saying It is bright, it illustrates only and the relevant part of the embodiment of the present invention.
As shown in fig. 6, the server-side of the living broadcast interactive device includes live video receiving module 611, the first Video Composition Module 612, the first video sending module 613, reaction video matching module 614, the second Video Composition module 615 and the second video Sending module 616.Detailed description are as follows for each function module:
Live video receiving module 611, the live video stream uploaded for the first client of real-time reception;
First Video Composition module 612 obtains the first synthetic video for carrying out animation compound processing to live video stream Stream;
First video sending module 613 is played out for the first synthetic video stream to be sent to the second client;
Video matching module 614 is reacted, is used in the playing process of the first synthetic video stream, if receiving the second client End send interactive operation instruction, then according to preset interactive operation with react the mapping relations between video, obtain the interaction The corresponding goal response video of operational order, wherein interactive operation instruction is given birth to by the second client according to the touch operation of user At;
Second Video Composition module 615, for carrying out animated transition to goal response video, and transformed target is anti- It answers video to be synthesized with current live video stream, obtains the second synthetic video stream;
Second video sending module 616 is played out for the second synthetic video stream to be sent to the second client.
Further, the server-side of the living broadcast interactive device further includes:
Setup module 601, the interactive operation for preset quantity to be arranged;
Video reception module 602 is reacted, the interactive operation for obtaining the selection of the first client, and receive the first client The reaction video uploaded under the interactive operation of the selection;
Mapping block 603, for establish the first client selection interactive operation with react the mapping relations between video, And the interactive operation of the selection is reacted into video association with this and is preserved.
Further, the server-side of the living broadcast interactive device further includes:
Data obtaining module 617, the user information for obtaining the second client;
Generation module 618 is recorded, for being instructed according to user information and interactive operation, generates interactive operation record;
Display module 619, for interactive operation record to be sent to the first client and the second client is shown.
Please continue to refer to Fig. 6, as shown in fig. 6, the second client of the living broadcast interactive device includes the first video playing mould Block 621, Object Identification Module 622, interaction instruction generation module 623, interaction instruction sending module 624 and the second video playing mould Block 625.Detailed description are as follows for each function module:
First video playback module 621, the first synthetic video stream for receiving and playing server-side transmission so that the The live streaming window of two clients shows the first live video that the first synthetic video stream includes, in the animation window of the second client Show corresponding first animated video of first live video;
Object Identification Module 622 is used in the playing process of the first synthetic video stream, to the image of the first live video Image recognition is carried out, determines regional location of the interactive objects in window is broadcast live;
Interaction instruction generation module 623, if for detecting specific touch operation of the user in window is broadcast live, basis The specific touch operates and the regional location of interactive objects, determines that specific touch operates corresponding interactive operation instruction;
Interaction instruction sending module 624, for interactive operation instruction to be sent to server-side;
Second video playback module 625, if the second synthetic video stream for receiving server-side transmission, play this Two synthetic video streams so that show the second live video that the second synthetic video stream includes in live streaming window, it is aobvious in animation window Show the second animated video, wherein second animated video is corresponding for the goal response video instructed according to interactive operation Animated video.
Further, interaction instruction generation module 623 includes:
Interaction action determination unit 6231, if for detecting specific touch operation of the user in window is broadcast live, root According to the correspondence between preset touch operation and interactive action, it is dynamic to determine that the specific touch operates corresponding target interaction Make;
Target location acquiring unit 6232 operates the target location in window is broadcast live for obtaining specific touch;
Interactive objects determination unit 6233 is obtained for matching target location with the regional location of interactive objects The corresponding target interactive objects in the target location;
Interaction instruction determination unit 6234, for according to target interaction action and target interactive objects, determining specific touch Operate corresponding interactive operation instruction.
Each module/unit realizes the process of respective function in a kind of living broadcast interactive device provided in this embodiment, specifically may be used With reference to the description of previous embodiment 1, details are not described herein again.
Embodiment 3
The present embodiment provides a computer readable storage medium, computer journey is stored on the computer readable storage medium Sequence realizes living broadcast interactive method in embodiment 1 when the computer program is executed by processor, no longer superfluous here to avoid repeating It states.Alternatively, realizing the work(of each module/unit in living broadcast interactive device in embodiment 2 when the computer program is executed by processor Can, to avoid repeating, which is not described herein again.
Embodiment 4
Fig. 7 is the schematic diagram for the terminal device that one embodiment of the invention provides.As shown in fig. 7, the terminal of the embodiment is set Standby 7 include:Processor 71, memory 72 and it is stored in the computer program that can be run in memory 72 and on processor 71 73, for example, living broadcast interactive method program.Processor 71 realizes living broadcast interactive in above-described embodiment 1 when executing computer program 73 The step of method, such as step S101 shown in FIG. 1 to step S111.Alternatively, reality when processor 71 executes computer program 73 The function of each module/unit of living broadcast interactive device in existing above-described embodiment 2, for example, server-side shown in Fig. 6 module 611 to mould The module 621 of the function of block 616 and the second client to module 625 function.
Illustratively, computer program 73 can be divided into one or more module/units, one or more mould Block/unit is stored in memory 72, and is executed by processor 71, to complete the present invention.One or more module/units can To be the series of computation machine program instruction section that can complete specific function, the instruction segment is for describing computer program 73 at end Implementation procedure in end equipment 7.For example, computer program 73 can be divided into live video receiving module, the first video closes At module, the first video sending module, reaction video matching module, the second Video Composition module and the second video sending module. Detailed description are as follows for each function module:
Live video receiving module, the live video stream uploaded for the first client of real-time reception;
First Video Composition module obtains the first synthetic video stream for carrying out animation compound processing to live video stream;
First video sending module is played out for the first synthetic video stream to be sent to the second client;
Video matching module is reacted, is used in the playing process of the first synthetic video stream, if receiving the second client The interactive operation of transmission instructs, then according to preset interactive operation with react the mapping relations between video, obtain the interaction and grasp Make to instruct corresponding goal response video, wherein interactive operation instruction is generated by the second client according to the touch operation of user;
Second Video Composition module, for carrying out animated transition to goal response video, and by transformed goal response Video is synthesized with current live video stream, obtains the second synthetic video stream;
Second video sending module is played out for the second synthetic video stream to be sent to the second client.
Further, computer program 73 can also be divided into including:
Setup module, the interactive operation for preset quantity to be arranged;
Video reception module is reacted, the interactive operation for obtaining the selection of the first client, and receive the first client and exist The reaction video uploaded under the interactive operation of the selection;
Mapping block, for establish the first client selection interactive operation with react the mapping relations between video, and The interactive operation of the selection is reacted video association with this to preserve.
Further, computer program 73 can also be divided into including:
Data obtaining module, the user information for obtaining the second client;
Generation module is recorded, for being instructed according to user information and interactive operation, generates interactive operation record;
Display module, for interactive operation record to be sent to the first client and the second client is shown.
Computer program 73 can also be divided into the first video playback module, Object Identification Module, interaction instruction and generate Module, interaction instruction sending module and the second video playback module.Detailed description are as follows for each function module:
First video playback module, the first synthetic video stream for receiving and playing server-side transmission so that second The live streaming window of client shows the first live video that the first synthetic video stream includes, aobvious in the animation window of the second client Show corresponding first animated video of first live video;
Object Identification Module, in the playing process of the first synthetic video stream, to the image of the first live video into Row image recognition determines regional location of the interactive objects in window is broadcast live;
Interaction instruction generation module, if for detecting specific touch operation of the user in window is broadcast live, basis should Specific touch operates and the regional location of interactive objects, determines that specific touch operates corresponding interactive operation instruction;
Interaction instruction sending module, for interactive operation instruction to be sent to server-side;
Second video playback module, if for receive server-side transmission the second synthetic video stream, play this second Synthetic video stream so that show the second live video that the second synthetic video stream includes in live streaming window, shown in animation window Second animated video, wherein second animated video is corresponding dynamic for the goal response video instructed according to interactive operation Draw video.
Further, interaction instruction generation module includes:
Interaction action determination unit, if for detecting specific touch operation of the user in window is broadcast live, according to pre- If touch operation and interactive action between correspondence, determine that the specific touch operates corresponding target interaction action;
Target location acquiring unit operates the target location in window is broadcast live for obtaining specific touch;
Interactive objects determination unit obtains the mesh for matching target location with the regional location of interactive objects The corresponding target interactive objects of cursor position;
Interaction instruction determination unit, for according to target interaction action and target interactive objects, determining that specific touch operates Corresponding interactive operation instruction.
Terminal device 7 can be the terminal devices such as desktop PC, notebook, palm PC and cloud server.Eventually End equipment 7 may include, but be not limited only to, processor 71, memory 72 and computer program 73.Those skilled in the art can manage Solution, Fig. 7 is only the example of terminal device 7, does not constitute the restriction to terminal device 7, may include more or more than illustrating Few component, either combines certain components or different components, for example, terminal device 7 can also include input-output equipment, Network access equipment, bus etc..
Processor 71 can be central processing unit (Central Processing Unit, CPU), can also be other General processor, digital signal processor (Digital Signal Processor, DSP), application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field- Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic, Discrete hardware components etc..General processor can be microprocessor or the processor can also be any conventional processor Deng.
Memory 72 can be the internal storage unit of terminal device 7, such as the hard disk or memory of terminal device 7.Storage Device 72 can also be the plug-in type hard disk being equipped on the External memory equipment of terminal device 7, such as terminal device 7, intelligent storage Block (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash card (Flash Card) etc.. Further, memory 72 can also both include terminal device 7 internal storage unit and also including External memory equipment.Memory 72 for storing other programs and data needed for the computer program and terminal device 7.Memory 72 can be also used for Temporarily store the data that has exported or will export.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each work( Can unit, module division progress for example, in practical application, can be as needed and by above-mentioned function distribution by different Functional unit, module are completed, i.e., the internal structure of described device are divided into different functional units or module, more than completion The all or part of function of description.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also It is that each unit physically exists alone, it can also be during two or more units be integrated in one unit.Above-mentioned integrated list The form that hardware had both may be used in member is realized, can also be realized in the form of SFU software functional unit.
If the integrated module/unit be realized in the form of SFU software functional unit and as independent product sale or In use, can be stored in a computer read/write memory medium.Based on this understanding, the present invention realizes above-mentioned implementation All or part of flow in example method, can also instruct relevant hardware to complete, the meter by computer program Calculation machine program can be stored in a computer readable storage medium, the computer program when being executed by processor, it can be achieved that on The step of stating each embodiment of the method.Wherein, the computer program includes computer program code, the computer program generation Code can be source code form, object identification code form, executable file or certain intermediate forms etc..The computer-readable medium May include:Any entity or device, recording medium, USB flash disk, mobile hard disk, magnetic of the computer program code can be carried Dish, CD, computer storage, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), electric carrier signal, telecommunication signal and software distribution medium etc..It should be noted that the meter The content that calculation machine readable medium includes can carry out increase and decrease appropriate according to legislation in jurisdiction and the requirement of patent practice, Such as in certain jurisdictions, according to legislation and patent practice, computer-readable medium is including being not electric carrier signal and electricity Believe signal.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although with reference to aforementioned reality Applying example, invention is explained in detail, it will be understood by those of ordinary skill in the art that:It still can be to aforementioned each Technical solution recorded in embodiment is modified or equivalent replacement of some of the technical features;And these are changed Or replace, the spirit and scope for various embodiments of the present invention technical solution that it does not separate the essence of the corresponding technical solution should all It is included within protection scope of the present invention.

Claims (10)

1. a kind of living broadcast interactive method, which is characterized in that the living broadcast interactive method includes the following steps that server-side executes:
The live video stream that the first client of real-time reception uploads;
Animation compound processing is carried out to the live video stream, obtains the first synthetic video stream;
The first synthetic video stream is sent to the second client to play out;
In the playing process of the first synthetic video stream, refer to if receiving the interactive operation that second client is sent Enable, then according to preset interactive operation with react the mapping relations between video, obtain the corresponding mesh of interactive operation instruction Mark reaction video, wherein the interactive operation instruction is generated by second client according to the touch operation of user;
Animated transition is carried out to the goal response video, and transformed goal response video and the current live streaming are regarded Frequency stream is synthesized, and the second synthetic video stream is obtained;
The second synthetic video stream is sent to second client to play out.
2. living broadcast interactive method as described in claim 1, which is characterized in that uploaded in the first client of the real-time reception Before live video stream, the living broadcast interactive method further includes the following steps that the server-side executes:
The interactive operation of preset quantity is set;
Obtain the interactive operation that first client is chosen, and receive first client the selection interactive operation The reaction video of lower upload;
Establish the interactive operation of the selection and the mapping relations reacted between video, and by the interactive operation of the selection Video association preservation is reacted with described.
3. living broadcast interactive method as claimed in claim 1 or 2, which is characterized in that described by the second synthetic video stream It is sent to after second client plays out, the living broadcast interactive method further includes the following step that the server-side executes Suddenly:
Obtain the user information of second client;
It is instructed according to the user information and the interactive operation, generates interactive operation record;
Interactive operation record is sent to first client and second client is shown.
4. a kind of living broadcast interactive method, which is characterized in that the living broadcast interactive method includes the following step of the second client executing Suddenly:
Receive and play the first synthetic video stream of server-side transmission so that show institute in the live streaming window of second client The first live video that the first synthetic video stream includes is stated, first live streaming is shown in the animation window of second client Corresponding first animated video of video;
In the playing process of the first synthetic video stream, image recognition is carried out to the image of first live video, really Determine regional location of the interactive objects in the live streaming window;
If detecting specific touch operation of the user in the live streaming window, operated according to the specific touch and described mutual The regional location of dynamic object determines that the specific touch operates corresponding interactive operation instruction;
Interactive operation instruction is sent to the server-side;
If receiving the second synthetic video stream that the server-side is sent, the second synthetic video stream is played so that The second live video that the second synthetic video stream includes is shown in the live streaming window, and second is shown in the animation window Animated video, wherein second animated video is corresponding for the goal response video instructed according to the interactive operation Animated video.
5. living broadcast interactive method as claimed in claim 4, which is characterized in that if described detect user in the live streaming window In specific touch operation, then according to the specific touch operation and the interactive objects the regional location, determine described in Specific touch operates corresponding interactive operation instruction:
If specific touch operation of the user in the live streaming window is detected, according to preset touch operation and interactive action Between correspondence, determine that the specific touch operates corresponding target interaction action;
Obtain target location of the specific touch operation in the live streaming window;
The target location is matched with the regional location of the interactive objects, it is corresponding to obtain the target location Target interactive objects;
According to target interaction action and the target interactive objects, determine that the specific touch operates corresponding interactive operation Instruction.
6. a kind of living broadcast interactive device, which is characterized in that the living broadcast interactive device includes server-side, and the server-side includes:
Live video receiving module, the live video stream uploaded for the first client of real-time reception;
First Video Composition module obtains the first synthetic video stream for carrying out animation compound processing to the live video stream;
First video sending module is played out for the first synthetic video stream to be sent to the second client;
Video matching module is reacted, is used in the playing process of the first synthetic video stream, if receiving second visitor The interactive operation instruction that family end is sent, then according to preset interactive operation with react the mapping relations between video, described in acquisition Interactive operation instructs corresponding goal response video, wherein the interactive operation instruction is by second client according to user Touch operation generate;
Second Video Composition module, for carrying out animated transition to the goal response video, and by transformed goal response Video is synthesized with the current live video stream, obtains the second synthetic video stream;
Second video sending module is played out for the second synthetic video stream to be sent to second client.
7. a kind of living broadcast interactive device, which is characterized in that the living broadcast interactive device includes the second client, second client End includes:
First video playback module, the first synthetic video stream for receiving and playing server-side transmission so that described second The live streaming window of client shows the first live video that the first synthetic video stream includes, in the dynamic of second client It draws window and shows corresponding first animated video of first live video;
Object Identification Module is used in the playing process of the first synthetic video stream, to the figure of first live video As carrying out image recognition, regional location of the interactive objects in the live streaming window is determined;
Interaction instruction generation module, if for detecting specific touch operation of the user in the live streaming window, according to institute The regional location for stating specific touch operation and the interactive objects determines that the specific touch operates corresponding interactive operation Instruction;
Interaction instruction sending module, for interactive operation instruction to be sent to the server-side;
Second video playback module, if the second synthetic video stream sent for receiving the server-side, plays institute State the second synthetic video stream so that show the second live video that the second synthetic video stream includes in the live streaming window, The second animated video is shown in the animation window, wherein second animated video is to be instructed according to the interactive operation The corresponding animated video of goal response video arrived.
8. living broadcast interactive device as claimed in claim 7, which is characterized in that the interaction instruction generation module includes:
Interaction action determination unit, if for detecting specific touch operation of the user in the live streaming window, according to pre- If touch operation and interactive action between correspondence, determine that the specific touch operates corresponding target interaction action;
Target location acquiring unit, for obtaining target location of the specific touch operation in the live streaming window;
Interactive objects determination unit, for the target location to be matched with the regional location of the interactive objects, Obtain the corresponding target interactive objects in the target location;
Interaction instruction determination unit, for according to target interaction action and the target interactive objects, determining described specific The corresponding interactive operation instruction of touch operation.
9. a kind of terminal device, including memory, processor and it is stored in the memory and can be on the processor The computer program of operation, which is characterized in that the processor realizes such as claims 1 to 3 when executing the computer program The step of any one of them living broadcast interactive method, alternatively, the processor realizes such as right when executing the computer program It is required that the step of living broadcast interactive method described in 4 or 5.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, feature to exist In the computer program realizes the step of living broadcast interactive method as described in any one of claims 1 to 3 when being executed by processor Suddenly, alternatively, the computer program realizes the step of living broadcast interactive method as described in claim 4 or 5 when being executed by processor Suddenly.
CN201810014420.3A 2018-01-08 2018-01-08 A kind of living broadcast interactive method, apparatus, terminal device and storage medium Active CN108462883B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810014420.3A CN108462883B (en) 2018-01-08 2018-01-08 A kind of living broadcast interactive method, apparatus, terminal device and storage medium
PCT/CN2018/077327 WO2019134235A1 (en) 2018-01-08 2018-02-27 Live broadcast interaction method and apparatus, and terminal device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810014420.3A CN108462883B (en) 2018-01-08 2018-01-08 A kind of living broadcast interactive method, apparatus, terminal device and storage medium

Publications (2)

Publication Number Publication Date
CN108462883A true CN108462883A (en) 2018-08-28
CN108462883B CN108462883B (en) 2019-10-18

Family

ID=63221253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810014420.3A Active CN108462883B (en) 2018-01-08 2018-01-08 A kind of living broadcast interactive method, apparatus, terminal device and storage medium

Country Status (2)

Country Link
CN (1) CN108462883B (en)
WO (1) WO2019134235A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110113256A (en) * 2019-05-14 2019-08-09 北京达佳互联信息技术有限公司 Information interaction method, device, server, user terminal and readable storage medium storing program for executing
CN110740346A (en) * 2019-10-23 2020-01-31 北京达佳互联信息技术有限公司 Video data processing method, device, server, terminal and storage medium
CN110989910A (en) * 2019-11-28 2020-04-10 广州虎牙科技有限公司 Interaction method, system, device, electronic equipment and storage medium
WO2020103657A1 (en) * 2018-11-19 2020-05-28 腾讯科技(深圳)有限公司 Video file playback method and apparatus, and storage medium
CN113038174A (en) * 2019-12-09 2021-06-25 上海幻电信息科技有限公司 Live video interaction method and device and computer equipment
CN113038149A (en) * 2019-12-09 2021-06-25 上海幻电信息科技有限公司 Live video interaction method and device and computer equipment
CN113448475A (en) * 2021-06-30 2021-09-28 广州博冠信息科技有限公司 Interaction control method and device for virtual live broadcast room, storage medium and electronic equipment
CN113542844A (en) * 2021-07-28 2021-10-22 北京优酷科技有限公司 Video data processing method, device and storage medium
CN114051151A (en) * 2021-11-23 2022-02-15 广州博冠信息科技有限公司 Live broadcast interaction method and device, storage medium and electronic equipment
CN114501103A (en) * 2022-01-25 2022-05-13 腾讯科技(深圳)有限公司 Interaction method, device and equipment based on live video and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103634681A (en) * 2013-11-29 2014-03-12 腾讯科技(成都)有限公司 Method, device, client end, server and system for live broadcasting interaction
CN105959718A (en) * 2016-06-24 2016-09-21 乐视控股(北京)有限公司 Real-time interaction method and device in video live broadcasting
CN106131692A (en) * 2016-07-14 2016-11-16 广州华多网络科技有限公司 Interactive control method based on net cast, device and server
CN106231368A (en) * 2015-12-30 2016-12-14 深圳超多维科技有限公司 Main broadcaster's class interaction platform stage property rendering method and device, client
CN106231434A (en) * 2016-07-25 2016-12-14 武汉斗鱼网络科技有限公司 A kind of living broadcast interactive specially good effect realization method and system based on Face datection
CN106878820A (en) * 2016-12-09 2017-06-20 北京小米移动软件有限公司 Living broadcast interactive method and device
CN107071580A (en) * 2017-03-20 2017-08-18 北京潘达互娱科技有限公司 Data processing method and device
CN107360160A (en) * 2017-07-12 2017-11-17 广州华多网络科技有限公司 live video and animation fusion method, device and terminal device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103634681A (en) * 2013-11-29 2014-03-12 腾讯科技(成都)有限公司 Method, device, client end, server and system for live broadcasting interaction
CN106231368A (en) * 2015-12-30 2016-12-14 深圳超多维科技有限公司 Main broadcaster's class interaction platform stage property rendering method and device, client
CN105959718A (en) * 2016-06-24 2016-09-21 乐视控股(北京)有限公司 Real-time interaction method and device in video live broadcasting
CN106131692A (en) * 2016-07-14 2016-11-16 广州华多网络科技有限公司 Interactive control method based on net cast, device and server
CN106231434A (en) * 2016-07-25 2016-12-14 武汉斗鱼网络科技有限公司 A kind of living broadcast interactive specially good effect realization method and system based on Face datection
CN106878820A (en) * 2016-12-09 2017-06-20 北京小米移动软件有限公司 Living broadcast interactive method and device
CN107071580A (en) * 2017-03-20 2017-08-18 北京潘达互娱科技有限公司 Data processing method and device
CN107360160A (en) * 2017-07-12 2017-11-17 广州华多网络科技有限公司 live video and animation fusion method, device and terminal device

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11528535B2 (en) 2018-11-19 2022-12-13 Tencent Technology (Shenzhen) Company Limited Video file playing method and apparatus, and storage medium
WO2020103657A1 (en) * 2018-11-19 2020-05-28 腾讯科技(深圳)有限公司 Video file playback method and apparatus, and storage medium
CN110113256A (en) * 2019-05-14 2019-08-09 北京达佳互联信息技术有限公司 Information interaction method, device, server, user terminal and readable storage medium storing program for executing
CN110740346A (en) * 2019-10-23 2020-01-31 北京达佳互联信息技术有限公司 Video data processing method, device, server, terminal and storage medium
CN110989910A (en) * 2019-11-28 2020-04-10 广州虎牙科技有限公司 Interaction method, system, device, electronic equipment and storage medium
CN113038174A (en) * 2019-12-09 2021-06-25 上海幻电信息科技有限公司 Live video interaction method and device and computer equipment
CN113038149A (en) * 2019-12-09 2021-06-25 上海幻电信息科技有限公司 Live video interaction method and device and computer equipment
US11889127B2 (en) 2019-12-09 2024-01-30 Shanghai Hode Information Technology Co., Ltd. Live video interaction method and apparatus, and computer device
US11778263B2 (en) 2019-12-09 2023-10-03 Shanghai Hode Information Technology Co., Ltd. Live streaming video interaction method and apparatus, and computer device
CN113448475A (en) * 2021-06-30 2021-09-28 广州博冠信息科技有限公司 Interaction control method and device for virtual live broadcast room, storage medium and electronic equipment
CN113542844A (en) * 2021-07-28 2021-10-22 北京优酷科技有限公司 Video data processing method, device and storage medium
CN114051151A (en) * 2021-11-23 2022-02-15 广州博冠信息科技有限公司 Live broadcast interaction method and device, storage medium and electronic equipment
CN114051151B (en) * 2021-11-23 2023-11-28 广州博冠信息科技有限公司 Live interaction method and device, storage medium and electronic equipment
CN114501103A (en) * 2022-01-25 2022-05-13 腾讯科技(深圳)有限公司 Interaction method, device and equipment based on live video and storage medium
CN114501103B (en) * 2022-01-25 2023-05-23 腾讯科技(深圳)有限公司 Live video-based interaction method, device, equipment and storage medium

Also Published As

Publication number Publication date
WO2019134235A1 (en) 2019-07-11
CN108462883B (en) 2019-10-18

Similar Documents

Publication Publication Date Title
CN108462883B (en) A kind of living broadcast interactive method, apparatus, terminal device and storage medium
CN110784752B (en) Video interaction method and device, computer equipment and storage medium
CN109922377B (en) Play control method and device, storage medium and electronic device
CN110213601A (en) A kind of live broadcast system and live broadcasting method based on cloud game, living broadcast interactive method
US8990842B2 (en) Presenting content and augmenting a broadcast
CN108010037B (en) Image processing method, device and storage medium
JP6550156B2 (en) Live streaming video generation method and apparatus, live service providing method and apparatus, and live streaming system
CN103067776B (en) Program push method, system and intelligent display device, cloud server
CN104869438B (en) The cloud dissemination method of live video cloud delivery system based on mobile terminal
CN110267053A (en) Live broadcasting method, apparatus and system
WO2019057034A1 (en) Method, device, storage medium and electronic device for determining video segment
CN104918061B (en) A kind of recognition methods of television channel and system
CN108174233A (en) A kind of live broadcasting method, device, server and medium
US20200404345A1 (en) Video system and video processing method, device and computer readable medium
CN109874059A (en) Method for showing interface, client and storage medium, computer equipment is broadcast live
US20170111667A1 (en) System and method for distributing media content associated with an event
WO2023083186A1 (en) Live streaming content processing method, electronic device, readable storage medium, and computer program product
CN109361954B (en) Video resource recording method and device, storage medium and electronic device
CN106375859B (en) A kind of media processing method, device and terminal
CN108040038A (en) The live method of network interdynamic, network main broadcaster end and user terminal
CN106162357A (en) Obtain the method and device of video content
CN111629222B (en) Video processing method, device and storage medium
CN113630630A (en) Method, device and equipment for processing dubbing information of video commentary
CN108289231A (en) A kind of panorama player of fusion
CN106375784A (en) Method and apparatus for commenting program and receiving comment information of program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant