CN107360160A - live video and animation fusion method, device and terminal device - Google Patents

live video and animation fusion method, device and terminal device Download PDF

Info

Publication number
CN107360160A
CN107360160A CN201710564127.XA CN201710564127A CN107360160A CN 107360160 A CN107360160 A CN 107360160A CN 201710564127 A CN201710564127 A CN 201710564127A CN 107360160 A CN107360160 A CN 107360160A
Authority
CN
China
Prior art keywords
animation
video stream
data
superimposed
upper strata
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710564127.XA
Other languages
Chinese (zh)
Inventor
库宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Cubesili Information Technology Co Ltd
Original Assignee
Guangzhou Huaduo Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huaduo Network Technology Co Ltd filed Critical Guangzhou Huaduo Network Technology Co Ltd
Priority to CN201710564127.XA priority Critical patent/CN107360160A/en
Publication of CN107360160A publication Critical patent/CN107360160A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/437Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Abstract

The embodiment of the invention discloses a kind of live video and animation fusion method, device and terminal device, wherein, live video and animation fusion method, comprise the following steps:Read the video data that main broadcaster end is recorded and pre-processed, obtain original video stream to be uploaded;Instruction is merged in response to the animation for the original video stream, the corresponding animation data of instruction reading is merged according to the animation and is superimposed on the original video stream upper strata, obtains synthetic video stream;Upload the synthetic video and flow to predetermined server so that the synthetic video stream is pushed into direct broadcasting room.The implementation of the present invention, it can effectively strengthen the video interactive effect in network direct broadcasting by original video stream and animation are merged, improve Consumer's Experience.

Description

Live video and animation fusion method, device and terminal device
Technical field
The present invention relates to media transport technology field, and specifically, the present invention relates to a kind of live video to merge with animation Method, apparatus and terminal device.
Background technology
With becoming increasingly popular for Internet technology, the development of stream media technology and terminal device equipment, nowadays network direct broadcasting Industry is increasingly mature, and also layer goes out major live platform as emerged rapidly in large numbersBamboo shoots after a spring rain, and network direct broadcasting is by increasing main flow people Group is received, and network direct broadcasting is with the pattern of novelty and easily services relatively and brings a kind of brand-new visual angle experience to netizen and regard Open country impact, the particularly characteristic for presenting and giving audient's due regard that two-way circulates of network, it is especially young by online friend The favor of netizen.
The maximum feature of network direct broadcasting be " interaction ", carried out due to live in the network platform, the autonomous selection of spectators and Participation has obtained huge extension.The interaction mode of network direct broadcasting is also from traditional word to picture and text, then to voice, and The epoch of video interactive are entered, for example, being that video flowing writes watermark using server, user sends comment in viewer end can Barrage etc. is added for video flowing.
However, in the prior art, often rendered video stream to server by main broadcaster end, then by server Direct broadcasting room is sent to, this has undoubtedly aggravated the burden of server, and exacerbates the phenomenon of data delay, result in and is difficult to reality When interactive, live and interaction effect it is poor the problems such as.
The content of the invention
It is an object of the invention to at least one aspect deficiency present on, there is provided a kind of live video melts with animation Method, apparatus and terminal device are closed, the video interactive effect in network direct broadcasting can be strengthened.
To achieve these goals, the present invention takes the technical scheme of following some aspects:
In a first aspect, providing a kind of live video and animation fusion method in the embodiment of the present invention, comprise the following steps:
Read the video data that main broadcaster end is recorded and pre-processed, obtain original video stream to be uploaded;
Instruction is merged in response to the animation for the original video stream, it is corresponding to merge instruction reading according to the animation Animation data is simultaneously superimposed on the original video stream upper strata, obtains synthetic video stream;
Upload the synthetic video and flow to predetermined server so that the synthetic video stream is pushed into direct broadcasting room.
With reference in a first aspect, of the invention in the first implementation of first aspect, the animation data includes Flash Animation data;It is described that the corresponding animation data of instruction reading is merged according to the animation and is superimposed in the original video stream Layer, including:
The parameter for setting Flash control Wmode attributes is Transparent or Opaque, or, in Flash controls Call setmode functions;
Flash animation datas are read by the Flash controls and are superimposed on the original video stream upper strata.
With reference in a first aspect, of the invention in second of implementation of first aspect, the animation data moves including Gif Draw data;It is described that the corresponding animation data of instruction reading is merged according to the animation and is superimposed on the original video stream upper strata, Including:
Gif animation datas are read, and obtains the number of the animation frame of Gif animation datas and shows the time of adjacent animation frame Interval;
The Gif animation datas are superimposed in the precalculated position on the original video stream upper strata according to the time interval successively Each animation frame.
With reference to first or second of implementation of first aspect, the third implementation of the present invention in first aspect In, the animation data is stored in the machine;It is described that the corresponding animation data of instruction reading is merged according to the animation and is superimposed on Before the step of original video stream upper strata, in addition to:Instruction is merged in response to the animation or according to predetermined week time Phase, the animation data is handled by following at least one mode:
MD5 verifications are carried out to one or more animation datas;The unmatched animation data of check value is deleted, and/or Delete the corresponding control of the unmatched animation data of check value in the user interface;
The routing information corresponding with the animation data recorded according to configuration file, search one or more animations Data;The routing information corresponding to the animation data that can not be found is deleted, and/or deletes the animation data that can not be found and exists Corresponding control in user interface.
It is described in the 4th kind of implementation of first aspect with reference to the third implementation of first aspect, the present invention Configuration file is encrypted state;The routing information corresponding with the animation data recorded according to configuration file, searches one Before item or the multinomial animation data, in addition to:The configuration file is decrypted by default algorithm.
It is described in the 5th kind of implementation of first aspect with reference to second of implementation of first aspect, the present invention It is superimposed each dynamic of the Gif animation datas in the precalculated position on the original video stream upper strata successively according to the time interval Frame is drawn, including:
Obtain the popularity value of direct broadcasting room corresponding to presently described main broadcaster end;
The time interval is accordingly adjusted according to the popularity value, and with the time interval after adjustment successively described original The precalculated position on video flowing upper strata is superimposed each animation frame of the Gif animation datas.
With reference to first or second of implementation of first aspect, six kind implementation of the present invention in first aspect In, it is described that the corresponding animation data of instruction reading is merged according to the animation and is superimposed on the original video stream upper strata, including:
Corresponding at least two animation data is read, at least two animation datas are accordingly superimposed on the original respectively successively The different precalculated positions on beginning video flowing upper strata;Or
Corresponding animation data is read, detects that an animation data is not superimposed when finishing, by current animation data The precalculated position on the original video stream upper strata is superimposed on so that the animation data obtained currently will not be completely covered by described upper one Animation data.
With reference to second of implementation of first aspect, the present invention is in the 7th kind of implementation of first aspect, described The corresponding animation data of instruction reading is merged according to the animation and is superimposed on the original video stream upper strata, in addition to:
The current pixel value of the original video stream is obtained, the animation number is accordingly adjusted according to the current pixel value According to so that the pixel value of the animation data matches with the current pixel value.
Second aspect, a kind of live video and animation fusing device are provided in the embodiment of the present invention, including:
Pretreatment module, for reading the video data of the machine recording and being pre-processed, obtain to be uploaded original regard Frequency flows;
Synthesis module, for merging instruction in response to the animation for the original video stream, merged according to the animation Instruction reads corresponding animation data and is superimposed on the original video stream upper strata, obtains synthetic video stream;
Pushing module, for uploading the synthetic video, to flow to predetermined server live so that the synthetic video stream to be pushed to Between.
The third aspect, provides a kind of terminal device in the embodiment of the present invention, including display screen, memory and one or Multiple processors, the memory are used to store the information for including application program, and the processor is used to control application program Perform, realize that first aspect or live described in its any one implementation regard described in the computing device during application program Frequency and animation fusion method.
Compared with prior art, the solution of the present invention at least has advantages below:
First, the present invention is merged original video stream and animation by main broadcaster end, and synthetic video stream is uploaded to To be pushed to direct broadcasting room, server no longer needs to render video flowing server during this, and only needs to push away video flowing Direct broadcasting room is delivered to, so as to reduce the bandwidth pressure of server, effectively improves video interactive effect.
Secondly, the present invention is by the way that animation data is merged with video flowing so that during main broadcaster is live, when in need When using the scene of Flash or Gif animations, live special effective function key only need to be clicked on, selects that oneself wants to move in floating window Draw, corresponding effect just can be merged in the live video of main broadcaster, alleviate the phenomenon of picture delay, make it is live more in real time, Vividly, the interaction effect of main broadcaster and spectators are optimized.
In addition, the popularity value that direct broadcasting room can be also easily corresponded to according to main broadcaster end adjusts animation fusion process, or according to original The pixel value of beginning video flowing accordingly adjusts the animation data for fusion, or when merging multiple animation datas by animation data according to The secondary precalculated position for being superimposed on original video stream upper strata, enriches the interaction mode between main broadcaster and spectators, improves interactive interest Taste and Consumer's Experience.
The additional aspect of the present invention and advantage will be set forth in part in the description, and these will become from the following description Obtain substantially, or recognized by the practice of the present invention.
Brief description of the drawings
Technical scheme in order to illustrate the embodiments of the present invention more clearly, make required in being described below to embodiment Accompanying drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the present invention, for For those skilled in the art, on the premise of not paying creative work, it can also be obtained according to these accompanying drawings other attached Figure.
Fig. 1 is the flow chart of the live video with animation fusion method of an embodiment of the present invention;
Fig. 2 is the interface schematic diagram before the live video of an embodiment of the present invention merges with animation;
Fig. 3 is the interface schematic diagram after the live video of an embodiment of the present invention merges with animation;
Fig. 4 is the level schematic diagram that the live video of an embodiment of the present invention merges with Flash animations;
Fig. 5 is the flow chart that the live video of an embodiment of the present invention merges with Gif animations;
Fig. 6 is the application architecture figure of an embodiment of the present invention;
Fig. 7 is the structured flowchart of the live video with animation fusing device of an embodiment of the present invention;
Fig. 8 is the structure principle chart of the terminal device of an embodiment of the present invention.
Embodiment
In order that those skilled in the art more fully understand the present invention program, below in conjunction with the embodiment of the present invention Accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described.
It will appreciated by the skilled person that unless otherwise defined, all terms used herein (including technology art Language and scientific terminology), there is the general understanding identical meaning with the those of ordinary skill in art of the present invention.Should also Understand, those terms defined in such as general dictionary, it should be understood that have with the context of prior art The consistent meaning of meaning, and unless by specific definitions as here, idealization or the implication of overly formal otherwise will not be used To explain.
It will appreciated by the skilled person that the file format of " Flash animation datas " used herein above can wrap The swf forms of Flash animations are included, vector graphics and animation issue are carried out on the internet for Macromedia Flash players File format, efficiently format analysis can be carried out.The file format of " Gif animation datas " used herein above is handed over for figure The Gif forms of exchange of notes part, Gif animations are made up of one group of animation frame for being separated by appointed interval time showing (picture).
It will appreciated by the skilled person that " terminal device " used herein above both includes wireless signal receiver Equipment, it only possesses the equipment of the wireless signal receiver of non-emissive ability, again include receive and transmitting hardware equipment, its With the reception that two-way communication on bidirectional communication link, can be carried out and the equipment of transmitting hardware.The method of the invention master To be applied to intelligent mobile phone terminal, tablet personal computer or terminal etc. has the terminal device of communication function, is not restricted to The type of its operating system.
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation describes, wherein same or similar label represents same or similar element or has same or like function from beginning to end Element.Obviously, described embodiment is only part of the embodiment of the present invention, rather than whole embodiments.Based on this Embodiment in invention, the every other implementation that those skilled in the art are obtained under the premise of creative work is not made Example, belongs to the scope of protection of the invention.
Fig. 1~Fig. 5, a kind of live video of the invention and animation fusion method are referred to, comprises the following steps S11- S13:
Step S11, read the video data that main broadcaster end is recorded and pre-processed, obtain original video stream to be uploaded.
In a kind of embodiment, terminal device generally carries or is circumscribed with camera, and main broadcaster end can be recorded by camera Video.In another embodiment, terminal device is preinstalled with corresponding video record software, and the video record software will can be used Some activities in the interface of family, such as play game, operation software, be recorded as video, so as to main broadcaster end can carry out playing it is live Or software teaching is live etc..Therefore, the video data that main broadcaster end is recorded specifically may include that what above-mentioned any-mode formed regards Frequency evidence.
After reading above-mentioned video data, it can be pre-processed according to the pre-processing instruction of user or program, including according to pre- Determine agreement to handle video data, obtain original video stream to be uploaded.Certainly, above-mentioned pretreatment may also include from video Video algorithm, such as whitening algorithm, special video effect algorithm are obtained in set of algorithms, to video data or original is regarded using video algorithm Frequency stream is handled.
Step S12, instruction is merged in response to the animation for the original video stream, and merging instruction according to the animation reads Take corresponding animation data and be superimposed on the original video stream upper strata, obtain synthetic video stream.
It is dynamic according to this after the animation fusion instruction for the original video stream that user or program triggering obtain for current anchor end Fusion instruction is drawn to read corresponding animation data and be superimposed on the original video stream upper strata.In a kind of embodiment, figure is referred to 2, Cao Zuoanjian corresponding to animation fusion instruction is shown in the user interface at the main broadcaster end.And animation fusion instruction is corresponding Animation data generally prestore or be cached in real time the machine, read animation data and be simultaneously superimposed on the original video stream upper strata, just It can obtain synthetic video stream so that live process is more lively, obtains more preferable interaction effect.In a kind of embodiment, refer to Fig. 3, the synthetic video stream that live video obtains after being merged with animation can also instant playback in user interface.
Specifically, above-mentioned animation data may include Flash animation datas or Gif animation datas, correspondingly, of the invention Embodiment is preferably as follows two ways and reads corresponding animation data and be superimposed on the original video stream upper strata:
First, the parameter for setting Flash control Wmode attributes is Transparent or Opaque, or, controlled in Flash Setmode functions are called in part;Flash animation datas are read by the Flash controls and are superimposed on the original video stream Upper strata;
Second, reading Gif animation datas, and obtain the number of the animation frame of Gif animation datas and show adjacent animation frame Time interval;The Gif animations are superimposed in the precalculated position on the original video stream upper strata according to the time interval successively Each animation frame of data.
For example, in the possible embodiment of the present invention, moved referring to Fig. 4, playing flash on the upper strata of original video stream Data are drawn, so as to which Flash animations are superimposed on the level of live video.Specifically, in order to prevent net cast video quilt Flash animations are completely covered, and can set corresponding pattern-word in the Wmode attributes of AXShockWaveFile controls in Flash Symbol string, is arranged to Transparent or Opaque, or called in Flash controls using VC++ by the parameter of Wmode attributes Setmode functions so that Flash controls read ground Flash animation datas and are superimposed on the original video stream upper strata, are synthesized Video flowing.
In addition, in other possible embodiments of the invention, referring to Fig. 5, step S12 may include step S122, that is, read The number of the animation frame of Gif animation datas is obtained after Gif animation datas and shows the time interval of adjacent animation frame;According to described Time interval is superimposed each animation frame of the Gif animation datas in the precalculated position on the original video stream upper strata successively.Tool Body, GDI+ can be used to play Gif animations, be moved by calling GetFrameDimensionsCount () function to obtain Gif The animation frame number in data is drawn, each adjacent animation frame is obtained by GetPropertyItem () function of Image objects Time interval.And then currently valid frame data is set, according to obtained time interval, successively on original video stream upper strata Precalculated position continue show Gif animation datas animation frame, just can live video upper strata play GIF animations, closed Into video flowing.
As an example it is supposed that the file storage location of Gif animation datas is " F:Byebye.gif ", then broadcast using GDI+ Putting the specific code of Gif animations can be:
On this basis, in order to reach the effect of more preferably live display or interaction, the present invention is gone back in certain embodiments Time interval and/or the precalculated position that can be used to be superimposed Gif animation datas according to adjustment is actually needed.A kind of for example, embodiment In, it is above-mentioned that the Gif animation datas are superimposed in the precalculated position on the original video stream upper strata according to the time interval successively Each animation frame, including:Obtain the popularity value of direct broadcasting room corresponding to presently described main broadcaster end;Accordingly adjusted according to the popularity value The whole time interval, and with the time interval after adjustment successively described in the superposition of the precalculated position on the original video stream upper strata Each animation frame of Gif animation datas.Typically the popularity value of the direct broadcasting room can obtain user's message by server to obtain Arrive, main broadcaster end adjusts above-mentioned time interval to work as forefathers according to based on default popularity value is corresponding to the mapping relations of time interval The time interval that gas value is mapped, or, based on default popularity value to be delayed multiple mapping relations it is corresponding adjust above-mentioned when Between interval be multiplied by the delay multiple that current popularity value is mapped, for being superimposed Gif animation datas.
During accordingly adjusting the time interval according to popularity value, the mapping relations of popularity value and time interval can be with For positive correlation or negative correlation, in a kind of possible design of the embodiment of the present invention, such as former time interval is 0.3 second, when popularity value For 10,000 when, delay multiple is 1.1, then is used to be superimposed Gif animation datas time interval for 0.33 second;When popularity value is 20,000, The multiple that is delayed is 1.2, then is used to be superimposed Gif animation datas time interval for 0.36 second.By live according to corresponding to main broadcaster end Between popularity value adjustment superposition Gif animation data animation frames interval time, can reduce bandwidth limitation and transmission packet loss to straight The influence of display is broadcast, optimizes live interaction effect.In the alternatively possible design of the embodiment of the present invention, according to the popularity value The time interval is accordingly adjusted, including:When determining that popularity value is more than preset value, time interval is narrowed down to 0.5-0.8 times;Or When determining that popularity value is more than the first preset value, time interval is narrowed down to 0.8 times, determines that popularity value is more than the second preset value, Time time interval narrows down to 0.5 times;First preset value is less than the second preset value.Between the time accordingly being adjusted according to popularity value Every so that show that the speed of animation frame associates with popularity value, improve live experience effect.In the example above, popularity value is got over Height, illustrate that the viewing number between current live is more, reduce time interval, can accelerate to show the speed of animation frame.On the one hand, add Fast display animation frame, between the direct broadcasting room more than number, is advantageous to spectators and quickly finishes watching the animation frame, continues viewing/understand other The information of human hair, in order to avoid miss the information of other human hairs;On the other hand, accelerate to show animation frame, it is a kind of lively to be advantageous to construction Atmosphere, promote interactive, the influence of direct broadcasting room of the atmosphere more than the number is apparent.
In another embodiment, live video is influenceed with moving in order to prevent multiple animation datas because of reading from mutually covering The syncretizing effect of picture, it is above-mentioned that the corresponding animation data of instruction reading is merged according to the animation and is superimposed on the original video stream Upper strata, including:Corresponding at least two animation data is read, is successively accordingly superimposed at least two animation datas respectively described The different precalculated positions on original video stream upper strata;Or corresponding animation data is read, an animation data is not folded on detecting When adding complete, current animation data is superimposed on the precalculated position on the original video stream upper strata so as to obtain animation number currently According to a upper animation data will not be completely covered by, Consumer's Experience is effectively improved.
In practical application scene, due to being stored in the animation data of the machine and currently available original video stream often color Differ greatly or collocation of colour is ineffective, therefore, further, in a kind of embodiment of the invention, described in above-mentioned basis Animation fusion instruction reads corresponding animation data and is superimposed on the original video stream upper strata, may also include:Obtain the original The current pixel value of beginning video flowing, the animation data is accordingly adjusted according to the current pixel value so that the animation data Pixel value match with the current pixel value.Specifically, according to the current pixel value of original video stream, made by oneself based on user Justice or the default matching strategy of client, are handled the pixel value of the animation data for superposition so that described accordingly The pixel value of animation data matches with the current pixel value, and the animation presentation for being superimposed on original video stream upper strata is desalinated, thoroughly Lightness change and other effects, so as to make the fusion of live video and animation more intelligent.
As previously described, because animation data is stored in the machine, the normal operation merged for guarantee live video with animation is a kind of Main broadcaster end can carry out compatible processing to animation data in embodiment.For example, it is live in order to advocate green, prevent animation data Distorted by user, the file substitute mode illegal including the use of some changes animation data, is merged above-mentioned according to the animation Instruction is read corresponding animation data and is superimposed on before the original video stream upper strata, in addition to:Melt in response to the animation Close instruction or according to the predetermined time cycle, the animation data is handled by following at least one mode:
MD5 verifications are carried out to one or more animation datas;The unmatched animation data of check value is deleted, and/or Delete the corresponding control of the unmatched animation data of check value in the user interface;
The routing information corresponding with the animation data recorded according to configuration file, search one or more animations Data;The routing information corresponding to the animation data that can not be found is deleted, and/or deletes the animation data that can not be found and exists Corresponding control in user interface.
In addition, it is encrypted state that the configuration file can be also set according to real needs;Needing to call configuration file, root Before searching animation data according to the routing information corresponding with the animation data of configuration file record, then pass through default algorithm pair The configuration file is decrypted, and to ensure that domestic consumer can not open random editor, is further ensured that configuration file and animation Data safety.
Step S13, upload the synthetic video and flow to predetermined server so that the synthetic video stream is pushed into direct broadcasting room.
After obtaining synthetic video stream, upload the synthetic video and flow to predetermined server so that the synthetic video stream to be pushed to Direct broadcasting room.It should be noted that in certain embodiments, such as when the present invention is implemented in the assistance application at main broadcaster end, upload The striding course that the process that synthetic video flow to predetermined server can be also included between at least two processes communicates, and specifically may include to lead to Cross striding course communication and synthetic video is streamed to main broadcaster end, then the synthetic video is uploaded to server with should by main broadcaster end Synthetic video stream pushes to direct broadcasting room.The mode of striding course communication may include:Broadcast, interface access, object accesses, shared visit Ask.
The above-mentioned process that synthetic video is flow to predetermined server further may include according to pre-defined rule by described in Synthetic video stream uploads onto the server, and the pre-defined rule can be that the behavior that the synthetic video stream is uploaded to direct broadcast server is advised Model, detailed process can be the detection instruction of the data integrity for the packet that will generate synthetic video stream, and wherein the packet is not Only include the synthetic video stream, can also include audio stream, the broadcast data and barrage that are sent for live middle main broadcaster Data, when possess have broadcast data and/or the barrage data when, the broadcast data and/or barrage data are incorporated into institute State in packet, conversion instruction need to be triggered when the packet is being performed uploading instructions with by the packet of synthetic video stream Be converted to the electric signal suitable for sending.
In the possible embodiment of the present invention, referring to Fig. 6, the live video application architecture related to animation fusion method can It is divided into three levels, includes application layer, multimedia interface layer, multimedia service layer.
Wherein, the application layer is responsible for the main logic of processing business, including shows the main work(of application program of user Can interface, and interface logic as user when using certain specific function shows, and for example, it mainly may include:
Mix Video Manager (mixed video manager) provide the process of the video flowing of application program generation mixing In associative operation function interface;
Video Container (video container) are used for the related data for accommodating application program input or output;
Mix Audio Manager (mixed audio manager) provide the process of the audio stream of application program generation mixing In associative operation function interface.
The Each part of its application layer illustrates different interface typesettings, and improve interface display has Effect property, more convenient favourable platform is provided for audio frequency and video processing.
In the embodiment of the present invention, then predominant package pulls out the related class of algorithms, including whitening to the multimedia service layer The class of algorithms, the special video effect class of algorithms, noise reduction class of algorithms etc.;Specific algorithm realizes that class is directed to and itself inputs and export What is, so as to which application layer and multimedia service layer are logically isolated entirely from, wherein mainly may include: CCapture Video (video capture), CAccompany Music (accompaniment music), CMedia Video (media video), CMood Music (atmosphere music), CDesktop Video (desktop video), CDecoration Layer (decorative layer).Wherein The CAccompany Music, CMedia Video, CMood Music threes carry out the related class of algorithms by common, realize It is unitized;The CDecoration Layer mainly pull out related algorithm.
In the embodiment of the present invention, wherein the application layer and multimedia service layer mainly pass through multimedia interface layer Two-way communication is realized, the media interface layer is mainly responsible for providing bilevel logical communication channel, and it mainly may include: IDecoration Layer Notify (decorative layer notice), IPlay Accompany Music (accompaniment music broadcasting), ICapture Video Notify (capture video notification), IPlay Media Video (media video broadcasting), IPaint Photo (picture drafting), IDesktop Video CB (desktop video), ICapture Video (video capture), IAccompany Music (accompaniment music), IMedia Video (media video), IMood Music (atmosphere music), IDecoration Layer (decorative layer), IVcam Manger.The distribution of its each several part unit module simplifies multimedia and connect Mouth, reduce the occupancy of interface resource.
It will be understood by those skilled in the art that because multimedia core block code (vcambiz) is more and more chaotic and superfluous It is remaining, thus the framework for three levels for passing through the application program application program on the associated maintenance of down-stream more just Victory, realize a set of plug-in unit multimedia data stream service framework rich in expansion.
It could be aware that implementation of the invention can by the live video to the present invention and the announcement of animation fusion method By the way that original video stream and animation are merged, effectively strengthen the video interactive effect in network direct broadcasting, improve user's body Test.
According to modularized design thinking, the present invention is on the basis of above-mentioned live video and animation fusion method, further It is proposed a kind of live video and animation fusing device.
Refer to Fig. 2~Fig. 7, a kind of live video of the invention and animation fusing device, including pretreatment module 11, conjunction It is as follows into module 12 and pushing module 13, the function introduction of wherein each unit:
Pretreatment module 11, for reading the video data of the machine recording and being pre-processed, obtain to be uploaded original Video flowing.
In a kind of embodiment, terminal device generally carries or is circumscribed with camera, and main broadcaster end can be recorded by camera Video.In another embodiment, terminal device is preinstalled with corresponding video record software, and the video record software will can be used Some activities in the interface of family, such as play game, operation software, be recorded as video, so as to main broadcaster end can carry out playing it is live Or software teaching is live etc..Therefore, the video data that main broadcaster end is recorded specifically may include that what above-mentioned any-mode formed regards Frequency evidence.
After pretreatment module 11 reads above-mentioned video data, it can be located in advance according to the pre-processing instruction of user or program Reason, including video data is handled according to predetermined protocol, obtain original video stream to be uploaded.Certainly, above-mentioned pretreatment May also include from video algorithm concentrate obtain video algorithm, such as whitening algorithm, special video effect algorithm, using video algorithm to regarding Frequency evidence or original video stream are handled.
Synthesis module 12, for merging instruction in response to the animation for the original video stream, melted according to the animation Close instruction to read corresponding animation data and be superimposed on the original video stream upper strata, obtain synthetic video stream.
It is dynamic according to this after the animation fusion instruction for the original video stream that user or program triggering obtain for current anchor end Fusion instruction is drawn to read corresponding animation data and be superimposed on the original video stream upper strata.In a kind of embodiment, figure is referred to 2, Cao Zuoanjian corresponding to animation fusion instruction is shown in the user interface at the main broadcaster end.And animation fusion instruction is corresponding Animation data generally prestore or be cached in real time the machine, read animation data and be simultaneously superimposed on the original video stream upper strata, just It can obtain synthetic video stream so that live process is more lively, obtains more preferable interaction effect.In a kind of embodiment, refer to Fig. 3, the synthetic video stream that live video obtains after being merged with animation can also instant playback in user interface.
Specifically, above-mentioned animation data may include Flash animation datas or Gif animation datas, correspondingly, of the invention Synthesis module 12 is preferably as follows two ways and reads corresponding animation data and be superimposed in the original video stream in embodiment Layer:
First, the parameter for setting Flash control Wmode attributes is Transparent or Opaque, or, controlled in Flash Setmode functions are called in part;Flash animation datas are read by the Flash controls and are superimposed on the original video stream Upper strata;
Second, reading Gif animation datas, and obtain the number of the animation frame of Gif animation datas and show adjacent animation frame Time interval;The Gif animations are superimposed in the precalculated position on the original video stream upper strata according to the time interval successively Each animation frame of data.
For example, in the possible embodiment of the present invention, moved referring to Fig. 4, playing flash on the upper strata of original video stream Data are drawn, so as to which Flash animations are superimposed on the level of live video.Specifically, in order to prevent net cast video quilt Flash animations are completely covered, and can set corresponding pattern-word in the Wmode attributes of AXShockWaveFile controls in Flash Symbol string, is arranged to Transparent or Opaque, or called in Flash controls using VC++ by the parameter of Wmode attributes Setmode functions so that Flash controls read ground Flash animation datas and are superimposed on the original video stream upper strata, are synthesized Video flowing.
In addition, in other possible embodiments of the invention, referring to Fig. 5, obtaining Gif animations after reading Gif animation datas The number of the animation frame of data and the time interval for showing adjacent animation frame;Original regarded described successively according to the time interval The precalculated position on frequency stream upper strata is superimposed each animation frame of the Gif animation datas.Specifically, GDI+ can be used to play Gif Animation, by calling GetFrameDimensionsCount () function to obtain the animation frame number in Gif animation datas, pass through GetPropertyItem () function of Image objects obtains the time interval of each adjacent animation frame.And then setting currently has The frame data of effect, according to obtained time interval, continue to show Gif animations successively in the precalculated position on original video stream upper strata The animation frame of data, just GIF animations can be played on live video upper strata, obtain synthetic video stream.
As an example it is supposed that the file storage location of Gif animation datas is " F:Byebye.gif ", then broadcast using GDI+ Putting the specific code of Gif animations can be:
On this basis, in order to reach the effect of more preferably live display or interaction, the present invention is gone back in certain embodiments Time interval and/or the precalculated position that can be used to be superimposed Gif animation datas according to adjustment is actually needed.A kind of for example, embodiment In, it is above-mentioned that the Gif animation datas are superimposed in the precalculated position on the original video stream upper strata according to the time interval successively Each animation frame, including:Obtain the popularity value of direct broadcasting room corresponding to presently described main broadcaster end;Accordingly adjusted according to the popularity value The whole time interval, and with the time interval after adjustment successively described in the superposition of the precalculated position on the original video stream upper strata Each animation frame of Gif animation datas.Typically the popularity value of the direct broadcasting room can obtain user's message by server to obtain Arrive, main broadcaster end adjusts above-mentioned time interval to work as forefathers according to based on default popularity value is corresponding to the mapping relations of time interval The time interval that gas value is mapped, or, based on default popularity value to be delayed multiple mapping relations it is corresponding adjust above-mentioned when Between interval be multiplied by the delay multiple that current popularity value is mapped, for being superimposed Gif animation datas.
During accordingly adjusting the time interval according to popularity value, the mapping relations of popularity value and time interval can be with For positive correlation or negative correlation, in a kind of possible design of the embodiment of the present invention, such as former time interval is 0.3 second, when popularity value For 10,000 when, delay multiple is 1.1, then is used to be superimposed Gif animation datas time interval for 0.33 second;When popularity value is 20,000, The multiple that is delayed is 1.2, then is used to be superimposed Gif animation datas time interval for 0.36 second.By live according to corresponding to main broadcaster end Between popularity value adjustment superposition Gif animation data animation frames interval time, can reduce bandwidth limitation and transmission packet loss to straight The influence of display is broadcast, optimizes live interaction effect.In the alternatively possible design of the embodiment of the present invention, according to the popularity value The time interval is accordingly adjusted, including:When determining that popularity value is more than preset value, time interval is narrowed down to 0.5-0.8 times;Or When determining that popularity value is more than the first preset value, time interval is narrowed down to 0.8 times, determines that popularity value is more than the second preset value, Time time interval narrows down to 0.5 times;First preset value is less than the second preset value.Between the time accordingly being adjusted according to popularity value Every so that show that the speed of animation frame associates with popularity value, improve live experience effect.In the example above, popularity value is got over Height, illustrate that the viewing number between current live is more, reduce time interval, can accelerate to show the speed of animation frame.On the one hand, add Fast display animation frame, between the direct broadcasting room more than number, is advantageous to spectators and quickly finishes watching the animation frame, continues viewing/understand other The information of human hair, in order to avoid miss the information of other human hairs;On the other hand, accelerate to show animation frame, it is a kind of lively to be advantageous to construction Atmosphere, promote interactive, the influence of direct broadcasting room of the atmosphere more than the number is apparent.
In another embodiment, live video is influenceed with moving in order to prevent multiple animation datas because of reading from mutually covering The syncretizing effect of picture, it is above-mentioned that the corresponding animation data of instruction reading is merged according to the animation and is superimposed on the original video stream Upper strata, including:Corresponding at least two animation data is read, is successively accordingly superimposed at least two animation datas respectively described The different precalculated positions on original video stream upper strata;Or corresponding animation data is read, an animation data is not folded on detecting When adding complete, current animation data is superimposed on the precalculated position on the original video stream upper strata so as to obtain animation number currently According to a upper animation data will not be completely covered by, Consumer's Experience is effectively improved.
In practical application scene, due to being stored in the animation data of the machine and currently available original video stream often color Differ greatly or collocation of colour is ineffective, therefore, further, in a kind of embodiment of the invention, described in above-mentioned basis Animation fusion instruction reads corresponding animation data and is superimposed on the original video stream upper strata, may also include:Obtain the original The current pixel value of beginning video flowing, the animation data is accordingly adjusted according to the current pixel value so that the animation data Pixel value match with the current pixel value.Specifically, according to the current pixel value of original video stream, made by oneself based on user Justice or the default matching strategy of client, are handled the pixel value of the animation data for superposition so that described accordingly The pixel value of animation data matches with the current pixel value, and the animation presentation for being superimposed on original video stream upper strata is desalinated, thoroughly Lightness change and other effects, so as to make the fusion of live video and animation more intelligent.
As previously described, because animation data is stored in the machine, the normal operation merged for guarantee live video with animation is a kind of Main broadcaster end can carry out compatible processing to animation data in embodiment.For example, it is live in order to advocate green, prevent animation data Distorted by user, the file substitute mode illegal including the use of some changes animation data, is merged above-mentioned according to the animation Instruction is read corresponding animation data and is superimposed on before the original video stream upper strata, in addition to:Melt in response to the animation Close instruction or according to the predetermined time cycle, the animation data is handled by following at least one mode:
MD5 verifications are carried out to one or more animation datas;The unmatched animation data of check value is deleted, and/or Delete the corresponding control of the unmatched animation data of check value in the user interface;
The routing information corresponding with the animation data recorded according to configuration file, search one or more animations Data;The routing information corresponding to the animation data that can not be found is deleted, and/or deletes the animation data that can not be found and exists Corresponding control in user interface.
In addition, it is encrypted state that the configuration file can be also set according to real needs;Needing to call configuration file, root Before searching animation data according to the routing information corresponding with the animation data of configuration file record, then pass through default algorithm pair The configuration file is decrypted, and to ensure that domestic consumer can not open random editor, is further ensured that configuration file and animation Data safety.
Pushing module 13, predetermined server is flow to so that the synthetic video stream to be pushed to directly for uploading the synthetic video Between broadcasting.
After obtaining synthetic video stream, pushing module 13 uploads the synthetic video and flow to predetermined server so that the synthesis to be regarded Frequency stream pushes to direct broadcasting room.It should be noted that in certain embodiments, such as when assistance application of the present invention at main broadcaster end is real Shi Shi, upload the process that synthetic video flow to predetermined server and can also have comprising the striding course communication between at least two processes Body may include synthetic video is streamed into main broadcaster end by striding course communication, then upload the synthetic video to clothes by main broadcaster end Device be engaged in so that the synthetic video stream is pushed into direct broadcasting room.The mode of striding course communication may include:Broadcast, interface access, object Access, share and access etc..
The process that synthetic video flow to predetermined server by pushing module 13 further may include according to pre-defined rule The synthetic video stream is uploaded onto the server, the pre-defined rule can be that the synthetic video stream is uploaded into direct broadcast server Behavioural norm, detailed process can be to instruct the detection of the data integrity for the packet for generating synthetic video stream, wherein the number Not only include the synthetic video stream according to bag, can also include audio stream, the broadcast data sent for live middle main broadcaster with And barrage data, when possess have broadcast data and/or the barrage data when, the broadcast data and/or barrage data are closed And to conversion instruction in the packet, need to be triggered when the packet is being performed uploading instructions with by synthetic video stream Packet is converted to the electric signal suitable for sending.
In the possible embodiment of the present invention, referring to Fig. 6, the live video application architecture related to animation fusing device can It is divided into three levels, includes application layer, multimedia interface layer, multimedia service layer.
Wherein, the application layer is responsible for the main logic of processing business, including shows the main work(of application program of user Can interface, and interface logic as user when using certain specific function shows, and for example, it mainly may include:
Mix Video Manager (mixed video manager) provide the process of the video flowing of application program generation mixing In associative operation function interface;
Video Container (video container) are used for the related data for accommodating application program input or output;
Mix Audio Manager (mixed audio manager) provide the process of the audio stream of application program generation mixing In associative operation function interface.
The Each part of its application layer illustrates different interface typesettings, and improve interface display has Effect property, more convenient favourable platform is provided for audio frequency and video processing.
In the embodiment of the present invention, then predominant package pulls out the related class of algorithms, including whitening to the multimedia service layer The class of algorithms, the special video effect class of algorithms, noise reduction class of algorithms etc.;Specific algorithm realizes that class is directed to and itself inputs and export What is, so as to which application layer and multimedia service layer are logically isolated entirely from, wherein mainly may include: CCapture Video (video capture), CAccompany Music (accompaniment music), CMedia Video (media video), CMood Music (atmosphere music), CDesktop Video (desktop video), CDecoration Layer (decorative layer).Wherein The CAccompany Music, CMedia Video, CMood Music threes carry out the related class of algorithms by common, realize It is unitized;The CDecoration Layer mainly pull out related algorithm.
In the embodiment of the present invention, wherein the application layer and multimedia service layer mainly pass through multimedia interface layer Two-way communication is realized, the media interface layer is mainly responsible for providing bilevel logical communication channel, and it mainly may include: IDecoration Layer Notify (decorative layer notice), IPlay Accompany Music (accompaniment music broadcasting), ICapture Video Notify (capture video notification), IPlay Media Video (media video broadcasting), IPaint Photo (picture drafting), IDesktop Video CB (desktop video), ICapture Video (video capture), IAccompany Music (accompaniment music), IMedia Video (media video), IMood Music (atmosphere music), IDecoration Layer (decorative layer), IVcam Manger.The distribution of its each several part unit module simplifies multimedia and connect Mouth, reduce the occupancy of interface resource.
It will be understood by those skilled in the art that because multimedia core block code (vcambiz) is more and more chaotic and superfluous It is remaining, thus the framework for three levels for passing through the application program application program on the associated maintenance of down-stream more just Victory, realize a set of plug-in unit multimedia data stream service framework rich in expansion.
It could be aware that implementation of the invention can by the announcement of live video and animation fusing device to the present invention By the way that original video stream and animation are merged, effectively strengthen the video interactive effect in network direct broadcasting, improve user's body Test.
Referring to Fig. 8, further provide a kind of terminal device in another embodiment of the present invention, including display screen 701, deposit Reservoir 702 and one or more processors 704, the memory 702 are used to store the information for including application program 705, institute The execution that processor 704 is used to control application program 705 is stated, the processor 704 is realized when performing the application program 705 State live video and the step in animation fusion method.
Wherein, specifically, display screen 701 can be used for display by the information of user's input or be supplied to the information of user with And the various menus of mobile phone.Display unit may include display panel, optionally, can use liquid crystal display (Liquid Crystal Display, LCD), the form such as Organic Light Emitting Diode (Organic Light-Emitting Diode, OLED) To configure display panel.Further, contact panel can cover display panel, when contact panel is detected on or near it After touch operation, processor 704 is sent to determine the type of touch event, is followed by subsequent processing class of the device 704 according to touch event Type provides corresponding visual output on a display panel.Although in some embodiments, contact panel and display panel are as two Individual independent part realizes input and output function, but in some embodiments it is possible to by contact panel and display panel Integrate and realize input and the output function of mobile phone.
Memory 702 can be used for storage application program and module, and processor 704 is stored in memory 702 by operation Application program and module, so as to perform the various function application of terminal device and data processing.Memory 702 can be main Including storing program area and storage data field, wherein, storing program area can storage program area, needed at least one function should With program 705 (such as sound-playing function, image player function etc.) etc.;Storage data field can store uses institute according to mobile phone Data of establishment etc..In addition, memory 702 can include high random access memory block 702, non-volatile deposit can also be included Storage area 702, for example, at least a disk memory, flush memory device or other volatile solid-state parts.
Communication interface 703, for terminal device in above-mentioned interaction and other equipment or communication.Communication connects Mouth 703 is the interface that processor 704 is communicated with extraneous subsystem, for information between processor 704 and ambient systems Transmission, to reach the purpose of control subsystem.
Processor 704 is the control centre of terminal device, is set using various communication interfaces 703 and the whole terminal of connection Standby various pieces, by running or performing the application program and/or module that are stored in memory block 702, and call storage Data in memory block 702, the various functions and processing data of terminal device are performed, so as to carry out overall prison to terminal device Control.Optionally, processor 704 may include one or more processing units;Preferably, processor 704 can integrate application processor And modem processor, wherein, application processor mainly handles operating system, user interface and application program 705 etc., modulation Demodulation processor mainly handles radio communication.It is understood that above-mentioned modem processor can not also be integrated into processing In device 704.
One or more application programs 705, it is preferable that these application programs 705 are stored in the memory block 702 And be configured as being performed by one or more of processors 704, one or more of programs be configured as live video with The function that any embodiment of animation fusion method is realized.
In embodiments of the present invention, the processor 704 included by the terminal device also has following functions:
Read the video data that main broadcaster end is recorded and pre-processed, obtain original video stream to be uploaded;
Instruction is merged in response to the animation for the original video stream, it is corresponding to merge instruction reading according to the animation Animation data is simultaneously superimposed on the original video stream upper strata, obtains synthetic video stream;
Upload the synthetic video and flow to predetermined server so that the synthetic video stream is pushed into direct broadcasting room.
A kind of computer-readable storage medium is additionally provided in the embodiment of the present invention, for saving as used in above-mentioned terminal device Computer software instructions, it, which is included, is used to perform the above-mentioned program designed by the terminal device.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description, The specific work process of device and unit, the corresponding process in preceding method embodiment is may be referred to, will not be repeated here.
It could be aware that by the announcement of terminal device to the present invention, implementation of the invention, can be by by original video Stream and animation are merged, and are effectively strengthened the video interactive effect in network direct broadcasting, are improved Consumer's Experience.
One of ordinary skill in the art will appreciate that all or part of step in the various methods of above-described embodiment is can To instruct the hardware of correlation to complete by program, the program can be stored in a calculating machine readable storage medium storing program for executing, storage Medium can include but is not limited to:Any kind of disk (including floppy disk, hard disk, CD, CD-ROM and magneto-optic disk), ROM (Read-Only Memory, read-only storage), RAM (Random Access Memory, immediately memory), EPROM (Erasable Programmable Read-Only Memory, Erarable Programmable Read only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory, EEPROM), Flash memory, magnetic card or light card.
Detailed Jie has been carried out to live video provided by the present invention and animation fusion method, device and terminal device above Continue, for those of ordinary skill in the art, under the premise without departing from the principles of the invention, in embodiment and using model There will be changes are placed, in summary, this specification content should not be construed as limiting the invention.

Claims (10)

1. a kind of live video and animation fusion method, it is characterised in that comprise the following steps:
Read the video data that main broadcaster end is recorded and pre-processed, obtain original video stream to be uploaded;
Instruction is merged in response to the animation for the original video stream, merging instruction according to the animation reads corresponding animation Data are simultaneously superimposed on the original video stream upper strata, obtain synthetic video stream;
Upload the synthetic video and flow to predetermined server so that the synthetic video stream is pushed into direct broadcasting room.
2. live video according to claim 1 and animation fusion method, it is characterised in that the animation data includes Flash animation datas;It is described that the corresponding animation data of instruction reading is merged according to the animation and is superimposed on the original video Upper strata is flowed, including:
The parameter for setting Flash control Wmode attributes is Transparent or Opaque, or, called in Flash controls Setmode functions;
Flash animation datas are read by the Flash controls and are superimposed on the original video stream upper strata.
3. live video according to claim 1 and animation fusion method, it is characterised in that the animation data includes Gif animation datas;It is described that the corresponding animation data of instruction reading is merged according to the animation and is superimposed on the original video stream Upper strata, including:
Gif animation datas are read, and are obtained between the number of the animation frame of Gif animation datas and the time of the adjacent animation frame of display Every;
The each of the Gif animation datas is superimposed in the precalculated position on the original video stream upper strata according to the time interval successively Individual animation frame.
4. live video and animation fusion method according to Claims 2 or 3, it is characterised in that the animation data is deposited It is stored in the machine;
It is described that the corresponding animation data of instruction reading is merged according to the animation and is superimposed on the step on the original video stream upper strata Before rapid, in addition to:Instruction is merged in response to the animation or according to the predetermined time cycle, passes through following at least one mode Handle the animation data:
MD5 verifications are carried out to one or more animation datas;The unmatched animation data of check value is deleted, and/or is deleted The corresponding control of the unmatched animation data of check value in the user interface;
The routing information corresponding with the animation data recorded according to configuration file, search one or more animation numbers According to;Delete the routing information corresponding to the animation data that can not find, and/or delete the animation data that can not find with Corresponding control in the interface of family.
5. live video according to claim 4 and animation fusion method, it is characterised in that the configuration file is encryption State;
The routing information corresponding with the animation data recorded according to configuration file, searches one or more animations Before data, in addition to:The configuration file is decrypted by default algorithm.
6. live video according to claim 3 and animation fusion method, it is characterised in that described according between the time Every each animation frame for being superimposed the Gif animation datas in the precalculated position on the original video stream upper strata successively, including:
Obtain the popularity value of direct broadcasting room corresponding to presently described main broadcaster end;
The time interval is accordingly adjusted according to the popularity value, and with the time interval after adjustment successively in the original video The precalculated position for flowing upper strata is superimposed each animation frame of the Gif animation datas.
7. live video and animation fusion method according to Claims 2 or 3, it is characterised in that described according to described dynamic Fusion instruction is drawn to read corresponding animation data and be superimposed on the original video stream upper strata, including:
Corresponding at least two animation data is read, at least two animation datas are accordingly superimposed on to described original regard respectively successively Frequency flows the different precalculated positions on upper strata;Or
Corresponding animation data is read, detects that an animation data is not superimposed when finishing, current animation data is superimposed In the precalculated position on the original video stream upper strata so that the animation data obtained currently will not be completely covered by a upper animation Data.
8. live video according to claim 3 and animation fusion method, it is characterised in that described to be melted according to the animation Instruction is closed to read corresponding animation data and be superimposed on the original video stream upper strata, in addition to:
The current pixel value of the original video stream is obtained, the animation data is accordingly adjusted according to the current pixel value, made The pixel value for obtaining the animation data matches with the current pixel value.
9. a kind of live video and animation fusing device, it is characterised in that including:
Pretreatment module, for reading the video data of the machine recording and being pre-processed, obtain original video stream to be uploaded;
Synthesis module, for merging instruction in response to the animation for the original video stream, merged and instructed according to the animation Read corresponding animation data and be superimposed on the original video stream upper strata, obtain synthetic video stream;
Pushing module, predetermined server is flow to so that the synthetic video stream is pushed into direct broadcasting room for uploading the synthetic video.
A kind of 10. terminal device, it is characterised in that including display screen, memory and one or more processors, the storage Device is used to store the information for including application program, and the processor is used for the execution for controlling application program, the computing device The live video described in claim 1-8 any one and the step in animation fusion method are realized during the application program.
CN201710564127.XA 2017-07-12 2017-07-12 live video and animation fusion method, device and terminal device Pending CN107360160A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710564127.XA CN107360160A (en) 2017-07-12 2017-07-12 live video and animation fusion method, device and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710564127.XA CN107360160A (en) 2017-07-12 2017-07-12 live video and animation fusion method, device and terminal device

Publications (1)

Publication Number Publication Date
CN107360160A true CN107360160A (en) 2017-11-17

Family

ID=60293559

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710564127.XA Pending CN107360160A (en) 2017-07-12 2017-07-12 live video and animation fusion method, device and terminal device

Country Status (1)

Country Link
CN (1) CN107360160A (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107886557A (en) * 2017-12-02 2018-04-06 天津大行道动漫文化发展有限公司 A kind of cartoon generation system based on Media Stream
CN108377243A (en) * 2018-02-24 2018-08-07 武汉斗鱼网络科技有限公司 A kind of transmission method and device of live TV stream
CN108391139A (en) * 2018-01-15 2018-08-10 上海掌门科技有限公司 A kind of video enhancement method, medium and equipment in net cast
CN108462883A (en) * 2018-01-08 2018-08-28 平安科技(深圳)有限公司 A kind of living broadcast interactive method, apparatus, terminal device and storage medium
CN108521584A (en) * 2018-04-20 2018-09-11 广州虎牙信息科技有限公司 Interactive information processing method, device, main broadcaster's side apparatus and medium
CN108712661A (en) * 2018-05-28 2018-10-26 广州虎牙信息科技有限公司 A kind of live video processing method, device, equipment and storage medium
CN109040766A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 live video processing method, device and storage medium
CN109168027A (en) * 2018-10-25 2019-01-08 北京字节跳动网络技术有限公司 Instant video methods of exhibiting, device, terminal device and storage medium
CN109327727A (en) * 2018-11-20 2019-02-12 网宿科技股份有限公司 Live streaming method for stream processing and plug-flow client in a kind of WebRTC
CN109348252A (en) * 2018-11-01 2019-02-15 腾讯科技(深圳)有限公司 Video broadcasting method, video transmission method, device, equipment and storage medium
CN109413508A (en) * 2018-10-26 2019-03-01 广州虎牙信息科技有限公司 Method, apparatus, equipment, plug-flow method and the live broadcast system of image blend
CN109660859A (en) * 2018-12-25 2019-04-19 北京潘达互娱科技有限公司 A kind of animated show method and mobile terminal
CN110830736A (en) * 2018-08-09 2020-02-21 广州小鹏汽车科技有限公司 Method for video synthesis, electronic device and computer-readable storage medium
CN111182348A (en) * 2018-11-09 2020-05-19 阿里巴巴集团控股有限公司 Live broadcast picture display method and device
CN111314773A (en) * 2020-01-22 2020-06-19 广州虎牙科技有限公司 Screen recording method and device, electronic equipment and computer readable storage medium
CN111327920A (en) * 2020-03-24 2020-06-23 上海万面智能科技有限公司 Live broadcast-based information interaction method and device, electronic equipment and readable storage medium
CN111726687A (en) * 2020-06-30 2020-09-29 北京百度网讯科技有限公司 Method and apparatus for generating display data
CN111787080A (en) * 2020-06-21 2020-10-16 张伟 Data processing method based on artificial intelligence and Internet of things interaction and cloud computing platform
CN111833421A (en) * 2020-07-20 2020-10-27 福建天晴在线互动科技有限公司 Method and system for realizing animation effect based on GDI +
CN112262570A (en) * 2018-06-12 2021-01-22 E·克里奥斯·夏皮拉 Method and system for automatic real-time frame segmentation of high-resolution video streams into constituent features and modification of features in individual frames to create multiple different linear views from the same video source simultaneously
CN113225587A (en) * 2020-02-06 2021-08-06 阿里巴巴集团控股有限公司 Video processing method, video processing device and electronic equipment
CN113810754A (en) * 2021-09-01 2021-12-17 广州博冠信息科技有限公司 Live broadcast picture generation method, device and system, electronic equipment and storage medium
CN114157896A (en) * 2021-12-09 2022-03-08 创盛视联数码科技(北京)有限公司 Video processing method and device and related products
CN114584798A (en) * 2022-03-02 2022-06-03 深圳禾苗通信科技有限公司 Private customized live broadcast method and device, computer equipment and storage medium
WO2023146469A3 (en) * 2022-01-31 2023-08-31 Lemon Inc. Content creation using interactive effects

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103686450A (en) * 2013-12-31 2014-03-26 广州华多网络科技有限公司 Video processing method and system
CN103702040A (en) * 2013-12-31 2014-04-02 广州华多网络科技有限公司 Real-time video graphic decoration superposing processing method and system
US20140347477A1 (en) * 2013-05-24 2014-11-27 International Business Machines Corporation Integrating Street View with Live Video Data
CN104427259A (en) * 2013-09-03 2015-03-18 广州市千钧网络科技有限公司 Method and device for playing video and audio special effects in real time
CN105959718A (en) * 2016-06-24 2016-09-21 乐视控股(北京)有限公司 Real-time interaction method and device in video live broadcasting
CN106028119A (en) * 2016-05-30 2016-10-12 徐文波 Multimedia special effect customizing method and device
CN106131591A (en) * 2016-06-30 2016-11-16 广州华多网络科技有限公司 Live broadcasting method, device and terminal
CN106658205A (en) * 2016-11-22 2017-05-10 广州华多网络科技有限公司 Studio video streaming synthesis control method, device and terminal equipment
CN106937130A (en) * 2017-03-14 2017-07-07 引力互动科技(武汉)有限公司 A kind of system and method that advertisement is delivered in net cast

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140347477A1 (en) * 2013-05-24 2014-11-27 International Business Machines Corporation Integrating Street View with Live Video Data
CN104427259A (en) * 2013-09-03 2015-03-18 广州市千钧网络科技有限公司 Method and device for playing video and audio special effects in real time
CN103686450A (en) * 2013-12-31 2014-03-26 广州华多网络科技有限公司 Video processing method and system
CN103702040A (en) * 2013-12-31 2014-04-02 广州华多网络科技有限公司 Real-time video graphic decoration superposing processing method and system
CN106028119A (en) * 2016-05-30 2016-10-12 徐文波 Multimedia special effect customizing method and device
CN105959718A (en) * 2016-06-24 2016-09-21 乐视控股(北京)有限公司 Real-time interaction method and device in video live broadcasting
CN106131591A (en) * 2016-06-30 2016-11-16 广州华多网络科技有限公司 Live broadcasting method, device and terminal
CN106658205A (en) * 2016-11-22 2017-05-10 广州华多网络科技有限公司 Studio video streaming synthesis control method, device and terminal equipment
CN106937130A (en) * 2017-03-14 2017-07-07 引力互动科技(武汉)有限公司 A kind of system and method that advertisement is delivered in net cast

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107886557A (en) * 2017-12-02 2018-04-06 天津大行道动漫文化发展有限公司 A kind of cartoon generation system based on Media Stream
CN108462883A (en) * 2018-01-08 2018-08-28 平安科技(深圳)有限公司 A kind of living broadcast interactive method, apparatus, terminal device and storage medium
WO2019134235A1 (en) * 2018-01-08 2019-07-11 平安科技(深圳)有限公司 Live broadcast interaction method and apparatus, and terminal device and storage medium
CN108391139A (en) * 2018-01-15 2018-08-10 上海掌门科技有限公司 A kind of video enhancement method, medium and equipment in net cast
CN108377243A (en) * 2018-02-24 2018-08-07 武汉斗鱼网络科技有限公司 A kind of transmission method and device of live TV stream
CN108377243B (en) * 2018-02-24 2021-03-16 武汉斗鱼网络科技有限公司 Live streaming transmission method and device
CN108521584A (en) * 2018-04-20 2018-09-11 广州虎牙信息科技有限公司 Interactive information processing method, device, main broadcaster's side apparatus and medium
CN108521584B (en) * 2018-04-20 2020-08-28 广州虎牙信息科技有限公司 Interactive information processing method, device, anchor side equipment and medium
CN108712661A (en) * 2018-05-28 2018-10-26 广州虎牙信息科技有限公司 A kind of live video processing method, device, equipment and storage medium
CN108712661B (en) * 2018-05-28 2022-02-25 广州虎牙信息科技有限公司 Live video processing method, device, equipment and storage medium
CN112262570B (en) * 2018-06-12 2023-11-14 E·克里奥斯·夏皮拉 Method and computer system for automatically modifying high resolution video data in real time
CN112262570A (en) * 2018-06-12 2021-01-22 E·克里奥斯·夏皮拉 Method and system for automatic real-time frame segmentation of high-resolution video streams into constituent features and modification of features in individual frames to create multiple different linear views from the same video source simultaneously
CN110830736B (en) * 2018-08-09 2022-05-13 广州小鹏汽车科技有限公司 Method for video synthesis, electronic device and computer-readable storage medium
CN110830736A (en) * 2018-08-09 2020-02-21 广州小鹏汽车科技有限公司 Method for video synthesis, electronic device and computer-readable storage medium
CN109040766A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 live video processing method, device and storage medium
US11068720B2 (en) 2018-08-27 2021-07-20 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for adding augmented reality data in live video, and storage medium
CN109168027A (en) * 2018-10-25 2019-01-08 北京字节跳动网络技术有限公司 Instant video methods of exhibiting, device, terminal device and storage medium
CN109168027B (en) * 2018-10-25 2020-12-11 北京字节跳动网络技术有限公司 Instant video display method and device, terminal equipment and storage medium
CN109413508A (en) * 2018-10-26 2019-03-01 广州虎牙信息科技有限公司 Method, apparatus, equipment, plug-flow method and the live broadcast system of image blend
CN109348252A (en) * 2018-11-01 2019-02-15 腾讯科技(深圳)有限公司 Video broadcasting method, video transmission method, device, equipment and storage medium
CN109348252B (en) * 2018-11-01 2020-01-10 腾讯科技(深圳)有限公司 Video playing method, video transmission method, device, equipment and storage medium
CN111182348A (en) * 2018-11-09 2020-05-19 阿里巴巴集团控股有限公司 Live broadcast picture display method and device
CN111182348B (en) * 2018-11-09 2022-06-14 阿里巴巴集团控股有限公司 Live broadcast picture display method and device, storage device and terminal
CN109327727A (en) * 2018-11-20 2019-02-12 网宿科技股份有限公司 Live streaming method for stream processing and plug-flow client in a kind of WebRTC
CN109660859A (en) * 2018-12-25 2019-04-19 北京潘达互娱科技有限公司 A kind of animated show method and mobile terminal
CN111314773A (en) * 2020-01-22 2020-06-19 广州虎牙科技有限公司 Screen recording method and device, electronic equipment and computer readable storage medium
CN113225587A (en) * 2020-02-06 2021-08-06 阿里巴巴集团控股有限公司 Video processing method, video processing device and electronic equipment
CN111327920A (en) * 2020-03-24 2020-06-23 上海万面智能科技有限公司 Live broadcast-based information interaction method and device, electronic equipment and readable storage medium
CN111787080B (en) * 2020-06-21 2021-01-29 广东友易互联科技有限公司 Data processing method based on artificial intelligence and Internet of things interaction and cloud computing platform
CN111787080A (en) * 2020-06-21 2020-10-16 张伟 Data processing method based on artificial intelligence and Internet of things interaction and cloud computing platform
CN111726687A (en) * 2020-06-30 2020-09-29 北京百度网讯科技有限公司 Method and apparatus for generating display data
CN111726687B (en) * 2020-06-30 2022-12-27 北京百度网讯科技有限公司 Method and apparatus for generating display data
CN111833421A (en) * 2020-07-20 2020-10-27 福建天晴在线互动科技有限公司 Method and system for realizing animation effect based on GDI +
CN111833421B (en) * 2020-07-20 2023-06-16 福建天晴在线互动科技有限公司 Method and system for realizing animation effect based on GDI +
CN113810754A (en) * 2021-09-01 2021-12-17 广州博冠信息科技有限公司 Live broadcast picture generation method, device and system, electronic equipment and storage medium
CN114157896A (en) * 2021-12-09 2022-03-08 创盛视联数码科技(北京)有限公司 Video processing method and device and related products
CN114157896B (en) * 2021-12-09 2023-12-05 创盛视联数码科技(北京)有限公司 Video processing method and device
WO2023146469A3 (en) * 2022-01-31 2023-08-31 Lemon Inc. Content creation using interactive effects
CN114584798A (en) * 2022-03-02 2022-06-03 深圳禾苗通信科技有限公司 Private customized live broadcast method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN107360160A (en) live video and animation fusion method, device and terminal device
CN108401175A (en) A kind of processing method, device, storage medium and the electronic equipment of barrage message
CN112351302B (en) Live broadcast interaction method and device based on cloud game and storage medium
WO2020083021A1 (en) Video recording method and apparatus, video playback method and apparatus, device, and storage medium
JP5767108B2 (en) Medium generation system and method
CN106792214A (en) A kind of living broadcast interactive method and system based on digital audio-video place
CN108124167A (en) A kind of play handling method, device and equipment
CN104796448B (en) The data processing method and device of network system
CN106792228A (en) A kind of living broadcast interactive method and system
US20090013263A1 (en) Method and apparatus for selecting events to be displayed at virtual venues and social networking
CN108683954A (en) Pop-up animation producing method and device, pop-up animation, network direct broadcasting server
CN105939495A (en) Electronic device, computer implementation method and non-volatile computer-readable media
CN106448297A (en) Cloud audio-video remote interactive class system
CN108769775A (en) Data processing method and device, network direct broadcasting system in network direct broadcasting
CN107979772A (en) There is provided using sharing with the collaborative of the personalized user function of personal device
WO2022022485A1 (en) Content provision method and apparatus, content display method and apparatus, and electronic device and storage medium
CN108769724A (en) Method and apparatus, the network direct broadcasting system of pop-up are pushed in network direct broadcasting
CN109361954A (en) Method for recording, device, storage medium and the electronic device of video resource
CN105727560A (en) Video colloquy and game interactive fusion method and apparatus
CN110324653A (en) Game interaction exchange method and system, electronic equipment and the device with store function
Noam The content, impact, and regulation of streaming video: The next generation of media emerges
CN109788327A (en) Multi-screen interaction method, device and electronic equipment
CN109688347A (en) Multi-screen interaction method, device and electronic equipment
CN106790196A (en) The interactive method and apparatus of red packet
CN103501467A (en) Method and equipment used for video resource access control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210118

Address after: 511442 3108, 79 Wanbo 2nd Road, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Applicant after: GUANGZHOU CUBESILI INFORMATION TECHNOLOGY Co.,Ltd.

Address before: 511442 29 floor, block B-1, Wanda Plaza, Huambo business district, Panyu District, Guangzhou, Guangdong.

Applicant before: GUANGZHOU HUADUO NETWORK TECHNOLOGY Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171117