CN106658215A - Method and device for pushing live file - Google Patents

Method and device for pushing live file Download PDF

Info

Publication number
CN106658215A
CN106658215A CN201611164454.8A CN201611164454A CN106658215A CN 106658215 A CN106658215 A CN 106658215A CN 201611164454 A CN201611164454 A CN 201611164454A CN 106658215 A CN106658215 A CN 106658215A
Authority
CN
China
Prior art keywords
frame
live
image
local resource
shared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611164454.8A
Other languages
Chinese (zh)
Inventor
刘飞
林锦滨
陈雍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201611164454.8A priority Critical patent/CN106658215A/en
Publication of CN106658215A publication Critical patent/CN106658215A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The invention relates to a method and a device for pushing a live file. The method is to combine a local resource to be shared with live content into a path flow of live file and push the live file to a server. By adopting the method, a user can share the local source with audiences in the live broadcasting process, so that the live content is enriched, the interestingness of the live broadcasting process is increased, simultaneously, the fluency of interaction between the user and the audiences can be kept, and the sharing and interaction experience in the live broadcasting process are improved on the whole.

Description

Push the method and device of live file
Technical field
It relates to communication technical field, more particularly to a kind of method and device for pushing live file.
Background technology
Nowadays, the user of the electronic equipment such as many mobile phones, panel computer likes carrying out live, user using live software Using live software, the content that audio-video collection system acquisition is arrived is by network and spectators' Real-Time Sharing, at the same time, user Chat can be carried out between spectators interactive.But, during live, the live content that user shares with spectators is only limitted to sound and regards The frequency acquisition system content to be collected, live content is more single.Therefore, it is shared achieved by current direct seeding technique And Interactive Experience is to be improved.
The content of the invention
To overcome problem present in correlation technique, the disclosure to provide a kind of method and device for pushing live file.
According to the first aspect of the embodiment of the present disclosure, there is provided a kind of method of push live file, user terminal is applied to, Methods described includes:
Local resource to be shared and live content are merged into the live file for flowing all the way, the local resource includes institute State at least one of image in user terminal, video or audio frequency;
The live file is pushed into server.
Alternatively, it is described that local resource to be shared and live content are merged into the live file for flowing all the way, including:
The local resource to be shared is decoded, to obtain the Frame of the local resource;
The Frame of the local resource is merged with the Frame of the live content;
The Frame that the merging is obtained is encoded, with the live file for being flowed all the way.
Alternatively, the Frame by the local resource is merged with the Frame of the live content, including:
When described when the local resource shared is still image, according to operation of the user to the still image, by institute The Frame for stating still image is processed;
The Frame of still image after treatment is merged with the Frame of the live content.
Alternatively, it is described that local resource to be shared and live content are merged into the live file for flowing all the way, including:
Image in the user terminal to be shared is decoded, to obtain the Frame of described image;
The Frame of the image in the Frame of described image and the live content is merged, to obtain image portion The Frame for dividing;
The Frame of the audio frequency in Frame and the live content to described image part is encoded, to obtain one The live file of road stream.
Alternatively, it is described that local resource to be shared and live content are merged into the live file for flowing all the way, including:
Audio frequency in the user terminal to be shared is decoded, to obtain the Frame of the audio frequency;
The Frame of the audio frequency in the Frame of the audio frequency and the live content is merged, to obtain audio portion The Frame for dividing;
The Frame of the image in Frame and the live content to the audio-frequency unit is encoded, to obtain one The live file of road stream.
Alternatively, it is described that local resource to be shared and live content are merged into the live file for flowing all the way, including:
Video in the user terminal to be shared is decoded, to obtain the Frame of the video;
The Frame of the image in the Frame of the image in the video and the live content is merged, to obtain Obtain the Frame of image section;
The Frame of the audio frequency in the Frame of the audio frequency in the video and the live content is merged, to obtain Obtain the Frame of audio-frequency unit;
The Frame of Frame and described image part to the audio-frequency unit is encoded, straight with what is flowed all the way Broadcast file.
According to the second aspect of the embodiment of the present disclosure, there is provided a kind of device of push live file, described device includes:
Merging module, is configured to be merged into local resource to be shared and live content the live file for flowing all the way, The local resource includes at least one of the image in the user terminal, video or audio frequency;
Pushing module, is configured to for the live file to push to server.
Alternatively, the merging module includes:
First decoding sub-module, is configured to be decoded the local resource to be shared, described local to obtain The Frame of resource;
First merges submodule, is configured to enter the Frame of the local resource with the Frame of the live content Row merges;
First encoding submodule, is configured to be encoded the Frame that the merging is obtained, with what is flowed all the way Live file.
Alternatively, the first merging submodule includes:
Process submodule, be configured as it is described when the local resource shared is still image, according to user to described The operation of still image, the Frame of the still image is processed;
Second merges submodule, is configured to the Frame of still image after treatment and the live content Frame is merged.
Alternatively, the merging module includes:
Second decoding sub-module, is configured to be decoded the image in the user terminal to be shared, to obtain The Frame of described image;
3rd merges submodule, is configured to the data of the image in the Frame of described image and the live content Frame is merged, to obtain the Frame of image section;
Second encoding submodule, the audio frequency being configured in the Frame and the live content to described image part Frame is encoded, with the live file for being flowed all the way.
Alternatively, the merging module includes:
3rd decoding sub-module, is configured to be decoded the audio frequency in the user terminal to be shared, to obtain The Frame of the audio frequency;
4th merges submodule, is configured to the data of the audio frequency in the Frame of the audio frequency and the live content Frame is merged, to obtain the Frame of audio-frequency unit;
3rd encoding submodule, the image being configured in the Frame and the live content to the audio-frequency unit Frame is encoded, with the live file for being flowed all the way.
Alternatively, the merging module includes:
4th decoding sub-module, is configured to be decoded the video in the user terminal to be shared, to obtain The Frame of the video;
5th merges submodule, is configured to the figure in the Frame of the image in the video and the live content The Frame of picture is merged, to obtain the Frame of image section;
The six directions and submodule, are configured to the sound in the Frame of the audio frequency in the video and the live content The Frame of frequency is merged, to obtain the Frame of audio-frequency unit;
4th encoding submodule, the Frame for being configured to the Frame to the audio-frequency unit and described image part enters Row coding, with the live file for being flowed all the way.
According to the third aspect of the embodiment of the present disclosure, there is provided a kind of device of push live file, including:
Processor;
For storing the memory of processor executable;
Wherein, the processor is configured to:
Local resource to be shared and live content are merged into the live file for flowing all the way, the local resource includes institute At least one of image in user terminal, video or audio frequency are stated, described image includes still image and dynamic image;
The live file is pushed into server.
According to the fourth aspect of the embodiment of the present disclosure, there is provided a kind of non-transitorycomputer readable storage medium, when described Instruction in storage medium by user terminal computing device when so that user terminal is able to carry out a kind of pushing live file Method, methods described includes:
Local resource to be shared and live content are merged into the live file for flowing all the way, the local resource includes institute State at least one of image in user terminal, video or audio frequency;
The live file is pushed into server.
The method of the push live file that the disclosure is provided, local resource to be shared and live content are merged into all the way The live file of stream, then push to server.With it, user can share local money during live with spectators Source, enriches live content, increased the interest of live process, meanwhile, stream interactive between user and spectators can be kept Smooth property, totally improve it is live during shared and Interactive Experience.
It should be appreciated that the general description of the above and detailed description hereinafter are only exemplary and explanatory, not The disclosure can be limited.
Description of the drawings
Accompanying drawing herein is merged in specification and constitutes the part of this specification, shows the enforcement for meeting the disclosure Example, and be used to explain the principle of the disclosure together with specification.
Fig. 1 a are the schematic diagrames of the live picture according to an exemplary embodiment.
Fig. 1 b are the schematic diagrames for sharing local resource according to an exemplary embodiment.
Fig. 1 c are another schematic diagrames of the live picture according to an exemplary embodiment.
Fig. 2 is the flow chart of the method for the push live file according to an exemplary embodiment.
Fig. 3 is another flow chart of the method for the push live file according to an exemplary embodiment.
Fig. 4 is another flow chart of the method for the push live file according to an exemplary embodiment.
Fig. 5 is another flow chart of the method for the push live file according to an exemplary embodiment.
Fig. 6 is another flow chart of the method for the push live file according to an exemplary embodiment.
Fig. 7 is another flow chart of the method for the push live file according to an exemplary embodiment.
Fig. 8 is the schematic diagram for sharing picture according to an exemplary embodiment.
Fig. 9 is the block diagram of the device of the push live file according to an exemplary embodiment.
Figure 10 is the block diagram that merging module in the device for pushing live file is illustrated according to an exemplary embodiment.
Figure 11 be according to an exemplary embodiment push live file device in first merge submodule frame Figure.
Figure 12 be according to an exemplary embodiment push live file device in merging module another block diagram.
Figure 13 be according to an exemplary embodiment push live file device in merging module another block diagram.
Figure 14 be according to an exemplary embodiment push live file device in merging module another block diagram.
Figure 15 is a kind of block diagram for pushing the device of live file according to an exemplary embodiment.
Specific embodiment
Here exemplary embodiment will be illustrated in detail, its example is illustrated in the accompanying drawings.Explained below is related to During accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawings represent same or analogous key element.Following exemplary embodiment Described in embodiment do not represent all embodiments consistent with the disclosure.Conversely, they be only with it is such as appended The example of the consistent apparatus and method of some aspects described in detail in claims, the disclosure.
In order to enrich live content, increase the interest of live process, present disclose provides a kind of push live file Method and device so that the local resources such as interesting image, video can also be also shared with sight by user while live It is many.The method and device of the push live file provided the disclosure is illustrated below.
Before the method for the push live file that the explanation disclosure is provided, for ease of understanding, first illustrate and lead to Cross the attainable live picture of the method.
Fig. 1 a are refer to, Fig. 1 a are the schematic diagrames of the live picture according to an exemplary embodiment.As shown in Figure 1a, User (i.e. main broadcaster) be currently in use live software carry out it is live, the picture of live content including user place environment and user's Sound, now, user wants the local resources such as the image related to live content, video to be also shared with spectators, then can be Related local resource is opened in user terminal (such as mobile phone), (Fig. 1 b are bases to kitten cartoon picture as shown in Figure 1 b The schematic diagram for sharing local resource shown in one exemplary embodiment), and determine and share the picture in live.Thus, go out Live picture (Fig. 1 c are another schematic diagrames of the live picture according to an exemplary embodiment) now as illustrated in figure 1 c, Occur in that user wants the kitten cartoon picture shared in the live picture (as shown in Figure 1a) of script.It should be noted that figure Simply a kind of possible live picture shown in 1c, is not limited in actual use this kind of live picture.
The method of the push live file provided by the disclosure, be just capable of achieving it is as above during live with sight Crowd shares local resource.The method is illustrated below.
Fig. 2 is refer to, Fig. 2 is the flow chart of the method for the push live file according to an exemplary embodiment.Such as Shown in Fig. 2, the method comprising the steps of S21 and step S22.
In the step s 21, local resource to be shared and live content be merged into the live file for flowing all the way, described Ground resource includes at least one of the image in the user terminal, video or audio frequency;
In step S22, the live file is pushed into server.
In order to be able to allow spectators while live content is seen, it can be seen that the local resource related to live content, main Broadcasting end needs for live content and local resource to push to server together, and then viewer end is also by live content and local resource Extract from server together.But in the process, if live content and local resource are shunting transmission, that , may be affected by network and viewer end during file push and extraction, cause in viewer end, live content with Local resource is asynchronous on sequential is played so that the live picture disunity of main broadcaster end and viewer end, thus leverages Interaction between main broadcaster and spectators.
In order to the live picture for ensureing main broadcaster end and viewer end is unified, smoothness can be carried out between main broadcaster and spectators Interaction, the disclosure provide push live file method be:Local resource is shared to live when user (i.e. main broadcaster) determines When middle, the live content of local resource to be shared and script is just merged into the live file for flowing all the way at main broadcaster end, then by one The live file of road stream pushes to server.Wherein, local resource can be image in user terminal, video or audio frequency.
Thus, live content and local resource are that server is transferred in the form of flowing all the way, and are also in viewer end Live content and local resource are extracted from server in the form of flowing all the way, therefore, live content and local resource It is to be integrated together, in the live picture at a broadcast window simultaneously played (as illustrated in figure 1 c), viewer end and main broadcaster end Face is unified, and the interaction of smoothness can be carried out between main broadcaster and spectators.
The method of the push live file provided by the disclosure, user can be shared locally during live with spectators Resource, enriches live content, increased the interest of live process, meanwhile, interaction between user and spectators can be kept Fluency, totally improve it is live during shared and Interactive Experience.
Alternatively, Fig. 3 is refer to, Fig. 3 is the another of the method for the push live file according to an exemplary embodiment One flow chart.As shown in figure 3, the method comprising the steps of S311, step S312, step S313, and procedure described above S22。
In step S311, the local resource to be shared is decoded, to obtain the data of the local resource Frame;
In step S312, the Frame of the local resource is merged with the Frame of the live content;
In step S313, the Frame that the merging is obtained is encoded, with the live file for being flowed all the way.
Local resource to be shared and live content be merged into the live file for flowing all the way, including decoding, merge and The process of coding.
First, local resource to be shared is decoded, to obtain the Frame of local resource.Because in live process In, the live content obtained by audio-video collection system acquisition is originally to be saved as a data frame and upload, without the need for The Frame of live content is can be obtained by through decoding.And local resource then just must can obtain original data by decoding Frame.
Obtained after the Frame of local resource, by the Frame of local resource and the Frame of live content by decoding Merge.The Frame that merging is obtained is encoded again, is derived from the live file for flowing all the way.Finally coding is obtained Live file push to server.
By decoding, the process for merging and encoding, the live file for being flowed all the way, operating process is simple, and thus The result of broadcast of the live file for obtaining is good, the complete new experience for sharing local resource during live is brought to user, with this It is also possible to keep fluency interactive between user and spectators.
Local resource to be shared includes the image in user terminal, video or audio frequency, for the local money of different content Source, the process that merging that live file carried out and coding are flowed all the way is different.Various situations are carried out separately below Explanation.
Alternatively, Fig. 4 is refer to, Fig. 4 is the another of the method for the push live file according to an exemplary embodiment One flow chart.As shown in figure 4, the method comprising the steps of S411, step S412, step S413, and procedure described above S22。
In step S411, the image in the user terminal to be shared is decoded, to obtain described image Frame;
In step S412, the Frame of the image in the Frame of described image and the live content is closed And, to obtain the Frame of image section;
In step S413, the Frame of the audio frequency in Frame and the live content to described image part is carried out Coding, with the live file for being flowed all the way.
Because the live content that audio-video collection system acquisition is obtained includes image and audio frequency, by live content and When local resource is merged into a live file, encoded again after should the Frame of different piece be merged respectively, That is, the image section in live content and local resource is merged, meanwhile, also by live content and local resource In audio-frequency unit merge, finally the Frame for merging the image section that obtains and audio-frequency unit encoded again, with This live file for being flowed all the way.
When wait the image during the local resource shared is user terminal, such as picture or film are only wrapped in local resource Containing image, not comprising audio frequency, therefore only need to carry out the Frame of the image in the Frame of local image and live content Merge, the Frame of image section is obtained with this.Again by the sound in the Frame and live content that merge the image section for obtaining The Frame of frequency is encoded, the live file that just can be flowed all the way.
Alternatively, Fig. 5 is refer to, Fig. 5 is the another of the method for the push live file according to an exemplary embodiment One flow chart.As shown in figure 5, the method comprising the steps of S511, step S512, step S513, and procedure described above S22。
In step S511, the audio frequency in the user terminal to be shared is decoded, to obtain the audio frequency Frame;
In step S512, the Frame of the audio frequency in the Frame of the audio frequency and the live content is closed And, to obtain the Frame of audio-frequency unit;
In step S513, the Frame of the image in Frame and the live content to the audio-frequency unit is carried out Coding, with the live file for being flowed all the way.
When wait the audio frequency during the local resource shared is user terminal, such as one section voice is only included in local resource Audio frequency, not comprising image, therefore only needs to be closed the Frame of the audio frequency in the Frame of local audio and live content And, the Frame of audio-frequency unit is obtained with this.Again by the image in the Frame and live content that merge the audio-frequency unit for obtaining Frame encoded, the live file that just can be flowed all the way.
Alternatively, Fig. 6 is refer to, Fig. 6 is the another of the method for the push live file according to an exemplary embodiment One flow chart.As shown in fig. 6, the method comprising the steps of S611, step S612, step S613, step S614, and the above The step of S22.
In step s 611, the video in the user terminal to be shared is decoded, to obtain the video Frame;
In step S612, by the Frame of the image in the Frame of the image in the video and the live content Merge, to obtain the Frame of image section;
In step S613, by the Frame of the audio frequency in the Frame of the audio frequency in the video and the live content Merge, to obtain the Frame of audio-frequency unit;
In step S614, the Frame of Frame and described image part to the audio-frequency unit is encoded, with The live file for being flowed all the way.
When wait the video during the local resource shared is user terminal, image was both included in local resource, also comprising sound Frequently, accordingly, it would be desirable to the Frame of the image in the Frame of the image in local video and live content is merged, while Also the Frame of the audio frequency in the Frame of the audio frequency in local video and live content is merged.Finally will merge again To the Frame of image section encoded with the Frame of audio-frequency unit, be derived from the live file for flowing all the way.
Alternatively, Fig. 7 is refer to, Fig. 7 is the another of the method for the push live file according to an exemplary embodiment One flow chart.As shown in fig. 7, the method also includes in addition to procedure described above S311, step S313 and step S22 Step S711 and step S712.
In step S711, when described when the local resource shared is still image, according to user to the static map The operation of picture, the Frame of the still image is processed;
In step S712, the Frame of still image after treatment is carried out with the Frame of the live content Merge.
When wait still images such as photo, the pictures during the local resource shared is user terminal, according to live effect, use Family such as may be amplified, reduce at the operation to still image.Give an example, refer to Fig. 8, Fig. 8 is according to an exemplary enforcement What is exemplified shares the schematic diagram of picture.When user is when kitten cartoon picture is shared during live, kitten cartoon figure is felt The head of kitten is extremely lovely in piece, will amplify in kitten cartoon picture, so that spectators can see the head of kitten, such as Shown in Fig. 8, comparison diagram 1c, the kitten cartoon picture in Fig. 8 is exaggerated, and only shows the head part of kitten.
The display effect of the live picture according to Fig. 8, user terminal is correspondingly processed the Frame of picture, Part in kitten cartoon picture in addition to kitten head is sheared, to obtain the data of the picture of kitten head part Frame, then the Frame of the picture after shear treatment is merged with the Frame of live content, then carry out encoding To live file, and pushed to server.
With it, spectators can more accurately understand the live content of main broadcaster, the viewing of spectators can be effectively improved Experience, further improves the interaction effect between main broadcaster and spectators.
The disclosure additionally provides a kind of device of push live file, refer to Fig. 9, and Fig. 9 is according to an exemplary enforcement The block diagram of the device of the push live file for exemplifying.As shown in figure 9, the device 100 includes merging module 101 and pushing module 102。
Merging module 101, is configured to be merged into local resource to be shared and live content the live text for flowing all the way Part, the local resource includes at least one of the image in the user terminal, video or audio frequency;
Pushing module 102, is configured to for the live file to push to server.
Alternatively, Figure 10 is refer to, Figure 10 is in the device of the push live file according to an exemplary embodiment The block diagram of merging module.As shown in Figure 10, merging module 101 includes:
First decoding sub-module 301, is configured to be decoded the local resource to be shared, to obtain described The Frame of ground resource;
First merges submodule 302, is configured to the data of the Frame of the local resource and the live content Frame is merged;
First encoding submodule 303, is configured to be encoded the Frame that the merging is obtained, to be flowed all the way Live file.
Alternatively, Figure 11 is refer to, Figure 11 is in the device of the push live file according to an exemplary embodiment First block diagram for merging submodule.As shown in figure 11, the first merging submodule 302 includes:
Process submodule 401, be configured as it is described when the local resource shared is still image, according to user to institute The operation of still image is stated, the Frame of the still image is processed;
Second merges submodule 402, is configured to the Frame of still image after treatment is live interior with described The Frame of appearance is merged.
Alternatively, Figure 12 is refer to, Figure 12 is in the device of the push live file according to an exemplary embodiment Another block diagram of merging module.As shown in figure 12, merging module 101 includes:
Second decoding sub-module 501, is configured to be decoded the image in the user terminal to be shared, to obtain Obtain the Frame of described image;
3rd merges submodule 502, is configured to the image in the Frame of described image and the live content Frame is merged, to obtain the Frame of image section;
Second encoding submodule 503, the sound being configured in the Frame and the live content to described image part The Frame of frequency is encoded, with the live file for being flowed all the way.
Alternatively, Figure 13 is refer to, Figure 13 is in the device of the push live file according to an exemplary embodiment Another block diagram of merging module.As shown in figure 13, merging module 101 includes:
3rd decoding sub-module 601, is configured to be decoded the audio frequency in the user terminal to be shared, to obtain Obtain the Frame of the audio frequency;
4th merges submodule 602, is configured to the audio frequency in the Frame of the audio frequency and the live content Frame is merged, to obtain the Frame of audio-frequency unit;
3rd encoding submodule 603, the figure being configured in the Frame and the live content to the audio-frequency unit The Frame of picture is encoded, with the live file for being flowed all the way.
Alternatively, Figure 14 is refer to, Figure 14 is in the device of the push live file according to an exemplary embodiment Another block diagram of merging module.As shown in figure 14, merging module 101 includes:
4th decoding sub-module 701, is configured to be decoded the video in the user terminal to be shared, to obtain Obtain the Frame of the video;
5th merges submodule 702, is configured in the Frame of the image in the video and the live content The Frame of image merge, to obtain the Frame of image section;
The six directions and submodule 703, are configured in the Frame of the audio frequency in the video and the live content The Frame of audio frequency merge, to obtain the Frame of audio-frequency unit;
4th encoding submodule 704, is configured to the data of the Frame to the audio-frequency unit and described image part Frame is encoded, with the live file for being flowed all the way.
With regard to the device in above-described embodiment, wherein modules perform the concrete mode of operation in relevant the method Embodiment in be described in detail, explanation will be not set forth in detail herein.
Figure 15 is a kind of block diagram for pushing the device 800 of live file according to an exemplary embodiment.Example Such as, device 800 can be mobile phone, and computer, digital broadcast terminal, messaging devices, game console, flat board sets It is standby, Medical Devices, body-building equipment, personal digital assistant etc..
With reference to Figure 15, device 800 can include following one or more assemblies:Process assembly 802, memory 804, electric power Component 806, multimedia groupware 808, audio-frequency assembly 810, the interface 812 of input/output (I/O), sensor cluster 814, and Communication component 816.
The integrated operation of the usual control device 800 of process assembly 802, such as with display, call, data communication, phase Machine operates and records the associated operation of operation.Process assembly 802 can refer to including one or more processors 820 to perform Order, to complete all or part of step of the method for above-mentioned push live file.Additionally, process assembly 802 can include one Or multiple modules, the interaction being easy between process assembly 802 and other assemblies.For example, process assembly 802 can include multimedia Module, to facilitate the interaction between multimedia groupware 808 and process assembly 802.
Memory 804 is configured to store various types of data to support the operation in device 800.These data are shown Example includes the instruction of any application program for operating on device 800 or method, and contact data, telephone book data disappears Breath, picture, video etc..Memory 804 can be by any kind of volatibility or non-volatile memory device or their group Close and realize, such as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM) is erasable to compile Journey read-only storage (EPROM), programmable read only memory (PROM), read-only storage (ROM), magnetic memory, flash Device, disk or CD.
Electric power assembly 806 provides electric power for the various assemblies of device 800.Electric power assembly 806 can include power management system System, one or more power supplys, and other generate, manage and distribute the component that electric power is associated with for device 800.
Multimedia groupware 808 is included in the screen of one output interface of offer between described device 800 and user.One In a little embodiments, screen can include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, screen Curtain may be implemented as touch-screen, to receive the input signal from user.Touch panel includes one or more touch sensings Device is with the gesture on sensing touch, slip and touch panel.The touch sensor can not only sensing touch or sliding action Border, but also detect and the touch or slide related duration and pressure.In certain embodiments, many matchmakers Body component 808 includes a front-facing camera and/or post-positioned pick-up head.When device 800 be in operator scheme, such as screening-mode or During video mode, front-facing camera and/or post-positioned pick-up head can receive outside multi-medium data.Each front-facing camera and Post-positioned pick-up head can be a fixed optical lens system or with focusing and optical zoom capabilities.
Audio-frequency assembly 810 is configured to output and/or input audio signal.For example, audio-frequency assembly 810 includes a Mike Wind (MIC), when device 800 is in operator scheme, such as call model, logging mode and speech recognition mode, microphone is matched somebody with somebody It is set to reception external audio signal.The audio signal for being received can be further stored in memory 804 or via communication set Part 816 sends.In certain embodiments, audio-frequency assembly 810 also includes a loudspeaker, for exports audio signal.
, to provide interface between process assembly 802 and peripheral interface module, above-mentioned peripheral interface module can for I/O interfaces 812 To be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and lock Determine button.
Sensor cluster 814 includes one or more sensors, and the state for providing various aspects for device 800 is commented Estimate.For example, sensor cluster 814 can detect the opening/closed mode of device 800, and the relative positioning of component is for example described Component is the display and keypad of device 800, and sensor cluster 814 can be with 800 1 components of detection means 800 or device Position change, user is presence or absence of with what device 800 was contacted, the orientation of device 800 or acceleration/deceleration and device 800 Temperature change.Sensor cluster 814 can include proximity transducer, be configured to be detected when without any physical contact The presence of object nearby.Sensor cluster 814 can also include optical sensor, such as CMOS or ccd image sensor, for into As used in application.In certain embodiments, the sensor cluster 814 can also include acceleration transducer, gyro sensors Device, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 816 is configured to facilitate the communication of wired or wireless way between device 800 and other equipment.Device 800 can access based on the wireless network of communication standard, such as WiFi, 2G or 3G, or combinations thereof.In an exemplary enforcement In example, communication component 816 receives the broadcast singal or broadcast related information from external broadcasting management system via broadcast channel. In one exemplary embodiment, the communication component 816 also includes near-field communication (NFC) module, to promote junction service.Example Such as, NFC module can be based on RF identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, Bluetooth (BT) technology and other technologies are realizing.
In the exemplary embodiment, device 800 can be by one or more application specific integrated circuits (ASIC), numeral letter Number processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components realizations, for performing above-mentioned push live file Method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instruction, example are additionally provided Such as include the memory 804 of instruction, above-mentioned instruction can be performed to complete the live text of above-mentioned push by the processor 820 of device 800 The method of part.For example, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD- ROM, tape, floppy disk and optical data storage devices etc..
Those skilled in the art will readily occur to other embodiment party of the disclosure after considering specification and putting into practice the disclosure Case.The application is intended to any modification, purposes or the adaptations of the disclosure, these modifications, purposes or adaptability Change follows the general principle of the disclosure and including the undocumented common knowledge in the art of the disclosure or usual skill Art means.Description and embodiments are considered only as exemplary, and the true scope of the disclosure and spirit are by claim below Point out.
It should be appreciated that the disclosure is not limited to the precision architecture for being described above and being shown in the drawings, and And can without departing from the scope carry out various modifications and changes.The scope of the present disclosure is only limited by appended claim.

Claims (13)

1. it is a kind of push live file method, it is characterised in that be applied to user terminal, methods described includes:
Local resource to be shared and live content are merged into the live file for flowing all the way, the local resource includes the use At least one of image, video or audio frequency in the terminal of family;
The live file is pushed into server.
2. method according to claim 1, it is characterised in that described to merge local resource to be shared with live content Into the live file for flowing all the way, including:
The local resource to be shared is decoded, to obtain the Frame of the local resource;
The Frame of the local resource is merged with the Frame of the live content;
The Frame that the merging is obtained is encoded, with the live file for being flowed all the way.
3. method according to claim 2, it is characterised in that the Frame by the local resource is live with described The Frame of content is merged, including:
When described when the local resource shared is still image, according to operation of the user to the still image, will be described quiet The Frame of state image is processed;
The Frame of still image after treatment is merged with the Frame of the live content.
4. method according to claim 1, it is characterised in that described to merge local resource to be shared with live content Into the live file for flowing all the way, including:
Image in the user terminal to be shared is decoded, to obtain the Frame of described image;
The Frame of the image in the Frame of described image and the live content is merged, to obtain image section Frame;
The Frame of the audio frequency in Frame and the live content to described image part is encoded, to be flowed all the way Live file.
5. method according to claim 1, it is characterised in that described to merge local resource to be shared with live content Into the live file for flowing all the way, including:
Audio frequency in the user terminal to be shared is decoded, to obtain the Frame of the audio frequency;
The Frame of the audio frequency in the Frame of the audio frequency and the live content is merged, to obtain audio-frequency unit Frame;
The Frame of the image in Frame and the live content to the audio-frequency unit is encoded, to be flowed all the way Live file.
6. method according to claim 1, it is characterised in that described to merge local resource to be shared with live content Into the live file for flowing all the way, including:
Video in the user terminal to be shared is decoded, to obtain the Frame of the video;
The Frame of the image in the Frame of the image in the video and the live content is merged, to obtain figure As the Frame of part;
The Frame of the audio frequency in the Frame of the audio frequency in the video and the live content is merged, to obtain sound The Frame of frequency part;
The Frame of Frame and described image part to the audio-frequency unit is encoded, with the live text for being flowed all the way Part.
7. it is a kind of push live file device, it is characterised in that be applied to user terminal, described device includes:
Merging module, is configured to be merged into local resource to be shared and live content the live file for flowing all the way, described Local resource includes at least one of the image in the user terminal, video or audio frequency;
Pushing module, is configured to for the live file to push to server.
8. device according to claim 7, it is characterised in that the merging module includes:
First decoding sub-module, is configured to be decoded the local resource to be shared, to obtain the local resource Frame;
First merges submodule, is configured to be closed the Frame of the local resource with the Frame of the live content And;
First encoding submodule, is configured to be encoded the Frame that the merging is obtained, live with what is flowed all the way File.
9. device according to claim 8, it is characterised in that the first merging submodule includes:
Process submodule, be configured as it is described when the local resource shared is still image, according to user to the static state The operation of image, the Frame of the still image is processed;
Second merges submodule, is configured to the data of the Frame of still image after treatment and the live content Frame is merged.
10. device according to claim 7, it is characterised in that the merging module includes:
Second decoding sub-module, is configured to be decoded the image in the user terminal to be shared, described to obtain The Frame of image;
3rd merges submodule, is configured to enter the Frame of the image in the Frame of described image and the live content Row merges, to obtain the Frame of image section;
Second encoding submodule, the data of the audio frequency being configured in the Frame and the live content to described image part Frame is encoded, with the live file for being flowed all the way.
11. devices according to claim 7, it is characterised in that the merging module includes:
3rd decoding sub-module, is configured to be decoded the audio frequency in the user terminal to be shared, described to obtain The Frame of audio frequency;
4th merges submodule, is configured to enter the Frame of the audio frequency in the Frame of the audio frequency and the live content Row merges, to obtain the Frame of audio-frequency unit;
3rd encoding submodule, the data of the image being configured in the Frame and the live content to the audio-frequency unit Frame is encoded, with the live file for being flowed all the way.
12. devices according to claim 7, it is characterised in that the merging module includes:
4th decoding sub-module, is configured to be decoded the video in the user terminal to be shared, described to obtain The Frame of video;
5th merges submodule, is configured to the image in the Frame of the image in the video and the live content Frame is merged, to obtain the Frame of image section;
The six directions and submodule, are configured to the audio frequency in the Frame of the audio frequency in the video and the live content Frame is merged, to obtain the Frame of audio-frequency unit;
4th encoding submodule, the Frame for being configured to the Frame to the audio-frequency unit and described image part is compiled Code, with the live file for being flowed all the way.
13. a kind of devices for pushing live file, it is characterised in that include:
Processor;
For storing the memory of processor executable;
Wherein, the processor is configured to:
Local resource to be shared and live content are merged into the live file for flowing all the way, the local resource includes the use At least one of image, video or audio frequency in the terminal of family;
The live file is pushed into server.
CN201611164454.8A 2016-12-15 2016-12-15 Method and device for pushing live file Pending CN106658215A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611164454.8A CN106658215A (en) 2016-12-15 2016-12-15 Method and device for pushing live file

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611164454.8A CN106658215A (en) 2016-12-15 2016-12-15 Method and device for pushing live file

Publications (1)

Publication Number Publication Date
CN106658215A true CN106658215A (en) 2017-05-10

Family

ID=58822532

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611164454.8A Pending CN106658215A (en) 2016-12-15 2016-12-15 Method and device for pushing live file

Country Status (1)

Country Link
CN (1) CN106658215A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107396166A (en) * 2017-08-07 2017-11-24 北京小米移动软件有限公司 The method and device of live middle display video
CN107566878A (en) * 2017-08-07 2018-01-09 北京小米移动软件有限公司 The method and device of live middle display picture
CN109151594A (en) * 2018-09-27 2019-01-04 广州虎牙信息科技有限公司 Direct playing and playback video broadcasting method, device and electronic equipment
CN110012336A (en) * 2019-03-12 2019-07-12 广州虎牙信息科技有限公司 Picture configuration method, terminal and the device at interface is broadcast live
CN110225020A (en) * 2019-06-04 2019-09-10 杭州网易云音乐科技有限公司 Audio frequency transmission method, system, electronic equipment and computer readable storage medium
CN110636321A (en) * 2019-09-30 2019-12-31 北京达佳互联信息技术有限公司 Data processing method, device, system, mobile terminal and storage medium
CN110662082A (en) * 2019-09-30 2020-01-07 北京达佳互联信息技术有限公司 Data processing method, device, system, mobile terminal and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105120301A (en) * 2015-08-25 2015-12-02 小米科技有限责任公司 Video processing method and apparatus, and intelligent equipment
CN105791889A (en) * 2016-05-04 2016-07-20 武汉斗鱼网络科技有限公司 Advertisement inter-cut method for video live broadcasting and advertisement inter-cut device for video live broadcasting
CN105791958A (en) * 2016-04-22 2016-07-20 北京小米移动软件有限公司 Method and device for live broadcasting game
CN106028137A (en) * 2016-06-22 2016-10-12 北京小米移动软件有限公司 Live streaming processing method and apparatus
CN106131583A (en) * 2016-06-30 2016-11-16 北京小米移动软件有限公司 A kind of live processing method, device, terminal unit and system
CN106131700A (en) * 2016-07-04 2016-11-16 天脉聚源(北京)传媒科技有限公司 A kind of sharing files method and device during net cast
CN106165430A (en) * 2016-06-29 2016-11-23 北京小米移动软件有限公司 Net cast method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105120301A (en) * 2015-08-25 2015-12-02 小米科技有限责任公司 Video processing method and apparatus, and intelligent equipment
CN105791958A (en) * 2016-04-22 2016-07-20 北京小米移动软件有限公司 Method and device for live broadcasting game
CN105791889A (en) * 2016-05-04 2016-07-20 武汉斗鱼网络科技有限公司 Advertisement inter-cut method for video live broadcasting and advertisement inter-cut device for video live broadcasting
CN106028137A (en) * 2016-06-22 2016-10-12 北京小米移动软件有限公司 Live streaming processing method and apparatus
CN106165430A (en) * 2016-06-29 2016-11-23 北京小米移动软件有限公司 Net cast method and device
CN106131583A (en) * 2016-06-30 2016-11-16 北京小米移动软件有限公司 A kind of live processing method, device, terminal unit and system
CN106131700A (en) * 2016-07-04 2016-11-16 天脉聚源(北京)传媒科技有限公司 A kind of sharing files method and device during net cast

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107396166A (en) * 2017-08-07 2017-11-24 北京小米移动软件有限公司 The method and device of live middle display video
CN107566878A (en) * 2017-08-07 2018-01-09 北京小米移动软件有限公司 The method and device of live middle display picture
CN109151594A (en) * 2018-09-27 2019-01-04 广州虎牙信息科技有限公司 Direct playing and playback video broadcasting method, device and electronic equipment
CN110012336A (en) * 2019-03-12 2019-07-12 广州虎牙信息科技有限公司 Picture configuration method, terminal and the device at interface is broadcast live
CN110225020A (en) * 2019-06-04 2019-09-10 杭州网易云音乐科技有限公司 Audio frequency transmission method, system, electronic equipment and computer readable storage medium
CN110636321A (en) * 2019-09-30 2019-12-31 北京达佳互联信息技术有限公司 Data processing method, device, system, mobile terminal and storage medium
CN110662082A (en) * 2019-09-30 2020-01-07 北京达佳互联信息技术有限公司 Data processing method, device, system, mobile terminal and storage medium

Similar Documents

Publication Publication Date Title
CN106658215A (en) Method and device for pushing live file
CN104469437B (en) Advertisement sending method and device
CN105791958A (en) Method and device for live broadcasting game
CN105828201B (en) Method for processing video frequency and device
CN106331761A (en) Live broadcast list display method and apparatuses
CN106804000A (en) Direct playing and playback method and device
CN106210757A (en) Live broadcasting method, live broadcast device and live broadcast system
CN104731688B (en) Point out the method and device of reading progress
CN106488251A (en) Realize the live middle even method and device of wheat, main broadcaster's client and subscription client
EP3264774B1 (en) Live broadcasting method and device for live broadcasting
CN106911967A (en) Direct playing and playback method and device
CN106231378A (en) The display packing of direct broadcasting room, Apparatus and system
CN104391711B (en) A kind of method and device that screen protection is set
CN106412710A (en) Method and device for exchanging information through graphical label in live video streaming
CN105120301B (en) Method for processing video frequency and device, smart machine
CN105468767A (en) Method and device for acquiring calling card information
CN105447150A (en) Face album based music playing method and apparatus, and terminal device
CN106550252A (en) The method for pushing of information, device and equipment
CN106919629A (en) The method and device of information sifting is realized in group chat
CN105117111A (en) Rendering method and device for virtual reality interaction frames
CN105120155A (en) Panoramic photograph generation method and device
CN110121106A (en) Video broadcasting method and device
CN107122430A (en) Search result display methods and device
CN105872573A (en) Video playing method and apparatus
CN106792118A (en) The player method and device of multimedia file

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170510