CN106604101A - Live streaming interaction method and device - Google Patents

Live streaming interaction method and device Download PDF

Info

Publication number
CN106604101A
CN106604101A CN201611034111.XA CN201611034111A CN106604101A CN 106604101 A CN106604101 A CN 106604101A CN 201611034111 A CN201611034111 A CN 201611034111A CN 106604101 A CN106604101 A CN 106604101A
Authority
CN
China
Prior art keywords
image
target
stage property
interactive
data frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611034111.XA
Other languages
Chinese (zh)
Inventor
乔梁
李志刚
韩尚佑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201611034111.XA priority Critical patent/CN106604101A/en
Publication of CN106604101A publication Critical patent/CN106604101A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/612Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/47815Electronic shopping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Abstract

The invention provides a live streaming interaction method and device. The method comprises that a target interaction property is obtained; a target interaction image from a live streaming data frame is determined; and the target interaction property is output on the basis of the target interaction image. The method can be used to enhance interaction between audiences and a host, and the viewing rate of a live streaming program can be improved effectively.

Description

The method and device of living broadcast interactive
Technical field
It relates to network direct broadcasting technical field, more particularly to a kind of method and device of living broadcast interactive.
Background technology
With the continuous development of terminal technology, the direct seeding technique for being applied to terminal also increasingly has advantage.Current is straight Broadcast technology simply dull, in order to improve live program effect, sometimes during live, increase some spectators and master The interaction broadcast, so as to improve attraction of the program to spectators, to improve the viewing rate of program.Spectators can wrap with the interaction of main broadcaster Spectators are included to main broadcaster's gifts etc..In the related, the process of gifts simply represents one simply at main broadcaster end Animation, this interaction mode it is interactive not strong, it is also difficult to effectively improve the viewing rate of programme televised live.
The content of the invention
In order to solve above-mentioned technical problem, the disclosure provides a kind of method and device of living broadcast interactive.
According to the first aspect of the embodiment of the present disclosure, there is provided a kind of method of living broadcast interactive, including:
Obtain target interaction stage property;
It is determined that from the target interactive image of live data frame;
The target interaction stage property is exported based on the target interactive image.
Optionally, the acquisition target interaction stage property, including:
The interactive stage property selection interface of output;
The interactive stage property that user is selected by the selection interface is obtained, as target interaction stage property.
Optionally, the target interactive image determined from live data frame, including:
Alternative image is gathered based on live data frame;
The alternative image is stored;
When interaction instruction is detected, target interactive image is obtained from the alternative image.
Optionally, it is described that alternative image is gathered based on live data frame, including:
Collection live data frame;
Target area is identified from the live data frame;
Extract the corresponding image in the target area alternately image.
Optionally, it is described that alternative image is gathered based on live data frame, including:
Obtain from the alternately image of truncated picture in live data frame.
Optionally, it is described that target interactive image is obtained from the alternative image, including:
Export the alternative image;
The image that user selects from the alternative image is obtained, as target interactive image.
Optionally, it is described that target interactive image is obtained from the alternative image, including:
The image matched with target interaction stage property is found out from the alternative image as the first image;
Target interactive image is obtained from described first image.
Optionally, it is described that the target interaction stage property is exported based on the target interactive image, including:
Determine the positions and dimensions of the image insertion area in the target interaction stage property;
The target interactive image is inserted into into described image insertion according to the positions and dimensions of described image insert region Region;
Output inserts the target interaction stage property of target interactive image.
Optionally, the target interactive image determined from live data frame, including:
When interaction instruction is detected, current live data frame is obtained in real time;
Whether comprising the region with predetermined characteristic in the current live data frame of identification;
Live data frame comprising the region with predetermined characteristic is defined as into current target interactive image.
Optionally, it is described that the target interaction stage property is exported based on the target interactive image, including:
Identify that the region matched with target interaction stage property shows as stage property from current target interactive image Region;
Target interaction stage property is shown in into the stage property viewing area.
According to the second aspect of the embodiment of the present disclosure, there is provided a kind of device of living broadcast interactive, including:
Acquisition module, is configured to obtain target interaction stage property;
Determining module, is configured to determine that the target interactive image from live data frame;
Output module, is configured to export the target interaction stage property based on the target interactive image.
Optionally, the acquisition module includes:
First output sub-module, is configured to export interactive stage property selection interface;
First acquisition submodule, is configured to obtain the interactive stage property that user is selected by the selection interface, as mesh The interactive stage property of mark.
Optionally, the determining module includes:
Collection submodule, is configured to gather alternative image based on live data frame;
Sub-module stored, is configured to be stored the alternative image;
Second acquisition submodule, is configured to when interaction instruction is detected, and target is obtained from the alternative image mutual Motion video.
Optionally, the collection submodule is arranged to:
Collection live data frame;
Target area is identified from the live data frame;
Extract the corresponding image in the target area alternately image.
Optionally, the collection submodule is arranged to:
Obtain from the alternately image of truncated picture in live data frame.
Optionally, second acquisition submodule is arranged to:
Export the alternative image;
The image that user selects from the alternative image is obtained, as target interactive image.
Optionally, second acquisition submodule is arranged to:
The image matched with target interaction stage property is found out from the alternative image as the first image;
Target interactive image is obtained from described first image.
Optionally, the output module includes:
Determination sub-module, the positions and dimensions of the image insertion area being configured to determine that in the target interaction stage property;
Insertion submodule, is configured to the target interactive image according to the positions and dimensions of described image insert region It is inserted into described image insert region;
Second output sub-module, is configured to export the target for inserting target interactive image interaction stage property.
Optionally, the determining module includes:
Frame acquisition submodule, is configured to when interaction instruction is detected, and current live data frame is obtained in real time;
Whether identification submodule, be configured to recognize in current live data frame comprising the region with predetermined characteristic;
Image determination sub-module, is configured to be defined as the live data frame comprising the region with predetermined characteristic currently Target interactive image.
Optionally, the output module includes:
Region recognition submodule, is configured to from current target interactive image identify and target interaction stage property The region of matching is used as stage property viewing area;
Display sub-module, is configured to for target interaction stage property to be shown in the stage property viewing area.
According to the third aspect of the embodiment of the present disclosure, there is provided a kind of device of living broadcast interactive, including:
Processor;
For storing the memory of processor executable;
Wherein, the processor is configured to:
Obtain target interaction stage property;
It is determined that from the target interactive image of live data frame;
The target interaction stage property is exported based on the target interactive image.
The technical scheme that embodiment of the disclosure is provided can include following beneficial effect:
The method of the living broadcast interactive that above-described embodiment of the disclosure is provided, by obtaining target interaction stage property, it is determined that source In the target interactive image of live data frame, and based on the target interactive image output target interaction stage property.So as to strengthen Spectators are interactive with main broadcaster, effectively raise the viewing rate of programme televised live.
The method of the living broadcast interactive that above-described embodiment of the disclosure is provided, by the interactive stage property selection interface of output, obtains The interactive stage property that user is selected by the selection interface, as target interaction stage property.It is determined that from the target of live data frame Interactive image, and based on target interactive image output target interaction stage property.It is interactive with main broadcaster so as to contribute to enhancing spectators, Further effectively raise the viewing rate of programme televised live.
It should be appreciated that the general description of the above and detailed description hereinafter are only exemplary and explanatory, not The disclosure can be limited.
Description of the drawings
Accompanying drawing herein is merged in specification and constitutes the part of this specification, shows the enforcement for meeting the disclosure Example, and be used to explain the principle of the disclosure together with specification.
Fig. 1 is a kind of flow chart of the method for living broadcast interactive of the disclosure according to an exemplary embodiment;
Fig. 2 is the flow chart of the method for another kind of living broadcast interactive of the disclosure according to an exemplary embodiment;
Fig. 3 is a kind of block diagram of the device of living broadcast interactive of the disclosure according to an exemplary embodiment;
Fig. 4 is the block diagram of the device of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment;
Fig. 5 is the block diagram of the device of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment;
Fig. 6 is the block diagram of the device of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment;
Fig. 7 is the block diagram of the device of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment;
Fig. 8 is the block diagram of the device of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment;
Fig. 9 is an a kind of structural representation of the device of living broadcast interactive of the disclosure according to an exemplary embodiment.
Specific embodiment
Here exemplary embodiment will be illustrated in detail, its example is illustrated in the accompanying drawings.Explained below is related to During accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawings represent same or analogous key element.Following exemplary embodiment Described in embodiment do not represent all embodiments consistent with the disclosure.Conversely, they be only with it is such as appended The example of the consistent apparatus and method of some aspects described in detail in claims, the disclosure.
The term used in the disclosure is, only merely for the purpose of description specific embodiment, and to be not intended to be limiting the disclosure. " one kind ", " described " and " being somebody's turn to do " of singulative used in disclosure and the accompanying claims book is also intended to include majority Form, unless context clearly shows that other implications.It is also understood that term "and/or" used herein is referred to and wrapped Containing one or more associated any or all possible combinations for listing project.
It will be appreciated that though various information, but this may be described using term first, second, third, etc. in the disclosure A little information should not necessarily be limited by these terms.These terms are only used for that same type of information is distinguished from each other out.For example, without departing from In the case of disclosure scope, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as One information.Depending on linguistic context, word as used in this " if " can be construed to " ... when " or " when ... When " or " in response to determining ".
As shown in figure 1, Fig. 1 is a kind of flow chart of the method for the living broadcast interactive according to an exemplary embodiment, should Method can apply in terminal, it is also possible in being applied to server.In the present embodiment, in order to make it easy to understand, with reference to having The terminal device of display screen is illustrating.It will be understood by those skilled in the art that the terminal device can be including but not limited to The mobile terminal device of such as smart mobile phone, intelligent wearable equipment, panel computer, personal digital assistant, laptop are portable Code and desktop computer etc..The method is comprised the following steps:
In a step 101, target interaction stage property is obtained.
In general, when spectators' viewing is live, living broadcast interactive can be carried out with main broadcaster, such as to main broadcaster's gifts. When living broadcast interactive is carried out with main broadcaster, the stage property used when determining interactive first is needed.In the present embodiment, target interaction stage property The stage property adopted when carrying out interaction by spectators and main broadcaster.Specifically, interactive stage property selection interface can be exported first, mutual The various interactive stage property for treating selection is shown in dynamic stage property selection interface, for user's selection.Then, user is obtained by above-mentioned The interactive stage property that selection interface is selected, as target interaction stage property.
In a step 102, it is determined that from the target interactive image of live data frame.
In step 103, based on target interactive image output target interaction stage property.
In the present embodiment, the vedio data frame to spectators or main broadcaster is exported when live data frame is live.Can Can be that the static image conduct of a part is intercepted from live data frame to determine target interactive image by live data frame Target interactive image.Can also be that real-time several frames live to every frame are identified, identify and wherein have predetermined characteristic (such as face Feature, or lip feature, or eye feature etc.) parts of images as target interactive image.Therefore, the mutual cardon of target As deriving from live data frame.
Then, can be based on target interactive image output target interaction stage property.For example, target interaction stage property can be exported, And directly in the pre-position insertion target interactive image of target interaction stage property.Or, it is also possible to target interaction stage property is shown Show and have at the position of predetermined characteristic in current live Frame.
The method of the living broadcast interactive that above-described embodiment of the disclosure is provided, by obtaining target interaction stage property, it is determined that source In the target interactive image of live data frame, and based on the target interactive image output target interaction stage property.So as to strengthen Spectators are interactive with main broadcaster, effectively raise the viewing rate of programme televised live.
As shown in Fig. 2 the flow chart of the method for another kind of living broadcast interactives of the Fig. 2 according to an exemplary embodiment, should Embodiment describes the process for obtaining target interaction stage property in detail, and the method can apply in terminal, it is also possible to be applied to clothes In business device.Comprise the following steps:
In step 201, interactive stage property selection interface is exported.
In step 202., the interactive stage property that user is selected by the selection interface is obtained, as target interaction stage property.
In the present embodiment, it is possible, firstly, to export interactive stage property selection interface in viewer end.In interactive stage property selection interface On, various alternative interactive stage properties are shown, for example, these interactive stage properties can be the image of an automobile, or one secondary The image of glasses, or the image of one piece of ring etc..User can select oneself to want what is used by the selection interface Interactive stage property is used as target interaction stage property.After user carries out selecting the operation of interactive stage property by selection interface, obtain and use The interactive stage property that family selects is used as target interaction stage property.
In step 203, it is determined that from the target interactive image of live data frame.
In a kind of implementation of the present embodiment, alternative image can be gathered based on live data frame, by alternative image Stored.When interaction instruction is detected, target interactive image is obtained from alternative image.
In the present embodiment, gathering alternative image based on live data frame can realize in the following way:Adopt in real time Collection live data frame gathers live data frame every certain period of time.Then, using image recognition technology to collecting Live data frame carry out image recognition, therefrom identify target area.Target area can be the region with predetermined characteristic, The such as region with face characteristic, or the region with lip feature, or the region with eye feature, or with people Region of hand feature etc..The corresponding image in target area alternately image can be extracted, such as can by human face region, or Lip region, or eye areas, or the corresponding image in staff region etc. alternately image.
In the present embodiment, gathering alternative image based on live data frame can also realize in the following way:Spectators User can choose certain region using sectional drawing instrument from live data frame, and by the region chosen from live data frame Intercept out alternately image.
In the present embodiment, target interactive image being obtained from alternative image can realize in the following way:Output Target interactive image selection interface, the output in target interactive image selection interface has some or all of alternative image.By with Family selects image from alternative image, and the image that acquisition user selects from alternative image is used as target interactive image.
In the present embodiment, target interactive image being obtained from alternative image can also realize in the following way:From The image matched with target interaction stage property is found out in alternative image as the first image, the mutual cardon of target is obtained from the first image Picture.In general, different types of interactive stage property may match with different types of interactive image, for example, glasses form Interactive stage property can match with the interactive image with eye feature;The interactive stage property of ring form can be special with finger The interactive image levied matches;The interactive stage property of cap form can match etc. with the interactive image with face characteristic. Specifically, it is possible, firstly, to determine the identification information of target interaction stage property, then, search from alternative image and the mark letter The alternative image of breath association is used as the first image.Then, target interactive image can be obtained from the first image.For example, can be with Export the first image for user selection, the first image that user is chosen as target interactive image, or, further adopt Target interaction stage property is matched with the first image, e.g., is matched in terms of size or color etc., will be with target interaction road The image of matching degree highest first of tool is used as target interactive image.
In another kind of implementation of the present embodiment, can be when interaction instruction is detected, obtaining in real time current Live data frame.Whether comprising the region with predetermined characteristic in the current live data frame of identification.Will be comprising with predetermined spy The live data frame in the region levied is defined as current target interactive image.
In the present embodiment, when interaction instruction is detected, current live data frame can in real time be obtained.Then, it is real When for the live data frame that gets, whether recognized in the Frame comprising with predetermined characteristic using image recognition technology Region.Live data frame comprising the region with predetermined characteristic is defined as into current target interactive image.Wherein, spy is made a reservation for The region levied can be the region with face characteristic, or the region with lip feature, or the area with eye feature Domain, or the region with staff feature etc..It is appreciated that predetermined characteristic can be the feature for arbitrarily being capable of identify that out, The disclosure is not limited the particular content aspect of predetermined characteristic.
In step 204, based on target interactive image output target interaction stage property.
In the present embodiment, target interaction stage property can be exported, and it is directly slotting in the pre-position of target interaction stage property Enter target interactive image.The position that can also will have predetermined characteristic in the currently displayed live data frame of target interaction stage property Place.
Specifically, in one implementation, it is possible, firstly, to determine the image insertion area in target interaction stage property Positions and dimensions.Then, target interactive image is inserted into into image insertion area according to the positions and dimensions of image insertion area. Specifically, target interactive image can be processed according to the size of image insertion area (e.g., scaling is processed, or rotation Process etc.), make target interactive image after treatment match with the size of image insertion area.Again in image insertion area Position at insert target interactive image, so as to export insert target interactive image target interaction stage property.
For example, in the interactive stage property of automobile form, at the display location of driver's seat, there can be an image insert district Domain, the region can show the image of face characteristic.The face image of the main broadcaster collected from live data frame can be entered After row is processed, it is inserted at the display location of driver's seat, and exports in the lump with the interactive stage property of automobile form.
In another kind of implementation, can with identify from current target interactive image and target interaction stage property The region matched somebody with somebody is used as stage property viewing area.Then, target interaction stage property is shown in into stage property viewing area.
For example, when stage property of the interactive stage property that user selects for glasses form, can be from current target interactive image In identify the region matched with the stage property of glasses form, the as region of the eyes of main broadcaster.And glasses are determined according to the region The corresponding viewing area of stage property of form.It is then possible to the stage property of glasses form is shown in into the stage property viewing area.
It should be noted that for the step identical with Fig. 1 embodiments, no longer being gone to live in the household of one's in-laws on getting married in above-mentioned Fig. 2 embodiments State, related content can be found in Fig. 1 embodiments.
The method of the living broadcast interactive that above-described embodiment of the disclosure is provided, by the interactive stage property selection interface of output, obtains The interactive stage property that user is selected by the selection interface, as target interaction stage property.It is determined that from the target of live data frame Interactive image, and based on target interactive image output target interaction stage property.It is interactive with main broadcaster so as to contribute to enhancing spectators, Further effectively improve the viewing rate of programme televised live.
Although it should be noted that describe the operation of method of disclosure with particular order in the accompanying drawings, this is not required that Or hint must perform these operations according to the particular order, or the operation having to carry out shown in whole could realize the phase The result of prestige.Conversely, the step of describing in flow chart can change execution sequence.Additionally or alternatively, it is convenient to omit some Multiple steps are merged into a step and are performed by step, and/or a step is decomposed into into execution of multiple steps.
Corresponding with the embodiment of the method for aforementioned switch contents, the disclosure additionally provides the enforcement of the device of switch contents Example.
As shown in figure 3, Fig. 3 is a kind of device block diagram of living broadcast interactive of the disclosure according to an exemplary embodiment, The device includes:Acquisition module 301, determining module 302 and output module 303.
Wherein, acquisition module 301, are configured to obtain target interaction stage property.
Determining module 302, is configured to determine that the target interactive image from live data frame.
Output module 303, is configured to based on target interactive image output target interaction stage property.
As shown in figure 4, Fig. 4 is the device frame of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment Figure, on the basis of aforementioned embodiment illustrated in fig. 3, acquisition module 301 can include the embodiment:First output sub-module 401 With the first acquisition submodule 402.
Wherein, the first output sub-module 401, is configured to export interactive stage property selection interface.
First acquisition submodule 402, is configured to obtain the interactive stage property that user is selected by selection interface, as target Interactive stage property.
As shown in figure 5, Fig. 5 is the device frame of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment Figure, on the basis of aforementioned embodiment illustrated in fig. 3, determining module 302 can include the embodiment:Collection submodule 501, storage The acquisition submodule 503 of submodule 502 and second.
Wherein, submodule 501 is gathered, is configured to gather alternative image based on live data frame.
Sub-module stored 502, is configured to be stored alternative image.
Second acquisition submodule 503, is configured to when interaction instruction is detected, and target is obtained from alternative image interactive Image.
In some optional embodiments, collection submodule is arranged to:Collection live data frame, from live data frame In identify target area, extract the corresponding image in the target area alternately image.
In some optional embodiments, collection submodule is arranged to:Obtain the figure intercepted from live data frame As alternately image.
In some optional embodiments, the second acquisition submodule is arranged to:Export alternative image, obtain user from The image selected in alternative image, as target interactive image.
In some optional embodiments, the second acquisition submodule is arranged to:Find out from alternative image and mesh The image of the interactive stage property matching of mark obtains target interactive image as the first image from the first image.
As shown in fig. 6, Fig. 6 is the device frame of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment Figure, on the basis of aforementioned embodiment illustrated in fig. 3, output module 303 can include the embodiment:Determination sub-module 601, insertion The output sub-module 603 of submodule 602 and second.
Wherein it is determined that submodule 601, the position of the image insertion area being configured to determine that in target interaction stage property and chi It is very little.
Insertion submodule 602, is configured to be inserted target interactive image according to the positions and dimensions of the image insertion area Enter to the image insertion area.
Second output sub-module 603, is configured to export the target for inserting target interactive image interaction stage property.
As shown in fig. 7, Fig. 7 is the device frame of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment Figure, on the basis of aforementioned embodiment illustrated in fig. 3, determining module 302 can include the embodiment:Frame acquisition submodule 701, recognize submodule 702 and image determination sub-module 703.
Wherein, Frame acquisition submodule 701, is configured to when interaction instruction is detected, and obtains current straight in real time Multicast data frame.
Whether identification submodule 702, be configured to recognize in current live data frame comprising the area with predetermined characteristic Domain.
Image determination sub-module 703, is configured to be defined as the live data frame comprising the region with predetermined characteristic Current target interactive image.
As shown in figure 8, Fig. 8 is the device frame of another kind of living broadcast interactive of the disclosure according to an exemplary embodiment Figure, on the basis of aforementioned embodiment illustrated in fig. 3, output module 303 can include the embodiment:Region recognition submodule 801 With display sub-module 802.
Wherein, region recognition submodule 801, is configured to from current target interactive image identify and target interaction The region of stage property matching is used as stage property viewing area.
Display sub-module 802, is configured to for target interaction stage property to be shown in stage property viewing area.
It should be appreciated that said apparatus can be set in advance in terminal or server, it is also possible to by modes such as downloads In being loaded into terminal or server.Corresponding module in said apparatus can with the module in terminal or server cooperate with Realize the scheme of living broadcast interactive.
For device embodiment, because it corresponds essentially to embodiment of the method, so related part is referring to method reality Apply the part explanation of example.Device embodiment described above is only schematic, wherein described as separating component The unit of explanation can be or may not be physically separate, can be as the part that unit shows or can also It is not physical location, you can be located at a place, or can also be distributed on multiple NEs.Can be according to reality Need the purpose for selecting some or all of module therein to realize disclosure scheme.Those of ordinary skill in the art are not paying In the case of going out creative work, you can to understand and implement.
Accordingly, the disclosure also provides a kind of device of living broadcast interactive, and the device of the living broadcast interactive includes processor;With In the memory of storage processor executable;Wherein, the processor is configured to:
Obtain target interaction stage property;
It is determined that from the target interactive image of live data frame;
The target interaction stage property is exported based on the target interactive image.
Fig. 9 is an a kind of structural representation of the device 9900 of the living broadcast interactive according to an exemplary embodiment.Example Such as, device 9900 can be mobile phone, and computer, digital broadcast terminal, messaging devices, game console, flat board sets It is standby, Medical Devices, body-building equipment, personal digital assistant etc..
With reference to Fig. 9, device 9900 can include following one or more assemblies:Process assembly 9902, memory 9904, electricity Source component 9906, multimedia groupware 9908, audio-frequency assembly 9910, the interface 9912 of input/output (I/O), sensor cluster 9914, and communication component 9916.
The integrated operation of the usual control device 9900 of process assembly 9902, such as with display, call, data communication, The associated operation of camera operation and record operation.Treatment element 9902 can include one or more processors 9920 to perform Instruction, to complete all or part of step of above-mentioned method.Additionally, process assembly 9902 can include one or more moulds Block, the interaction being easy between process assembly 9902 and other assemblies.For example, process assembly 9902 can include multi-media module, To facilitate the interaction between multimedia groupware 9908 and process assembly 9902.
Memory 9904 is configured to store various types of data to support the operation in device 9900.These data Example include on device 9900 operate any application program or method instruction, contact data, telephone book data, Message, picture, video etc..Memory 9904 can by any kind of volatibility or non-volatile memory device or they Combination realizes, such as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), it is erasable can Program read-only memory (EPROM), programmable read only memory (PROM), read-only storage (ROM), magnetic memory, flash memory Reservoir, disk or CD.
Power supply module 9906 provides electric power for the various assemblies of device 9900.Power supply module 9906 can include power management System, one or more power supplys, and other generate, manage and distribute the component that electric power is associated with for device 9900.
Multimedia groupware 9908 is included in the screen of one output interface of offer between described device 9900 and user. In some embodiments, screen can include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, Screen may be implemented as touch-screen, to receive the input signal from user.Touch panel includes that one or more touches are passed Sensor is with the gesture on sensing touch, slip and touch panel.The touch sensor can not only sensing touch or slip be dynamic The border of work, but also the detection duration related to the touch or slide and pressure.In certain embodiments, it is many Media component 9908 includes a front-facing camera and/or post-positioned pick-up head.When device 9900 is in operator scheme, mould is such as shot When formula or video mode, front-facing camera and/or post-positioned pick-up head can receive outside multi-medium data.Each preposition shooting Head and post-positioned pick-up head can be a fixed optical lens systems or with focusing and optical zoom capabilities.
Audio-frequency assembly 9910 is configured to output and/or input audio signal.For example, audio-frequency assembly 9910 includes a wheat Gram wind (MIC), when device 9900 is in operator scheme, such as call model, logging mode and speech recognition mode, microphone quilt It is configured to receive external audio signal.The audio signal for being received can be further stored in memory 9904 or via communication Component 9916 sends.In certain embodiments, audio-frequency assembly 9910 also includes a loudspeaker, for exports audio signal.
I/O interfaces 9912 are that interface, above-mentioned peripheral interface module are provided between process assembly 9902 and peripheral interface module Can be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and Locking press button.
Sensor cluster 9914 includes one or more sensors, and the state for providing various aspects for device 9900 is commented Estimate.For example, sensor cluster 9914 can detect the opening/closed mode of device 9900, such as relative positioning of component, institute Display and keypad that component is device 9900 are stated, sensor cluster 9914 can be with detection means 9900 or device 9,900 1 The position of individual component changes, and user is presence or absence of with what device 9900 was contacted, the orientation of device 9900 or acceleration/deceleration and dress Put 9900 temperature change.Sensor cluster 9914 can include proximity transducer, be configured to without any physics The presence of object nearby is detected during contact.Sensor cluster 9914 can also include optical sensor, and such as CMOS or ccd image are sensed Device, for used in imaging applications.In certain embodiments, the sensor cluster 9914 can also include acceleration sensing Device, gyro sensor, Magnetic Sensor, pressure sensor, microwave remote sensor or temperature sensor.
Communication component 9916 is configured to facilitate the communication of wired or wireless way between device 9900 and other equipment.Dress Putting 9900 can access based on the wireless network of communication standard, such as WiFi, 2G or 3G, or combinations thereof.It is exemplary at one In embodiment, communication component 9916 receives the broadcast singal or broadcast correlation from external broadcasting management system via broadcast channel Information.In one exemplary embodiment, the communication component 9916 also includes near-field communication (NFC) module, to promote short distance Communication.For example, RF identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra broadband can be based in NFC module (UWB) technology, bluetooth (BT) technology and other technologies are realizing.
In the exemplary embodiment, device 9900 can be by one or more application specific integrated circuits (ASIC), numeral Signal processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components realizations, for performing said method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instruction, example are additionally provided Such as include the memory 9904 of instruction, above-mentioned instruction can be performed to complete said method by the processor 9920 of device 9900.Example Such as, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, soft Disk and optical data storage devices etc..
Those skilled in the art will readily occur to its of the disclosure after considering specification and putting into practice invention disclosed herein Its embodiment.The application is intended to any modification, purposes or the adaptations of the disclosure, these modifications, purposes or Person's adaptations follow the general principle of the disclosure and including the undocumented common knowledge in the art of the disclosure Or conventional techniques.Description and embodiments are considered only as exemplary, and the true scope of the disclosure and spirit are by following Claim is pointed out.
It should be appreciated that the disclosure is not limited to the precision architecture for being described above and being shown in the drawings, and And can without departing from the scope carry out various modifications and changes.The scope of the present disclosure is only limited by appended claim.

Claims (21)

1. a kind of method of living broadcast interactive, it is characterised in that methods described includes:
Obtain target interaction stage property;
It is determined that from the target interactive image of live data frame;
The target interaction stage property is exported based on the target interactive image.
2. method according to claim 1, it is characterised in that the acquisition target interaction stage property, including:
The interactive stage property selection interface of output;
The interactive stage property that user is selected by the selection interface is obtained, as target interaction stage property.
3. method according to claim 1, it is characterised in that target mutual cardon of the determination from live data frame Picture, including:
Alternative image is gathered based on live data frame;
The alternative image is stored;
When interaction instruction is detected, target interactive image is obtained from the alternative image.
4. method according to claim 3, it is characterised in that described that alternative image is gathered based on live data frame, including:
Collection live data frame;
Target area is identified from the live data frame;
Extract the corresponding image in the target area alternately image.
5. method according to claim 3, it is characterised in that described that alternative image is gathered based on live data frame, including:
Obtain from the alternately image of truncated picture in live data frame.
6. method according to claim 3, it is characterised in that described that the mutual cardon of target is obtained from the alternative image Picture, including:
Export the alternative image;
The image that user selects from the alternative image is obtained, as target interactive image.
7. method according to claim 3, it is characterised in that described that the mutual cardon of target is obtained from the alternative image Picture, including:
The image matched with target interaction stage property is found out from the alternative image as the first image;
Target interactive image is obtained from described first image.
8. method according to claim 3, it is characterised in that described that the target is exported based on the target interactive image Interactive stage property, including:
Determine the positions and dimensions of the image insertion area in the target interaction stage property;
The target interactive image is inserted into into described image insert region according to the positions and dimensions of described image insert region;
Output inserts the target interaction stage property of target interactive image.
9. method according to claim 1, it is characterised in that target mutual cardon of the determination from live data frame Picture, including:
When interaction instruction is detected, current live data frame is obtained in real time;
Whether comprising the region with predetermined characteristic in the current live data frame of identification;
Live data frame comprising the region with predetermined characteristic is defined as into current target interactive image.
10. method according to claim 9, it is characterised in that described that the mesh is exported based on the target interactive image The interactive stage property of mark, including:
Identify the region matched with target interaction stage property as stage property viewing area from current target interactive image;
Target interaction stage property is shown in into the stage property viewing area.
11. a kind of devices of living broadcast interactive, it is characterised in that described device includes:
Acquisition module, is configured to obtain target interaction stage property;
Determining module, is configured to determine that the target interactive image from live data frame;
Output module, is configured to export the target interaction stage property based on the target interactive image.
12. device according to claim 11, it is characterised in that the acquisition module includes:
First output sub-module, is configured to export interactive stage property selection interface;
First acquisition submodule, is configured to obtain the interactive stage property that user is selected by the selection interface, mutual as target Dynamic stage property.
13. devices according to claim 11, it is characterised in that the determining module includes:
Collection submodule, is configured to gather alternative image based on live data frame;
Sub-module stored, is configured to be stored the alternative image;
Second acquisition submodule, is configured to when interaction instruction is detected, and the mutual cardon of target is obtained from the alternative image Picture.
14. devices according to claim 13, it is characterised in that the collection submodule is arranged to:
Collection live data frame;
Target area is identified from the live data frame;
Extract the corresponding image in the target area alternately image.
15. devices according to claim 13, it is characterised in that the collection submodule is arranged to:
Obtain from the alternately image of truncated picture in live data frame.
16. devices according to claim 13, it is characterised in that second acquisition submodule is arranged to:
Export the alternative image;
The image that user selects from the alternative image is obtained, as target interactive image.
17. devices according to claim 13, it is characterised in that second acquisition submodule is arranged to:
The image matched with target interaction stage property is found out from the alternative image as the first image;
Target interactive image is obtained from described first image.
18. devices according to claim 13, it is characterised in that the output module includes:
Determination sub-module, the positions and dimensions of the image insertion area being configured to determine that in the target interaction stage property;
Insertion submodule, is configured to be inserted the target interactive image according to the positions and dimensions of described image insert region To described image insert region;
Second output sub-module, is configured to export the target for inserting target interactive image interaction stage property.
19. devices according to claim 11, it is characterised in that the determining module includes:
Frame acquisition submodule, is configured to when interaction instruction is detected, and current live data frame is obtained in real time;
Whether identification submodule, be configured to recognize in current live data frame comprising the region with predetermined characteristic;
Image determination sub-module, is configured to for the live data frame comprising the region with predetermined characteristic to be defined as current mesh Mark interactive image.
20. devices according to claim 19, it is characterised in that the output module includes:
Region recognition submodule, is configured to from current target interactive image to identify and matches with target interaction stage property Region as stage property viewing area;
Display sub-module, is configured to for target interaction stage property to be shown in the stage property viewing area.
21. a kind of devices of living broadcast interactive, it is characterised in that include:
Processor;
For storing the memory of processor executable;
Wherein, the processor is configured to:
Obtain target interaction stage property;
It is determined that from the target interactive image of live data frame;
The target interaction stage property is exported based on the target interactive image.
CN201611034111.XA 2016-11-15 2016-11-15 Live streaming interaction method and device Pending CN106604101A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611034111.XA CN106604101A (en) 2016-11-15 2016-11-15 Live streaming interaction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611034111.XA CN106604101A (en) 2016-11-15 2016-11-15 Live streaming interaction method and device

Publications (1)

Publication Number Publication Date
CN106604101A true CN106604101A (en) 2017-04-26

Family

ID=58592666

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611034111.XA Pending CN106604101A (en) 2016-11-15 2016-11-15 Live streaming interaction method and device

Country Status (1)

Country Link
CN (1) CN106604101A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107277632A (en) * 2017-05-12 2017-10-20 武汉斗鱼网络科技有限公司 A kind of method and apparatus for showing virtual present animation
CN107820132A (en) * 2017-11-21 2018-03-20 广州华多网络科技有限公司 Living broadcast interactive method, apparatus and system
CN108134964A (en) * 2017-11-22 2018-06-08 上海掌门科技有限公司 Net cast stage property stacking method, computer equipment and storage medium
CN109195001A (en) * 2018-07-02 2019-01-11 广州虎牙信息科技有限公司 Methods of exhibiting, device, storage medium and the terminal of present is broadcast live

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104104703A (en) * 2013-04-09 2014-10-15 广州华多网络科技有限公司 Multi-person audio and video interaction method and system, client, and server
CN104618797A (en) * 2015-02-06 2015-05-13 腾讯科技(北京)有限公司 Information processing method and device and client
CN104780458A (en) * 2015-04-16 2015-07-15 美国掌赢信息科技有限公司 Method and electronic equipment for loading effects in instant video
CN105334963A (en) * 2015-10-29 2016-02-17 广州华多网络科技有限公司 Method and system for displaying virtual article

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104104703A (en) * 2013-04-09 2014-10-15 广州华多网络科技有限公司 Multi-person audio and video interaction method and system, client, and server
CN104618797A (en) * 2015-02-06 2015-05-13 腾讯科技(北京)有限公司 Information processing method and device and client
CN104780458A (en) * 2015-04-16 2015-07-15 美国掌赢信息科技有限公司 Method and electronic equipment for loading effects in instant video
CN105334963A (en) * 2015-10-29 2016-02-17 广州华多网络科技有限公司 Method and system for displaying virtual article

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107277632A (en) * 2017-05-12 2017-10-20 武汉斗鱼网络科技有限公司 A kind of method and apparatus for showing virtual present animation
CN107820132A (en) * 2017-11-21 2018-03-20 广州华多网络科技有限公司 Living broadcast interactive method, apparatus and system
CN107820132B (en) * 2017-11-21 2019-12-06 广州华多网络科技有限公司 Live broadcast interaction method, device and system
CN108134964A (en) * 2017-11-22 2018-06-08 上海掌门科技有限公司 Net cast stage property stacking method, computer equipment and storage medium
CN109195001A (en) * 2018-07-02 2019-01-11 广州虎牙信息科技有限公司 Methods of exhibiting, device, storage medium and the terminal of present is broadcast live

Similar Documents

Publication Publication Date Title
CN106651955A (en) Method and device for positioning object in picture
CN106951884A (en) Gather method, device and the electronic equipment of fingerprint
CN106506448A (en) Live display packing, device and terminal
CN106331761A (en) Live broadcast list display method and apparatuses
CN105302315A (en) Image processing method and device
CN106534994A (en) Live broadcasting interaction method and device
CN106126687A (en) Recommendation method, device, terminal and the server of interface subject
CN106682736A (en) Image identification method and apparatus
CN106804000A (en) Direct playing and playback method and device
CN107832036A (en) Sound control method, device and computer-readable recording medium
JP6474393B2 (en) Music playback method, apparatus and terminal device based on face album
CN106559712A (en) Video playback processing method, device and terminal device
CN106550252A (en) The method for pushing of information, device and equipment
CN106547466A (en) Display control method and device
CN106604101A (en) Live streaming interaction method and device
CN104867112B (en) Photo processing method and device
CN106534963A (en) Direct broadcast processing method, direct broadcast processing device and terminal
CN106534951A (en) Method and apparatus for video segmentation
CN105335198A (en) Font addition method and device
CN107885418A (en) Terminal, multi-screen display method and device
CN104853223B (en) The inserting method and terminal device of video flowing
CN104156151A (en) Image display method and image display device
CN106547850A (en) Expression annotation method and device
CN107085823A (en) Face image processing process and device
CN106681632A (en) Projection control method, device and system, terminal device and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170426