CN108966031A - Method and device, the electronic equipment of broadcasting content control are realized in video session - Google Patents
Method and device, the electronic equipment of broadcasting content control are realized in video session Download PDFInfo
- Publication number
- CN108966031A CN108966031A CN201710352069.4A CN201710352069A CN108966031A CN 108966031 A CN108966031 A CN 108966031A CN 201710352069 A CN201710352069 A CN 201710352069A CN 108966031 A CN108966031 A CN 108966031A
- Authority
- CN
- China
- Prior art keywords
- data
- video session
- image data
- touch event
- trace image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/475—End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
Abstract
The disclosure discloses method and device, the electronic equipment, computer readable storage medium that broadcasting content control is realized in a kind of video session, this method is applied to participate in the terminal of video session, this includes: the drafting function of response triggering, creates drawing area on the interface for carrying out video session;The touch event of drawing area internal trigger is tracked, touch event track is generated, obtains corresponding trace image data;The video session data that trace image data fusion is obtained to the corresponding moment, trace image data show that touch event track in the broadcasting of video session data and being merged.Technical solution provided by the present disclosure, during carrying out video session, it can be without the help of carrier, user draws corresponding text or figure by touching in drawing area, the pattern of drafting can be shown during video session data playback, keep the broadcasting content during video session more abundant.
Description
Technical field
This disclosure relates to field of computer technology, in particular to the method for broadcasting content control is realized in a kind of video session
And device, electronic equipment, computer readable storage medium.
Background technique
At present in mobile live streaming, the interaction forms of common spectators and main broadcaster are text reviews, thumb up, object of giving gifts (stage property)
Deng.Although these interaction forms meet the demand exchanged between part main broadcaster and spectators, but mainly spectators to main broadcaster one
Kind feedback.Main broadcaster comes outside expressing information except through limbs and spoken language, it is also possible to have some more flexible demands, such as uncommon
The more intuitive guidance of spectators is given in prestige by drawing or main broadcaster want to convey to spectators in live streaming it is some more complex
When text information, usually obtains and write on information on the carriers such as blank sheet of paper, then inform spectators.
Existing live streaming interactive form be mainly spectators make comments, thumb up, object of giving gifts etc..With spectators make comments for
Example, spectators make comments, and text are transmitted to server by TCP request, the live streaming APP of main broadcaster passes through the side such as polling request
Formula gets comment information and shows.Main broadcaster, which transmits text information, can also be led to notice information by the function of similar bulletin
It crosses TCP request and is sent to server.
TCP is not real-time network protocol, may have larger delay when transmitting the image informations such as stroke.In net cast
The text or image information of middle main broadcaster's complexity beyond expression of words can only can be interacted, directly by spoken or limb action
Interaction forms during broadcasting are single, are unable to satisfy current live demand.
Summary of the invention
In order to solve the text or image information of the complexity beyond expression of words of main broadcaster in net cast present in the relevant technologies
The problem of, present disclose provides a kind of methods that broadcasting content control is realized in video session.
Based on this, present disclose provides a kind of method that broadcasting content control is realized in video session, the method applications
In the terminal for participating in video session, which comprises
The drafting function of responding triggering creates drawing area on the interface for carrying out video session;
The touch event of the drawing area internal trigger is tracked, touch event track is generated, obtains corresponding trace image
Data;
The trace image data are overlapped with the video session data that the corresponding moment obtains, the trace image number
Show that the touch event track in the broadcasting of the video session data according to and being overlapped.
On the other hand, the disclosure additionally provides the device that broadcasting content control is realized in a kind of video session, described device
Applied to the terminal for participating in video session, described device includes:
Drawing area creation module is created on the interface for carrying out video session and is drawn for responding the drafting function of triggering
Graph region;
Trace image obtains module, for tracking the touch event of the drawing area internal trigger, generates touch event rail
Mark obtains corresponding trace image data;
Data fusion module, for folding the trace image data with the video session data that the corresponding moment obtains
Add, the trace image data make the touch event track in the broadcasting of the video session data and being overlapped
Display.
The disclosure has additionally provided a kind of electronic equipment, and the electronic equipment includes:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to executing the method for realizing broadcasting content control in above-mentioned video session.
In addition, the disclosure additionally provides a kind of computer readable storage medium, the computer-readable recording medium storage
There is computer program, the computer program can be executed the side that broadcasting content control is realized in above-mentioned video session by processor
Method.
The technical scheme provided by this disclosed embodiment can include the following benefits:
The disclosure tracks the touch event in drawing area by creation drawing area, generates touch event track, and will
The trace image data of acquisition are overlapped with the video session data that the corresponding moment obtains, thus playing video session data
During can show the pattern of touch event track.It, can be without the help of load as a result, during carrying out video session
Body, user draw corresponding text or figure by touching in drawing area, so that it may by the pattern of drafting in video session
It is shown during data playback, keeps the broadcasting content during video session more abundant.It is main in the process to solve live streaming at present
Information can only be transmitted to spectators by spoken or other limb actions by broadcasting, and can not carry out complicated information interchange with spectators in real time,
The single problem of the interaction forms of live streaming process.
It should be understood that the above general description and the following detailed description are merely exemplary, this can not be limited
It is open.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows and meets implementation of the invention
Example, and in specification together principle for explaining the present invention.
Fig. 1 is a kind of schematic diagram of implementation environment according to involved in the disclosure;
Fig. 2 is the schematic diagram of the another kind implementation environment according to involved in the disclosure;
Fig. 3 is a kind of block diagram of device shown according to an exemplary embodiment;
Fig. 4 is the stream that the method for broadcasting content control is realized in a kind of video session shown according to an exemplary embodiment
Cheng Tu;
Fig. 5 is shown according to an exemplary embodiment by superimposed video display renderings;
Fig. 6 is the stream shown according to an exemplary embodiment for being overlapped trace image data and video session data
Journey schematic diagram;
Fig. 7 is the flow chart that the details of the step S430 of Fig. 4 corresponding embodiment is described;
Fig. 8 is the frame that the device of broadcasting content control is realized in a kind of video session shown according to an exemplary embodiment
Figure;
Fig. 9 is to obtain the block diagram that the details of module is described to the trace image of Fig. 8 corresponding embodiment.
Figure 10 is to obtain the block diagram that the details of module is described to the trace image of Fig. 9 corresponding embodiment.
Specific embodiment
Here will the description is performed on the exemplary embodiment in detail, the example is illustrated in the accompanying drawings.Following description is related to
When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment
Described in embodiment do not represent all embodiments consistented with the present invention.On the contrary, they be only with it is such as appended
The example of device and method being described in detail in claims, some aspects of the invention are consistent.
Fig. 1 is the schematic diagram of the implementation environment according to involved in the disclosure.The implementation environment include: main broadcaster's lateral terminal 110,
Spectators' lateral terminal 120 and background server 130;Pass through background service between main broadcaster's lateral terminal 110 and spectators' lateral terminal 120
Device 130 is communicated.Main broadcaster's lateral terminal 110 can acquire video session data to be broadcast live, and by the rail of touch event track
Mark image data is superimposed in the video session data of corresponding moment acquisition.Background server 130 is whole for that will pass through main broadcaster side
110 superimposed video session data are held to be distributed broadcast.Spectators' lateral terminal 120 is then wide for receiving background server 130
Then the superimposed video session data broadcast play the video session data comprising touch event track.
Fig. 2 is the schematic diagram of another possible implementation environment according to involved in the disclosure.The implementation environment includes: multiple
Mobile terminal 210, the interrelational form between each mobile terminal 210, network associate mode and/or agreement including hardware, with
And the data correlation mode come and gone therebetween.Wherein, video calling, Mei Geyi can be carried out between each mobile terminal 210
Dynamic terminal 210 all can serve as the terminal of video session, be superimposed trace image data to the transmission of other mobile terminals 210
Video session data.
Fig. 3 is a kind of block diagram of device 300 shown according to an exemplary embodiment.For example, device 300 can be Fig. 1
Mobile terminal 210 in implementation environment shown in main broadcaster's lateral terminal 110 or Fig. 2 in shown implementation environment.Main broadcaster's lateral terminal 110
It such as can be smart phone, tablet computer with mobile terminal 210.
Referring to Fig. 3, device 300 may include following one or more components: processing component 302, memory 304, power supply
Component 306, multimedia component 308, audio component 310, sensor module 314 and communication component 316.
The integrated operation of the usual control device 300 of processing component 302, such as with display, telephone call, data communication, phase
Machine operation and the associated operation of record operation etc..Processing component 302 may include one or more processors 318 to execute
Instruction, to complete all or part of the steps of following methods.In addition, processing component 302 may include one or more modules,
Convenient for the interaction between processing component 302 and other assemblies.For example, processing component 302 may include multi-media module, with convenient
Interaction between multimedia component 308 and processing component 302.
Memory 304 is configured as storing various types of data to support the operation in device 300.These data are shown
Example includes the instruction of any application or method for operating on the device 300.Memory 304 can be by any type
Volatibility or non-volatile memory device or their combination realize, such as static random access memory (Static
Random Access Memory, abbreviation SRAM), electrically erasable programmable read-only memory (Electrically Erasable
Programmable Read-Only Memory, abbreviation EEPROM), Erasable Programmable Read Only Memory EPROM (Erasable
Programmable Read Only Memory, abbreviation EPROM), programmable read only memory (Programmable Red-
Only Memory, abbreviation PROM), read-only memory (Read-Only Memory, abbreviation ROM), magnetic memory, flash
Device, disk or CD.One or more modules are also stored in memory 304, which is configured to by this
One or more processors 318 execute, with complete following Fig. 4,6,7 it is any shown in broadcasting content control is realized in video session
Method all or part of step.
Power supply module 306 provides electric power for the various assemblies of device 300.Power supply module 306 may include power management system
System, one or more power supplys and other with for device 300 generate, manage, and distribute the associated component of electric power.
Multimedia component 308 includes the screen of one output interface of offer between described device 300 and user.One
In a little embodiments, screen may include liquid crystal display (Liquid Crystal Display, abbreviation LCD) and touch panel.
If screen includes touch panel, screen may be implemented as touch screen, to receive input signal from the user.Touch panel
Including one or more touch sensors to sense the gesture on touch, slide, and touch panel.The touch sensor can be with
The boundary of a touch or slide action is not only sensed, but also detects duration associated with the touch or slide operation and pressure
Power.Screen can also include display of organic electroluminescence (Organic Light Emitting Display, abbreviation OLED).
Audio component 310 is configured as output and/or input audio signal.For example, audio component 310 includes a Mike
Wind (Microphone, abbreviation MIC), when device 300 is in operation mode, such as call model, logging mode and speech recognition mould
When formula, microphone is configured as receiving external audio signal.The received audio signal can be further stored in memory
304 or via communication component 316 send.In some embodiments, audio component 310 further includes a loudspeaker, for exporting
Audio signal.
Sensor module 314 includes one or more sensors, and the state for providing various aspects for device 300 is commented
Estimate.For example, sensor module 314 can detecte the state that opens/closes of device 300, the relative positioning of component, sensor group
Part 314 can be with the position change of 300 1 components of detection device 300 or device and the temperature change of device 300.Some
In embodiment, which can also include Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 316 is configured to facilitate the communication of wired or wireless way between device 300 and other equipment.Device
300 can access the wireless network based on communication standard, such as WiFi (WIreless-Fidelity, Wireless Fidelity).Show at one
In example property embodiment, communication component 316 receives broadcast singal or broadcast from external broadcasting management system via broadcast channel
Relevant information.In one exemplary embodiment, the communication component 316 further includes near-field communication (Near Field
Communication, abbreviation NFC) module, to promote short range communication.For example, radio frequency identification (Radio can be based in NFC module
Frequency Identification, abbreviation RFID) technology, Infrared Data Association (Infrared Data
Association, abbreviation IrDA) technology, ultra wide band (Ultra Wideband, abbreviation UWB) technology, Bluetooth technology and other skills
Art is realized.
In the exemplary embodiment, device 300 can be by one or more application specific integrated circuit (Application
Specific Integrated Circuit, abbreviation ASIC), it is digital signal processor, digital signal processing appts, programmable
Logical device, field programmable gate array, controller, microcontroller, microprocessor or other electronic components are realized, for executing
Following methods.
Fig. 4 is the stream that the method for broadcasting content control is realized in a kind of video session shown according to an exemplary embodiment
Cheng Tu.In the video session realize broadcasting content control method the scope of application and executing subject, this method can be used for into
The terminal of row video session.For example, this method can be used for shown in the main broadcaster's lateral terminal 110 or Fig. 2 of implementation environment shown in Fig. 1
Mobile terminal 210 in implementation environment.It, can be with as shown in figure 4, the control method can be executed by the terminal of carry out video session
Include the following steps.
In step S410, the drafting function of triggering is responded, creates drawing area on the interface for carrying out video session;
For example, the terminal for carrying out video session can be the main broadcaster's lateral terminal being broadcast live, be also possible to it is multiple into
Any one in the mobile terminal of row video calling.For example, using wechat carry out video calling terminal or using QQ into
The terminal etc. of row video calling.Specifically, carrying out the terminal response triggering of video session when user triggers drawing function button
Drawing function, start charting component, carry out video session interface on create drawing area.Wherein, in the drawing area
It can be drawn a design for user by finger or stylus, such as text or figure.
In step S430, the touch event of the drawing area internal trigger is tracked, generates touch event track, obtains phase
The trace image data answered;
Specifically, when user draws a design the drawing pattern in drawing area by finger or stylus, charting component
The touch event in drawing area internal trigger can be monitored, the generation position of touch event can be formed by a little, and continuous point is
It may make up lines, in other words namely touch event track.Optionally, it after generating touch event track, can be drawn at this
Graph region carries out touch event track and shows, that is, shows the lines being made of the generation position of continuous touch event, or
Show the pattern of multiple lines compositions.As needed, the color of lines can be red, and the drawing area face other than lines is arranged
Color is transparent.
Wherein, according to touch event track, trace image data corresponding with touch event track can be obtained.Specifically
, for the lines that the generation position of touch event continuous in drawing area is constituted, the lines of available different moments
Image data, i.e. corresponding trace image data in touch event track.Wherein, which can be continuous packet
Include a frame frame image data of the pixel value of each pixel.
In step S450, the trace image data are overlapped with the video session data that the corresponding moment obtains,
The trace image data show that the touch event track in the broadcasting of the video session data
Show
Wherein, video session data may include video image data and voice data, or only video image data.
The video session data can carry out video session number by the image collecting device and voice acquisition device that terminal has been configured with
According to acquisition.
Specifically, terminal can by trace image data that a certain moment obtains and corresponding moment (such as synchronization or
In certain time interval) the video session data that obtain are merged by Image Fusion, such as by trace image data
In the pixel value of each pixel and the pixel value of video session data corresponding pixel points be overlapped after transmit.Thus and terminal
The another party for carrying out video calling, can show the corresponding touch event of trace image data in video session data playback
Track;It is corresponded to alternatively, the spectator client of viewing live streaming can show trace image data in video session data playback
Touch event track.As shown in figure 5, main broadcaster can be in main broadcaster's lateral terminal graphing, the graphic hotsopt trace image of drafting
Live video data stream is written in data, and the terminal for being then transmit to viewer side carries out graphical display.Thus, it is possible to enrich main broadcaster
With the interaction forms of spectators.Wherein, the stacked system of trace image data and video session data can be real using the prior art
It is existing.For example, as shown in fig. 6, video image can be shown in first area 51, in second area 52 after superposition
Show touch event track.
It should be noted that since second area 52 is located in first area 51, so second area 52 may block
The display of 51 inner video image of first area.Preferably, step S450 obtains the trace image data with the corresponding moment
Video session data are overlapped, and can specifically include: by the trace image data in a manner of video watermark and when corresponding
The video session data obtained are carved to be overlapped.That is, touch event track can be in a manner of watermark in video session
It is shown in the playing process of data, to will not influence the broadcasting of video session data, i.e., not when showing touch event track
The display of video image can be sheltered from.Wherein it is possible to using existing video watermarking algorithms by trace image data and video council
Words data are overlapped, to be embedded in the watermark patterns drawn terminal side in video image.
Referring to shown in Fig. 7, trace image data and video session data are overlapped and be can be based on GPUImage
(third party library of open source is handled for picture and video).It can be seen from figure 7 that text or graphical information will be contained first
The watermark figure layer of (i.e. touch event track) generates watermarking images (i.e. trace image data).And the frame information of video file is then
It is packaged into video image, the view using watermarking images and video image as input, by the processing of GPUImage, after generating watermark
Frequency image.Then the video image after watermark is output to live stream.Certainly, when the video image after watermark being exported,
Audio data is exported simultaneously.By timer, circulation executes process shown in Fig. 7, one by one successively by trace image data
It is overlapped with video session data.
For example with application scenarios shown in FIG. 1, the terminal for carrying out video session is main broadcaster's lateral terminal, in the prior art
The user (i.e. main broadcaster) of main broadcaster's lateral terminal can only be passed by other spoken limb actions to the user (i.e. spectators) of spectators' lateral terminal
It delivers letters breath, main broadcaster can not carry out complicated information interchange with spectators in real time, and the interaction forms during being broadcast live are single, are unable to satisfy
Current live demand.The method that broadcasting content control is realized in the video session that the embodiment of the present disclosure provides, can be used for above-mentioned
Main broadcaster's lateral terminal carries out net cast, can be certainly also used in video conversation.
The disclosure tracks the touch event in drawing area by creation drawing area, generates touch event track, and will
The trace image data of acquisition are overlapped with the video session data that the corresponding moment obtains, thus playing video session data
During can show the pattern of touch event track.It, can be without the help of load as a result, during carrying out video session
Body, user draw corresponding text or figure by touching in drawing area, so that it may by the pattern of drafting in video session
It is shown during data playback, keeps the broadcasting content during video session more abundant.It is main in the process to solve live streaming at present
Information can only be transmitted to spectators by spoken or other limb actions by broadcasting, and can not carry out complicated information interchange with spectators in real time,
The single problem of the interaction forms of live streaming process.
Fig. 8 is the flow chart that the details of the step S430 of Fig. 4 corresponding embodiment is described.As shown in figure 8, step
S430 tracks the touch event of the drawing area internal trigger, generates touch event track, obtains corresponding trace image data,
It can specifically include following steps:
In step S431: when listening to the drawing area internal trigger touch event, starting timer function and remember
Record the trigger position coordinate of touch event;Wherein, continuous position coordinates constitute touch event track;
Specifically, breathing out charting component when user triggers and draws function, creating drawing area by charting component, and supervise
Listen the touch event of drawing area internal trigger, i.e. monitoring users finger or stylus touching and mobile event.It is touched when listening to
When having sent out touch event, in other words, when screen where finger or stylus start to touch drawing area, start timer
Function, and the position coordinates where current finger or stylus are recorded, that is, the trigger position coordinate of touch event.Work as hand
When finger or stylus move on the screen, continuous trigger position coordinate may be constructed lines, that is, touch event track.
As needed, the lines of touch event track can be arranged to red, other regions in addition to lines are set as
It is transparent, and the pattern displaying that lines are constituted is on the screen of terminal.
In step S432: according to the preset time interval, it is corresponding with touch event track to continuously generate a frame frame
Trace image data.
For example, timer can be set every 0.05 second, and the lines figure comprising touch event track is generated a frame
Trace image data.When user is drawn a design in drawing area by finger or stylus, continuous trigger position coordinate
Touch event track is formed, and over time, formed track of the frame about touch event track every 0.05 second
Image data.
Further, step S450 folds the trace image data with the video session data that the corresponding moment obtains
Add, specifically include: the view that the trace image data generated according to the preset time interval are successively obtained with the corresponding moment
Frequency session data is overlapped.
It should be noted that working as step S432, at set time intervals (such as 0.05 second), a frame frame and touch-control are generated
After the corresponding trace image data of event trace, a burst of frame track image data is sequentially overlaid on what the corresponding moment obtained
In video session data.
The video session data obtained when the frame track image data generated when for example, by 0.05 second was with 0.05 second
It is overlapped;The video session data obtained when the next frame trace image data generated when 0.1 second were with 0.1 second are folded
Add;The video session data obtained when the trace image of the next frame again data generated when 0.15 second were with 0.15 second are overlapped,
And so on, successively the video session data that a burst of frame track image data of generation is obtained with synchronization are overlapped.
In view of data transmission delay, a burst of frame track image data sequentially generated can also successively with certain time
Video session data behind interval are overlapped.For example, when the frame track image data generated when by 0.05 second was with 0.06 second
The video session data of acquisition are overlapped.
Further, occurring in a period of time after touch event, do not listening to drawing area internal trigger touching again
Control event can then indicate that user's finger or stylus have left drawing area, which terminates.It can then remove in caching
Touch event track, waiting trigger next time.
In one exemplary embodiment, the method that broadcasting content control is realized in video session that the disclosure provides is answered
Main broadcaster's lateral terminal for being broadcast live.Wherein, the view obtained the trace image data with the corresponding moment in step S450
After frequency session data is overlapped, the method for broadcasting content control is realized in the video session that the disclosure provides further include:
The video session data for being superimposed trace image data are generated into stream medium data, the stream medium data is sent
To studio's server, the transmission of the stream medium data triggers studio's server to described in the broadcast of studio's spectator client
Stream medium data.
Specifically, the video session data for being superimposed trace image data are generated stream medium data by main broadcaster's lateral terminal, so
Stream medium data is sent to studio's server afterwards.After studio's server receives stream medium data, it is broadcast live according to access
The audience information of room is distributed to studio's spectator client and broadcasts the stream medium data.Spectator client is receiving stream matchmaker
Main broadcaster's side video image comprising watermark patterns is shown after volume data, and shape of the main broadcaster to spectators' transmitting information can be enriched with this
Formula.
In one exemplary embodiment, the method that broadcasting content control is realized in video session that the disclosure provides is answered
For carrying out the terminal of video calling.Wherein, the video obtained the trace image data with the corresponding moment in step S450
After session data is overlapped, institute's method further include:
The video session data for being superimposed trace image data are generated into stream medium data, by the stream medium data
It is sent to video calling server, the transmission of the stream medium data triggers the video calling server for the Streaming Media number
According to the distant terminal for being sent to progress video calling.
Specifically, the video session data for being superimposed trace image data are generated stream by the first terminal for carrying out video calling
Then stream medium data is sent to video calling server by media data.Video calling server receives stream medium data
Afterwards, according to the address for the second terminal for carrying out video calling with first terminal, stream medium data is sent to the second terminal,
Second terminal shows the first terminal side video image comprising watermark patterns.Likewise, second terminal can also use the disclosure
The method of offer has been superimposed the video session data of trace image data by video calling server to first terminal transmission,
The second terminal side video image comprising watermark patterns is shown in first terminal.With this, the use for carrying out video calling can be enriched
Exchange way between family.
Following is embodiment of the present disclosure, performed by the terminal that can be used for executing the above-mentioned carry out video session of the disclosure
A kind of video session in realize broadcasting content control embodiment of the method.For undisclosed thin in embodiment of the present disclosure
Section please refers to the embodiment of the method that broadcasting content control is realized in disclosure video session.
Fig. 9 is the frame that the device of broadcasting content control is realized in a kind of video session shown according to an exemplary embodiment
Figure, realize in the video session device of broadcasting content control can be used for implementation environment shown in Fig. 1 main broadcaster's lateral terminal 110 or
In mobile terminal 210 in implementation environment shown in person Fig. 2, execute Fig. 4,7,8 it is any shown in realize in video session and play in
Hold all or part of step of the method for control.As shown in figure 9, the device includes but is not limited to: drawing area creation module
910, trace image obtains module 930 and data fusion module 950;
Drawing area creation module 910 creates on the interface for carrying out video session for responding the drafting function of triggering
Drawing area;
Trace image obtains module 930, for tracking the touch event of the drawing area internal trigger, generates touch event
Track obtains corresponding trace image data;
Data fusion module 950, video session data for obtaining the trace image data and corresponding moment into
Row superposition, the trace image data make touch event track the broadcasting in the video session data and being overlapped
Put middle display.
The function of modules and the realization process of effect are specifically detailed in realization in above-mentioned video session and broadcast in above-mentioned apparatus
The realization process that step is corresponded in the method for content-control is put, details are not described herein.
Drawing area creation module 910 such as can be some physical structure processor 318 in Fig. 3.
Trace image obtains module 930 and data fusion module 950 is also possible to functional module, for executing above-mentioned view
The correspondence step in the method for broadcasting content control is realized in frequency session.It is appreciated that these modules can be by hardware, soft
Part, or a combination of both realize.When realizing in hardware, these modules may be embodied as one or more hardware modules,
Such as one or more specific integrated circuits.When being realized with software mode, these modules be may be embodied as in one or more
The one or more computer programs executed on processor, such as memory 304 is stored in performed by the processor 318 of Fig. 3
In program.
Optionally, as shown in Figure 10, the trace image obtains module 930 and can include but is not limited to:
Trigger recording unit 931, for starting timer function when listening to the drawing area internal trigger touch event
It can and record the trigger position coordinate of touch event;Wherein, continuous position coordinates constitute touch event track;
Data generating unit 932, is used for following preset time intervals, and continuously generates a frame frame and touch event track phase
Corresponding trace image data.
Optionally, the data fusion module 950 can include but is not limited to:
Data fusion unit, for by the trace image data generated according to the preset time interval successively and accordingly
The video session data that moment obtains are overlapped.
Optionally, the data fusion module 950 can include but is not limited to:
Watermark integrated unit, the view for obtaining the trace image data with the corresponding moment in the form of video watermark
Frequency session data is overlapped.
Optionally, described device can also include but is not limited to:
Track display module is shown for carrying out the touch event track in the drawing area.
Optionally, described device can also include but is not limited to:
Live data sending module, the video session data for that will be superimposed trace image data generate Streaming Media number
According to, the stream medium data is sent to studio's server, the transmission of the stream medium data trigger studio's server to
Studio's spectator client broadcasts the stream medium data.
Optionally, described device can also include but is not limited to:
Communicating data sending module, for the video session data for being superimposed trace image data to be generated Streaming Media
The stream medium data is sent to video calling server by data, and it is logical that the transmission of the stream medium data triggers the video
The stream medium data is sent to the distant terminal for carrying out video calling by words server.
Optionally, the disclosure additionally provides a kind of electronic equipment, which can be used for implementation environment shown in Fig. 1
In mobile terminal 210 in implementation environment shown in main broadcaster's lateral terminal 110 or Fig. 2, execute Fig. 4,7,8 it is any shown in video
The all or part of step of the method for broadcasting content control is realized in session.Described device includes:
Processor;
Memory for storage processor executable instruction;
Wherein, it is realized in broadcasting the processor is configured to executing in video session described in the above exemplary embodiments
Hold the method for control.
The concrete mode that the processor of device in the embodiment executes operation is realized in the related video session
Detailed description is performed in the embodiment of the method for broadcasting content control, no detailed explanation will be given here.
In the exemplary embodiment, a kind of storage medium is additionally provided, which is computer readable storage medium,
It such as can be the provisional and non-transitorycomputer readable storage medium for including instruction.The storage medium is stored with computer
Program, the computer program can be executed as the processor 318 of device 300 to complete video session described in above-described embodiment
The middle method for realizing broadcasting content control.
It should be understood that the present invention is not limited to the precise structure already described above and shown in the accompanying drawings, and
And various modifications and change can executed without departing from the scope.The scope of the present invention is limited only by the attached claims.
Claims (15)
1. realizing the method for broadcasting content control in a kind of video session, which is characterized in that the method is applied to participate in video
The terminal of session, which comprises
The drafting function of responding triggering creates drawing area on the interface for carrying out video session;
The touch event of the drawing area internal trigger is tracked, touch event track is generated, obtains corresponding trace image data;
The trace image data are overlapped with the video session data that the corresponding moment obtains, the trace image data are logical
It crosses and is overlapped and shows that the touch event track in the broadcasting of the video session data.
2. the method according to claim 1, wherein the touch-control thing of the tracking drawing area internal trigger
Part generates touch event track, obtains corresponding trace image data, specifically include:
When listening to the drawing area internal trigger touch event, starts timer function and record the trigger bit of touch event
Set coordinate;Wherein, continuous position coordinates constitute touch event track;
According to the preset time interval, frame frame trace image data corresponding with touch event track are continuously generated.
3. according to the method described in claim 2, it is characterized in that, the view that trace image data are obtained with the corresponding moment
Frequency session data is overlapped, comprising:
The video session number that the trace image data generated according to the preset time interval are successively obtained with the corresponding moment
According to being overlapped.
4. the method according to claim 1, wherein the view that trace image data are obtained with the corresponding moment
Frequency session data is overlapped, comprising:
The video session data that the trace image data are obtained in the form of video watermark with the corresponding moment are overlapped.
5. the method according to claim 1, wherein the touch-control thing of the tracking drawing area internal trigger
Part, after generating touch event track, the method also includes:
The touch event track is carried out in the drawing area to show.
6. the method according to claim 1, wherein the view that the trace image data are obtained with the corresponding moment
After frequency session data is overlapped, institute's method further include:
The video session data for being superimposed trace image data are generated into stream medium data, the stream medium data is sent to directly
Room server is broadcast, the transmission of the stream medium data triggers studio's server and broadcasts the stream matchmaker to studio's spectator client
Volume data.
7. the method according to claim 1, wherein the view that the trace image data are obtained with the corresponding moment
After frequency session data is overlapped, institute's method further include:
The video session data for being superimposed trace image data are generated into stream medium data, the stream medium data is sent
To video calling server, the transmission of the stream medium data triggers the video calling server and sends out the stream medium data
It send to the distant terminal for carrying out video calling.
8. realizing the device of broadcasting content control in a kind of video session, described device is applied to participate in the terminal of video session,
Described device includes:
Drawing area creation module creates Drawing zone on the interface for carrying out video session for responding the drafting function of triggering
Domain;
Trace image obtains module, for tracking the touch event of the drawing area internal trigger, generates touch event track, obtains
Obtain corresponding trace image data;
Data fusion module, for the trace image data to be overlapped with the video session data that the corresponding moment obtains,
The trace image data show that the touch event track in the broadcasting of the video session data
Show.
9. device according to claim 8, which is characterized in that the trace image obtains module and specifically includes:
Trigger recording unit, for when listening to the drawing area internal trigger touch event, starting timer function and remembering
Record the trigger position coordinate of touch event;Wherein, continuous position coordinates constitute touch event track;
Data generating unit is used for following preset time intervals, and it is corresponding with touch event track to continuously generate a frame frame
Trace image data.
10. device according to claim 9, which is characterized in that the data fusion module includes:
Data fusion unit, for by the trace image data generated according to the preset time interval successively with the corresponding moment
The video session data of acquisition are overlapped.
11. device according to claim 8, which is characterized in that the data fusion module includes:
Watermark integrated unit, the video council for obtaining the trace image data with the corresponding moment in the form of video watermark
Words data are overlapped.
12. device according to claim 8, which is characterized in that described device further include:
Track display module is shown for carrying out the touch event track in the drawing area.
13. device according to claim 8, which is characterized in that described device further include:
Live data sending module, the video session data for that will be superimposed trace image data generate stream medium data, will
The stream medium data is sent to studio's server, and the transmission of the stream medium data triggers studio's server to studio
Spectator client broadcasts the stream medium data.
14. a kind of electronic equipment, which is characterized in that the electronic equipment includes:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to perform claim requires to realize in broadcasting in video session described in 1-7 any one
Hold the method for control.
15. a kind of computer readable storage medium, which is characterized in that the computer-readable recording medium storage has computer journey
Sequence, the computer program can be completed to realize in video session described in claim 1-7 any one and broadcast as processor execution
The method for putting content-control.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710352069.4A CN108966031B (en) | 2017-05-18 | 2017-05-18 | Method and device for realizing playing content control in video session and electronic equipment |
PCT/CN2018/085494 WO2018210136A1 (en) | 2017-05-18 | 2018-05-03 | Method and apparatus for realizing content playback control in video session, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710352069.4A CN108966031B (en) | 2017-05-18 | 2017-05-18 | Method and device for realizing playing content control in video session and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108966031A true CN108966031A (en) | 2018-12-07 |
CN108966031B CN108966031B (en) | 2021-06-04 |
Family
ID=64273285
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710352069.4A Active CN108966031B (en) | 2017-05-18 | 2017-05-18 | Method and device for realizing playing content control in video session and electronic equipment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108966031B (en) |
WO (1) | WO2018210136A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110536094A (en) * | 2019-08-27 | 2019-12-03 | 上海盛付通电子支付服务有限公司 | A kind of method and apparatus transmitting information in video call process |
CN111491174A (en) * | 2020-05-29 | 2020-08-04 | 广州华多网络科技有限公司 | Virtual gift acquisition and display method, device, equipment and storage medium |
CN111524210A (en) * | 2020-04-10 | 2020-08-11 | 北京百度网讯科技有限公司 | Method and apparatus for generating drawings |
CN111796900A (en) * | 2020-07-23 | 2020-10-20 | 深圳利亚德光电有限公司 | Display control method and device of electronic conference system and electronic conference system |
CN112000252A (en) * | 2020-08-14 | 2020-11-27 | 广州市百果园信息技术有限公司 | Virtual article sending and displaying method, device, equipment and storage medium |
CN112383793A (en) * | 2020-11-12 | 2021-02-19 | 咪咕视讯科技有限公司 | Picture synthesis method and device, electronic equipment and storage medium |
CN113613060A (en) * | 2021-08-03 | 2021-11-05 | 广州繁星互娱信息科技有限公司 | Drawing live broadcast method, device, equipment and storage medium |
CN113709389A (en) * | 2020-05-21 | 2021-11-26 | 北京达佳互联信息技术有限公司 | Video rendering method and device, electronic equipment and storage medium |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109547836A (en) * | 2018-12-05 | 2019-03-29 | 网易(杭州)网络有限公司 | Exchange method and device, electronic equipment, storage medium is broadcast live |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070200925A1 (en) * | 2006-02-07 | 2007-08-30 | Lg Electronics Inc. | Video conference system and method in a communication network |
CN103702040A (en) * | 2013-12-31 | 2014-04-02 | 广州华多网络科技有限公司 | Real-time video graphic decoration superposing processing method and system |
CN105187930A (en) * | 2015-09-18 | 2015-12-23 | 广州酷狗计算机科技有限公司 | Video live broadcasting-based interaction method and device |
CN105959718A (en) * | 2016-06-24 | 2016-09-21 | 乐视控股(北京)有限公司 | Real-time interaction method and device in video live broadcasting |
CN106162230A (en) * | 2016-07-28 | 2016-11-23 | 北京小米移动软件有限公司 | The processing method of live information, device, Zhu Boduan, server and system |
CN106454199A (en) * | 2016-10-31 | 2017-02-22 | 维沃移动通信有限公司 | Video communication method and mobile terminal |
CN106534875A (en) * | 2016-11-09 | 2017-03-22 | 广州华多网络科技有限公司 | Barrage display control method and device and terminal |
-
2017
- 2017-05-18 CN CN201710352069.4A patent/CN108966031B/en active Active
-
2018
- 2018-05-03 WO PCT/CN2018/085494 patent/WO2018210136A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070200925A1 (en) * | 2006-02-07 | 2007-08-30 | Lg Electronics Inc. | Video conference system and method in a communication network |
CN103702040A (en) * | 2013-12-31 | 2014-04-02 | 广州华多网络科技有限公司 | Real-time video graphic decoration superposing processing method and system |
CN105187930A (en) * | 2015-09-18 | 2015-12-23 | 广州酷狗计算机科技有限公司 | Video live broadcasting-based interaction method and device |
CN105959718A (en) * | 2016-06-24 | 2016-09-21 | 乐视控股(北京)有限公司 | Real-time interaction method and device in video live broadcasting |
CN106162230A (en) * | 2016-07-28 | 2016-11-23 | 北京小米移动软件有限公司 | The processing method of live information, device, Zhu Boduan, server and system |
CN106454199A (en) * | 2016-10-31 | 2017-02-22 | 维沃移动通信有限公司 | Video communication method and mobile terminal |
CN106534875A (en) * | 2016-11-09 | 2017-03-22 | 广州华多网络科技有限公司 | Barrage display control method and device and terminal |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110536094A (en) * | 2019-08-27 | 2019-12-03 | 上海盛付通电子支付服务有限公司 | A kind of method and apparatus transmitting information in video call process |
WO2021036561A1 (en) * | 2019-08-27 | 2021-03-04 | 上海盛付通电子支付服务有限公司 | Method and apparatus for transferring information during video call |
CN111524210A (en) * | 2020-04-10 | 2020-08-11 | 北京百度网讯科技有限公司 | Method and apparatus for generating drawings |
CN113709389A (en) * | 2020-05-21 | 2021-11-26 | 北京达佳互联信息技术有限公司 | Video rendering method and device, electronic equipment and storage medium |
CN111491174A (en) * | 2020-05-29 | 2020-08-04 | 广州华多网络科技有限公司 | Virtual gift acquisition and display method, device, equipment and storage medium |
CN111796900A (en) * | 2020-07-23 | 2020-10-20 | 深圳利亚德光电有限公司 | Display control method and device of electronic conference system and electronic conference system |
CN112000252A (en) * | 2020-08-14 | 2020-11-27 | 广州市百果园信息技术有限公司 | Virtual article sending and displaying method, device, equipment and storage medium |
CN112383793A (en) * | 2020-11-12 | 2021-02-19 | 咪咕视讯科技有限公司 | Picture synthesis method and device, electronic equipment and storage medium |
CN112383793B (en) * | 2020-11-12 | 2023-07-07 | 咪咕视讯科技有限公司 | Picture synthesis method and device, electronic equipment and storage medium |
CN113613060A (en) * | 2021-08-03 | 2021-11-05 | 广州繁星互娱信息科技有限公司 | Drawing live broadcast method, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2018210136A1 (en) | 2018-11-22 |
CN108966031B (en) | 2021-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108966031A (en) | Method and device, the electronic equipment of broadcasting content control are realized in video session | |
US11151889B2 (en) | Video presentation, digital compositing, and streaming techniques implemented via a computer network | |
WO2020083021A1 (en) | Video recording method and apparatus, video playback method and apparatus, device, and storage medium | |
US9471902B2 (en) | Proxy for asynchronous meeting participation | |
CN106162230A (en) | The processing method of live information, device, Zhu Boduan, server and system | |
US20070067707A1 (en) | Synchronous digital annotations of media data stream | |
CN109089059A (en) | Method, apparatus, electronic equipment and the computer storage medium that video generates | |
JP2009145883A (en) | Learning system, storage medium, and learning method | |
CN106804000A (en) | Direct playing and playback method and device | |
CN102447877A (en) | Optimized telepresence using mobile device gestures | |
TW201408053A (en) | Method and device for displaying content, and method for providing additional information about content | |
CN105791950A (en) | Power Point video recording method and device | |
CN111508531B (en) | Audio processing method and device | |
CN207882853U (en) | A kind of intelligent information release system | |
CN105872822A (en) | Video playing method and video playing system | |
CN108259988A (en) | A kind of video playing control method, terminal and computer readable storage medium | |
CN109451849A (en) | Paging message method of sending and receiving and device, base station, user equipment | |
JP2014532330A (en) | Strengthen video conferencing | |
CN110719529A (en) | Multi-channel video synchronization method, device, storage medium and terminal | |
CN112738544A (en) | Live broadcast room interaction method and device, electronic equipment and storage medium | |
CN107105339A (en) | A kind of methods, devices and systems for playing live video | |
CN107895006A (en) | Audio frequency playing method, device, storage medium and electronic equipment | |
CN109361954A (en) | Method for recording, device, storage medium and the electronic device of video resource | |
CN106888155A (en) | Information gathering and shared method, client and system | |
US9671939B2 (en) | Systems and methods for automatic generation and consumption of hypermeetings |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |