CN110602521A - Method, system, computer readable medium and device for measuring mixed drawing time delay - Google Patents

Method, system, computer readable medium and device for measuring mixed drawing time delay Download PDF

Info

Publication number
CN110602521A
CN110602521A CN201910959170.5A CN201910959170A CN110602521A CN 110602521 A CN110602521 A CN 110602521A CN 201910959170 A CN201910959170 A CN 201910959170A CN 110602521 A CN110602521 A CN 110602521A
Authority
CN
China
Prior art keywords
video data
data
timing
client
time delay
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910959170.5A
Other languages
Chinese (zh)
Other versions
CN110602521B (en
Inventor
李劲
冯迅
陈宇辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Cubesili Information Technology Co Ltd
Original Assignee
Guangzhou Huaduo Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huaduo Network Technology Co Ltd filed Critical Guangzhou Huaduo Network Technology Co Ltd
Priority to CN201910959170.5A priority Critical patent/CN110602521B/en
Publication of CN110602521A publication Critical patent/CN110602521A/en
Application granted granted Critical
Publication of CN110602521B publication Critical patent/CN110602521B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2365Multiplexing of several video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention provides a method, a system, a computer readable medium and equipment for measuring mixed drawing time delay, wherein the method comprises the following steps: acquiring timing data in each video data of the mixed picture stream; the mixed picture stream comprises a plurality of video data, and the video data are synchronously timed to obtain mixed picture time delay of the first video data and the second video data by utilizing timing data in the first video data and timing data in the second video data through calculation; wherein the first video data and the second video data are any two video data in the comic stream. The aim of testing the mixed drawing time delay of multi-user interactive live broadcasting is achieved.

Description

Method, system, computer readable medium and device for measuring mixed drawing time delay
Technical Field
The invention relates to the technical field of live broadcasting, in particular to a method, a system, a computer readable medium and equipment for measuring mixed drawing time delay.
Background
With the increasing development of science and technology, science and technology also gradually influence the daily life of people, people's life is more and more unable to leave the internet now, and with the rise of internet, people can find many novel works on the internet, so the live broadcast industry of network is born. Live webcasting is an emerging social networking mode, and a live webcasting platform becomes a brand-new social media. The network live broadcast is mainly divided into real-time live broadcast games, movies or television series and the like. In addition, the network live broadcast absorbs and continues the advantages of the internet, acquires audio data and video data by erecting independent signal acquisition equipment on site, guides the acquired data into a broadcast guide terminal (broadcast guide equipment or platform), uploads the data to a server through the network, and distributes the data to a website for people to watch.
Currently, a webcast platform can support interactive live broadcast of multiple users, for example: and multi-person video real-time live interactive broadcasting with wheat. The multi-user interactive live broadcast generally includes that a plurality of anchor users send respective video streams to a server, and the server mixes a plurality of video streams into a mixed picture stream which is pushed to audiences to watch. Therefore, the comic delay of each anchor picture is an important index of the comic stream. Therefore, it is necessary to test the comic delay of the interactive live broadcast of multiple users.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method, an apparatus, a computer-readable medium, and a device for measuring comic delay, which are used to test comic delay of multi-user interactive live broadcast.
In order to achieve the above purpose, the embodiments of the present invention provide the following technical solutions:
the first aspect of the present invention provides a method for measuring comic time delay, including:
acquiring timing data in each video data of the mixed picture stream; the mixed picture stream comprises a plurality of video data, and the video data are synchronously timed;
calculating mixed drawing time delay of the first video data and the second video data by using timing data in the first video data and timing data in the second video data; wherein the first video data and the second video data are any two video data in the comic stream.
Optionally, the acquiring timing data in each video data of the comic stream includes:
intercepting a plurality of frames of images in the mixed picture stream;
and identifying each frame of image in the intercepted mixed drawing stream to obtain timing data corresponding to each video data of the mixed drawing stream in each frame of image.
Optionally, the calculating, by using timing data in first video data and timing data in second video data, a mixed drawing time delay of the first video data and the second video data includes:
calculating the initial mixed drawing time delay of the first video data and the second video data in each frame of image by utilizing timing data corresponding to the first video data and the second video data in each frame of image;
and averaging the initial mixed drawing time delay of the first video data and the second video data in each frame of image to obtain the mixed drawing time delay of the first video data and the second video data.
The second aspect of the present invention provides a method for measuring comic delay, which is applied to a first client, and the method includes:
generating first video data; the first video data is used for generating a mixed drawing stream with at least one second video data, the first video data carries timing data synchronous with each second video data, and the timing data of the first video data and the timing data of any one second video data are used for calculating mixed drawing time delay of the first video data and the second video data.
Optionally, the generating the first video data includes:
acquiring second video data generated by a second client; wherein the second video data carries timing data;
in the process of playing the second video data on the display screen of the first client, a video playing image of the display screen of the first client is intercepted in real time;
and generating the first video data by utilizing the video playing image obtained by real-time interception.
Optionally, the generating the first video data includes:
acquiring second video data generated by a second client; wherein the second video data carries timing data;
acquiring video data shot by a front camera of the first client in the process of playing the second video data on a display screen of the first client; the front-facing camera of the first client shoots the second video data played by the display screen of the first client.
The third aspect of the present invention provides a device for measuring comic delay, including:
an acquisition unit configured to acquire timing data in each video data of the mixed stream; the mixed picture stream comprises a plurality of video data, and the video data are synchronously timed;
the calculating unit is used for calculating the mixed drawing time delay of the first video data and the second video data by utilizing timing data in the first video data and timing data in the second video data; wherein the first video data and the second video data are any two video data in the comic stream.
A fourth aspect of the present invention provides a client, including:
a generation unit configured to generate first video data; the first video data is used for generating a mixed drawing stream with at least one second video data, the first video data carries timing data synchronous with each second video data, and the timing data of the first video data and the timing data of any one second video data are used for calculating mixed drawing time delay of the first video data and the second video data.
A fifth aspect of the invention provides a computer readable medium having a computer program stored thereon, wherein the program when executed by a processor implements a method as defined in any one of the first aspects of the invention or as defined in any one of the second aspects of the invention.
A sixth aspect of the present invention provides an apparatus comprising:
one or more processors;
a storage device having one or more programs stored thereon;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement a method according to any one of the first aspects of the invention or a method according to any one of the second aspects of the invention.
The seventh aspect of the present invention provides a system for measuring comic delay, including:
-measuring means for measuring aliasing delays for performing the method according to any one of the first aspect of the invention;
a client for performing the method according to any one of the second aspect of the present invention.
According to the scheme, in the method, the system, the computer readable medium and the equipment for measuring the mixed drawing time delay, timing data in each video data of the mixed drawing stream is obtained; the mixed picture stream comprises a plurality of video data, and the video data are synchronously timed; calculating mixed drawing time delay of the first video data and the second video data by using timing data in the first video data and timing data in the second video data; the first video data and the second video data are any two video data in the mixed picture stream, so that the aim of testing the mixed picture time delay of multi-user interactive live broadcast is fulfilled.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of a method for measuring comic delay according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a live broadcast device of a method for measuring comic delay according to an embodiment of the present invention;
fig. 3 is a specific flowchart of a method for measuring comic delay according to an embodiment of the present invention;
fig. 4 is a schematic diagram of an online stopwatch adding method of a method for measuring comic delay according to an embodiment of the present invention;
fig. 5 is a detailed flowchart of an implementation manner of step S301 according to another embodiment of the present invention;
fig. 6 is a detailed flowchart of an implementation manner of step S301 according to another embodiment of the present invention;
fig. 7 is a detailed flowchart of an implementation manner of step S303 according to another embodiment of the present invention;
fig. 8 is a detailed flowchart of an implementation manner of step S304 according to another embodiment of the present invention;
FIG. 9 is a diagram illustrating image frames of a comic stream according to another embodiment of the present invention;
FIG. 10 is a diagram illustrating image frames of a comic stream according to another embodiment of the present invention;
fig. 11 is a schematic diagram of a device for measuring aliasing delay according to another embodiment of the present invention;
fig. 12 is a schematic diagram of an obtaining unit 110 according to another embodiment of the present invention;
fig. 13 is a schematic diagram of a client according to another embodiment of the present invention;
fig. 14 is a schematic diagram of a generating unit 1301 provided in another embodiment of the present invention;
fig. 15 is a schematic diagram of a generating unit 1301 provided in another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
At present, the national 'network live broadcast' is roughly divided into two types, one is that television signals are provided on the internet for watching, such as the live broadcast of various sports competitions and literary and art activities, the principle of the type of the live broadcast is that television (analog) signals are collected and converted into digital signals to be input into a computer, and the digital signals are uploaded to a website in real time for people to watch, which is equivalent to 'network television'; the other is then "live webcast" in the true sense: the method comprises the steps that independent signal acquisition equipment is erected on a live broadcast site to acquire audio data and video data, and then the audio data and the video data are led into a broadcasting guide end, wherein the broadcasting guide end is generally broadcasting guide equipment or a broadcasting guide platform, and then the audio data and the video data are uploaded to a server through a network and are released to a website for a user client to watch. The greatest difference between these types of live webcasts and the predecessors lies in the autonomy of live webcasts: the independent controllable audio and video acquisition is completely different from the singleness of the rebroadcast television signal. Meanwhile, the network live broadcast can be carried out for the application that television media are difficult to live broadcast, such as government affair open meetings, mass audition meetings, court trial live broadcast in a court, official examination training, product release meetings, enterprise meetings, industry meetings, exhibition live broadcast and the like.
With the development of network live broadcast becoming better and better, more and more anchor broadcasters improve their popularity by connecting with other anchor broadcasters, or the anchor broadcasters want to explain an event together with other anchor broadcasters or watch a video together, the server needs to mix the video data of the anchor broadcasters with the video data of other anchor broadcasters into a mixed picture stream, and push the mixed picture stream to the audience for watching. Therefore, the comic delay of each anchor picture is an important index of the comic stream.
As shown in fig. 1, an application scenario of the method for measuring comic delay is provided in the embodiment of the present invention, wherein a server 10 is in communication connection with a live broadcast initiating terminal 20 and a live broadcast receiving terminal 30. In the present application scenario, the server 10, the live broadcast initiator 20, and the live broadcast receiver 30 are obtained by performing simulation by a device in a product testing stage. The server 10 may be a server that provides a live service for the live initiator 20 and the live receiver 30. The live broadcast initiating terminal 20 may be a terminal device corresponding to an anchor terminal after a subsequent product is online, and the anchor may initiate a request for joining a live broadcast room to the server 10 through the live broadcast initiating terminal 20, and send a live broadcast video stream to the server 10. The live broadcast receiving end 30 may be a terminal device corresponding to a viewer end (user end) after a subsequent product is online, and a viewer may obtain a live broadcast interactive video stream sent by the server 10 through the live broadcast receiving end 30, so as to watch a live broadcast video.
In some implementation scenarios, the live initiator 20 and the live receiver 30 may be used interchangeably. For example, the anchor may use the live originator 20 to provide a live video service to viewers, or may use the live originator 20 as a live viewer to view live video provided by other anchors. For another example, the viewer may use the live broadcast receiving end 30 to watch live video provided by the anchor, or may use the live broadcast receiving end 30 as the anchor to provide live video service for other viewers.
In the embodiment of the present invention, the live broadcast initiating terminal 20 and the live broadcast receiving terminal 30 may be, but are not limited to, a smart phone, a tablet computer, a personal computer, a notebook computer, a virtual reality terminal device, an augmented reality terminal device, and the like. The live broadcast initiator 20 and the live broadcast receiver 30 may have internet products installed therein for providing live broadcast services of the internet, for example, the internet products may be applications APP used in a computer or a smart phone and related to live broadcast services of the internet, World wide Web (Web) pages, applets, and the like.
Only a schematic diagram of the server 10 communicatively coupled to one live initiator 20 and one live receiver 30 is shown in fig. 1, it being understood that the server 10 of the present disclosure may be communicatively coupled to a plurality of live initiators 20 and a plurality of live receivers 30.
As shown in fig. 2, in one implementation of the embodiment of the present invention, the server 10, the live initiator 20, the live receiver 30, and the like may include a storage device 12, a computer-readable medium 13, and a processor 14. Wherein the storage device 12 and the processor 14 are electrically connected, directly or indirectly, to enable the transfer or interaction of data. For example, they may be electrically connected to each other via one or more communication buses or signal lines. The computer-readable medium 13 includes at least one software functional module that can be stored in the storage means 12 in the form of software or firmware (firmware). The processor 14 is configured to execute an executable computer program stored in the storage device 12, for example, a software functional module and a computer program included in the computer readable medium 13, so as to implement the method for measuring comic time delay disclosed in the embodiment of the present invention.
It is understood that the structure shown in fig. 2 is only an illustration, and the server 10, the live initiating terminal 20, the live receiving terminal 30, and the like may further include more or less components than those shown in fig. 2, or have a different configuration from that shown in fig. 2, for example, the server 10, the live initiating terminal 20, the live receiving terminal 30, and the like may further include a communication unit for information interaction with other devices. Wherein the components shown in fig. 2 may be implemented in hardware, software, or a combination thereof.
Based on the application scenario, a method for measuring the comic time delay is introduced below.
Referring to fig. 3, the method for measuring comic delay disclosed in the embodiment of the present invention includes:
s301, the first client generates first video data.
The first video data generated by the first client carries timing data, and the timing data is used for timing the playing of the first video data, and can be understood as the playing time of the first video data. After the first client generates the first video data, the first video data is uploaded to the server 10.
It should be noted that the first client may generate the first video data according to the video content that the first client needs to play, in this case, the timing data may be manually added in the first client, for example, the anchor player opens any one of the online stopwatches on the web page as the timing reference time in the live broadcast process, and adds the online stopwatch to the video data in a screen capture manner. The method of adding the online stopwatch to the video data may be, as shown in fig. 4, but is not limited to the method shown in fig. 4, and after the online stopwatch is added to the video data, the video data to which the online stopwatch is added is transmitted to the server 10.
It should be noted that, no matter before the product is on line, a tester or an expert, or after the product is on line, the tester or the expert is in the process of adding timing data, the tester or the expert can select any position in the video image to add the timing data.
It should be noted that, not only the on-line stopwatch is used as the timing data, but also different tools can be used for timing according to different application scenarios, and the present invention is not limited herein.
Optionally, in another embodiment of the present invention, as shown in fig. 5, another implementation manner of step S301 includes:
s501, second video data generated by a second client side are obtained.
Wherein the second video data carries timing data.
In a specific implementation process of this embodiment, the second video data generated by the second client may be directly obtained, or the second video data generated by the second client may be obtained by the server 10.
S502, in the process of playing the second video data on the display screen of the first client, a video playing image of the display screen of the first client is captured in real time.
Specifically, in the process of playing the second video data on the display screen of the first client, the video playing image of each frame in the display screen of the first client may be continuously captured by using the screen capture function of the first client.
S503, generating first video data by utilizing the video playing image obtained by real-time interception.
Specifically, the timing data in the first video data may be that when the anchor (i.e., the first client) connects to another anchor (i.e., the second client), and in the process of pulling the video data of the other anchor, if the video data of the other anchor includes the timing data, the intercepted video playing image including the timing data is combined to form the first video data.
It should be noted that, if the number of other anchor is greater than one, the video data of one anchor is selected as the reference stream, all the anchors simultaneously acquire the timing data in the video data serving as the reference stream, and capture the video playing image containing the timing data in the video data of the reference stream, and combine to form the first video data.
Optionally, in another embodiment of the present invention, as shown in fig. 6, another implementation manner of step S301 may include:
s601, second video data generated by a second client side are obtained.
Wherein the second video data carries timing data.
The second video data generated by the second client may be directly obtained, or may also be obtained by the server 10.
S602, in the process of playing the second video data on the display screen of the first client, video data shot by the front camera of the first client is obtained.
The front-facing camera of the first client shoots second video data played by a display screen of the first client.
In the specific implementation process of this embodiment, if the first client does not have a screenshot function, the second video data can be projected in the mirror by using a mirror facing the screen of the first client, and then the front camera of the first client is turned on to shoot the second video data in the mirror in real time.
And S302, the second client generates second video data.
In the synchronization step S301, the second video data generated by the second client also carries timing data, and the timing data carried in the first video data and the second video data is synchronous timing data, that is, the first client and the second client synchronously time. And, the second client also uploads the second video data to the server 10.
It should be noted that, not only the on-line stopwatch is used as the timing data, but also different tools can be used for timing according to different application scenarios, and the present invention is not limited herein.
In a specific implementation process of this embodiment, the first client and the second client may be understood as the live broadcast initiator 20 in the application scenario. The first video data and the second video data are used for generating mixed drawing streams, the first video data carries timing data synchronous with each second video data, and the timing data of the first video data and the timing data of the second video data are used for calculating mixed drawing time delay of the first video data and the second video data.
It can be understood that, in this embodiment, the first client and the second client are only used for illustration in the testing stage before the product is on line, and in the actual application process after the product is on line, a plurality of first clients and a plurality of second clients exist, but the specific implementation manner is not changed.
S303, the server acquires timing data in each video data of the mixed picture stream.
The server 10 receives the first video data and the second video data, and generates a mixed stream using the first video data and the second video data. Since the first video data and the second video data both carry timing data, the video data of the comic stream also carries timing data.
After the server generates the comic stream, in order to calculate the comic delay between the video data in the comic stream, the timing data in each video data of the comic stream is acquired.
In this embodiment, first video data generated by a first client and second video data generated by a second client are taken as an example for explanation. It is understood that the comic stream may also include more than two pieces of video data, that is, the server 10 receives the video data uploaded by more than two clients, generates the comic stream using the received video data, and measures a comic delay between any two pieces of video data in the comic stream.
Optionally, in another embodiment of the present invention, as shown in fig. 7, an implementation manner of step S303 includes:
and S701, intercepting a plurality of frames of images in the comic stream.
Specifically, the server 10 intercepts each frame of image in the comic stream in which the comic delay is to be calculated, where the frame of image includes video data and timing data.
It should be noted that, when capturing the image in the comic stream, only one frame of image in the comic stream may be captured, but capturing the image containing the timing data under multiple frames in the comic stream may make the calculation of the subsequent comic delay more accurate.
S702, identifying each frame image in the intercepted mixed drawing stream, and obtaining timing data corresponding to each video data of the mixed drawing stream in each frame image.
In the process of identifying each frame of image in the intercepted comic stream, timing data in each frame of image can be extracted by using a computer image identification technology, and similarly, other methods can be used to continue identifying and intercepting the timing data, which is not limited herein.
Specifically, since the comic stream is a video stream in which a plurality of video data are synthesized frame by frame according to the conventional layout, the comic stream can be analyzed according to the conventional layout, and the position of each video data in the comic stream can be determined.
It should be noted that, in the testing process, the tester may select a position in the image for adding in the process of adding the timing data, so after capturing the multi-frame image in the comic stream, the tester may determine the timing data at the corresponding position in the image.
Of course, the content of each frame of image in the comic stream can also be identified, the position of the timing data is determined, the image at the position of the timing data is intercepted, and the timing data corresponding to each video data of the comic stream in each frame of image is obtained.
S304, the server calculates the mixed drawing time delay of the first video data and the second video data by using the timing data in the first video data and the timing data in the second video data.
The first video data and the second video data are any two video data in the mixed picture stream; the comic stream may be a comic stream formed of a plurality of video data.
It should be noted that, this embodiment is only to explain the method for calculating the comic delay, and is not limited to calculating the comic delay of two pieces of video data, and may also calculate the comic delay of more pieces of video data at the same time.
It should be noted that the steps S303 and S304 may be executed by the server 10, the live broadcast initiator 20, or the live broadcast receiver 30.
Optionally, in another embodiment of the present invention, as shown in fig. 8, an implementation manner of step S304 includes:
s801, calculating to obtain initial mixed drawing time delay of the first video data and the second video data in each frame of image by utilizing timing data corresponding to the first video data and the second video data in each frame of image.
Specifically, as shown in fig. 9, the image data of the same frame of the first video data and the second video data is obtained, where the timing data in the first video data is 11.825S, and the timing data in the second video data is 11.696S, and then the initial blending delay of the first video data and the second video data is 11.825S-11.696S-0.129S with reference to the timing data in the first video data.
In the specific implementation process of the embodiment of the present invention, the image data shown in fig. 9 of multiple frames may be obtained, where the timing data in fig. 9 may change with time, and a plurality of initial comic delays are obtained according to the above method.
Specifically, when the video data in the comic stream is greater than two, taking three video data as an example, as shown in fig. 10, the timing data of the first video data is 52.181S, the second video data is 52.096S, and the third video data is 52.138S, since the images of the second video data and the third video data include the image in the first video data, it can be seen that the timing data in the second video data and the third video data is obtained through the first video data, so it can be said that the comic stream calculates the initial comic delay with the timing data in the first video data as a reference; the initial comic delay is calculated as in the above embodiment, that is, the initial comic delay of the first video data and the second video data is 52.181S-52.096S-0.085S; the initial comixing time delay of the first video data and the third video data is 52.181S-52.138S-0.043S.
In the specific implementation process of the embodiment of the present invention, the image data shown in fig. 10 of multiple frames may be obtained, where the timing data in fig. 10 may change with time, and a plurality of initial comic delays are obtained according to the above method.
It should be noted that, in the process of calculating the initial comic delay, not only difference calculation is performed, but also weighting calculation is performed on the data result obtained by difference calculation according to a preset weight, and the like, which is not limited herein.
S802, averaging the initial mixed drawing time delay of the first video data and the initial mixed drawing time delay of the second video data in each frame of image to obtain the mixed drawing time delay of the first video data and the second video data.
It should be noted that, if the first video data and the second video data have only one frame of image, the initial blending time delay is the blending time delay of the first video data and the second video data; if the first video data and the second video data have multiple frames of images, averaging calculation is carried out on the initial comixing time delay of the first video data and the second video data in each frame of image, so that the obtained comixing time delay is more accurate, but if the initial comixing time delay in each frame of image is found to be large in floating in the calculation process, warning information is sent to technical personnel or an expert group to carry out analysis processing on the system after the comixing time delay is calculated. For example, the preset blending time delay is 0.6S, the initial blending time delay of the first frame image is 1S, and the initial blending time delay of the second frame image is 0.1S, at this time, the blending time delay obtained by averaging the initial blending time delays of the first frame image and the second frame image is 0.55, which is smaller than the preset blending time delay, but since the initial blending time delay of the first frame image is 1S and the initial blending time delay of the second frame image is 0.1S, the system still needs to be analyzed.
Therefore, in the implementation process of the embodiment of the present invention, before the initial aliasing delay of the first video data and the second video data in each frame of image is calculated by averaging to obtain the aliasing delay of the first video data and the second video data, the initial aliasing delay of the first video data and the second video data in each frame of image needs to be preliminarily analyzed, namely, the fluctuation of the initial mixed drawing time delay of the first video data and the second video data in each frame of image is analyzed and judged, when the fluctuation is smaller than a preset fluctuation value, the initial mixed drawing time delay of the first video data and the initial mixed drawing time delay of the second video data in each frame of image can be calculated by averaging, and finally the mixed drawing time delay of the first video data and the second video data is obtained, otherwise, warning information is directly sent to a technician or an expert group to analyze and process the system.
According to the scheme, the method for measuring the mixed drawing time delay provided by the invention comprises the steps of acquiring timing data in each video data of mixed drawing flow; the mixed picture stream comprises a plurality of video data, and the video data are synchronously timed; calculating mixed drawing time delay of the first video data and the second video data by using timing data in the first video data and timing data in the second video data; wherein the first video data and the second video data are any two video data in the comic stream. The aim of testing the mixed drawing time delay of multi-user interactive live broadcasting is achieved.
Another embodiment of the present invention provides a device for measuring comic delay, as shown in fig. 11, including:
an acquisition unit 1101 configured to acquire timing data in each video data of the mixed picture stream.
Wherein the mixed picture stream comprises a plurality of video data, and the plurality of video data are synchronously timed; the plurality of video data are video data sent by the plurality of live broadcast initiating terminals 20, and in the application scenario, the video data are contents that the anchor broadcasts want to live, and may be from screenshots, or from cameras or media files.
The calculating unit 1102 is configured to calculate, by using the timing data in the first video data and the timing data in the second video data, a mixed drawing time delay of the first video data and the second video data.
The first video data and the second video data are any two video data in the mixed picture stream.
For the specific working process of the unit disclosed in the above embodiment of the present invention, reference may be made to the content of the corresponding method embodiment, as shown in fig. 3, which is not described herein again.
Optionally, in another embodiment of the present invention, an implementation manner of the obtaining unit 1101, as shown in fig. 12, includes:
a clipping unit 1201, configured to clip a plurality of frames of images in the comic stream.
The identifying unit 1202 is configured to identify each frame image in the captured comic stream, and obtain timing data corresponding to each video data of the comic stream in each frame image.
For a specific working process of the unit disclosed in the above embodiment of the present invention, reference may be made to the content of the corresponding method embodiment, as shown in fig. 7, which is not described herein again.
Optionally, in another embodiment of the present invention, an implementation manner of the computing unit 1102 includes:
and the calculating subunit is used for calculating the initial comixing time delay of the first video data and the second video data in each frame of image by using the timing data corresponding to the first video data and the second video data in each frame of image.
And the calculating subunit is further configured to perform averaging calculation on the initial comixing time delay of the first video data and the initial comixing time delay of the second video data in each frame of image, so as to obtain the comixing time delay of the first video data and the second video data.
For the specific working process of the units disclosed in the above embodiments of the present invention, reference may be made to the contents of the corresponding method embodiments, as shown in fig. 8, fig. 9, and fig. 10, which are not described herein again.
According to the above scheme, the device for measuring the mixed drawing time delay provided by the invention acquires the timing data in each video data of the mixed drawing stream through the acquisition unit 1101; the mixed picture stream comprises a plurality of video data, and the video data are synchronously timed; then, the computing unit 1102 is used for computing the timing data in the first video data and the timing data in the second video data to obtain the mixed drawing time delay of the first video data and the second video data; wherein the first video data and the second video data are any two video data in the comic stream. The aim of testing the mixed drawing time delay of multi-user interactive live broadcasting is achieved.
Another embodiment of the present invention provides a client, as shown in fig. 13, including:
a generating unit 1301 is configured to generate first video data.
The first video data is used for generating a mixed drawing stream with at least one second video data, and the first video data carries timing data synchronous with each second video data, and the timing data of the first video data and the timing data of any one second video data are used for calculating and obtaining mixed drawing time delay of the first video data and the second video data.
For the specific working process of the unit disclosed in the above embodiment of the present invention, reference may be made to the content of the corresponding method embodiment, which is not described herein again.
Optionally, in another embodiment of the present invention, an implementation manner of the generating unit 1301, as shown in fig. 14, includes:
a second video data obtaining unit 1401, configured to obtain second video data generated by a second client.
Wherein the second video data carries timing data.
The video playing image capturing unit 1402 is configured to capture a video playing image of the display screen of the first client in real time during the process that the display screen of the first client plays the second video data.
A generating sub-unit 1403, configured to generate the first video data by using the video playing image obtained by real-time capturing.
For the specific working process of the unit disclosed in the above embodiment of the present invention, reference may be made to the content of the corresponding method embodiment, as shown in fig. 5, which is not described herein again.
Optionally, in another embodiment of the present invention, as shown in fig. 15, another implementation manner of the generating unit 1301 includes:
a second video data acquisition unit 1501 is configured to acquire second video data generated by a second client.
Wherein the second video data carries timing data.
It should be noted that the functions of the second video acquisition unit 1501 and the second video acquisition unit 1401 are the same, and are not described here again.
The shooting unit 1502 is configured to obtain video data shot by a front-facing camera of the first client in a process of playing the second video data on the display screen of the first client.
The front-facing camera of the first client is used for shooting second video data played by a display screen of the first client.
For a specific working process of the unit disclosed in the above embodiment of the present invention, reference may be made to the content of the corresponding method embodiment, as shown in fig. 6, which is not described herein again.
According to the scheme, the client generates the first video data by using the generating unit; the first video data is used for generating a mixed drawing stream with at least one second video data, the first video data carries timing data synchronous with each second video data, and the timing data of the first video data and the timing data of any one second video data are used for being subsequently sent to the mixed drawing time delay measuring device to calculate and obtain mixed drawing time delay of the first video data and the second video data. The aim of testing the mixed drawing time delay of multi-user interactive live broadcasting is achieved.
Another embodiment of the invention provides a computer readable medium having a computer program stored thereon, wherein the program when executed by a processor implements the method as in any one of the embodiments above.
Another embodiment of the present invention provides a system for measuring comic delay, including:
a measuring device for mixed drawing time delay and a client.
Wherein the measuring device of the comixing time delay is used for executing the method in any one of the embodiments described above, as shown in fig. 3, 7 and 8; the client is configured to perform the method according to any one of fig. 5 and 6 in the above embodiments.
In the above embodiments of the present disclosure, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus and method embodiments described above are illustrative only, as the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present disclosure may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part. The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a live broadcast device, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (11)

1. A method for measuring comic time delay is characterized by comprising the following steps:
acquiring timing data in each video data of the mixed picture stream; the mixed picture stream comprises a plurality of video data, and the video data are synchronously timed;
calculating mixed drawing time delay of the first video data and the second video data by using timing data in the first video data and timing data in the second video data; wherein the first video data and the second video data are any two video data in the comic stream.
2. The measurement method according to claim 1, wherein the acquiring timing data in each video data of the comic stream comprises:
intercepting a plurality of frames of images in the mixed picture stream;
and identifying each frame of image in the intercepted mixed drawing stream to obtain timing data corresponding to each video data of the mixed drawing stream in each frame of image.
3. The method according to claim 2, wherein the calculating the aliasing delay of the first video data and the second video data by using timing data in the first video data and timing data in the second video data comprises:
calculating the initial mixed drawing time delay of the first video data and the second video data in each frame of image by utilizing timing data corresponding to the first video data and the second video data in each frame of image;
and averaging the initial mixed drawing time delay of the first video data and the second video data in each frame of image to obtain the mixed drawing time delay of the first video data and the second video data.
4. A method for measuring comic time delay is applied to a first client, and comprises the following steps:
generating first video data; the first video data is used for generating a mixed drawing stream with at least one second video data, the first video data carries timing data synchronous with each second video data, and the timing data of the first video data and the timing data of any one second video data are used for calculating mixed drawing time delay of the first video data and the second video data.
5. The measurement method of claim 4, wherein the generating first video data comprises:
acquiring second video data generated by a second client; wherein the second video data carries timing data;
in the process of playing the second video data on the display screen of the first client, a video playing image of the display screen of the first client is intercepted in real time;
and generating the first video data by utilizing the video playing image obtained by real-time interception.
6. The measurement method of claim 4, wherein the generating first video data comprises:
acquiring second video data generated by a second client; wherein the second video data carries timing data;
acquiring video data shot by a front camera of the first client in the process of playing the second video data on a display screen of the first client; the front-facing camera of the first client shoots the second video data played by the display screen of the first client.
7. A measuring device for mixed drawing time delay is characterized by comprising:
an acquisition unit configured to acquire timing data in each video data of the mixed stream; the mixed picture stream comprises a plurality of video data, and the video data are synchronously timed;
the calculating unit is used for calculating the mixed drawing time delay of the first video data and the second video data by utilizing timing data in the first video data and timing data in the second video data; wherein the first video data and the second video data are any two video data in the comic stream.
8. A client, comprising:
a generation unit configured to generate first video data; the first video data is used for generating a mixed drawing stream with at least one second video data, the first video data carries timing data synchronous with each second video data, and the timing data of the first video data and the timing data of any one second video data are used for calculating mixed drawing time delay of the first video data and the second video data.
9. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1 to 3, or the method of any one of claims 4 to 6.
10. An apparatus, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-3 or the method of any of claims 4-6.
11. A comic delay measurement system, comprising:
-measuring means of aliasing delay for performing the method according to any of claims 1 to 3;
a client for performing the method of any one of claims 4 to 6.
CN201910959170.5A 2019-10-10 2019-10-10 Method, system, computer readable medium and device for measuring mixed drawing time delay Active CN110602521B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910959170.5A CN110602521B (en) 2019-10-10 2019-10-10 Method, system, computer readable medium and device for measuring mixed drawing time delay

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910959170.5A CN110602521B (en) 2019-10-10 2019-10-10 Method, system, computer readable medium and device for measuring mixed drawing time delay

Publications (2)

Publication Number Publication Date
CN110602521A true CN110602521A (en) 2019-12-20
CN110602521B CN110602521B (en) 2020-10-30

Family

ID=68866224

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910959170.5A Active CN110602521B (en) 2019-10-10 2019-10-10 Method, system, computer readable medium and device for measuring mixed drawing time delay

Country Status (1)

Country Link
CN (1) CN110602521B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022156721A1 (en) * 2021-01-20 2022-07-28 华为技术有限公司 Photographing method and electronic device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2706754A2 (en) * 2012-09-11 2014-03-12 Comcast Cable Communications, LLC Synchronizing program presentation
CN108900859A (en) * 2018-08-17 2018-11-27 广州酷狗计算机科技有限公司 Live broadcasting method and system
CN109257618A (en) * 2018-10-17 2019-01-22 北京潘达互娱科技有限公司 Company wheat interflow method, apparatus and server in a kind of live streaming
CN110213635A (en) * 2018-04-08 2019-09-06 腾讯科技(深圳)有限公司 Video mixed flow method, video flow mixing device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2706754A2 (en) * 2012-09-11 2014-03-12 Comcast Cable Communications, LLC Synchronizing program presentation
CN110213635A (en) * 2018-04-08 2019-09-06 腾讯科技(深圳)有限公司 Video mixed flow method, video flow mixing device and storage medium
CN108900859A (en) * 2018-08-17 2018-11-27 广州酷狗计算机科技有限公司 Live broadcasting method and system
CN109257618A (en) * 2018-10-17 2019-01-22 北京潘达互娱科技有限公司 Company wheat interflow method, apparatus and server in a kind of live streaming

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022156721A1 (en) * 2021-01-20 2022-07-28 华为技术有限公司 Photographing method and electronic device

Also Published As

Publication number Publication date
CN110602521B (en) 2020-10-30

Similar Documents

Publication Publication Date Title
CN108574879B (en) Combined live broadcast method and device and electronic equipment
CN110472099B (en) Interactive video generation method and device and storage medium
CN108989883B (en) Live broadcast advertisement method, device, equipment and medium
CN108881956B (en) Live stream transmission method and device and related equipment
CN108243318B (en) Method and device for realizing live broadcast of multiple image acquisition devices through single interface
CN109120990B (en) Live broadcast method, device and storage medium
CN110602521B (en) Method, system, computer readable medium and device for measuring mixed drawing time delay
CN112337100A (en) Live broadcast-based data processing method and device, electronic equipment and readable medium
CN109286760B (en) Entertainment video production method and terminal thereof
WO2018058726A1 (en) Grouping action method, terminal and system
CN108322764B (en) Real-time interaction realization method and device
CN109523844B (en) Virtual live broadcast simulation teaching system and method
CN107968942B (en) Method and system for measuring audio and video time difference of live broadcast platform
Salas et al. Subjective quality evaluations using crowdsourcing
CN105228010B (en) A kind of method and device that TV interaction systems interactive information is set
CN113271474B (en) Method, device, equipment and storage medium for testing streaming media server
CN110166825B (en) Video data processing method and device and video playing method and device
CN116962746A (en) Online chorus method and device based on continuous wheat live broadcast and online chorus system
CN113297065A (en) Data processing method, game-based processing method and device and electronic equipment
CN111901351A (en) Remote teaching system, method and device and voice gateway router
CN112601048A (en) Online examination monitoring method, electronic device and storage medium
CN113645470A (en) Video playing method and device and computer storage medium
Joskowicz et al. Automation of subjective video quality measurements
RU2658893C1 (en) Virtual events attendance service provision method
Zhang et al. A Subjective Quality Assessment Database for Mobile Video Coding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210114

Address after: 510000 3108, 79 Wanbo 2nd Road, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Patentee after: GUANGZHOU CUBESILI INFORMATION TECHNOLOGY Co.,Ltd.

Address before: 28th floor, block B1, Wanda Plaza, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Patentee before: GUANGZHOU HUADUO NETWORK TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20191220

Assignee: GUANGZHOU HUADUO NETWORK TECHNOLOGY Co.,Ltd.

Assignor: GUANGZHOU CUBESILI INFORMATION TECHNOLOGY Co.,Ltd.

Contract record no.: X2021440000053

Denomination of invention: Method, system, computer readable medium and equipment for measuring mixing time delay

Granted publication date: 20201030

License type: Common License

Record date: 20210208

EE01 Entry into force of recordation of patent licensing contract